Articles tagged with "tactile-sensors"
China’s new humanoid robot senses delicate touch with soft skin tech
China’s startup Matrix Robotics has unveiled MATRIX-3, its third-generation humanoid robot that marks a significant advancement in physical artificial intelligence. Unlike previous robots limited to pre-set tasks, MATRIX-3 is designed for adaptive, real-world interaction, aiming to operate safely and autonomously in everyday commercial, medical, and home environments. The robot features a biomimetic “skin” made of flexible woven fabric embedded with distributed sensors, enabling it to detect soft touch and real-time impacts, thus enhancing safety during human–robot interaction. Its tactile sensor arrays in the fingertips can sense pressures as low as 0.1 newtons, and combined with an advanced vision system, MATRIX-3 can assess object properties and handle fragile or deformable items reliably. MATRIX-3 also boasts human-like dexterity and mobility, with a 27-degree-of-freedom hand that mimics human anatomy and uses lightweight cable-driven actuators for precise, fast movements. Its full-body motion is powered by linear actuators
roboticshumanoid-robotbiomimetic-skintactile-sensorsartificial-intelligencehuman-robot-interactiondexterous-manipulationChina's ice cream-making humanoid robot wows crowds at US tech show
At CES 2026 in Las Vegas, PaXini Tech showcased its tactile humanoid robot TORA-ONE performing a complete ice cream-making workflow autonomously, demonstrating the practical application of touch-driven intelligence beyond research settings. The company presented its full embodied intelligence stack, including advanced tactile sensors, robotic hands, humanoid platforms, and large-scale data systems. Originating from Japan’s Sugano Laboratory, PaXini focuses on enabling AI systems to understand the physical world through high-precision touch, force, and motion sensing. Central to PaXini’s technology are its independently developed tactile sensors, such as the PX-6AX-GEN3, which provide multidimensional force sensing with exceptional resolution and repeatability. These sensors, along with wrist and joint force sensing, allow robots to perceive contact similarly to human touch. The company also introduced the DexH13 dexterous hand, featuring over a thousand tactile processing units, capable of delicate manipulation tasks like grasping irregular objects and turning knobs,
robothumanoid-robottactile-sensorsembodied-AIrobotics-technologydexterous-robotic-handCES-2026Humanoid robot deals cards and builds paper windmill with nimble hands
At CES 2026, Singapore-based AI robotics company Sharpa unveiled its first full-body humanoid robot, North, designed for productivity-focused autonomy. North showcased impressive dexterity through live demonstrations, performing complex tasks such as playing ping-pong with a 0.02-second reaction time, dealing cards using real-time vision and language inputs, and assembling a paper windmill—a task involving over 30 steps and handling diverse materials like thin, deformable paper, plastic pins, and sticks. These feats highlight North’s ability to adapt its grasp and manipulate objects with human-like precision, enabled by Sharpa’s proprietary robotic hand, SharpaWave, which features human-scale size, 22 degrees of freedom, and thousands of tactile sensors per fingertip. Sharpa’s Vice President Alicia Veneziani emphasized that North’s dexterity stems from the anthropomorphic design of its hand, the integration of tactile feedback, and advanced AI training that allows the robot to adjust to different interaction scenarios. The company aims to move
robothumanoid-robotrobotic-handAI-roboticsdexterous-manipulationtactile-sensorsrobotics-technologyAMD hardware-powered humanoid robot uses body as computing system
Italian robotics company Generative Bionics unveiled its humanoid robot concept, GENE.01, at CES 2026. Scheduled for commercial launch in late 2026, GENE.01 is designed around the principle of Physical AI, using its entire body as a computing system. The robot features a full-body tactile skin embedded with a distributed network of touch and force sensors, enabling it to sense contact, pressure, and subtle physical interactions. This tactile input is integrated into its core decision-making processes, allowing real-time responses to human touch or collisions, thereby facilitating safer and more natural human-robot interactions. Powered by AMD’s suite of CPUs, GPUs, embedded processors, and FPGA-based systems, GENE.01 processes sensory data locally near the sensors rather than relying on a centralized brain. This distributed computing approach enables split-second reactions and smoother movements, reflecting an efficiency inspired by human intelligence residing both in the brain and body. Generative Bionics emphasizes openness by leveraging AMD-supported open-source
roboticshumanoid-robotphysical-AItactile-sensorsAMD-processorsindustrial-automationAI-computingChina's neuromorphic e-skin lets humanoid robots sense pain and react
Researchers in China have developed a neuromorphic robotic electronic skin (NRE-skin) that enables humanoid robots to sense touch, detect injury, and respond to harmful stimuli with rapid, reflex-like movements inspired by the human nervous system. Unlike traditional robotic skins that only detect contact pressure, this e-skin mimics biological skin by converting tactile inputs into electrical pulse trains similar to human nerve signals. It features a four-layer structure, including an outer protective layer and underlying sensors that continuously monitor pressure, force, and structural integrity. The skin can detect damage by the cessation of periodic electrical pulses, allowing the robot to identify injury location. A key innovation of the NRE-skin is its ability to trigger immediate reflexive responses without routing signals through a central processor. When pressure exceeds a preset threshold indicating potential harm, a high-voltage signal bypasses the central unit and directly activates the robot’s motors to withdraw from the stimulus, mimicking human pain reflexes. This hierarchical, neural-inspired architecture enhances
roboticsneuromorphic-engineeringelectronic-skinhumanoid-robotstactile-sensorsrobotic-safetyreflexive-responseThe science of human touch – and why it’s so hard to replicate in robots - Robohub
The article by Perla Maiolino from the University of Oxford explores the complexity of human touch and the challenges in replicating it in robots. While robots have advanced significantly in visual perception and navigation, their ability to touch objects gently, safely, and meaningfully remains limited. Human touch is highly sophisticated, involving multiple types of mechanoreceptors in the skin that detect various stimuli such as vibration, stretch, and texture. Moreover, touch is an active sense, involving constant movement and adjustment to transform raw sensory input into perception. Replicating this dynamic and distributed sensory system across a robot’s entire soft body, and enabling it to interpret the rich sensory data, presents a formidable challenge. The article also highlights the concept of distributed or embodied intelligence, where behavior emerges from the interaction between body, material, and environment rather than centralized brain control. The octopus is cited as an example, with most of its neurons located in its limbs, allowing local adaptation and movement. This principle is influential in soft robotics,
roboticssoft-roboticstactile-sensorsartificial-skinembodied-intelligencehuman-robot-interactionsensor-technologySkin patch lets users type and read messages through touch
Researchers have developed a soft, skin-like patch that enables users to type and receive text messages through touch, leveraging advances in stretchable electronics, gel-based sensors, and AI. Unlike conventional digital devices that detect only simple taps and swipes, this patch uses an iontronic sensor array embedded in a flexible, stretchable copper circuit layered with silicone to detect subtle pressure changes on the skin. The patch encodes ASCII characters by dividing each character into four two-bit segments, with each sensor registering presses that correspond to segment values. Feedback is provided via vibration patterns, where actuators vibrate a specific number of times to represent each segment, creating a tactile communication system aligned with the ASCII standard. To interpret touch inputs without requiring extensive data collection, the researchers developed a mathematical model simulating pressing behavior, capturing variations in force, duration, and press count. Demonstrations of the patch include typing the phrase “Go!” with tactile confirmation and controlling a racing game where presses steer the vehicle and vibration intensity indicates proximity
IoTwearable-technologysoft-materialshuman-computer-interactiontactile-sensorsstretchable-electronicsAI-algorithmsSeventh sense: Humans can sense buried objects like shorebirds
A recent study by researchers at Queen Mary University of London and University College London reveals that humans possess a “remote touch” ability, enabling them to detect objects buried beneath sand without direct contact. This challenges the traditional view that touch is limited to physical contact with surfaces. Participants in the study were able to locate hidden cubes under sand by sensing subtle mechanical vibrations and displacements transmitted through the granular material, a capability similar to that of shorebirds like sandpipers and plovers, which detect prey beneath sand via mechanical cues. The study also compared human performance with a robotic tactile sensor trained using a Long Short-Term Memory (LSTM) algorithm. Humans achieved a higher precision (70.7%) in detecting buried objects than the robot (40%), despite the robot sensing objects from slightly greater distances but producing more false positives. Both human and robotic detection approached the theoretical physical limits of sensitivity. These findings expand the scientific understanding of human touch, showing it extends beyond direct contact, and suggest new directions for designing tactile
roboticstactile-sensorsremote-touchhuman-robot-interactionmachine-learningLSTM-algorithmrobotic-explorationFigure AI designs Figure 03 humanoid for AI, home use, and scaling - The Robot Report
Figure AI Inc. has unveiled its third-generation humanoid robot, Figure 03, featuring a comprehensive redesign of hardware and software aimed at enhancing AI integration, home usability, and scalability for mass production. The robot incorporates a new sensory suite and hand system designed to reduce manufacturing costs and improve suitability for household environments. The company, based in San Jose, California, recently established a new supply chain and manufacturing process to support large-scale production, with plans to ship 100,000 units over the next four years. Figure AI has rapidly advanced its humanoid technology, earning a 2024 RBR50 Robotics Innovation Award and securing over $1 billion in committed capital, resulting in a $39 billion valuation. Figure 03 is built around Figure AI’s Helix physical AI model, enabling advanced reasoning and intelligent navigation in complex, cluttered spaces like homes. The robot’s vision system offers twice the frame rate, significantly reduced latency, and a wider field of view compared to its predecessor, supporting high-frequency
robothumanoid-robotartificial-intelligencerobotics-innovationsensory-technologytactile-sensorsAI-roboticsWomen in robotics you need to know about 2025 - Robohub
The article "Women in Robotics You Need to Know About 2025" from Robohub celebrates International Women in Robotics Day by highlighting 20 influential women shaping the robotics field worldwide. Robotics today extends beyond traditional manufacturing to areas like space exploration, healthcare, agriculture, and global connectivity. The featured women include professors, engineers, startup founders, and communicators from diverse countries such as Australia, Brazil, Canada, China, Germany, Spain, Switzerland, the UK, and the US. Their work spans tactile sensing, swarm robotics, embodied AI, and more, demonstrating the broad scope and impact of robotics research and innovation. The article emphasizes the importance of recognizing women's contributions to robotics to combat their historical invisibility and encourage greater representation. Among the honorees are Heba Khamis, co-founder of Contactile developing tactile sensors; Kelen Teixeira Vivaldini, researching autonomous robots for environmental applications; Natalie Panek, a senior engineer in space robotics; and Joelle Pineau,
roboticswomen-in-roboticstactile-sensorsautonomous-robotsAI-in-roboticsswarm-roboticsrobotics-innovation3D-printed auxetic sensors promise leap in wearable electronics
A research team from Seoul National University of Science and Technology, led by Mingyu Kang and Dr. Soonjae Pyo, has developed a novel 3D-printed tactile sensor platform based on auxetic mechanical metamaterials (AMMs). These materials exhibit a negative Poisson’s ratio, meaning they contract inward under compression, which concentrates strain and enhances sensitivity. Using digital light processing (DLP)-based 3D printing, the team fabricated cubic lattice structures with spherical voids that improve sensor performance by increasing sensitivity, maintaining stability, and minimizing crosstalk. The sensors operate in capacitive and piezoresistive modes, with the latter utilizing a carbon nanotube coating to detect resistance changes under load. The researchers demonstrated the technology’s potential through applications such as tactile arrays for spatial pressure mapping and wearable smart insoles capable of monitoring gait patterns and detecting pronation types. Unlike conventional porous structures, the auxetic design prevents lateral expansion, making the sensors more wearable and less prone to
3D-printingauxetic-sensorswearable-electronicstactile-sensorsroboticsmechanical-metamaterialshealth-monitoringLoomia Smart Skin Developer Kit to help give humanoid robots a sense of touch - The Robot Report
The Loomia Smart Skin Developer Kit is a new product designed to help roboticists incorporate flexible tactile sensing into humanoid robots and other automation systems. Recognizing that most robots lack the ability to sense touch, Loomia developed this kit after extensive interviews with over 100 engineers across industrial automation, medical devices, and robotics sectors through the National Science Foundation’s I-Corps program. Loomia’s founder, Maddy Maxey, highlighted that pressure sensing is a critical missing component in robotic hands and grippers, with no robust, flexible, plug-and-play solutions previously available. Founded in 2014, Loomia specializes in patented soft circuit systems that enable sensing, heating, and lighting in environments unsuitable for traditional printed circuit boards, and has deployed its technology in automotive, industrial, and robotics applications. The company’s flexible tactile sensors, first developed in 2018, have been shipped in over 1,000 units to enterprise clients for custom prototyping. Loomia identified key challenges faced by robotics
roboticstactile-sensorshumanoid-robotsflexible-electronicssoft-circuitsindustrial-automationsensor-technologyNew system helps robotic arm navigate using sound instead of vision
Researchers at Carnegie Mellon University have developed SonicBoom, a novel sensing system that enables robotic arms to navigate and localize objects using sound rather than relying on visual sensors. Traditional robotic arms depend heavily on cameras for tactile sensing, which can be obstructed or damaged in cluttered environments like agricultural fields. SonicBoom addresses these challenges by embedding contact microphones along the robot’s arm that detect sound waves generated when the arm touches objects, such as branches. By analyzing subtle variations in these sound waves with AI, the system can accurately determine the exact point of contact, achieving localization errors as low as 0.43 centimeters for trained objects and maintaining strong accuracy (2.22 cm error) even with unfamiliar materials. This acoustic-based approach offers several advantages: the microphones are well-protected from harsh contact, the system is more affordable and practical than camera-based tactile sensors, and it can function effectively in visually occluded environments. The researchers demonstrated SonicBoom’s utility by mapping occluded branch-like structures in a mock canopy
roboticsrobotic-armsound-sensingAItactile-sensorsagricultural-robotsobstacle-navigationBrighter Signals emerges from stealth - The Robot Report
Brighter Signals B.V., an Amsterdam-based sensing technology company, has emerged from stealth mode, unveiling its patented multi-modal tactile sensor platform designed to detect and measure touch with real-time pressure gradients. Founded by Andrew Klein, Christine Fraser, and Edward Shim, the company’s lightweight, durable, and recyclable sensors can be embedded into fabrics, surfaces, and structural components. Brighter Signals is initially targeting three key industries: robotics, automotive, and healthcare. In robotics, their sensors enhance tactile sensing in grippers and humanoid systems, enabling precise handling of objects, including delicate and irregularly shaped items. In automotive, the technology is being tested for occupant classification and airbag control via in-seat sensors. In healthcare, it supports passive, continuous monitoring of vital signs such as heart rate, breathing, and blood pressure through wearables and smart mattresses. The company collaborates with robotics OEMs, Tier 1 tactile solution suppliers, automotive manufacturers, and academic and clinical partners to validate and deploy its technology.
roboticstactile-sensorsmulti-modal-sensingrobotic-gripperswearable-technologyhealthcare-monitoringautomotive-sensorsCongratulations to the #ICRA2025 best paper award winners - Robohub
The 2025 IEEE International Conference on Robotics and Automation (ICRA), held from May 19-23 in Atlanta, USA, announced its best paper award winners and finalists across multiple categories. The awards recognized outstanding research contributions in areas such as robot learning, field and service robotics, human-robot interaction, mechanisms and design, planning and control, and robot perception. Each category featured a winning paper along with several finalists, highlighting cutting-edge advancements in robotics. Notable winners include "Robo-DM: Data Management for Large Robot Datasets" by Kaiyuan Chen et al. for robot learning, "PolyTouch: A Robust Multi-Modal Tactile Sensor for Contact-Rich Manipulation Using Tactile-Diffusion Policies" by Jialiang Zhao et al. for field and service robotics, and "Human-Agent Joint Learning for Efficient Robot Manipulation Skill Acquisition" by Shengchent Luo et al. for human-robot interaction. Other winning papers addressed topics such as soft robot worm behaviors, robust sequential task solving via dynamically composed gradient descent, and metrics-aware covariance for stereo visual odometry. The finalists presented innovative work ranging from drone detection to adaptive navigation and assistive robotics, reflecting the broad scope and rapid progress in the robotics field showcased at ICRA 2025.
roboticsrobot-learninghuman-robot-interactiontactile-sensorsrobot-automationsoft-roboticsrobot-navigation