Articles tagged with "human-robot-interaction"
Upcoming 'Yogi' humanoid robot to focus on human connections
Cartwheel Robotics is developing a humanoid robot named Yogi, designed primarily to foster genuine human connections and serve as a friendly, emotionally intelligent companion in homes and workplaces. Unlike many other robotics firms focusing on factory automation—such as Tesla’s Optimus robot—Cartwheel emphasizes natural movement, safety, and approachability. Yogi is constructed with medical-grade silicone and soft protective materials, features modular swappable batteries for extended operation, and incorporates precision-engineered actuators with overload protection. The robot aims to assist with light household tasks while maintaining intuitive and reliable interactions, reflecting Cartwheel’s goal to integrate humanoid AI into everyday life by enhancing how people live, work, and care for one another. Humanoid Global Holdings Corp., Cartwheel’s parent investment company, highlighted that Yogi is built on a proprietary full-stack humanoid platform combining custom hardware, AI models, motion systems, and software. Cartwheel is expanding operations with a new facility in Reno, Nevada, set to open in January
robothumanoid-robotAIhome-automationrobotics-technologyhuman-robot-interactionbattery-technologyChina builds humanoid robot with realistic eye movements, bionic skin
China’s AheadForm Technology has developed a highly advanced humanoid robot named Elf V1, featuring lifelike bionic skin and realistic eye movements designed for natural daily interactions. The robot integrates 30 facial muscles controlled by brushless micro-motors and a high-precision control system, enabling expressive facial features, synchronized speech, and the ability to convey emotions and interpret human non-verbal cues. This design aims to overcome the “uncanny valley” effect, making interactions with humans more natural and engaging. Powered by self-supervised AI algorithms and enhanced with Large Language Models (LLMs) and Vision-Language Models (VLMs), Elf V1 can perceive its environment, communicate intelligently, and adapt in real-time to human emotions and behaviors. AheadForm envisions these robots providing assistance, companionship, and support across various industries, bridging the gap between humans and machines. The company’s previous Lan Series offered more cost-efficient humanoids with 10 degrees of freedom, while Elf V1 represents a
roboticshumanoid-robotbionic-skinAI-roboticshuman-robot-interactionadvanced-control-systemsemotion-recognitionDiligent Robotics adds two members to AI advisory board - The Robot Report
Diligent Robotics, known for its Moxi mobile manipulator used in hospitals, has expanded its AI advisory board by adding two prominent experts: Siddhartha Srinivasa, a robotics professor at the University of Washington, and Zhaoyin Jia, a distinguished engineer specializing in robotic perception and autonomy. The advisory board, launched in late 2023, aims to guide the company’s AI development with a focus on responsible practices and advancing embodied AI. The board includes leading academics and industry experts who provide strategic counsel as Diligent scales its Moxi robot deployments across health systems nationwide. Srinivasa brings extensive experience in robotic manipulation and human-robot interaction, having led research and development teams at Amazon Robotics and Cruise, and contributed influential algorithms and systems like HERB and ADA. Jia offers deep expertise in computer vision and large-scale autonomous systems from his leadership roles at Cruise, DiDi, and Waymo, focusing on safe and reliable AI deployment in complex environments. Diligent Robotics’
roboticsAIhealthcare-robotsautonomous-robotshuman-robot-interactionrobotic-manipulationembodied-AIIEEE study group publishes framework for humanoid standards
The IEEE Humanoid Study Group has published a comprehensive framework aimed at guiding the development of standards for humanoid robots. This framework addresses the unique risks and capabilities of humanoids to support their safe and effective deployment across industrial, service, and public sectors. The study group focused on three key interconnected areas: Classification, Stability, and Human-Robot Interaction (HRI). Classification involves creating a clear taxonomy to define humanoid robots by their physical and behavioral traits and application domains, serving as a foundation for identifying applicable standards and gaps. Stability focuses on developing measurable metrics and safety standards for balancing robots, including dynamic balance and fall-response behaviors. HRI guidelines aim to ensure safe, trustworthy interactions between humans and humanoid robots, covering collaborative safety, interpretable behavior, and user training. Led by Aaron Prather of ASTM International, the working group comprised over 60 experts from industry, academia, and regulatory bodies who collaborated for more than a year. Their efforts included market research, vendor and end-user interviews,
roboticshumanoid-robotsrobot-standardshuman-robot-interactionrobotics-safetyIEEE-standardsautonomous-systemsChina's humanoid robot head shocks with 'lifelike facial expressions'
Chinese robotics company AheadForm has developed a humanoid robotic head capable of expressing a wide range of realistic facial emotions, aiming to enhance human-robot interaction. Their robot head, showcased in a viral YouTube video, features lifelike eye movements, blinking, and expressive facial cues achieved through a combination of self-supervised AI algorithms and advanced bionic actuation technology. AheadForm’s “Elf series” of robots, characterized by elf-like features such as large ears, incorporate up to 30 degrees of freedom in facial movement, powered by precise control systems and AI learning algorithms. Their latest model, “Xuan,” is a full-body bionic figure with a static body but a highly interactive head capable of rich facial expressions and lifelike gaze behaviors. A key innovation enabling these realistic expressions is a specialized brushless motor designed for ultra-quiet, responsive, and energy-efficient facial control, allowing subtle and precise movements. AheadForm’s founder, Hu Yuhang, envisions humanoid robots that feel
robothumanoid-robotAI-algorithmsbionic-actuationbrushless-motorhuman-robot-interactionlifelike-facial-expressionsLaunch of the World's Cuddliest Robot
The article announces the release of the GR-3, described as the world’s cuddliest robot, now available for purchase. Developed by Fourier, the GR-3 embodies the company’s commitment to creating empathic robot companions designed to assist humans in everyday activities. The robot aims to provide emotional support and practical help, blending advanced technology with a comforting, approachable design. Key takeaways include Fourier’s emphasis on empathy in robotics, positioning the GR-3 not just as a functional assistant but also as a companion that can enhance users’ emotional well-being. While specific features and capabilities of the GR-3 are not detailed in the article, its launch marks a significant step in the integration of robotics into daily human life, focusing on both utility and emotional connection.
robotroboticsempathic-robotscompanion-robotsGR-3-robothuman-robot-interactionThis $30M startup built a dog crate-sized robot factory that learns by watching humans
San Francisco-based startup MicroFactory has developed a compact, dog crate-sized robotic manufacturing system designed for precision tasks such as circuit board assembly, soldering, and cable routing. Unlike traditional humanoid or large-scale factory robots, MicroFactory’s enclosed workstation features two robotic arms that can be trained through direct human demonstration as well as AI, enabling faster and more intuitive programming for complex manufacturing sequences. Co-founder and CEO Igor Kulakov emphasized that this approach simplifies both hardware and AI development while allowing users to observe the manufacturing process in real time. Founded in 2024 by Kulakov and Viktor Petrenko, who previously ran a manufacturing business, MicroFactory built its prototype within five months and has since received hundreds of preorders for diverse applications, including electronics assembly and even food processing. The company recently raised $1.5 million in a pre-seed funding round, valuing it at $30 million post-money, with investors including executives from Hugging Face and Naval Ravikant. MicroFactory plans to
roboticsmanufacturing-automationAI-roboticsrobotic-armstabletop-robot-factoryhuman-robot-interactionprecision-manufacturing'World’s cutest' humanoid carries out chores with warmth, care
The Fourier GR-3 humanoid robot, developed by Chinese firm Fourier Robotics, is designed to support meaningful human interaction by combining emotional intelligence with practical functionality. Unlike traditional robots, the GR-3 can express empathy and kindness, making it feel more like a companion than a machine. It demonstrates capabilities such as eidetic memory to assist an art curator, multilingual communication to guide museum visitors, and home assistance by managing daily schedules. The robot also exhibits advanced visual recognition and human-like locomotion, responding naturally to gestures like waving. Weighing 71 kg and standing 165 cm tall, the GR-3 features 55 degrees of freedom for balanced, fluid movement and an animated facial interface that enhances its lifelike presence. Its emotional intelligence is powered by Fourier’s Full-Perception Multimodal Interaction System, integrating sight, sound, and touch, with 31 pressure sensors enabling responsive actions such as blinking and eye tracking. The robot supports continuous operation with a swappable battery and adaptable movement modes
robothumanoid-robotemotional-intelligencehuman-robot-interactionrobotics-technologyautonomous-robotssmart-roboticsHumans can ‘borrow’ robot hands as their own, scientists discover
Researchers from the Italian Institute of Technology and Brown University have discovered that humans can unconsciously incorporate a humanoid robot’s hand into their body schema—the brain’s internal map of the body and its spatial relationship to the environment—especially when collaborating on a task. In experiments involving a child-sized robot named iCub, participants who jointly sliced a soap bar with the robot showed faster reactions to visual cues near the robot’s hand, indicating that their brains treated the robot’s hand as part of their own near space. This effect was contingent on active collaboration and was influenced by the robot’s movement style, with broader, fluid, and well-synchronized gestures enhancing the cognitive integration. The study also found that physical proximity and the participant’s perception of the robot’s competence and pleasantness strengthened this integration. Participants who attributed more human-like traits or emotions to the robot exhibited a stronger cognitive bond, suggesting that empathy and partnership play important roles in human-robot interaction. These findings provide valuable insights for designing future robots that can
robothumanoid-robothuman-robot-interactionbody-schemacognitive-integrationrehabilitation-roboticsiCub-robotHumanoid robot HITTER plays table tennis with human-like speed
UC Berkeley has developed a humanoid robot named HITTER that can play table tennis with human-like speed and agility. Demonstrated in a video, HITTER successfully engaged in rallies exceeding 100 shots against human opponents, using its left hand for balance and executing precise, fluid movements. The robot’s performance relies on a dual-system design: a high-level planner that tracks and predicts the ball’s trajectory using external cameras, and a low-level controller that converts these calculations into coordinated arm and leg motions. Trained on human motion data, HITTER can move naturally, reacting to balls traveling up to 5 m/s in under a second. The development team combined model-based planning with reinforcement learning to overcome the challenges of split-second decision-making and unpredictable shots inherent in table tennis. This hybrid approach enabled HITTER to fine-tune its movements through trial and error, resulting in lifelike swings and footwork. Tested on a general-purpose humanoid platform (likely the Unitree G1), HITTER demonstrated its
roboticshumanoid-robotreinforcement-learningAI-planninghuman-robot-interactiontable-tennis-robotrobot-motion-controlEmotional intelligence is ElliQ's core strength, says Intuition Robotics - The Robot Report
Intuition Robotics, founded in 2016 by Dor Skuler, developed ElliQ, an AI care companion robot designed to promote independence and healthy living among older adults. Skuler’s personal experiences caring for his grandfather highlighted the importance of emotional connection and personality in caregiving, beyond just technical skills. This insight led Intuition Robotics to focus on emotional intelligence as the core strength of ElliQ, aiming to create empathetic interactions that can address loneliness and provide meaningful companionship rather than merely performing physical tasks. Unlike many developers pursuing fully mobile humanoid robots, Intuition Robotics chose to create a stationary device that emphasizes social interaction and emotional engagement. ElliQ’s design centers on a “social interaction stack” that enables it to initiate conversations naturally and understand the nuances of human behavior and etiquette within the home environment. Skuler emphasized that true utility in caregiving robots requires blending seamlessly into the complexities of daily life, making ElliQ more of a friend or roommate than just a functional tool. The company’s approach reflects
robotAI-care-companionemotional-intelligencehuman-robot-interactionelder-care-technologysocial-robotsIntuition-RoboticsNew algorithm teaches robots how not to hurt humans in workplaces
Researchers at the University of Colorado Boulder have developed a new algorithm that enables robots to make safer decisions when working alongside humans in factory environments. Inspired by game theory, the algorithm treats the robot as a player seeking an “admissible strategy” that balances task completion with minimizing potential harm to humans. Unlike traditional approaches focused on winning or perfect prediction, this system prioritizes human safety by anticipating unpredictable human actions and choosing moves that the robot will not regret in the future. The algorithm allows robots to respond intelligently and proactively in collaborative workspaces. If a human partner acts unexpectedly or makes a mistake, the robot first attempts to correct the issue safely; if unsuccessful, it may relocate its task to a safer area to avoid endangering the person. This approach acknowledges the variability in human expertise and behavior, requiring robots to adapt to all possible scenarios rather than expecting humans to adjust. The researchers envision that such robots will complement human strengths by handling repetitive, physically demanding tasks, potentially addressing labor shortages in sectors like elder
robotroboticshuman-robot-interactionsafety-algorithmsindustrial-robotsworkplace-safetyartificial-intelligenceMIT roboticists debate the future of robotics, data, and computing - The Robot Report
At the IEEE International Conference on Robotics and Automation (ICRA), leading roboticists debated the future direction of robotics, focusing on whether advances will be driven primarily by code-based models or data-driven approaches. The panel, moderated by Ken Goldberg of UC Berkeley and featuring experts such as Daniela Rus, Russ Tedrake, Leslie Kaelbling, and others, highlighted a growing divide in the field. Rus and Tedrake strongly advocated for data-centric methods, emphasizing that real-world robotics requires machines to learn from extensive, multimodal datasets capturing human actions and environmental variability. They argued that traditional physics-based models work well in controlled settings but fail to generalize to unpredictable, human-centered tasks. Rus’s team at MIT’s CSAIL is pioneering this approach by collecting detailed sensor data on everyday human activities like cooking, capturing nuances such as gaze and force interactions to train AI systems that enable robots to generalize and adapt. Tedrake illustrated how scaling data enables robots to develop "common sense" for dexter
roboticsartificial-intelligencemachine-learningrobotics-researchdata-driven-roboticshuman-robot-interactionrobotic-automationUL Solutions opens 1st service robot testing lab
UL Solutions, a global leader in applied safety science, has opened its first testing laboratory for commercial and service robots in Seoul, South Korea. The lab aims to provide testing and certification services focused on identifying emerging hazards, especially those related to human-robot interactions. It will primarily test compliance with UL 3300, the Standard for Safety for Service, Communication, Information, Education and Entertainment Robots. This standard addresses critical safety aspects such as mobility, fire and shock hazards, and safe interaction with vulnerable individuals, requiring features like speed limits, object detection, and audible/visual indicators to ensure robots operate safely alongside people in public and commercial settings. The establishment of this lab reflects the rapid growth of the robotics industry, where robots are increasingly deployed in diverse environments including hotels, healthcare, retail, and delivery services. UL Solutions highlights the importance of addressing new safety concerns as robots take on more roles outside traditional industrial floors. The global service robotics market is expanding, particularly in the Asia-Pacific region, driven by labor
robotservice-robotsrobot-testinghuman-robot-interactionUL-3300-standardrobotics-safetycommercial-robotsSoft robot jacket offers support for upper-limb disabilities
Researchers at Harvard John A. Paulson School of Engineering and Applied Sciences, in collaboration with Massachusetts General Hospital and Harvard Medical School, have developed a soft, wearable robotic jacket designed to assist individuals with upper-limb impairments caused by conditions such as stroke and ALS. This device uses a combination of machine learning and a physics-based hysteresis model to personalize movement assistance by accurately detecting the user’s motion intentions through sensors. The integrated real-time controller adjusts the level of support based on the user’s specific movements and kinematic state, enhancing control transparency and practical usability in daily tasks like eating and drinking. In trials involving stroke and ALS patients, the robotic jacket demonstrated a 94.2% accuracy in identifying subtle shoulder movements and reduced the force needed to lower the arm by nearly one-third compared to previous models. It also improved movement quality by increasing range of motion in the shoulder, elbow, and wrist, reducing compensatory trunk movements by up to 25.4%, and enhancing hand-path efficiency by up
soft-roboticswearable-robotsupper-limb-supportassistive-technologymachine-learningrehabilitation-roboticshuman-robot-interactionInterview with Haimin Hu: Game-theoretic integration of safety, interaction and learning for human-centered autonomy - Robohub
In this interview, Haimin Hu discusses his PhD research at Princeton Safe Robotics Lab, which centers on the algorithmic foundations of human-centered autonomy. His work integrates dynamic game theory, machine learning, and safety-critical control to develop autonomous systems—such as self-driving cars, drones, and quadrupedal robots—that are safe, reliable, and adaptable in human-populated environments. A key innovation is a unified game-theoretic framework that enables robots to plan motion by considering both physical and informational states, allowing them to interact safely with humans, adapt to their preferences, and even assist in skill refinement. His contributions span trustworthy human-robot interaction through real-time learning to reduce uncertainty, verifiable neural safety analysis for complex robotic systems, and scalable game-theoretic planning under uncertainty. Hu highlights the challenge of defining safety in human-robot interaction, emphasizing that statistical safety metrics alone are insufficient for trustworthy deployment. He argues for robust safety guarantees comparable to those in critical infrastructure, combined with runtime learning
robothuman-robot-interactionautonomous-systemssafety-critical-controlgame-theorymachine-learningautonomous-vehiclesElephant Robotics builds myCobot Pro 450 to meet industrial expectations - The Robot Report
Elephant Robotics has launched the myCobot Pro 450, a compact collaborative robot arm designed to meet industrial-level demands across education, research, and commercial applications. The robot features a modular design with a 1 kg payload, 450 mm reach, and high positioning accuracy of ±0.1 mm. Weighing under 5 kg, it incorporates harmonic reducers, servo motors, joint brakes, and integrated controllers within an all-metal, durable housing. The myCobot Pro 450 supports various end effectors such as cameras, suction pumps, and grippers, enabling rapid deployment for tasks like data collection, fine manipulation, and intelligent human-robot interaction (HRI). The cobot supports personalized applications including 3D visual random sorting, robotic writing and painting, and compound mobile inspections. It integrates with peripherals like 3D cameras, recognition software, industrial PCs, and mobile platforms (e.g., myAGV Pro) to offer scalable solutions. Notably, the myC
robotcollaborative-robotmyCobot-Pro-450industrial-automationAI-integrationhuman-robot-interactionrobotic-armChina’s Kaiwa plans world’s first pregnancy humanoid robot
Chinese tech company Kaiwa Technology, based in Guangzhou, is developing what it claims will be the world’s first pregnancy humanoid robot, set to debut by 2026 at a price under $13,900. This humanoid robot features an embedded artificial womb designed to carry a fetus through the entire ten-month gestation period, replicating natural pregnancy by using artificial amniotic fluid and nutrient delivery via a hose. The technology, reportedly mature in laboratory settings, aims to offer an alternative to human pregnancy, potentially benefiting those who wish to avoid the physical burdens of gestation. The project has sparked significant public debate over ethical, legal, and scientific implications, with discussions already underway with authorities in Guangdong Province. The artificial womb technology builds on prior advances, such as the 2017 “biobag” experiment where premature lambs were nurtured in artificial amniotic fluid, though current artificial wombs mainly support partial gestation rather than full-term pregnancy. Kaiwa’s vision requires further breakthroughs
robothumanoid-robotartificial-wombAI-technologypregnancy-robotrobotics-innovationhuman-robot-interactionSensing robot hand flicks, flinches, and grips like a human
A student team at USC Viterbi, led by assistant professor Daniel Seita, has developed the MOTIF Hand, a robotic hand designed to mimic human touch by sensing multiple modalities such as pressure, temperature, and motion. Unlike traditional robot grippers, the MOTIF Hand integrates a thermal camera embedded in its palm to detect heat without physical contact, allowing it to "flinch" away from hot surfaces much like a human would. It also uses force sensors in its fingers to apply precise pressure and can gauge the weight or contents of objects by flicking or shaking them, replicating human instincts in object interaction. The MOTIF Hand builds on previous open-source designs like Carnegie Mellon’s LEAP Hand, with the USC team also committing to open-source their work to foster collaboration in the robotics community. The developers emphasize that this platform is intended as a foundation for further research, aiming to make advanced tactile sensing accessible to more teams. Their findings have been published on Arxiv, highlighting a significant step toward
robotrobotic-handsensorshuman-robot-interactiontactile-sensingthermal-detectionrobotics-researchGR-3 humanoid robot debuts with empathy, emotion, and lifelike walk
The GR-3 humanoid robot, unveiled by Fourier on August 6, 2025, represents a significant advancement in human-robot interaction by emphasizing empathy, emotional awareness, and lifelike movement. Standing 165 cm tall and weighing 71 kg, GR-3 features 55 degrees of freedom enabling natural, balanced motion, including expressive gaits such as a “bouncy walk.” Its design incorporates a soft-touch shell with warm tones and premium upholstery to create a familiar, comforting presence rather than a mechanical one. Central to its capabilities is Fourier’s Full-Perception Multimodal Interaction System, which integrates vision, audio, and tactile inputs into a real-time emotional processing engine. This system allows GR-3 to localize voices, maintain eye contact, recognize faces, and respond to touch via 31 pressure sensors, producing subtle emotional gestures that simulate genuine empathy. Beyond sensing, GR-3 employs a dual-path cognitive architecture combining fast, reflexive responses with slower, context-aware reasoning
roboticshumanoid-robotemotional-AIhuman-robot-interactionhealthcare-roboticsempathetic-robotsassistive-technologyJapan team builds falcon-like drone that lands softly on your palm
Researchers at the University of Tokyo’s DRAGON Lab have developed a falcon-inspired flapping-wing drone capable of safely landing on a person’s palm without cushions. Unlike traditional propeller drones, this drone uses soft, flexible wings that mimic bird flight, resulting in quieter operation and a gentler presence ideal for close human interaction. The design is inspired by falconry and represents the first successful contact-based interaction between a flapping-wing drone and a human, emphasizing safety through careful flight planning that accounts for physical and psychological factors such as distance, altitude, approach direction, and velocity. The drone maintains a minimum distance of 0.3 meters from the user’s chest, slows down as it approaches, and stays within a comfortable altitude range between the elbow and eye level. It is controlled through intuitive hand gestures—bending the arm signals the drone to hover, while extending the arm commands it to approach and land. A sophisticated motion capture system with multiple cameras tracks markers on the user and drone, enabling
robotdroneflapping-wing-dronehuman-robot-interactiongesture-controlmotion-planningsafe-landingInterview with Kate Candon: Leveraging explicit and implicit feedback in human-robot interactions - Robohub
In this interview, Kate Candon, a PhD student at Yale University, discusses her research on improving human-robot interaction by leveraging both explicit and implicit feedback. Traditional robot learning often relies on explicit feedback, such as simple "good job" or "bad job" signals from a human teacher who is not actively engaged in the task. However, Candon emphasizes that humans naturally provide a range of implicit cues—like facial expressions, gestures, or subtle actions such as moving an object away—that convey valuable information without additional effort. Her current research aims to develop a framework that combines these implicit signals with explicit feedback to enable robots to learn more effectively from humans in natural, interactive settings. Candon explains that interpreting implicit feedback is challenging due to variability across individuals and cultures. Her initial approach focuses on analyzing human actions within a shared task to infer appropriate robot responses, with plans to incorporate visual cues such as facial expressions and gestures in future work. The research is tested in a pizza-making scenario, chosen for
robothuman-robot-interactionimplicit-feedbackexplicit-feedbackinteractive-agentsrobot-learningAICutest Humanoid Robot Ready For Launch
The article introduces the Fourier GR-3, a new humanoid robot designed primarily for companionship and caregiving purposes. It highlights the robot's notably cute appearance, which sets it apart from previous models and may enhance its acceptance and integration into human environments. The robot's design aims to foster more natural and engaging interactions between humans and robots. While specific capabilities of the Fourier GR-3 are not detailed in the provided content, the article suggests that its launch could mark a significant step forward in how robots assist with caregiving and social companionship. The potential impact includes improving the quality of life for individuals needing support and advancing the development of empathetic and interactive robotic companions. However, further information about its functionalities and deployment remains unclear from the excerpt.
robothumanoid-robotroboticsAIcompanion-robotcaregiving-robothuman-robot-interactionFourier to unveil world's most 'adorable' humanoid robot next week
Shanghai-based robotics company Fourier Robotics is set to unveil its newest humanoid robot, the GR-3, on August 6, 2025. The GR-3 follows the GR-1 and GR-2 models but features a notably smaller and friendlier design, standing approximately 4 feet 5 inches (134 cm) tall, compared to the taller predecessors. The robot’s aesthetic is described as “softer” and more “adorable,” with expressive eyes aimed at enhancing user engagement. Designed primarily for domestic, educational, healthcare, and public environments, the GR-3 integrates a large language model (LLM) to facilitate natural speech interaction, positioning it as a companion or caregiver robot optimized for friendly human interaction. Building on Fourier’s previous models, which showcased advanced mobility, perception, and dexterous manipulation, the GR-3 is expected to emphasize compact hardware and approachable design suitable for home and classroom settings. While likely featuring simpler actuation and sensing compared to the GR-2
roboticshumanoid-robotAI-companionsmart-actuatorsdomestic-robotseducational-robotshuman-robot-interactionNew soft robot arm scrubs toilets and dishes with drill-level force
Researchers at Northeastern University have developed SCCRUB, a novel soft robotic arm designed to tackle tough cleaning tasks with drill-level scrubbing power while maintaining safety around humans. Unlike traditional rigid industrial robots, SCCRUB uses flexible yet strong components called TRUNC cells—torsionally rigid universal couplings—that allow the arm to bend and flex while transmitting torque comparable to a handheld drill. This combination enables the robot to apply significant force to remove stubborn grime without posing risks typical of hard robotic arms. Equipped with a counter-rotating scrubber brush and guided by a deep learning-based controller, SCCRUB can clean challenging messes such as microwaved ketchup and fruit preserves on glass dishes and toilet seats, removing over 99% of residue in lab tests. The counter-rotating brush design helps maintain firm pressure and stability by canceling frictional forces, enhancing cleaning effectiveness while preserving the arm’s soft and safe nature. The research team envisions expanding SCCRUB’s capabilities to assist humans
robotsoft-roboticsrobotic-armmachine-learningautomationcleaning-robothuman-robot-interactionMIT’s 3-in-1 training tool eases robot learning
MIT engineers have developed a novel three-in-one training interface that allows robots to learn new tasks through any of three common demonstration methods: remote control (teleoperation), physical manipulation (kinesthetic training), or by observing a human perform the task (natural teaching). This handheld, sensor-equipped tool can attach to many standard robotic arms, enabling users to teach robots in whichever way best suits the task or user preference. The interface was tested on a collaborative robotic arm by manufacturing experts performing typical factory tasks, demonstrating increased flexibility in robot training. This versatile demonstration interface aims to broaden the range of users who can effectively teach robots, potentially expanding robot adoption beyond manufacturing into areas like home care and healthcare. For example, one person could remotely train a robot to handle hazardous materials, another could physically guide the robot in packaging, and a third could demonstrate drawing a logo for the robot to mimic. The research, led by MIT’s Department of Aeronautics and Astronautics and CSAIL, was presented at the IEEE I
roboticsrobot-learninghuman-robot-interactioncollaborative-robotsrobot-training-toolsMIT-roboticsintelligent-robotsUnveiling the Tree of Robots: A new taxonomy for understanding robotic diversity - The Robot Report
Researchers at the Munich Institute of Robotics and Machine Intelligence (MIRMI) at the Technical University of Munich (TUM) have developed the “Tree of Robots,” a novel taxonomy and evaluation scheme designed to measure and compare the sensitivity of autonomous robots. Sensitivity, which is critical for safe and flexible human-robot interaction, previously lacked a standardized assessment method. This new framework enables the categorization of various robotic systems—including industrial robots, cobots, soft robots, and tactile robots—based on 25 specific measurements related to physical contact sensitivity, such as force alignment and safety in human interaction. The resulting spider diagrams provide an accessible visual summary of a robot’s sensitivity performance, facilitating better understanding and comparison even for non-experts. The Tree of Robots draws inspiration from Darwin’s Tree of Life, illustrating the diversity and specialization of robotic “species” according to their design and operational environments. By analyzing single-armed robots from different manufacturers, the researchers identified distinct capabilities related to sensors, motors, and control
roboticsrobotic-manipulatorsrobot-sensitivityhuman-robot-interactionindustrial-robotsautonomous-robotsrobotic-taxonomyWeek in Review: X CEO Linda Yaccarino steps down
The Week in Review highlights several major tech developments, starting with the departure of Linda Yaccarino as CEO of X after a challenging two-year period marked by advertiser backlash, controversies involving Elon Musk, and AI-related issues on the platform. Despite her leadership, the company faces ongoing difficulties ahead. Apple is adjusting its user interface by reducing transparency in features like Notifications and Apple Music to improve readability ahead of its fall OS launch. Hugging Face introduced Reachy Mini, an affordable, programmable robot aimed at AI developers, priced from $299 and integrated with its AI hub. In consumer tech, Nothing launched its ambitious Phone 3 with innovative features like a second screen and AI capabilities, though mixed reactions to design and pricing may limit its market impact. Samsung released new foldable phones, including the Z Fold7, Z Flip7, and a more affordable Z Flip7 FE. Rivian unveiled a high-performance electric vehicle boasting over 1,000 horsepower and advanced software features, positioning it as a flagship
robotAIprogrammable-robotsHugging-Facerobotics-safetyAI-developershuman-robot-interactionHugging Face launches Reachy Mini robot as embodied AI platform
Hugging Face, following its acquisition of Pollen Robotics in April 2025, has launched Reachy Mini, an open-source, compact robot designed to facilitate experimentation in human-robot interaction, creative coding, and AI. Standing 11 inches tall and weighing 3.3 pounds, Reachy Mini features motorized head and body rotation, expressive animated antennas, and multimodal sensing via an integrated camera, microphones, and speakers, enabling rich AI-driven audio-visual interactions. The robot is offered as a kit in two versions, encouraging hands-on assembly and deeper mechanical understanding, and will provide over 15 robot behaviors at launch. A key advantage of Reachy Mini is its seamless integration with Hugging Face’s AI ecosystem, allowing users to utilize advanced open-source models for speech, vision, and personality development. It is fully programmable in Python, with planned future support for JavaScript and Scratch, catering to developers of varying skill levels. The robot’s open-source hardware, software, and simulation
robotembodied-AIopen-source-roboticshuman-robot-interactionAI-powered-robotprogrammable-robotHugging-Face-roboticsRobot Talk Episode 125 – Chatting with robots, with Gabriel Skantze - Robohub
In episode 125 of the Robot Talk podcast, Claire interviews Gabriel Skantze, a Professor of Speech Communication and Technology at KTH Royal Institute of Technology. Skantze specializes in conversational AI and human-robot interaction, focusing on creating natural face-to-face conversations between humans and robots. His research integrates both verbal and non-verbal communication elements, such as prosody, turn-taking, feedback, and joint attention, to improve the fluidity and naturalness of spoken interactions with robots. Skantze also co-founded Furhat Robotics in 2014, where he continues to contribute as Chief Scientist. Furhat Robotics develops social robots designed to engage in human-like conversations, leveraging Skantze’s expertise in computational models of spoken interaction. The episode highlights ongoing advancements in conversational systems and the challenges involved in making robot communication more natural and effective, emphasizing the importance of combining multiple communication cues to enhance human-robot interaction.
robotroboticsconversational-AIhuman-robot-interactionspeech-communicationautonomous-machinesFurhat-RoboticsTesla sues former Optimus engineer over alleged trade secret theft
Tesla has filed a lawsuit against Zhongjie “Jay” Li, a former engineer in its Optimus humanoid robotics program, accusing him of stealing trade secrets related to advanced robotic hand sensors. Li, who worked at Tesla from August 2022 to September 2024, allegedly downloaded confidential information onto personal devices and conducted research on humanoid robotic hands and startup funding sources during his final months at the company. Shortly after his departure, Li founded a startup called Proception, which claims to have developed advanced humanoid robotic hands resembling Tesla’s designs. The complaint highlights that Proception was incorporated less than a week after Li left Tesla and publicly announced its achievements within five months, raising concerns about the misuse of Tesla’s proprietary technology. Tesla’s Optimus program, launched in 2021, has faced development challenges and delays, with Elon Musk indicating in mid-2024 that the company would continue work on the project despite earlier setbacks. The lawsuit underscores ongoing tensions in the competitive field of humanoid robotics
robothumanoid-roboticsTesla-Optimusrobotic-hand-sensorstrade-secret-theftrobotics-startuphuman-robot-interactionSensitive skin to help robots detect information about surroundings
Researchers from the University of Cambridge and University College London have developed a highly sensitive, low-cost, and durable robotic skin that can detect various types of touch and environmental information similarly to human skin. This flexible, conductive skin is made from a gelatine-based hydrogel that can be molded into complex shapes, such as a glove for robotic hands. Unlike traditional robotic touch sensors that require multiple sensor types for different stimuli, this new skin acts as a single sensor capable of multi-modal sensing, detecting taps, temperature changes, cuts, and multiple simultaneous touches through over 860,000 tiny conductive pathways. The team employed a combination of physical testing and machine learning to interpret signals from just 32 electrodes placed at the wrist, enabling the robotic skin to process more than 1.7 million data points across the hand. Tests included exposure to heat, gentle and firm touches, and even cutting, with the collected data used to train the system to recognize different types of contact efficiently. While not as sensitive as human skin
roboticsrobotic-skinsensorsflexible-materialsconductive-hydrogelmulti-modal-sensinghuman-robot-interactionInterview with Amar Halilovic: Explainable AI for robotics - Robohub
Amar Halilovic, a PhD student at Ulm University in Germany, is conducting research on explainable AI (XAI) for robotics, focusing on how robots can generate explanations of their actions—particularly in navigation—that align with human preferences and expectations. His work involves developing frameworks for environmental explanations, especially in failure scenarios, using black-box and generative methods to produce textual and visual explanations. He also studies how to plan explanation attributes such as timing, representation, and duration, and is currently exploring dynamic selection of explanation strategies based on context and user preferences. Halilovic finds it particularly interesting how people interpret robot behavior differently depending on urgency or failure context, and how explanation expectations shift accordingly. Moving forward, he plans to extend his framework to enable real-time adaptation, allowing robots to learn from user feedback and adjust explanations on the fly. He also aims to conduct more user studies to validate the effectiveness of these explanations in real-world human-robot interaction settings. His motivation for studying explainable robot navigation stems from a broader interest in human-machine interaction and the importance of understandable AI for trust and usability. Before his PhD, Amar studied Electrical Engineering and Computer Science in Bosnia and Herzegovina and Sweden. Outside of research, he enjoys traveling and photography and values building a supportive network of mentors and peers for success in doctoral studies. His interdisciplinary approach combines symbolic planning and machine learning to create context-sensitive, explainable robot systems that adapt to diverse human needs.
roboticsexplainable-AIhuman-robot-interactionrobot-navigationAI-researchPhD-researchautonomous-robotsPepper humanoid robot powered by ChatGPT conducts real-world interaction
Researchers from the University of Canberra showcased Pepper, a humanoid robot integrated with ChatGPT, at an Australian innovation festival to study public reactions to AI-powered social robots in real-world settings. Pepper captures audio from users, transcribes it, generates responses via ChatGPT, and communicates back through text-to-speech. The trial involved 88 participants who interacted with Pepper, many for the first time, providing feedback that revealed a broad spectrum of emotions including curiosity, amusement, frustration, and unease. The study underscored the importance of first impressions and real-world contexts in shaping societal acceptance of humanoid robots, especially as they become more common in sectors like healthcare, retail, and education. Key findings highlighted four main themes: user suggestions for improvement, expectations for human-like interaction, emotional responses, and perceptions of Pepper’s physical form. Participants noted a disconnect between Pepper’s human-like appearance and its limited interactive capabilities, such as difficulties in recognizing facial expressions and following social norms like turn-taking. Feedback also pointed to technical and social challenges, including the need for faster responses, greater cultural and linguistic inclusivity—particularly for Indigenous users—and improved accessibility. The study emphasizes that testing social robots “in the wild” provides richer, human-centered insights into how society may adapt to embodied AI companions beyond controlled laboratory environments.
robothumanoid-robotChatGPTAI-powered-robotshuman-robot-interactionsocial-roboticsSoftBank-RoboticsCongratulations to the #ICRA2025 best paper award winners - Robohub
The 2025 IEEE International Conference on Robotics and Automation (ICRA), held from May 19-23 in Atlanta, USA, announced its best paper award winners and finalists across multiple categories. The awards recognized outstanding research contributions in areas such as robot learning, field and service robotics, human-robot interaction, mechanisms and design, planning and control, and robot perception. Each category featured a winning paper along with several finalists, highlighting cutting-edge advancements in robotics. Notable winners include "Robo-DM: Data Management for Large Robot Datasets" by Kaiyuan Chen et al. for robot learning, "PolyTouch: A Robust Multi-Modal Tactile Sensor for Contact-Rich Manipulation Using Tactile-Diffusion Policies" by Jialiang Zhao et al. for field and service robotics, and "Human-Agent Joint Learning for Efficient Robot Manipulation Skill Acquisition" by Shengchent Luo et al. for human-robot interaction. Other winning papers addressed topics such as soft robot worm behaviors, robust sequential task solving via dynamically composed gradient descent, and metrics-aware covariance for stereo visual odometry. The finalists presented innovative work ranging from drone detection to adaptive navigation and assistive robotics, reflecting the broad scope and rapid progress in the robotics field showcased at ICRA 2025.
roboticsrobot-learninghuman-robot-interactiontactile-sensorsrobot-automationsoft-roboticsrobot-navigationWhy Intempus thinks robots should have a human physiological state
robotroboticsAIemotional-intelligencehuman-robot-interactionIntempusmachine-learningWhat’s coming up at #ICRA2025?
robotroboticsautomationICRA2025human-robot-interactionsoft-roboticsmulti-robot-systemsMô hình AI cho phép điều khiển robot bằng lời
robotAIMotionGlotmachine-learningroboticshuman-robot-interactionautomationRobot Talk Episode 110 – Designing ethical robots, with Catherine Menon
robot-ethicsassistive-technologyautonomous-systemsAI-safetyhuman-robot-interactionethical-designpublic-trust-in-AI