Articles tagged with "autonomous-robots"
China's LimX Dynamics raises funds to build humanoid robot 'brains'
Chinese robotics company LimX Dynamics has secured approximately $200 million in Series B funding to advance its development of embodied intelligence in humanoid robots. The funding round included investors such as UAE-based Stone Venture, Oriental Fortune Capital, JD.com, and others. LimX Dynamics focuses on integrating AI with physical machines, enabling robots to learn and adapt through interaction with their environment—a concept known as embodied intelligence, which is a subset of Physical AI. The company has developed two core technologies: COSA (Cognitive OS of Agents), a software platform serving as the robot’s “brain” that controls whole-body motion, and Tron 2, a modular hardware system for building humanoid robots. COSA functions similarly to the human cerebellum, enabling fluid, coordinated movements and real-time task reprioritization without human intervention. Their humanoid robot, Oli, stands 5 ft 4 in tall, weighs 121 pounds, and features dual arms with seven degrees of freedom, capable of handling objects up
roboticshumanoid-robotsAIembodied-intelligencemodular-roboticsautonomous-robotsphysical-AIWorld's first Large Plant Model trained on 150 million plants unveiled
US startup Carbon Robotics has introduced the world’s first Large Plant Model (LPM), an AI system trained on 150 million labeled plants to revolutionize crop management. This model powers the company’s LaserWeeder robots, enabling them to identify and laser-weed nearly any crop or field within minutes. The LPM continuously learns from data collected by a global fleet of machines, allowing real-time adaptation and shared performance improvements across the entire fleet. This technology aims to reduce labor costs, minimize herbicide use, and enhance crop yields by providing advanced, autonomous weeding capabilities. In addition to the LPM, Carbon Robotics has launched Plant Profiles, a personalization feature that lets farmers quickly customize the LaserWeeder to their specific crops, weeds, and field conditions via a tablet interface. By selecting just a few representative images, users can immediately adjust the AI’s plant identification and laser treatment processes, enabling rapid, real-time optimization without lengthy retraining. This user-friendly tool significantly lowers the barrier to adopting autonomous we
robotAIagriculture-technologyautonomous-robotsmachine-learningprecision-farmingLaserWeederCarbon Robotics built an AI model that detects and identifies plants
Carbon Robotics, a Seattle-based company known for its LaserWeeder robot fleet that uses lasers to eliminate weeds, has developed a new AI model called the Large Plant Model (LPM). This advanced model instantly recognizes plant species, enabling farmers to identify and target new weeds without the need for retraining the robots. Trained on over 150 million photos and data points collected from more than 100 farms across 15 countries, the LPM powers Carbon AI, the system controlling the company’s autonomous weed-killing machines. Previously, recognizing new or variant weeds required about 24 hours of data labeling and retraining, but the LPM can now learn and adapt in real time, allowing farmers to instruct the robots to kill new weeds immediately. The LPM represents a significant leap in agricultural AI, as it understands plant species at a deeper level, including their structure and relations, even if the specific plant has never been seen before. Carbon Robotics, founded in 2018, began developing this model after
roboticsartificial-intelligenceagriculture-technologyautonomous-robotsmachine-learningweed-detectionprecision-farmingPhotos: Apollo humanoid robots prepare to transition from factory floors to space
The Apollo humanoid robot, developed by Apptronik in collaboration with NASA over the past decade, represents a significant advancement in humanoid robotics designed for both industrial and space applications. Standing 5 feet 8 inches tall and weighing 160 pounds, Apollo’s architecture is derived from NASA’s Valkyrie project and incorporates liquid-cooled robotic actuator technology developed through NASA SBIR contracts. This actuator technology enables precise, reliable movements, allowing Apollo to perform physically demanding and repetitive tasks. Currently deployed in Mercedes-Benz factories for logistics work, Apollo serves as a practical testing platform to prepare the robot for the challenging environments of spacecraft and extraterrestrial habitats. NASA’s interest in humanoid robots like Apollo is driven by their compatibility with human-designed tools and environments, such as space stations and lunar habitats. Humanoids can operate existing interfaces and perform complex maintenance remotely, acting as avatars for human operators. Unlike earlier space robots like Robonaut 2, which lacked legs and thus mobility, Apollo features advanced actuators and
roboticshumanoid-robotsNASAspace-explorationrobotic-actuatorsautonomous-robotsindustrial-automationAutonomous robot drills data centers 10x faster with 99.97% accuracy
DEWALT, a US power tool manufacturer, has developed an autonomous downward-drilling robot in partnership with August Robotics to automate the labor-intensive task of high-volume concrete drilling in data center construction. Designed specifically for hyperscale infrastructure projects, the robot significantly increases productivity by operating up to ten times faster than traditional drilling methods while maintaining 99.97% accuracy in hole placement and depth. During pilot deployments across ten data center build phases with an unnamed hyperscaler, the system completed over 90,000 holes, cutting a combined 80 weeks from construction schedules and reducing costs per drilled hole. Bill Beck, President of Tools and Outdoor at Stanley Black & Decker, emphasized that the rapid growth in data center demand necessitates faster construction speeds, which this autonomous drilling technology addresses by removing a major bottleneck in large-scale builds. The robot not only accelerates project timelines and lowers costs but also improves jobsite safety and precision. Following successful live demonstrations in early 2026, DEWALT plans to
roboticsautonomous-robotsconstruction-automationdata-center-technologydrilling-robotindustrial-robotsproductivity-enhancementWatch China’s humanoid robots walk out of crates like Matrix scene
Chinese robotics company LimX Dynamics has demonstrated a significant advancement in humanoid robotics with its new COSA operating system, coordinating 18 full-size Oli humanoid robots in a fully autonomous deployment. In a recently released video, the robots emerge independently from shipping crates, stand up, walk in formation, avoid collisions, and perform a synchronized routine without human intervention. This demonstration marks what LimX describes as the world’s first practical autonomous deployment of humanoid robots, showcasing the potential for multi-robot coordination in industrial environments such as manufacturing floors. The COSA (Cognitive OS of Agents) system integrates cognition and physical action within a unified software framework, linking high-level decision-making directly with low-level motor control. This integration allows the robots to perceive, reason, plan, and move almost simultaneously, improving responsiveness to dynamic real-world conditions. COSA also incorporates memory, enabling the robots to store and use past experiences to adapt their behavior. The system processes real-time sensor data to make rapid adjustments in balance and
roboticshumanoid-robotsmulti-robot-coordinationautonomous-robotsrobot-operating-systemLimX-Dynamicsrobot-deploymentWorld’s first robot astronaut: China’s Engine AI plans to send humanoid into space
Chinese robotics company Engine AI has announced plans to send its humanoid robot, PM01, into space, aiming to create the world’s first robot astronaut. Partnering with commercial space firm Beijing Interstellar Human Spaceflight Technology (Interstellor), the initiative—called the Humanoid Robot Astronaut Exploration Program—will focus on adapting PM01 for the extreme conditions of space, including vacuum, microgravity, temperature fluctuations, and radiation. Engine AI emphasizes that space missions demand exceptional stability, adaptability, and autonomous decision-making from robots, and the collaboration will work to enhance PM01’s resilience and independent operational capabilities for complex tasks in orbit. The PM01 humanoid robot is a compact, 1.38-meter-tall platform weighing about 40 kilograms, designed with a bionic structure and advanced sensors such as an Intel RealSense depth camera for spatial awareness. It features a dual-chip architecture combining NVIDIA Jetson Orin and Intel N97 CPUs to manage perception and motion control in real time. Engine
robothumanoid-robotspace-explorationAI-roboticsautonomous-robotsrobotics-in-spaceEngine-AIAutonomous microrobots finally break the millimeter barrier
Researchers from the University of Pennsylvania and the University of Michigan have developed autonomous microrobots that break the longstanding millimeter-size barrier, achieving fully integrated sensing, computation, and motion control at a scale of just 210 × 340 × 50 micrometers—about the size of a paramecium. This represents a volume roughly 10,000 times smaller than previous programmable robots. Unlike earlier microrobots that rely on external control systems such as magnetic coils or ultrasound arrays, these new robots operate independently, sensing their environment, making decisions, and acting autonomously. The devices are manufactured using fully lithographic processes, enabling low-cost production (under a penny per unit at scale), and can be programmed wirelessly via LED light to perform complex behaviors like climbing temperature gradients and encoding sensor data through movement patterns. Historically, microrobots have faced a fundamental trade-off: either be very small but externally controlled with no onboard intelligence, or be larger (around one millimeter)
roboticsmicrorobotsautonomous-robotsmicrotechnologysensorsonboard-computingmedical-roboticsRobot bat finds insects in darkness with 98% accuracy, mirroring bats
Scientists from the Smithsonian Tropical Research Institute, University of Cincinnati, and University of Antwerp investigated how big-eared bats (Micronycteris microtis) detect silent insects on leaves at night using echolocation. Building on prior behavioral studies, they hypothesized that bats exploit the acoustic properties of leaves: smooth, empty leaves reflect echolocation calls away, producing weak echoes, while insects on leaves scatter sound in multiple directions, creating distinctive echoes. However, precisely measuring leaf orientation to find prey seemed impractical for bats. Instead, the researchers proposed that bats rely on the steadiness of echoes over time rather than detailed spatial mapping. To test this, the team created a robot equipped with ultrasonic sensors mimicking bat echolocation. The robot scanned an array of cardboard leaves, one holding a fake insect, without measuring leaf size or angle. It successfully identified prey-occupied leaves with 98% accuracy and had a low false detection rate (18%) on empty leaves. The findings confirmed that bats detect prey by
roboticsbiomimicryultrasonic-sensorsecholocation-technologyrobotic-sensingautonomous-robotsacoustic-detectionWorld's first fleet drilling robot cuts data center build times
DEWALT, a U.S.-based power equipment maker owned by Stanley Black & Decker, has partnered with August Robotics to introduce the world’s first fleet-capable robot designed for downward concrete drilling. This robotic system targets a critical bottleneck in data center construction by automating the labor-intensive task of drilling thousands of precision holes needed to anchor server racks and support overhead mechanical, electrical, and plumbing systems. The robot operates autonomously and can work in fleets, allowing multiple units to drill simultaneously across large sites. According to DEWALT, the system drills up to 10 times faster than traditional methods, potentially reducing overall construction timelines by as much as 80 weeks while improving jobsite safety and cutting costs per hole. The robotic drilling system is already being piloted with one of the world’s largest hyperscalers and has completed work across 10 data center construction phases, achieving 99.97 percent accuracy in hole location and depth over more than 90,000 drilled holes. This high
roboticsconstruction-automationdata-centerdrilling-robotautonomous-robotsAI-infrastructurefleet-roboticsAI robot dogs take over missions too risky for human firefighters
DEEP Robotics has developed the “Emergency Firefighting Solution,” an AI-driven robotic system designed to transform firefighting by reducing human risk and enhancing operational efficiency. This integrated platform employs a team of specialized robots—including quadruped reconnaissance units, water gun and water cannon firefighting robots, and logistics bots—to manage various disaster scenarios such as fires, chemical leaks, and natural disasters. Equipped with advanced sensors like LiDAR, dual-spectrum cameras, and gas detectors, the reconnaissance robots can navigate hazardous environments, detect dangers, and locate trapped individuals, while the firefighting robots use high-pressure water mist and foam with dual-layer cooling to suppress flames safely from a distance. The system features a closed-loop design that covers forward reconnaissance, precise firefighting response, intelligent transport of supplies, and reliable communication, including integration with drones for comprehensive situational awareness through real-time data and 3D mapping. DEEP Robotics’ technology has been validated in multiple real-world tests, including emergency drills and competitions, where their robots
roboticsAI-robotsfirefighting-robotsemergency-responserobot-dogsautonomous-robotsdisaster-managementAirbus to test China-made humanoid robots in aviation production
Chinese robotics firm UBTech Robotics has entered a new partnership with European aerospace giant Airbus to supply its Walker S2 humanoid robots for use in aircraft manufacturing facilities. This collaboration aims to test and deploy the robots in aviation production environments, which demand high precision, strict safety, and reliable performance. The Walker S2, introduced in mid-2023, is a 5.8-foot-tall industrial humanoid robot equipped with dexterous arms, vision systems, and UBTech’s proprietary Co Agent AI, enabling it to perform complex tasks, recognize objects, and adapt to production line needs. Notably, it features an autonomous battery-swapping system allowing continuous operation without lengthy charging breaks. UBTech has already seen adoption of the Walker S2 across various industries, including automotive and electronics manufacturing, with companies like BYD and Foxconn integrating the robots into their facilities. The Airbus deal follows a similar agreement with US semiconductor company Texas Instruments, highlighting UBTech’s expanding presence in overseas markets such as aviation
roboticshumanoid-robotsindustrial-automationAI-roboticsaircraft-manufacturingsmart-factoriesautonomous-robotsUK-built tiny sailing robots collect first-ever data from live Category 5 hurricane at sea
UK-based company Oshen, founded by Anahita Laverack, has achieved a groundbreaking milestone by collecting live data from inside a Category 5 hurricane at sea for the first time. Initially focused on building small autonomous sailing robots for ocean crossings, Laverack realized the critical challenge was the lack of real-time ocean and weather data, which made existing robots fragile and ineffective. This insight led to the creation of Oshen’s C-Stars—small, durable, and relatively inexpensive autonomous sailing drones capable of operating in swarms and surviving up to 100 days at sea while collecting valuable environmental data. In 2025, the National Oceanic and Atmospheric Administration (NOAA) commissioned Oshen to deploy a fleet of C-Stars into Hurricane Humberto. Of the five drones sent, three survived and successfully gathered crucial hurricane data, marking the first time such information was collected live from within a Category 5 hurricane. This data is expected to enhance weather prediction models, improve naval awareness, advance
robotautonomous-robotsoceangoing-dronesenvironmental-data-collectionhurricane-dataocean-sensorsremote-sensingOshen built the first ocean robot to collect data in a Category 5 hurricane
Anahita Laverack, originally aspiring to be an aerospace engineer and an experienced sailor, founded Oshen in April 2022 after recognizing a critical gap in ocean data collection. Inspired by her unsuccessful attempt to cross the Atlantic with an autonomous sail-powered micro-robot in the Microtransat Challenge, she discovered that a lack of reliable ocean and weather data was a major obstacle. Partnering with electrical engineer Ciaran Dowds, Laverack developed Oshen’s autonomous micro-robots, called C-Stars, designed to survive up to 100 days in harsh ocean conditions and deployed in swarms to gather detailed ocean data. The company initially operated on a shoestring budget, using a 25-foot sailboat as a testing platform while iterating on the technology through challenging weather conditions. Oshen’s innovation lies in creating micro-robots that are simultaneously mass deployable, cost-effective, and technologically advanced enough to operate autonomously for extended periods. This unique combination attracted interest from defense and
robotautonomous-robotsocean-data-collectionmarine-roboticsenvironmental-monitoringIoT-sensorsremote-sensingPhotos: Figure AI humanoid robot mimics human running style with impressive precision
California-based robotics company Figure AI has released a video showcasing its latest humanoid robot jogging outdoors alongside company employees, demonstrating significant advancements in bipedal locomotion. The robot exhibits a fluid and natural running gait, closely mimicking human biomechanics, a marked improvement over earlier robotic movements. However, it remains unclear whether the robot was operating autonomously or under remote control during the run, a detail critical to assessing its level of independence. The new Figure model, introduced in late 2025, is designed for versatile physical interactions, supporting various movements and weight distributions necessary for both outdoor navigation and indoor tasks. While primarily intended for home assistance, the jogging demonstration tests the robot’s balance and motor control in unconstrained environments. The robot is also being trained for household chores such as loading dishwashers, folding laundry, and distributing food and beverages, combining computer vision with precise hand-eye coordination. No commercial release date for the Figure 03 model has been announced, as it remains in development pending further reliability testing
robothumanoid-robotroboticsbipedal-movementAI-roboticsautonomous-robotsphysical-interactionTaking humanoid soccer to the next level: An interview with RoboCup trustee Alessandra Rossi - Robohub
The article features an interview with Alessandra Rossi, a trustee of RoboCup and Assistant Professor of Computer Science at the University of Naples “Federico II,” who has been deeply involved in the RoboCup humanoid soccer league since 2016. Rossi’s engagement has grown from participating as a team member and leader of the UK’s Bold Hearts humanoid KidSize team to serving on the Technical and Organizing Committees, the Executive Committee of the Humanoid League, and most recently, the RoboCup Board of Trustees. She has also contributed to educational initiatives, such as an online robotics module using RoboCup as a teaching benchmark, and co-authored a significant paper on current and future challenges in humanoid robotics, highlighting collaboration across RoboCup leagues. Looking ahead, RoboCup aims to realize its ambitious 2050 goal: a fully autonomous humanoid robot team defeating the reigning FIFA World Cup champions. To accelerate progress, the Federation plans key changes, including a stronger emphasis on humanoid robots and the
roboticshumanoid-robotsRoboCupAI-researchautonomous-robotsrobotics-competitionhumanoid-soccerNeo humanoid maker 1X releases world model to help bots learn what they see
Robotics company 1X, known for its Neo humanoid robots, has introduced a new physics-based AI model called the "world model" designed to help its bots better understand and learn from the real world. This model leverages video data combined with user prompts to enable Neo robots to acquire new skills beyond their initial training. While the company claims that Neo can transform any prompt into new actions, this capability is not immediate or all-encompassing; for example, Neo cannot instantly learn complex tasks like driving a car. Instead, the process involves capturing video linked to specific prompts, feeding this data into the world model, and then distributing the enhanced knowledge back to the network of Neo robots to improve their understanding and behavior over time. 1X is preparing to launch its Neo humanoids for home use, with pre-orders reportedly exceeding expectations, though the company has not disclosed exact shipping timelines or order quantities. According to founder and CEO Bernt Børnich, the world model marks a significant step toward
roboticsAIhumanoid-robotsmachine-learningrobotics-technologyautonomous-robotsrobot-trainingByteDance backs China’s new humanoid robot maker in funding round
Chinese robotics startup X Square Robot has secured $143.3 million (1 billion yuan) in a Series A++ funding round led by major investors including ByteDance, HSG (formerly Sequoia Capital China), and government-backed firms such as Beijing Information Industry Development Investment Fund and Shenzhen Capital Group. Founded in 2023, X Square specializes in humanoid robots and embodied AI, aiming for applications in homes, hotels, and logistics. The company is known for its Quanta X1 and X2 wheeled humanoid robots with dexterous hands, powered by its proprietary vision–language–action (VLA) model called WALL-A. This model integrates world models and causal reasoning to enhance robots’ ability to generalize and perform complex tasks in unstructured environments without prior training. X Square’s product lineup includes the Quanta X1, a wheeled bimanual robot with 20 degrees of freedom and a working range of up to 1 meter, and the more advanced Quanta
roboticshumanoid-robotsembodied-AIartificial-intelligencerobotics-startuprobotic-manipulationautonomous-robotsNEO can now teach itself new skills using video-based AI models
1X has introduced a significant AI upgrade for its humanoid robot NEO, called the 1X World Model, which enables the robot to autonomously learn new physical skills from simple voice or text prompts—even for tasks and environments it has never encountered before. This breakthrough is powered by a video-based AI model grounded in real-world physics, allowing NEO to learn from vast internet-scale video data that captures human interactions with objects. Unlike traditional robots that rely heavily on pre-programmed behaviors or extensive human-operated data, NEO can generalize knowledge from videos and translate it into reliable physical actions, bridging a critical gap in humanoid robotics. The 1X World Model allows NEO to perceive its surroundings through cameras, generate visual predictions of future actions, and execute precise movements using an inverse dynamics model. Demonstrations show NEO performing everyday tasks such as packing a lunch box and handling novel actions like opening sliding doors, ironing clothes, and brushing hair without prior training examples. This capability reflects the transfer
robothumanoid-robotAI-learningautonomous-robotsvideo-based-AIrobotics-innovationmachine-learningBeatbot announces new pool robots in pool care automation at CES 2026
At CES 2026, Beatbot unveiled two new automated pool-cleaning robots, highlighting a significant innovation in pool care automation. The flagship product, Beatbot AquaSense X, introduces a comprehensive AI-driven ecosystem combining an advanced robotic pool cleaner with the world’s first self-cleaning pool cleaner station, called AstroRinse. This station automates the traditionally disliked post-cleaning maintenance by rinsing the robot’s internal filter, emptying debris, and recharging the unit in about three minutes, supporting up to two cleaning cycles per week for two months without manual intervention. The AquaSense X employs “Beatbot AI 2.0” and HybridSense AI Vision, which enhances debris detection (recognizing 40 types) and coverage across pool floors, walls, waterlines, and surfaces using a combination of camera, infrared, and ultrasonic sensors. Its adaptive navigation handles complex, multi-level pool layouts, ensuring thorough cleaning of steps, edges, and shallow zones. In addition to cleaning, AquaSense
robotAIpool-cleaningautomationIoTautonomous-robotssmart-home-devicesEngineAI's CEO-kicking, combat-ready humanoid robots dazzle CES 2026
At CES 2026, Chinese robotics firm EngineAI unveiled two advanced humanoid robots, the PM01 and the T800, showcasing significant progress in embodied intelligence and practical robotics applications. The PM01 is a lightweight, versatile robot designed for scaled deployment in sectors such as public transportation support, retail, guided tours, and automated inspections, emphasizing stable and repeatable performance tailored to real-world operational needs. The T800, making its global debut, is a full-scale humanoid featuring a high-torque joint system capable of delivering up to 450 Nm of peak torque and 14,000W of instantaneous power, enabling it to perform dynamic tasks like running and martial arts with anthropomorphic mobility and load handling. EngineAI addressed skepticism surrounding the T800’s capabilities by releasing footage of the robot safely kicking its CEO, Zhao Tongyang, demonstrating precise, high-speed movements without CGI or video manipulation. This stunt reinforced the robot’s stability and combat-ready design, distinguishing EngineAI from competitors focused mainly on industrial
roboticshumanoid-robotsautonomous-robotsEngineAICES-2026embodied-intelligencecollaborative-robotsUK firm to debut world's fastest-developed humanoid robot at CES 2026
The UK-based company Humanoid has developed HMND 01 Alpha, a wheeled humanoid robot built in just seven months, marking the fastest development cycle for such a robot. Currently showcased at CES 2026, HMND 01 Alpha autonomously performs industrial bin picking by selecting metallic bearing rings from cluttered bins in a near-production factory environment. The robot stands 220 cm tall, moves on wheels at speeds up to 4.47 mph, and features 29 active degrees of freedom. It can carry bimanual payloads up to 33 pounds and reach items from floor level to two meters high, using AI-driven motion and task execution supported by advanced sensors including 360-degree RGB cameras and depth sensors. At CES, the robot is demonstrated at the Schaeffler Group booth, where it operates fully autonomously to pick unsorted bearing rings and place them onto a buffer table feeding into a ball-bearing assembly line. Schaeffler, both a user and supplier of
robothumanoid-robotindustrial-automationAI-roboticsautonomous-robotsCES-2026robotic-bin-pickingVideo: Humanoid robot obeys verbal commands, grabs Coke autonomously
Israel-based startup Mentee Robotics has demonstrated its Menteebot V3 humanoid robot autonomously responding to verbal commands, such as retrieving a can of Coke. The robot interprets spoken instructions, visually identifies the target object, navigates to it, grasps it, and returns to the user without human intervention. This capability is enabled by Mentee’s “Foundation Model,” which integrates language understanding, visual perception, navigation, and manipulation into a cohesive system. Training involves reinforcement learning in simulated environments, with skills transferred to real robots via Sim2Real techniques, allowing non-experts to teach robots naturally through speech and demonstration rather than coding. Founded in 2022 by Mobileye founder Prof. Amnon Shashua and AI experts, Mentee Robotics has raised over $40 million and employs about 70 people. On January 6 at CES, Mobileye announced plans to acquire Mentee in a deal valued up to $900 million, aiming to expand beyond autonomous vehicles into humanoid robotics
roboticshumanoid-robotautonomous-robotsmachine-learningAIrobot-traininghuman-robot-collaborationKeenon advances home robotics with autonomous mower, humanoid robot
Keenon Robotics, a leading Chinese service robot maker, showcased significant advancements in home and commercial robotics at CES 2026. The company introduced Keenmow K1, a fully autonomous robotic lawn mower featuring a 3D LiDAR-Vision fusion system that enables precise garden mapping without boundary wires or complex setup. This technology allows Keenmow to independently plan mowing routes, avoid obstacles, and navigate tight spaces with minimal human intervention, offering a true “set-and-forget” lawn care solution. The launch marks Keenon’s expansion from commercial service robots into smart home applications. In addition to Keenmow, Keenon demonstrated its humanoid service robot Xman-R1, designed for multifunctional roles in hospitality and dining, such as taking orders, preparing simple food items, delivering goods, and clearing dishes. The robot is part of Keenon’s broader ecosystem aimed at automating service workflows more efficiently. The company also unveiled new autonomous cleaning robots in its Kleenbot lineup and showcased delivery
roboticsautonomous-robotshome-roboticsrobotic-lawn-mowerLiDAR-technologysmart-home-deviceshumanoid-robotsVideo: New humanoid robot operates on its own in crowded public setting
At CES 2026, IntBot, a California-based startup, showcased Nylo, a humanoid social robot operating fully autonomously in a crowded public setting without human backup. Nylo distinguishes itself from conventional AI by perceiving social cues, understanding intent, and engaging naturally with people in dynamic environments. Powered by IntBot’s proprietary multimodal social intelligence system, IntEngine, Nylo integrates vision, audio, and language to coordinate speech, facial expressions, and gestures in real time, enabling it to decide when and how to interact. This demonstration marks a significant advancement in robotics, moving social robots from controlled lab environments into real-world applications. IntBot’s humanoid robots are designed primarily for hospitality and public service roles, providing interactive assistance with a warm, human-like presence. They handle routine inquiries, offer accurate information and directions, and provide local recommendations in over 50 languages, effectively removing language barriers. The robots operate 24/7, allowing human staff to focus on more complex tasks.
robothumanoid-robotsocial-robotAI-roboticsautonomous-robotsservice-robotsphysical-agentsHyundai’s MobED wins robotics award for stable all-terrain motion
At CES 2026 in Las Vegas, Hyundai Motor Group’s Robotics LAB won the Best of Innovation Award in Robotics for its MobED (Mobile Eccentric Droid), a compact four-wheeled robot designed to navigate challenging terrains such as steep ramps and high curbs. Utilizing Hyundai’s proprietary Drive and Lift (DnL) technology, MobED maintains stability on uneven surfaces by independently adjusting each wheel to keep the platform level. The robot is offered in two versions: the Basic model, a controller-operated platform for developers, and the Pro model, equipped with LiDAR, camera sensors, and AI algorithms for autonomous navigation and a “follow-me” mode, making it suitable for urban environments and commercial use. MobED is intended for diverse applications including last-mile logistics, service industry roles as a digital guide or mobile advertising platform, and carrying various payloads up to 47 kg (Basic) or 57 kg (Pro). The Pro model can reach speeds up to 10 km/h. First
roboticsautonomous-robotsHyundai-MobEDall-terrain-robotAI-navigationlast-mile-deliveryrobotics-innovationAutonomous robots to clean up polluted waters to stop 'dead zones'
South Korean company ECOPEACE is expanding its global operations for autonomous water-quality management systems, with upcoming pilot projects planned in Singapore and the United Arab Emirates (UAE). ECOPEACE uses AI-powered semi-submerged robots called ECOBOT to detect and remove algae and other pollutants from waterways in real time. These robots operate continuously, employing stainless-steel microfilters and electrochemical treatments to break down contaminants, aiming to prevent harmful algal blooms before they escalate into environmental crises. Algal blooms deplete oxygen in water, creating “dead zones” that threaten aquatic life and public health, while also impacting tourism and fishing industries. Traditional monitoring and cleanup methods are labor-intensive and reactive, whereas ECOPEACE’s automated system integrates real-time sensor data with AI to dynamically adjust treatment processes. Singapore’s advanced urban water governance and the UAE’s challenging hot, water-scarce environment provide contrasting testbeds for the technology, with success in these locations potentially demonstrating broad applicability worldwide.
robotautonomous-robotswater-quality-managementAIenvironmental-technologypollution-controlsmart-water-systemsThe 33 top health and wellness startups from Disrupt Startup Battlefield
The article highlights 33 standout health and wellness startups selected from TechCrunch’s annual Startup Battlefield pitch contest, which narrows thousands of applicants to 200 contenders across categories. These startups showcase innovative technologies addressing critical healthcare challenges, ranging from AI-powered surgical room preparation (Akara) and affordable 3D-printed prosthetics (Arm Bionics) to electronic artificial skin for prosthetics (ArtSkin) and wearable EEG devices for stress monitoring (AWEAR). Many focus on accessibility and affordability, such as Care Hero’s tech-enabled caregiver network to address caregiver shortages, Che Innovations Uganda’s transport warmer for preterm babies in rural Africa, and MariTest’s bloodless malaria diagnostic tool designed for sub-Saharan Africa. Other notable innovations include AI-driven posture adjustment technology (ELLUSTRÖS), at-home heart and metabolic health assessments (Endless Health), and AI-based harmonization of fragmented electronic medical records (Eos.ai) to improve healthcare data utility. Additionally, startups like GLITCHERS
robotprostheticsAI-sensorswearable-technologymedical-deviceselectronic-skinautonomous-robotsUS Army funds $1.5M Purdue project on GPS-free AI robot teams
Purdue University has received $1.5 million in funding from the U.S. Army’s Combat Capabilities Development Command Army Research Laboratory to develop autonomous AI-driven robotic teams capable of operating in GPS-denied, hostile environments. Led by Associate Professor Aniket Bera, the five-year project focuses on integrating aerial drones with ground vehicles to create coordinated air–ground systems that can scout, map, and navigate complex terrains without human intervention. These robots aim to provide enhanced situational awareness and battlefield intuition by sensing their surroundings, sharing information, and making collective decisions, thereby reducing risks to soldiers. The research is conducted within Purdue’s Intelligent Design for Exploration and Augmented Systems (IDEAS) Lab and utilizes the Hicks Robotics and Autonomy Testbed, a large facility equipped for both simulation and real-world testing of diverse robotic platforms. Building on prior single-agent navigation work, this project advances multi-agent collaboration, enabling robotic teams to function as intelligent scouts and navigators connected through an AI framework that fuses perception
roboticsartificial-intelligenceautonomous-robotsmilitary-technologydrone-technologymulti-agent-systemsGPS-denied-navigationTop 7 must-read humanoid robot stories of 2025
The article highlights seven pivotal humanoid robot developments in 2025, marking a transition from experimental prototypes to practical, real-world applications. A standout story is from Chinese startup AheadForm Technology, which introduced the Elf V1 robot head featuring highly expressive, human-like facial micro-expressions powered by AI and micro-actuated muscles. This advancement pushes humanoid robots toward emotionally responsive interaction, enabling more natural communication by interpreting social cues and responding with lifelike expressions. This represents a major qualitative leap from the traditionally mechanical and stiff humanoid designs. Another significant development involved UBTech Robotics securing a $37 million contract to deploy their Walker S2 humanoid robots at the China-Vietnam border. These robots perform diverse public service roles such as guiding travelers, managing personnel flow, conducting patrols, and handling logistics with autonomous battery swapping and dexterous manipulation capabilities. This large-scale deployment underscores the growing practical integration of humanoids in government and industrial operations. Additionally, the article spotlights Clone Robotics’ Protoclone
roboticshumanoid-robotsAIsynthetic-musclesfacial-expression-technologyautonomous-robotsindustrial-automationPickle Robot adds Tesla veteran as first CFO
Pickle Robot, a Charlestown, Massachusetts-based company specializing in autonomous unloading robots for warehouses and distribution centers, has appointed its first chief financial officer, a Tesla veteran named Evanson. Evanson, who had been consulting with Pickle Robot since September, joined full-time as CFO. He previously served as Tesla’s vice president of global investor relations and strategy from 2011 to 2017, working closely with Elon Musk and playing a key role in raising debt and equity financing to support Tesla’s vehicle launches and acquisitions. Founded in 2018, Pickle Robot has raised around $100 million in venture capital and is reportedly expanding its partnership with shipping giant UPS. Bloomberg reports that UPS is investing $120 million to purchase 400 of Pickle’s robots, with deployment expected to begin in late 2026 and early 2027. While Pickle Robot confirmed UPS has been a customer for several years, the company declined to comment on the recent investment news. This strategic hire and partnership expansion
robotautonomous-robotswarehouse-automationlogistics-technologyPickle-RobotUPS-partnershiprobotics-financeWorld’s smallest robots swim, sense heat, and think autonomously
Researchers at the University of Pennsylvania and the University of Michigan have developed the world’s smallest fully programmable, autonomous robots, each measuring about 0.2 by 0.3 by 0.05 millimeters—comparable in size to bacteria. These microscopic swimming robots can sense their environment, make decisions, and operate independently for months. They move by generating electric fields that push ions in the surrounding fluid, creating thrust without any moving parts. This innovative propulsion system enables durable, long-lasting operation in fluid environments, and the robots can also coordinate their movements in groups, similar to schools of fish. The robots’ intelligence is powered by ultra-miniaturized computers from the University of Michigan, which operate on just 75 nanowatts of power—about 100,000 times less than a smartwatch. Their surfaces are mostly covered by solar cells that harvest light for energy and serve as optical receivers for programming via light pulses. Each robot carries a unique identifier for individualized instructions. Equipped with temperature sensors sensitive
robotsautonomous-robotsmicroscale-roboticsmicro-robotsrobotic-sensingmicro-robot-propulsionprogrammable-robotsGenerations in Dialogue: Human-robot interactions and social robotics with Professor Marynel Vasquez - Robohub
The article discusses the fourth episode of the AAAI podcast series "Generations in Dialogue: Bridging Perspectives in AI," which features a conversation between host Ella Lan and Professor Marynel Vázquez, a computer scientist and roboticist specializing in Human-Robot Interaction (HRI). The episode explores Professor Vázquez’s research journey and evolving perspectives on how robots navigate social environments, particularly in multi-party settings. Key topics include the use of graph-based models to represent social interactions, challenges in recognizing and addressing errors in robot behavior, and the importance of incorporating user feedback to create adaptive, socially aware robots. The discussion also highlights potential applications of social robotics in education and the broader societal implications of human-robot interactions. Professor Vázquez’s interdisciplinary approach combines computer science, behavioral science, and design to develop perception and decision-making algorithms that enable robots to understand and respond to complex social dynamics such as spatial behavior and social influence. The podcast, hosted by Ella Lan—a Stanford student passionate about AI ethics and interdisciplinary dialogue—
robothuman-robot-interactionsocial-roboticsAI-ethicsautonomous-robotsmulti-party-HRIrobotic-perceptionVideo: NASA's cute cube robot flies autonomously for first time on ISS
Stanford researchers have successfully demonstrated the first AI-based autonomous flight of Astrobee, a cube-shaped, fan-powered robot aboard the International Space Station (ISS). Astrobee is designed to navigate the ISS’s confined, equipment-filled corridors to perform tasks such as leak detection and supply delivery, potentially reducing astronauts’ workload. The team developed a novel route-planning system using sequential convex programming combined with machine learning, which enables the robot to generate safe and efficient trajectories more quickly by leveraging patterns learned from thousands of previous path solutions. This AI-assisted control marks a significant advancement in space robotics, where limited onboard computing resources and stringent safety requirements have traditionally constrained autonomy. During the ISS experiment, the AI system operated autonomously for four hours with minimal astronaut intervention, under remote supervision. The researchers compared conventional “cold start” planning with the new AI-assisted “warm start” approach, finding that the latter reduced trajectory planning time by 50–60%, especially in complex, cluttered environments. Multiple safety measures ensured
roboticsautonomous-robotsAI-controlspace-roboticsNASAISS-technologymachine-learningChina’s AgiBot produces 5,000 humanoid robots in just three years
Shanghai-based robotics startup AgiBot announced it has produced its 5,000th humanoid robot less than three years after its founding in February 2023, marking a significant milestone in China’s rapidly growing humanoid robotics sector. The company’s production is diversified across three main product lines tailored to different commercial needs: the agile bipedal Lingxi X-Series (1,846 units), the full-sized Expedition A-Series designed for broad tasks including a recent 66-mile autonomous walk (1,742 units), and the task-focused, often wheeled Genie G-Series for industrial and logistics applications (1,412 units). The milestone was celebrated with the delivery of a Lingxi X2 robot to actor Huang Xiaoming’s studio, underscoring AgiBot’s expanding presence beyond industrial settings. AgiBot’s rapid production pace positions it ahead of domestic competitors like UBTECH, which aims to reach 5,000 units by 2026. Founder Peng Zhihui, a former Huawei
roboticshumanoid-robotsAgiBotAI-roboticsindustrial-robotsrobot-manufacturingautonomous-robotsChina debuts robot dog that can map 10 million square feet nonstop
Chinese robotics company Pudu introduced its latest quadruped robot, the D5, at Tokyo’s International Robot Exhibition (iREX 2025). Standing nearly one meter tall, the D5 showcases advanced motion-control algorithms and embodied intelligence, enabling it to navigate complex environments autonomously. Powered by an NVIDIA Orin platform and RK3588 dual-processor architecture, the robot delivers up to 275 TOPS of computing power for real-time SLAM mapping, obstacle avoidance, and object recognition. It can continuously map and inspect up to one million square meters (approximately 10 million square feet) and travel up to 14 kilometers without human intervention. Equipped with fisheye cameras and LiDAR sensors, the D5 provides 360-degree perception and dense 3D point clouds, enhancing operational safety and efficiency. Designed for durability, it supports a 30-kilogram load for over two hours and is resistant to dust, water, and extreme temperatures. Pudu positions the D5 as
robotautonomous-robotsquadruped-robotSLAM-mappingLiDARNVIDIA-Orinindustrial-robotsHow robots like ANYbotics' Roberta are improving industry inspection
The article discusses how ANYbotics’ robotic platform, exemplified by their robot Roberta, is transforming industrial inspection by enhancing autonomy, safety, and sustainability in heavy industries such as oil, gas, and chemical sectors. Founded in 2016 by Dr. Péter Fankhauser and his team, ANYbotics aims to move robotics from research labs into real-world applications, addressing the need for remote monitoring in hazardous environments. Roberta is deployed at Equinor’s Northern Lights carbon-storage terminal, where it autonomously conducts regular inspections, including CO₂ level monitoring and perimeter surveys, reducing the need for on-site personnel and improving asset integrity. Roberta’s operations are integrated with a digital twin of the facility, enabling pre-planned missions that are supervised and triggered remotely via cloud-based platforms. This system allows for virtual rehearsal of routes and failure scenarios, ensuring the robot’s on-site behavior aligns with expectations. The robot collects multi-sensor data (visual, thermal, acoustic, gas), which
roboticsindustrial-automationautonomous-robotsdigital-twinpredictive-maintenancesensor-technologyoil-and-gas-industryMobED: Hyundai’s first mass-produced robot for logistics, home use
Hyundai Motor Group has introduced MobED (Mobile Eccentric Droid), its first mass-produced autonomous robot designed for both industrial logistics and everyday use. Developed by Hyundai’s Robotics Lab, MobED was showcased at the International Robot Exhibition 2025 in Tokyo and is slated for sale in the first half of 2026. The compact, four-wheeled robot features AI-powered route planning and obstacle avoidance, utilizing LiDAR and cameras for environmental perception. Its modular design allows adaptation to various roles, including delivery, research, media, and lifestyle services like golf assistance. MobED’s standout feature is its eccentric control mechanism, enabling it to adjust posture and height actively to maintain balance on uneven terrain and narrow indoor corridors. Each wheel integrates Hyundai’s drive-and-life (DnL) modules, combining driving, steering, and height adjustment in a compact unit. Two versions will be available: a pro model with full AI autonomy, advanced sensors, and a “follow-me” mode for logistics and inspection tasks
roboticsautonomous-robotsHyundai-MobEDlogistics-robotsAI-navigationindustrial-robotsmobile-robotsElon Musk's Optimus humanoid robot achieves human-like smooth running
Tesla has released a new update on its humanoid robot, Optimus, showcasing a viral video of the robot running smoothly in a lab setting. Standing 5 feet 11 inches tall and weighing 160 pounds, Optimus features over 40 degrees of freedom, including highly dexterous hands with 11 degrees of freedom designed for human-like interaction. Powered by a 2.3 kWh battery, it operates with impressive energy efficiency, consuming as little as 100W at rest and 500W while walking. The latest update highlights significant improvements in balance, coordination, and gait control, marking a major milestone beyond earlier demonstrations of basic walking, object handling, and posture training. Optimus has shown rapid progress since early 2023, evolving from slow, basic movements to performing complex tasks such as pick-and-place operations, basic assembly, and even Kung Fu moves with smooth full-body coordination. Tesla aims to mass-produce the robot by the end of 2025, with Elon Musk
robothumanoid-robotTesla-Optimusroboticsartificial-intelligencebattery-technologyautonomous-robotsHyundai Motor Group Unveils Production-Ready Autonomous Mobility Robot Platform ‘MobED’ at iREX 2025 - CleanTechnica
Hyundai Motor Group unveiled MobED (Mobile Eccentric Droid), its first mass-produced autonomous mobility robot platform, at the International Robot Exhibition 2025 (iREX 2025) in Tokyo. Developed by Hyundai’s Robotics LAB, MobED evolved from a 2022 CES concept into a production-ready AI-powered robot designed for diverse industrial and everyday applications. The platform emphasizes three core pillars: Adaptive Mobility (hardware), Intuitive Autonomy (software), and Infinite Journey (applications). MobED features an innovative eccentric posture control mechanism for stable movement on varied terrains, automotive-grade engineering for durability and precision, and AI-based autonomous navigation using LiDAR-camera fusion sensors. Two models, MobED Pro and MobED Basic, are slated for sales in the first half of 2026. MobED’s Adaptive Mobility allows it to dynamically adjust posture and height to maintain balance on uneven or inclined surfaces, enabling seamless navigation across indoor and outdoor environments without needing environment-specific designs. Its Intuitive Autonomy
roboticsautonomous-robotsAI-navigationmobility-robot-platformHyundai-Motor-Groupindustrial-robotsadaptive-mobilityChina's bipedal robot turns into lifelike dinosaur in stunning demo
Chinese robotics company LimX Dynamics has unveiled a bipedal robot capable of transforming into a life-sized, highly detailed Tyrannosaurus rex, showcased in a recent demonstration. Built on the TRON1 platform, the robot features a realistic dinosaur skin with sculpted head, arms, and tail, creating an immersive prehistoric appearance. The robot maintains balance and stability through advanced real-time posture adjustments and sensor networks, allowing it to withstand pushes and kicks without falling. It walks on two legs at a controlled speed of about 3.1 mph, suitable for crowded public spaces. The robot is designed primarily as a mobile interactive display for cultural tourism, aiming to attract visitors to museums and parks by offering a unique, educational experience of walking alongside a lifelike extinct animal. The TRON1 platform supports versatile mobility, including point-foot balance on uneven terrain and wheeled movement on smooth surfaces, and can accommodate different skins that are easily swapped to transform the robot into various creatures. LimX Dynamics plans to offer
roboticsbipedal-robotautonomous-robotsrobot-balance-controlinteractive-roboticsentertainment-roboticscultural-tourism-technologyChina’s robot dog autonomously finds victims in disaster drills
Chinese robotics firm Deep Robotics showcased its X30 quadruped robot dogs in a disaster rescue drill held in Hangzhou on November 27. The drill, named the “2025 Joint Emergency Rescue Drill for Concurrent Multi-Type Accidents,” tested the robots’ ability to autonomously locate seven simulated trapped victims across multiple high-risk scenarios. The X30 demonstrated advanced mobility and stability by navigating challenging terrain such as 45-degree stairs, hollow scaffolding, and debris-filled surfaces, supported by its four-legged bionic design, smart gait control, and IP67 protection rating that allows operation in dusty and wet conditions. These capabilities enabled the robots to safely enter hazardous zones and relay critical real-time data to command centers. The X30 robots employed full-scene scanning, long-distance video feeds, and a broadband self-organizing network to build 3D models of the disaster site, providing rescue teams with accurate environmental awareness and situational updates. Integrated with personnel search-and-rescue systems and facial recognition cameras, the
roboticsrescue-robotsautonomous-robotsdisaster-responsequadruped-robotsrobot-dogDeep-RoboticsManta ray soft robot uses magnetic fields to swim autonomously
Researchers at the National University of Singapore have developed a manta ray-inspired soft robot that uses magnetic fields not only to propel itself but also to enhance the performance of its flexible batteries, enabling autonomous and untethered operation. Traditional flexible batteries often stiffen soft robots or degrade quickly under strain, limiting their autonomy. The team addressed this by encapsulating zinc-manganese dioxide (Zn-MnO₂) batteries in soft silicone and stacking them vertically within the robot’s body, maximizing space and maintaining flexibility. Magnetic fields generated by the robot’s actuators stabilize the battery chemistry, reduce dendrite growth (which can cause short circuits), and maintain energy output after repeated bending, nearly doubling battery life compared to unenhanced samples. The magnetic field improves battery function through two mechanisms: the Lorentz force redirects zinc ion movement to promote uniform deposition and suppress dendrite formation, while alignment of electron spins within the manganese oxide lattice strengthens atomic bonds, preventing crystal degradation. The robot’s fins flap in response to external magnetic
soft-roboticsmagnetic-fieldsflexible-batteriesautonomous-robotsenergy-managementmanta-ray-robotelectrochemical-stabilizationWatch: Humanoid robots sort boxes in real-time warehouse demo
Mentee Robotics, an Israeli company founded by Mobileye co-founder Amnon Shashua, has released an unedited 18-minute video demonstrating two of its V3 humanoid robots autonomously sorting and moving boxes in a real warehouse environment. The robots successfully relocated 32 boxes from uneven piles to storage racks without any remote control, showcasing advanced live perception, motion planning, and multi-robot coordination. They maintained balance while carrying loads up to 55 pounds, navigated shared spaces without collisions, and operated at a steady, measured pace emphasizing reliability over speed. This continuous, uncut footage serves as evidence of the robots’ capability for long-duration, autonomous warehouse tasks. The V3 MenteeBot is designed specifically for industrial use, standing 5 feet 9 inches tall and equipped with dual NVIDIA Jetson Orin AGX processors to manage full 360-degree vision and onboard decision-making, which is critical in environments with unreliable wireless signals. Its custom actuators provide high power density for repeated
robothumanoid-robotswarehouse-automationautonomous-robotsrobotics-technologyindustrial-robotsrobot-coordinationUS OpenMind's BrainPack makes humanoid robots ‘real-world' smart
OpenMind has introduced BrainPack, a modular, backpack-sized platform that integrates key autonomous robot functions—such as advanced mapping, object labeling, privacy-protected vision, remote operation, and self-charging—into a single unit powered by Nvidia’s high-performance computing. Designed to bridge the gap between robotics and intelligence, BrainPack enables robots not only to move but also to observe, interpret, and learn from their surroundings by building detailed 3D maps and recognizing objects autonomously. Privacy features include automatic face detection and blurring to anonymize humans in view, while remote control and secure video streaming enhance usability. The platform combines research-grade reliability with consumer-level simplicity, making autonomous robotics more accessible without the need for specialized labs or complex setups. Early tests have demonstrated that BrainPack-equipped robots can perform self-guided patrols, map multi-room environments, recognize and label objects, and self-dock for charging—all without direct supervision. Additionally, OpenMind has developed OM1, a hardware-agnostic
roboticshumanoid-robotsautonomous-robotsrobot-autonomyAI-in-roboticsrobot-mappingself-charging-robotsAgile Robots launches Agile ONE industrial humanoid - The Robot Report
Agile Robots SE, a Munich-based company, has launched Agile ONE, its first industrial humanoid robot designed to work safely and efficiently alongside humans and other systems in structured industrial environments. Agile ONE features intuitive human-robot interaction (HRI) capabilities, including responsive eye rings, proximity sensors, a rearview camera, and a chest display for real-time information. Its dexterous five-fingered hands, equipped with multiple sensors for force and tactile feedback, enable precise manipulation tasks such as handling tiny screws or operating power tools. The robot embodies Agile Robots’ vision of “physical AI,” combining intelligence, autonomy, and flexibility to perceive, understand, and act in the physical world. A key differentiator for Agile ONE is its layered AI approach, described as a “data pyramid” that integrates real-world teleoperation and field data, physical simulation data, and visual data from videos and images. Its cognitive architecture includes three layers: slow thinking for task planning, fast thinking for dynamic individual actions,
robothumanoid-robotindustrial-automationAI-roboticshuman-robot-interactionrobotic-handautonomous-robotsRobots get new 'brain' inspired by birds, ants to navigate without GPS
Researchers have developed a novel navigation system for robots inspired by animals such as ants, birds, and rodents, enabling robust navigation in environments where GPS is unreliable or unavailable. Traditional non-GPS methods like cameras and sensors often fail under poor visibility or harsh conditions, but animals have evolved redundant and efficient navigation strategies that can be mimicked. The system integrates three overlapping navigation methods—ant-inspired internal step and direction tracking via a spiking neural network, bird-inspired multi-sensor fusion including quantum magnetometers and polarization compasses processed through Bayesian filters, and rodent-inspired cognitive mapping that updates only when significant landmarks are detected. This redundancy, known as degeneracy in biology, allows the robot to compensate if one system fails, enhancing reliability. This bio-inspired approach offers significant advantages over conventional methods like SLAM by conserving energy and reducing computational load, making it suitable for challenging applications such as search and rescue in collapsed buildings, planetary exploration, deep-sea missions, and industrial inspections in chaotic environments. Although currently theoretical,
roboticsnavigation-systemsbio-inspired-robotsautonomous-robotsspiking-neural-networksquantum-magnetometerenergy-efficient-roboticsFoxglove raises $40M to scale its data platform for roboticists
Foxglove, a San Francisco-based startup founded in 2021 by former Cruise engineers Adrian Macneil and Roman Shtylman, has raised $40 million in a Series B funding round, bringing its total funding to over $58 million. The company develops a data and observability platform designed to help robotics companies collect, analyze, and visualize sensor data from their robots, aiming to accelerate development and improve robot reliability. Foxglove’s platform provides robotics startups with infrastructure similar to that used internally by industry leaders like Waymo and Tesla, but without requiring large engineering teams. Its customers include Amazon, NVIDIA, Shield AI, and Dexterity, among others. Foxglove’s tools have demonstrated significant impact, such as helping Dexterity reduce tooling and development time by over 20%, saving $150,000 annually. Notably, Shield AI integrated Foxglove’s platform into its HiveMind autonomy stack, embedding it as part of its software development kit, highlighting Foxglove’s role
roboticsdata-platformrobotics-developmentmachine-learningautonomous-robotsrobotics-startupssoftware-development-kitRobotic kitchen in a box cooks, cleans and serves 120 meals an hour
A Munich-based robotics company, Circus SE, has introduced the CA-1 Series 4, a fully autonomous robotic kitchen, inside a REWE supermarket in Düsseldorf, Germany. This compact, glass-enclosed system autonomously handles the entire meal preparation process—from ingredient collection to cooking, plating, and cleaning—without human intervention. Capable of producing up to 120 meals per hour, the CA-1 offers restaurant-quality dishes priced from €6, cooked fresh on demand within minutes. The system’s AI-driven operations include real-time ingredient monitoring, adaptive stirring speeds, and self-cleaning via an integrated commercial dishwasher, all visible to customers through a transparent panel. This installation marks the first integration of AI-powered cooking robots directly within a supermarket, positioning REWE as a pioneer in retail automation and experiential food services. The collaboration between Circus and REWE is designed to be scalable, with two additional pilot sites planned and potential applications in hospitals, universities, factories, and even military settings. The CA-1
roboticsAIautomationrobotic-kitchenfood-service-automationretail-technologyautonomous-robotsRobot Talk Episode 132 – Collaborating with industrial robots, with Anthony Jules - Robohub
In Robot Talk Episode 132, Claire interviews Anthony Jules, CEO and co-founder of Robust.AI, about their autonomous warehouse robots designed to collaborate seamlessly with human workers. Robust.AI’s flagship product, Carter™, is engineered to operate within existing warehouse environments without disrupting current workflows, emphasizing human-centered automation. Anthony Jules brings over 30 years of experience in robotics, AI, and business, combining technical expertise with operational leadership to advance AI-driven warehouse automation. The episode highlights the growing trend of integrating autonomous robots into human workplaces to enhance efficiency while maintaining collaboration and safety. Jules’ background, including his MIT training and leadership roles in various tech companies, underscores the innovative approach Robust.AI takes in developing robots that complement rather than replace human labor. This conversation fits within the broader context of Robot Talk’s focus on robotics, AI, and autonomous machines, showcasing practical applications and ongoing advancements in the field.
roboticsindustrial-robotswarehouse-automationAI-driven-robotshuman-robot-collaborationautonomous-robotsrobotic-technologyUS firm unveils mini ‘tank-killer’ robot built to hunt heavy armor
US robotics company Swarmbotics AI has introduced FireAnt, a new lightweight, ground-based autonomous unmanned vehicle designed to operate in coordinated swarms for anti-tank missions. FireAnt is built to detect, track, and engage heavy armored targets such as tanks, functioning semi-independently within swarms controlled by a single human operator. The system emphasizes swarm autonomy, enabling robots to share targeting data, adapt to battlefield changes in real-time, and accelerate the kill chain process. FireAnt is ruggedized to IP67 standards, resistant to dust, water immersion, heat, vibration, and shock, making it suitable for diverse combat environments. The FireAnt platform is modular and interoperable, supporting quick payload changes for various missions including reconnaissance, mapping, data relay, and anti-armor tasks. It integrates with common robotic software architectures like ROS 2 and JAUS, facilitating connection to broader defense networks. Swarmbotics co-founder Drew Watson likens FireAnt’s approach to the successful use
roboticsunmanned-ground-vehiclesautonomous-robotsrobotic-warfareswarm-roboticsmilitary-technologymodular-robotics100 robotics startups to watch - The Robot Report
The Robot Report’s inaugural Startup Radar is a comprehensive report profiling 100 robotics startups that are five years old or younger, highlighting the emerging leaders shaping the future of robotics. The report provides detailed insights into each company’s products, target markets, funding levels, employee counts, and other key data points. Covering a broad range of robotics sectors—including autonomous mobility, humanoid robots, and industrial automation—the Startup Radar identifies areas of rapid innovation and new market opportunities. Designed for investors, engineers, and component suppliers, the report offers a clear perspective on the evolving robotics landscape and helps stakeholders identify promising companies poised for growth. By showcasing the creativity and momentum within the robotics ecosystem, the Startup Radar serves as a valuable resource to stay informed about the next generation of robotics technologies and the startups driving industry transformation. The full report is available exclusively through The Robot Report.
roboticsstartupsautonomous-robotsindustrial-automationhumanoid-robotsrobotics-innovationrobotics-industryTop 10 robotics developments of October 2025 - The Robot Report
In October 2025, the robotics industry saw significant developments highlighted by The Robot Report, coinciding with the return of the RoboBusiness event in Santa Clara, California. Key advancements included EndoQuest Robotics completing the first robotic endoscopic submucosal dissection (ESD) procedure at the Mayo Clinic as part of a multicenter trial evaluating their Endoluminal Surgical System for gastrointestinal applications. Meanwhile, 1X Technologies opened preorders for NEO, a humanoid robot designed specifically for household use, marking a strategic shift toward consumer robotics. Revolute Robotics secured $1.9 million in funding to accelerate deployment of its autonomous ground and aerial robots for inspection, security, and defense sectors. Other notable updates involved Singapore’s National Robotics Programme unveiling initiatives to boost robot adoption and workforce readiness through shared testbeds, collaborations, and national standards. Serve Robotics planned to raise up to $100 million via stock sales to fund its sidewalk delivery robot operations. Amazon introduced the Blue Jay robot and Project
roboticshumanoid-robotsautonomous-robotsrobotic-surgeryrobot-adoptionrobotics-fundingdelivery-robotsPolish firm to demonstrate fully autonomous robot with new collaboration
Polish company Robotec.ai is set to demonstrate the first fully autonomous warehouse robot powered exclusively by AMD Ryzen AI processors. This robot utilizes advanced Agentic AI capabilities to dynamically plan and execute tasks in real time without relying on pre-programmed scripts. Through collaborations with AMD and Liquid AI, the robot integrates multiple cutting-edge technologies, including Liquid AI’s next-generation LFM2 Vision Language Models, which combine perception, reasoning, and natural language understanding. This enables the robot to interpret human commands, detect safety hazards such as spills or blocked exits, and autonomously take corrective actions, significantly enhancing operational safety and efficiency. The collaboration also leverages extensive testing in simulated environments created with the Open 3D Engine, allowing validation of the embedded AI on real hardware while minimizing physical testing costs and risks. The robot operates within mixed-traffic warehouse settings, adapting to changing conditions through real-time replanning and serving as an inspection agent that alerts operators to unexpected events or safety issues. The AMD Ryzen processor provides a powerful
robotautonomous-robotsAI-roboticswarehouse-automationAMD-Ryzen-AIagentic-AIrobotics-collaborationRobotec.ai works with AMD, Liquid AI to apply agentic AI to warehouse robots - The Robot Report
Robotec.ai, in collaboration with Liquid AI and AMD, has demonstrated a fully autonomous warehouse robot powered by agentic AI that dynamically plans and executes tasks in real time without relying on hard-coded scripts. The robot operates on AMD Ryzen AI processors and Liquid AI’s LFM2 vision language models (VLMs), which integrate perception, reasoning, and natural language understanding. This enables the robot to interpret commands, detect safety hazards like spills or blocked exits, and autonomously take corrective actions. Extensive simulation testing has enhanced system performance and validated embedded AI on real hardware, reducing the risks and costs associated with physical testing. The autonomous mobile robot (AMR) showcased at ROSCon 2025 in Singapore operates in a mixed-traffic warehouse environment, completing human-specified tasks via natural language and adapting to changing conditions through replanning. Liquid AI’s LFM2-VL model, optimized for AMD hardware, processes visual scenes, performs context-aware reasoning, and plans goal-driven actions entirely on-device. Robot
robotAIwarehouse-automationautonomous-robotsAMD-Ryzen-AILiquid-AIrobotics-simulationLimX Dynamics' Oli humanoid robot performs autonomous get-up routine
LimX Dynamics has demonstrated a significant advancement in humanoid robotics with its Oli robot performing an autonomous get-up routine. In a video released on October 23, two Oli humanoids were shown lying down and then rising back to a standing position in a human-like manner, using coordinated movements involving their legs and hip joints. This demonstration highlights Oli’s ability to execute complex, joint-level motions autonomously, supported by its modular design and software development kit (SDK), aimed at researchers and integrators to accelerate humanoid system development. Oli stands 165 cm tall, weighs 55 kg, and features 31 degrees of freedom, enabling detailed movements such as bending, reaching, and grasping. Equipped with 3D cameras, LiDAR, and motion sensors, it can perceive its environment and interact with small objects. The robot supports popular simulation platforms like Python, NVIDIA Isaac Sim, and Gazebo, with software updates delivered over-the-air. Available in Lite, EDU, and Super versions, Oli
roboticshumanoid-robotautonomous-robotsLimX-Dynamicsrobot-control-systemsrobot-sensorsrobot-simulation-platformsOso Electric Equipment acquires Electric Sheep Robotics - The Robot Report
Oso Electric Equipment has acquired Electric Sheep Robotics, a company specializing in AI-driven autonomous mowing robots. This merger combines Oso’s electric powertrain technology with Electric Sheep’s robotics and machine learning systems, aiming to advance the automation of outdoor work across various sectors including infrastructure, construction, agriculture, defense, and space exploration. The acquisition follows a prior partnership where Oso introduced a commercial electric smart lawn mower powered by Electric Sheep’s AI platform. Together, they plan to expand access to zero-emission, autonomous outdoor equipment suitable for challenging commercial environments, with deployments already underway in California and Texas. Electric Sheep, founded in 2019 and based in San Francisco, gained recognition as The Robot Report’s 2024 RBR50 Robotics Innovation Award Startup of the Year for its innovative business model. The company uniquely integrated landscaping services into its strategy, allowing it to deploy autonomous mowers gradually with direct operational experience. Its technology includes the ES1 learned-world model for reasoning and planning, powering products like the RAM
roboticsautonomous-robotsAIelectric-equipmentenergy-efficiencyoutdoor-automationsmart-lawn-mowersRobot battle: Nearly 50 miniature bots fight at UK championship event
The UK Beetle Championship, held at St Michael’s Centre in Stoke Gifford, Bristol, showcased nearly 50 miniature robots weighing 3.3 pounds or less competing in head-to-head battles. Organized by the Bristol Bot Builders, the event aimed to promote STEM education by engaging participants and spectators in robotics through an interactive and accessible format. Unlike larger combat robots seen on shows like Robot Wars, these beetleweight bots are smaller, more affordable, and easier to build, yet still capable of delivering powerful hits using weapons such as spinning discs that can reach speeds of 250 mph. Around 300 spectators attended the event, which emphasized hands-on learning and community involvement over pure spectacle. Bristol has become a prominent hub for robot combat in the UK, attracting engineers, students, and hobbyists who dedicate significant time to designing, assembling, and programming their robots. Participants engage in a continuous cycle of building, testing, and refining their machines throughout the year, often programming them for autonomous operation
roboticsrobot-combatSTEM-educationautonomous-robotsrobot-engineeringrobot-battlesUK-robotics-eventsHumanoids need orchestration to be useful in manufacturing, notes Flexxbotics CEO - The Robot Report
The article by Tyler Bouchard, CEO of Flexxbotics, emphasizes that humanoid robots in manufacturing require sophisticated orchestration and coordination to be truly effective. Rather than automating isolated tasks, humanoids must perform multiple operations autonomously and work seamlessly alongside other robots, machines, and human workers within smart factories. Achieving this level of integration demands that humanoids operate with contextual awareness, communicating bi-directionally with business IT systems and factory equipment to receive instructions, provide updates, and adjust actions in real time. Bouchard highlights that humanoids need robotic production software capable of secure, real-time read/write communication with diverse factory assets to enable fully autonomous operation. This closed-loop communication system allows humanoids to move beyond simple automation toward connected autonomy, where they can make contextual decisions and interact dynamically within production processes. Without such orchestration and interoperability, the potential of humanoid robots to drive scalable, efficient manufacturing will remain unrealized.
roboticshumanoid-robotssmart-factoryindustrial-automationmanufacturing-technologyrobot-orchestrationautonomous-robotsRevolute Robotics brings in $1.9M to deploy its driving, flying robots - The Robot Report
Revolute Robotics, a Scottsdale-based startup founded in 2020, has raised $1.9 million to advance its hybrid aerial-terrestrial robots designed for autonomous inspection, security, and defense applications. The company’s robot features a durable exoskeleton and customizable payloads, enabling it to drive on the ground to conserve battery life and fly to overcome obstacles. This dual mobility allows longer inspection times over larger areas and access to confined, complex, and GPS-denied environments where traditional drones and robots cannot operate. Revolute’s platform supports multiple sensor types—including visual, thermal, gas, radiation detection, lidar mapping, and ultrasonic testing—making it a versatile “Swiss Army Knife” for industries such as oil and gas, power, chemicals, construction, and mining. The robot is already being used by security teams for perimeter patrols and threat response, and by defense teams for base patrol, intelligence, surveillance, reconnaissance (ISR), vehicle inspection, and search and rescue missions. The system also supports
roboticsautonomous-robotshybrid-mobility-robotdrone-technologyinspection-robotssurveillance-robotsdefense-technologyHumanoid robot 'superworker' offers dexterous industrial assistance
Ati Motors, an Indian AI and robotics company, has introduced the Sherpa Mecha humanoid-inspired robot designed specifically for practical industrial applications rather than human-like imitation. Unlike traditional humanoid robots that focus on replicating human appearance and motion, Sherpa Mecha prioritizes functionality on manufacturing floors, performing tasks such as machine tending, material transport, and heavy bin handling. The robot features high-performance actuators, precision gripping, 3D navigation, and a 26-pound payload capacity, moving on wheels for enhanced speed and safety in industrial environments. This design reflects Ati Motors’ philosophy of creating robots “for industry, not spectacle,” emphasizing utility and integration over biomimicry. Sherpa Mecha is positioned as a “tool-forward industrial superworker” capable of continuous, fatigue-free operation and seamless integration into existing automation lines. Developed in collaboration with research institutions and industrial partners, the robot has undergone extensive testing to ensure reliability and compatibility. Ati Motors promotes Sherpa Mecha as a customizable platform
roboticsindustrial-automationhumanoid-robotAI-roboticsmanufacturing-technologyautonomous-robotsindustrial-superworkerAutonomous ARGUS robot tracks hackers and guards physical spaces
Romanian researchers from Ștefan cel Mare University have developed ARGUS (Autonomous Robotic Guard System), an innovative autonomous robot that integrates physical security and cybersecurity into a unified defense platform. Equipped with LiDAR, RGB/IR cameras, an intrusion detection system (IDS) module, and AI-powered computer vision, ARGUS can simultaneously patrol physical spaces and monitor network traffic to detect intruders and cyber threats in near real-time. It uses deep learning to identify suspicious activities such as unauthorized personnel, weapons, abnormal sounds, and digital anomalies, enabling it to respond to both physical and cyber breaches concurrently. ARGUS employs advanced navigation technologies like Simultaneous Localization and Mapping (SLAM) and sophisticated control algorithms to autonomously maneuver through indoor and outdoor environments without human intervention. Its modular design allows integration with existing security infrastructures, making it suitable for complex environments such as industrial plants, smart cities, airports, and research labs where cyber-physical threats often overlap. Future developments envision multiple ARGUS units operating as
roboticsautonomous-robotscybersecurityAISLAMsmart-buildingsintrusion-detectionWomen in robotics you need to know about 2025 - Robohub
The article "Women in Robotics You Need to Know About 2025" from Robohub celebrates International Women in Robotics Day by highlighting 20 influential women shaping the robotics field worldwide. Robotics today extends beyond traditional manufacturing to areas like space exploration, healthcare, agriculture, and global connectivity. The featured women include professors, engineers, startup founders, and communicators from diverse countries such as Australia, Brazil, Canada, China, Germany, Spain, Switzerland, the UK, and the US. Their work spans tactile sensing, swarm robotics, embodied AI, and more, demonstrating the broad scope and impact of robotics research and innovation. The article emphasizes the importance of recognizing women's contributions to robotics to combat their historical invisibility and encourage greater representation. Among the honorees are Heba Khamis, co-founder of Contactile developing tactile sensors; Kelen Teixeira Vivaldini, researching autonomous robots for environmental applications; Natalie Panek, a senior engineer in space robotics; and Joelle Pineau,
roboticswomen-in-roboticstactile-sensorsautonomous-robotsAI-in-roboticsswarm-roboticsrobotics-innovationDiligent Robotics adds two members to AI advisory board - The Robot Report
Diligent Robotics, known for its Moxi mobile manipulator used in hospitals, has expanded its AI advisory board by adding two prominent experts: Siddhartha Srinivasa, a robotics professor at the University of Washington, and Zhaoyin Jia, a distinguished engineer specializing in robotic perception and autonomy. The advisory board, launched in late 2023, aims to guide the company’s AI development with a focus on responsible practices and advancing embodied AI. The board includes leading academics and industry experts who provide strategic counsel as Diligent scales its Moxi robot deployments across health systems nationwide. Srinivasa brings extensive experience in robotic manipulation and human-robot interaction, having led research and development teams at Amazon Robotics and Cruise, and contributed influential algorithms and systems like HERB and ADA. Jia offers deep expertise in computer vision and large-scale autonomous systems from his leadership roles at Cruise, DiDi, and Waymo, focusing on safe and reliable AI deployment in complex environments. Diligent Robotics’
roboticsAIhealthcare-robotsautonomous-robotshuman-robot-interactionrobotic-manipulationembodied-AIAdrian Stoch: Driving Hai Robotics' U.S. expansion
In Episode 215 of The Robot Report Podcast, Adrian Stoch, CEO of Hai Robotics USA, discusses his move from GXO Logistics to Hai Robotics, attracted by the company’s customer-focused approach and innovative culture under founder Richie Chen. Stoch emphasizes the importance of aligning automation solutions with customer needs, highlighting a trend toward large-scale automation driven by global supply chain challenges and labor shortages. His goals for Hai Robotics in the Americas include building a skilled team and implementing lean processes to support growth and enhance customer success. The episode also covers major robotics industry news, including Dexory’s milestone of 500 million warehouse scans and an $80 million Series B funding round aimed at expanding its AI-powered DexoryView platform and U.S. market presence. DoorDash introduced Dot, a compact autonomous delivery robot designed for neighborhood deliveries, capable of carrying up to 30 pounds and traveling at speeds up to 20 mph. Zoox has begun testing its self-driving robotaxi service in Washington, D.C., marking its
roboticsautonomous-robotswarehouse-automationdelivery-robotsAI-powered-roboticslogistics-automationself-driving-vehiclesGlobant invests in InOrbit Series A to advance robot orchestration - The Robot Report
InOrbit Inc., a Mountain View-based company specializing in AI-powered robot operations (RobOps) software, has closed its Series A funding round led by Globant and other investors. The capital will be used to accelerate platform development and expand InOrbit’s presence in key industries such as manufacturing, logistics, retail, and hospitality. InOrbit aims to address challenges like labor shortages and supply chain risks by providing a robot orchestration platform that integrates robots, human workers, and AI agents. The company’s software acts as a “central nervous system” for robot fleets, enabling autonomous decision-making and adaptive responses in real-world environments, with customers including Colgate-Palmolive and Genentech. The partnership between InOrbit and Globant builds on their previous collaboration, with Globant integrating InOrbit’s RobOps software into its Robotics Studio and offering it as part of its digital transformation services. Globant emphasizes that InOrbit’s platform complements existing enterprise systems such as WMS and ERP, enhancing orchestration of diverse
robotroboticsAIautomationrobot-orchestrationenterprise-softwareautonomous-robotsGoogle's Gemini model lets humanoid robot carry out multimodal tasks
Google DeepMind has unveiled advancements in its humanoid robots powered by the Gemini Robotics 1.5 AI models, enabling them to perform complex, multi-step tasks through multimodal reasoning. Demonstrated in a recent video, the bi-arm Franka robot successfully completed the "banana test," sorting different fruits by color into separate plates, showcasing improved capabilities over previous models that could only follow single-step instructions. Another test featured Apptronik’s Apollo humanoid sorting laundry by color, even adapting to changes in basket positions mid-task, highlighting the robots' enhanced perception and adaptability. The Gemini Robotics 1.5 family includes two complementary models: one that converts visual inputs and instructions into actions, and another that reasons about the environment to create step-by-step plans. This agentic framework allows robots to autonomously study their surroundings, make decisions, and execute tasks such as sorting waste according to local recycling rules by researching guidelines online and applying them in real time. Google emphasizes safety in these models, incorporating risk assessment
roboticshumanoid-robotsAI-modelsmultimodal-tasksautonomous-robotsrobot-perceptionrobot-reasoningSpider-like robot can 3D print homes in a day to fight housing crunch
Australia has developed an advanced spider-like robot named Charlotte, designed to address the housing crisis by 3D printing low-cost, low-carbon homes rapidly. Created through a collaboration between Crest Robotics and Earthbuilt Technology, Charlotte can autonomously print a 200-square-metre house within 24 hours by transforming readily available materials such as sand, earth, and crushed brick into structural walls. The robot employs a sustainable Earthbagging-like technique, compacting these materials in fabric layers to build durable structures efficiently, offering a scalable solution to the slow and costly traditional construction methods. Beyond Earth, Charlotte is engineered for lunar construction, supporting NASA and other space agencies' ambitions to establish permanent bases on the Moon. Its lightweight, foldable hexapod design makes it highly portable for space travel, unlike bulky traditional 3D printers, and allows it to extrude and compact lunar soil to build habitats such as domed shelters. This innovation positions Charlotte within a competitive global effort alongside companies like ICON and AI SpaceFactory
robotics3D-printingconstruction-technologylunar-habitatsautonomous-robotssustainable-buildingspace-explorationOxford Robotics Institute director discusses the truth about AI and robotics - The Robot Report
Nick Hawes, director of the Oxford Robotics Institute and professor at the University of Oxford, highlights significant advances in robotics and AI that are transforming business applications. He emphasizes that autonomous robotics—robots capable of operating independently without direct human control—are becoming increasingly common, especially in logistics and inspection tasks. Examples include quadruped robots and drones that autonomously monitor sites for issues requiring human attention. While humanoid robots generate excitement, Hawes advises caution for immediate business adoption, suggesting their practical use cases may emerge within the next five to ten years. In AI, he points to foundation models, such as large language and vision-language-action models, as pivotal technologies that enable robots to better understand and interact with complex, unstructured environments. Hawes draws on extensive experience deploying autonomous robots across diverse environments to illustrate their potential. Early projects involved autonomous mobile robots performing security patrols in offices and assisting nursing staff in care homes and hospitals, operating continuously without human intervention. His work also includes underwater autonomous robots collecting
roboticsartificial-intelligenceautonomous-robotsAI-in-roboticsrobotics-applicationshumanoid-robotsrobotics-researchWind-driven tumbleweed rovers could roll up to 1,740 miles on Mars
European researchers have developed a novel concept for Mars exploration using wind-driven “Tumbleweed” rovers—lightweight, spherical robots up to five meters in diameter designed to be propelled across the Martian surface by winds. Inspired by natural tumbleweeds, these rovers carry scientific instruments within their cores and can traverse diverse terrains, including sand, pebbles, rough ground, and slopes equivalent to 30 degrees on Mars. Initial field tests with a 2.7-meter prototype in a Dutch quarry and wind tunnel experiments simulating Martian atmospheric conditions demonstrated that these rovers can maintain sensor functionality while rolling and can be mobilized by wind speeds typical on Mars. The experiments validated fluid dynamics models predicting rover movement and confirmed the feasibility of using swarms of these rovers for low-cost, wide-ranging planetary exploration. Data suggest a single Tumbleweed rover could travel approximately 262 miles in 100 Martian sols at an average speed of 0.22 mph, with potential maximum distances up
roboticsMars-explorationwind-powered-roversplanetary-roversautonomous-robotsspace-roboticsrobotic-sensorsOmniCore EyeMotion lets robots adapt to complex environments in real time, says ABB - The Robot Report
ABB Robotics has launched OmniCore EyeMotion, a new software solution that enables OmniCore-powered robots to recognize and adapt to their surroundings in real time using any third-party camera or sensor. This advancement allows robots to perform complex 2D and 3D vision-based tasks without requiring specialized camera hardware. Designed for ease of use with a simple drag-and-drop web interface, EyeMotion integrates fully with ABB’s RobotStudio programming tool, significantly reducing commissioning time by up to 90%. The system supports a wide range of applications across industries such as manufacturing, logistics, packaging, and food and beverage, handling tasks like item sorting and quality inspection. In more complex scenarios, OmniCore EyeMotion can be combined with ABB’s Automatic Path Planning Online to enable collision-free navigation around obstacles, potentially reducing cycle times by up to 50%. This innovation is part of ABB’s broader strategy to advance “autonomous versatile robotics” (AVR), aiming for robots that autonomously plan and execute diverse tasks in real time
roboticsindustrial-robotsAI-visionautonomous-robotsOmniCore-EyeMotionABB-Roboticsmachine-automationANYbotics earns strategic investment from Climate Investment - The Robot Report
ANYbotics AG, a Zurich-based company specializing in quadruped robots for autonomous industrial inspections, has received a strategic investment from Climate Investment (CI), increasing its total funding to over $150 million. The company’s flagship robot, ANYmal, is designed to operate safely in hazardous, explosive, and all-weather conditions, providing early detection of asset degradation, such as equipment overheating, abnormal vibrations, and gas leaks. ANYmal D is already conducting thousands of inspections weekly, autonomously navigating complex industrial sites with AI-powered capabilities including collision avoidance and stair climbing. The investment will support the upcoming market launch of ANYmal X in 2026, an Ex-certified legged robot tailored for explosive environments, enabling continuous and safe inspections in such zones. ANYbotics also recently introduced enhanced gas-leak and presence-detection features, integrating gas detectors and acoustic imaging to precisely locate leaks and measure ambient gas concentrations. The funding will further aid ANYbotics’ global expansion and strengthen collaborations with CI’s extensive network
roboticsindustrial-automationautonomous-robotsenergy-sectoremissions-reductionAI-navigationhazardous-environment-robotsSelf-supervised learning for soccer ball detection and beyond: interview with winners of the RoboCup 2025 best paper award - Robohub
The article highlights the award-winning research on autonomous soccer ball detection by the SPQR team, who received the best paper award at RoboCup 2025 held in Salvador, Brazil. The team addressed a key challenge in robotic soccer: accurate ball detection under varying conditions. Traditional deep learning approaches require large labeled datasets, which are difficult and labor-intensive to produce for highly specific tasks like RoboCup. To overcome this, the researchers developed a self-supervised learning framework that reduces the need for manual labeling by leveraging pretext tasks that exploit the structure of unlabeled image data. Their method also incorporates external guidance from a pretrained object detection model (YOLO) to refine predictions from a general bounding box to a more precise circular detection around the ball. Deployed at RoboCup 2025, the new model demonstrated significant improvements over their 2024 benchmark, notably requiring less training data and exhibiting greater robustness to different lighting and environmental conditions. This adaptability is crucial given the variability of competition venues. The SPQR team
robotautonomous-robotsself-supervised-learningdeep-learningRoboCupsoccer-robotscomputer-visionUK startup launches fastest-developed humanoid robot for logistics
UK-based robotics and AI startup Humanoid has unveiled the HMND 01 Alpha, the fastest-developed humanoid robot prototype designed for industrial logistics, retail, and manufacturing tasks. Developed in just seven months by a team of around 170 experts, the nearly 87-inch tall wheeled robot integrates multiple AI workflows powered by NVIDIA’s Jetson Thor platform, enabling autonomous operation, real-time adaptation to environments, and simultaneous running of large-scale generative AI models. The robot can move at speeds up to 4.4 mph, carry payloads of 33 lbs or more, and operate in confined spaces such as factory floors and store aisles. The HMND 01 Alpha features advanced perception capabilities with 360-degree RGB cameras and depth sensors, 29 active degrees of freedom (excluding end-effectors), and interchangeable end-effectors including a dexterous five-fingered hand or a simpler parallel gripper. It can reach items from floor level up to two meters high and handle shelf depths
roboticshumanoid-robotAI-roboticsindustrial-automationlogistics-technologyautonomous-robotsNVIDIA-Jetson-ThorSwisslog Healthcare, Diligent Robotics to bring last-mile delivery to hospitals - The Robot Report
Swisslog Healthcare has formed a strategic alliance with Diligent Robotics to introduce autonomous last-mile delivery robots, specifically the Moxi robot, into hospitals across the U.S. This partnership aims to enhance hospital logistics by integrating Swisslog’s transport and pharmacy automation systems with Diligent’s autonomous mobile manipulation robots (AMMRs). The collaboration is designed to streamline workflows, automate routine tasks, and improve patient care by enabling faster, more accurate delivery of medications and other critical items, including high-risk drugs like pediatric chemotherapy and narcotics. The alliance also seeks to complement existing pneumatic tube systems with robotic solutions capable of handling deliveries that are too large or sensitive for tubes. The service robotics market in healthcare is projected to grow significantly, and this partnership addresses the increasing demand for efficient, end-to-end hospital logistics solutions. Swisslog and Diligent Robotics emphasize that their combined technology will improve tracking and traceability of transported goods, reduce operational waste, and allow healthcare staff to focus more on patient care. Dilig
roboticshealthcare-automationautonomous-robotshospital-logisticslast-mile-deliverymedical-robotstransport-automationFrom teleoperation to autonomy: Inside Boston Dynamics' Atlas training
In Episode 212 of The Robot Report Podcast, Boston Dynamics’ VP of robotics research, Scott Kuindersma, discussed the development of large behavior models (LBMs) for the Atlas humanoid robot. The team collected 20 hours of teleoperation data to train these LBMs, which enable Atlas to generalize manipulation tasks such as bi-manual operations, including picking and placing parts for the Spot quadruped robot. The development process involved data collection, annotation, model training, and evaluation, with a strong emphasis on combining simulation data and human demonstration data. Boston Dynamics plans to further test Atlas in Hyundai facilities and leverage AI-driven advancements to improve humanoid manipulation and dynamic behaviors. The episode also covered recent robotics industry news, including Serve Robotics’ acquisition of Voysys’ assets to enhance its autonomous delivery fleet with low-latency video streaming for remote monitoring and teleoperation. Zoox, an Amazon subsidiary, launched a free robotaxi service on the Las Vegas Strip, with plans to expand testing
roboticsBoston-DynamicsAtlas-robotteleoperationautonomous-robotsAI-in-roboticsrobot-manipulation'World’s cutest' humanoid carries out chores with warmth, care
The Fourier GR-3 humanoid robot, developed by Chinese firm Fourier Robotics, is designed to support meaningful human interaction by combining emotional intelligence with practical functionality. Unlike traditional robots, the GR-3 can express empathy and kindness, making it feel more like a companion than a machine. It demonstrates capabilities such as eidetic memory to assist an art curator, multilingual communication to guide museum visitors, and home assistance by managing daily schedules. The robot also exhibits advanced visual recognition and human-like locomotion, responding naturally to gestures like waving. Weighing 71 kg and standing 165 cm tall, the GR-3 features 55 degrees of freedom for balanced, fluid movement and an animated facial interface that enhances its lifelike presence. Its emotional intelligence is powered by Fourier’s Full-Perception Multimodal Interaction System, integrating sight, sound, and touch, with 31 pressure sensors enabling responsive actions such as blinking and eye tracking. The robot supports continuous operation with a swappable battery and adaptable movement modes
robothumanoid-robotemotional-intelligencehuman-robot-interactionrobotics-technologyautonomous-robotssmart-roboticsIs Tesla's Robot Manifesto Simply An Investment Hail Mary? - CleanTechnica
The article from CleanTechnica examines Tesla's current strategic direction in light of its recently published fourth "Master Plan," questioning whether Elon Musk's focus on robotics represents a desperate investment move amid slowing vehicle sales growth. The author, with 13 years of experience covering Tesla and Musk, notes a significant shift from Tesla's previous rapid and near-continuous sales growth to a plateau and eventual decline by mid-2025. Despite Tesla's continued success and large sales volumes, the company appears to be struggling to attract buyers at the same pace, as evidenced by increased consumer incentives and marketing changes—signs that contrast sharply with Tesla's earlier growth trajectory. This slowdown poses a fundamental challenge because Tesla's stock valuation is heavily predicated on hypergrowth and disruptive market dominance. While traditional automakers like Ford and GM have shown steady growth, Tesla's faltering sales and profits raise questions about whether its market capitalization remains justified. The article implies that Musk's pivot toward robotics, as outlined in the new Master Plan
robotTeslaElon-Muskautonomous-robotsrobotics-investmentrobot-manifestotechnology-innovationTop 10 robotics developments of August 2025 - The Robot Report
In August 2025, The Robot Report highlighted significant developments in the robotics industry, emphasizing both business dynamics and technological advancements. Robotics investments surged to over $4.35 billion in July 2025, driven primarily by the U.S. and China through 93 funding rounds. Key funding news included FORT Robotics securing an additional $18.9 million to enhance robotic safety and OpenMind raising $20 million to advance its OM1 operating system aimed at connecting intelligent machines globally. Teradyne Robotics reported $75 million in Q2 revenue, reflecting a 9% increase from the previous quarter despite a 17% year-over-year decline. On the innovation front, Boston Dynamics and TRI are leveraging large behavior models to train the Atlas humanoid robot for versatile task competence, including object manipulation and dynamic balance. University of Waterloo researchers are pioneering tiny robots designed to dissolve kidney stones, potentially transforming treatment for a condition affecting 12% of people. Unitree Robotics introduced the A2 quadruped robot
roboticshumanoid-robotsrobot-investmentsrobotic-safetymedical-robotsAI-in-roboticsautonomous-robotsResearchers are teaching robots to walk on Mars from the sand of New Mexico - Robohub
Researchers are advancing the development of dog-like quadruped robots to perform scientific tasks on Mars by conducting field experiments at White Sands National Park in New Mexico, a Mars analog environment. These tests, part of the NASA-funded LASSIE Project (Legged Autonomous Surface Science in Analog Environments), involve a multidisciplinary team from several universities and NASA centers. The project aims to prepare legged robots for future crewed missions to the Moon and Mars, building on prior work with similar robots in lunar-like terrains such as Mount Hood, Oregon. The quadruped robots gather data from their foot interactions with the surface, enabling them to sense terrain stability and adapt their movements accordingly. During recent trials at White Sands, despite challenging high temperatures, the team achieved significant progress, including the robot autonomously making decisions for the first time. This autonomy is crucial for enabling simultaneous independent actions by astronauts and robots on Mars, thereby enhancing scientific productivity. The researchers also tested new locomotion strategies tailored to different surface conditions, which
roboticsquadruped-robotsMars-explorationautonomous-robotsNASAlunar-explorationrobotic-field-testingThe startup journey, from prototype to production
In Episode 209 of The Robot Report Podcast, hosts Steve Crowe and Mike Oitzman interview Bren Pierce, CEO and founder of Kinisi Robotics, focusing on the challenges and strategies involved in deploying autonomous robots in warehouse environments. The discussion highlights the complexities of navigating logistics, integrating robots with existing systems, and how industry leaders are innovating to enhance efficiency and automation in warehouse operations. This episode sheds light on the transformative impact robotics can have on supply chain and warehouse management. The episode also covers recent industry news, including Boston Dynamics and Toyota Research Institute’s collaboration on developing large behavior models (LBMs) for the Atlas humanoid robot to enable it to perform complex, long-horizon manipulation tasks. Additionally, FieldAI announced a $405 million funding round to accelerate global growth and product development in locomotion and manipulation, leveraging their Field Foundation Models designed for embodied intelligence. The inaugural World Humanoid Robot Games in China showcased autonomous and manually controlled robots competing in various events, signaling growing interest and formalization
roboticsautonomous-robotswarehouse-automationhumanoid-robotsAI-in-roboticsBoston-DynamicsKinisi-RoboticsVideo: Swiss robot dog plays perfect badminton match with a human
Researchers at Switzerland’s ETH Zurich have developed a quadruped robot dog named ANYmal, capable of playing badminton with a human at the skill level of a seven-year-old child. ANYmal, created by ANYbotics, uses a sophisticated control system equipped with two cameras to track and predict the shuttlecock’s trajectory. It swings a racket attached to a multi-axis arm to hit the shuttlecock precisely. The robot was trained using reinforcement learning in a virtual environment, where it practiced thousands of rallies to learn positioning, shot accuracy, and anticipatory movement, enabling it to perform with remarkable precision in real-world play. A key challenge addressed in the development was maintaining balance while lunging and moving quickly to return shots. ANYmal’s reinforcement learning algorithm enhances its coordination and stability, allowing it to move with agility and balance comparable to a human player. Originally designed for industrial inspection and navigating rough terrains, including disaster zones, ANYmal’s capabilities have now been extended to dynamic sports environments. Priced at around
robotroboticsreinforcement-learningquadruped-robotrobot-dogautonomous-robotsrobot-control-systemsRoboBall: Ball-like robot could easily map steep moon craters
The RoboBall project, originally conceptualized by NASA’s Dr. Robert Ambrose in 2003 and now being developed at Texas A&M University by graduate students Rishi Jangale and Derek Pravecek, aims to create a spherical robot capable of navigating terrains inaccessible to traditional rovers. Designed as a “robot in an airbag,” RoboBall’s unique spherical shape allows it to roll over steep, uneven, and abrupt terrain transitions without flipping over, making it ideal for exploring challenging environments such as the steep walls of lunar craters. Two versions are in development: the smaller RoboBall II, a 2-foot prototype for testing power and control, and the larger RoboBall III, a 6-foot model designed to carry scientific payloads like sensors and sampling tools. RoboBall’s versatility is being tested in real-world conditions, including field trials on the beaches of Galveston, Texas, where it demonstrates its ability to transition smoothly between water and land—something traditional wheeled or
robotroboticslunar-explorationautonomous-robotsrobotic-mappingplanetary-roversrobotic-designHow to make robots predictable with a priority based architecture and a new legal model - The Robot Report
The article discusses the challenge of ensuring predictable and safe behavior in increasingly autonomous robots, such as Tesla's Optimus humanoid and Waymo's driverless cars. Traditional robotic control systems rely on predefined scripts or reactive responses to commands, which can lead to conflicting actions and hesitation in complex, dynamic environments. Such unpredictability poses significant safety risks, especially when robots receive simultaneous or contradictory commands or when technical faults occur. To address these issues, the author’s team developed a priority-based control architecture that moves beyond simple stimulus-response behavior. This system evaluates every event through mission and subject filters, considering environmental context and potential consequences before execution. The architecture features two interlinked hierarchies: a mission hierarchy that ranks goals from fundamental safety rules (e.g., “Do not harm a human”) to user-set and current tasks, and a hierarchy of interaction subjects that prioritizes commands based on their source, giving highest priority to owners or operators and lower priority to external parties. This approach aims to enable robots to act
roboticsautonomous-robotspriority-based-controlTesla-Optimusrobot-safetyhumanoid-robotsautonomous-systemsRobot dog trains on White Sands dunes for future Mars exploration
Oregon State University engineers are training a dog-like quadruped robot on the shifting gypsum dunes of White Sands National Park, New Mexico, to simulate the unstable surfaces expected on the Moon and Mars. This work is part of NASA’s Moon to Mars program under the LASSIE Project (Legged Autonomous Surface Science in Analog Environments), which involves multiple universities and NASA’s Johnson Space Center. The goal is to develop autonomous legged robots capable of navigating and adapting to alien terrains without direct human commands, crucial for overcoming communication delays during extraterrestrial missions. During a five-day trial, the robot’s sensors collected data on surface texture and stability, enabling it to make independent decisions about movement and route selection using refined algorithms. This autonomy allows the robot to operate alongside astronauts, potentially accelerating exploration by scouting terrain, carrying instruments, or identifying scientific sites. The team also tested the robot in other analog environments, such as icy volcanic slopes on Mount Hood, Oregon, to simulate lunar polar conditions. These experiments demonstrate
robotroboticsautonomous-robotsMars-explorationspace-technologyNASAlegged-robotsRoboCup@Work League: Interview with Christoph Steup - Robohub
The RoboCup@Work League is part of the Industrial League within the international RoboCup initiative, which aims to advance intelligent robotics, AI, and automation. The @Work League focuses on mimicking aspects of industrial production systems, particularly the concept of the "factory of the future," where autonomous robots build customized products efficiently on a small scale. Unlike traditional factories that mass-produce identical items on large conveyor belts, the @Work League emphasizes the production of individual pieces with automation. The robots used in the competition are compact, fitting within a one-meter cube, and operate entirely on the ground to simplify logistics and reduce costs. In the competition, robots must autonomously transport objects between various workstations with only a single restart allowed per team, highlighting the need for reliability and consistent performance. Beyond object transportation, teams face specialized tasks such as precision placement—fitting objects into cavities of matching shape and size—and handling objects on a rotating table that simulates a conveyor belt. This rotating table is a practical abstraction
roboticsautomationindustrial-robotsRoboCupfactory-automationautonomous-robotsAI-in-manufacturingFieldAI raises $405M to scale 'physics first' foundation models for robots - The Robot Report
FieldAI, a Mission Viejo, California-based robotics company, has raised $405 million through two consecutive funding rounds to accelerate its global expansion and product development. The company plans to double its workforce by the end of the year as it advances its work in locomotion and manipulation for autonomous robots. FieldAI’s technology centers on its proprietary Field Foundation Models (FFMs), a novel class of AI models specifically designed for embodied intelligence in robotics. Unlike standard vision or language models adapted for robotics, FFMs are built from the ground up to handle uncertainty, risk, and physical constraints in dynamic, unstructured environments without relying on prior maps, GPS, or fixed paths. FieldAI’s FFMs enable robots to safely and reliably perform complex tasks in diverse real-world industrial settings such as construction, energy, manufacturing, urban delivery, and inspection. This approach allows robots to dynamically adapt to new and unexpected conditions without manual programming, marking a significant breakthrough in robotics AI. The company’s investors include prominent names such as
roboticsartificial-intelligenceautonomous-robotsField-Foundation-Modelsindustrial-robotsrobot-locomotionrobot-manipulationInaugural World Humanoid Robot Games step into the spotlight - The Robot Report
The inaugural World Humanoid Robot Games 2025 took place at Beijing’s National Speed Skating Oval, featuring 280 teams from 16 countries competing in 487 contests across 26 events. The competition included races, mixed martial arts, soccer, and a warehouse material sorting challenge that tested robots’ embodied AI, perception, and problem-solving skills. While many robots were teleoperated rather than fully autonomous, the event showcased both commercial and experimental humanoid robots, with Unitree Robotics’ H1 humanoid winning multiple foot races and setting a new world record in the 1,500 m event. Notably, an autonomous robot was awarded first place in the 100 m sprint after applying a time-coefficient advantage for autonomy. A key outcome of the event was the creation of the World Humanoid Robot Sports Federation, which will govern future humanoid robot competitions. The games highlighted the current state of humanoid robotics, balancing teleoperation and autonomy, and emphasized real-world applications such as logistics and
robothumanoid-robotsrobotics-competitionautonomous-robotsteleoperated-robotsAI-in-roboticsrobot-sports-federationUnitree dominates inaugural humanoid robot games with four golds
At the inaugural World Humanoid Robot Games held in Beijing, Unitree Robotics emerged as the dominant force, securing four gold medals in key track events including the 400m dash, 1,500m race, 100m hurdles, and the 4×100m relay. The Hangzhou-based company’s H1 humanoid robots showcased superior mechanical design powered by their proprietary M107 joint motor, enabling longer strides and stronger kicks. Unitree topped the overall medal table with 11 medals, highlighting its leadership in humanoid robot performance. Independent teams using Unitree’s G1 platform also earned multiple medals, demonstrating the versatility of its hardware. Other Chinese teams also performed strongly, with X-Humanoid (Beijing Humanoid Robot Innovation Centre) winning 10 medals, including golds in the 100m sprint and a materials handling contest. Their Tien Kung robot, notable for running autonomously without remote control, recently won a half-marathon against human runners and is being developed as
roboticshumanoid-robotsUnitree-Roboticsrobot-competitionsAI-in-roboticsrobotic-motorsautonomous-robotsMosquito-killing robot dogs to fight Chikungunya virus in Hong Kong
Hong Kong authorities are set to deploy robot dogs equipped with insecticide sprayers starting next month to combat the rising cases of the mosquito-borne Chikungunya virus. This initiative comes after nine imported cases were recorded locally and a significant outbreak in nearby Guangdong province. The robot dogs, capable of navigating difficult terrains like hillsides, aim to spray insecticides in hard-to-reach areas, thereby reducing the workload on frontline workers, especially during hot weather. If the trial is successful, the government plans to expand the use of these robotic dogs and continue researching innovative mosquito-control methods. These robotic dogs, developed by companies such as Boston Dynamics, integrate AI, cameras, and sensors to detect standing water and map mosquito breeding sites. They can analyze environmental data to predict high-risk areas, enabling targeted insecticide use that minimizes environmental impact. Additionally, Hong Kong is exploring other mosquito control strategies, including a WHO-recommended method involving bacteria introduced into mosquitoes to reduce their reproduction and virus transmission, with trials expected next
robotroboticsAImosquito-controlpublic-health-technologysmart-sensorsautonomous-robotsRobots explore lunar caves using advanced autonomous descent system
Scientists have successfully tested autonomous robots exploring lava tubes in a volcanic cave on Lanzarote, chosen for its similarity to underground structures on Mars and the moon. These natural lava tubes, formed by flowing lava that leaves hollow tunnels, are considered promising sites for future extraterrestrial exploration because they could shield astronauts from extreme temperatures, radiation, and meteorite impacts, as well as potentially harbor microbial life. The 21-day field trials involved two rovers collaboratively mapping the cave entrance, deploying a sensor-laden cube to create a 3D model, and performing a coordinated descent into the cave, with the smaller rover detaching to travel 235 meters while building a 3D map of the tunnel. The experiments demonstrated the feasibility of robotic cooperation and 3D mapping in dark, confined environments, though challenges remain. Moisture affected ground-penetrating radar accuracy, some sensors experienced interference, and autonomous navigation without human intervention still requires more advanced algorithms and reliable inter-robot communication. Despite these hurdles, the
robotsautonomous-robotslunar-explorationcave-mappingspace-roboticsautonomous-navigationextraterrestrial-explorationMachines compete in martial arts at World Humanoid Robot Games
The World Humanoid Robot Games, held in Beijing at the National Speed Skating Oval, brought together over 500 humanoid robots from 280 teams across 16 countries to compete in a diverse range of events over three days. The competition featured 487 contests spanning 26 categories, including traditional sports like soccer and boxing, scenario-based challenges such as hospital medicine sorting and hotel cleaning, as well as fashion showcases and artistic performances. The opening ceremony highlighted human-robot collaboration through a blend of robotics and live performances, including robots executing complex martial arts movements and participating in a fashion runway that merged Chinese cultural heritage with robotics. The event emphasized both autonomous and teleoperated robot capabilities, with teams leveraging AI, visual recognition, and 5G networks to demonstrate advanced decision-making and adaptability in real-world tasks. Leading Chinese companies and top universities, alongside international teams from countries like the U.S., Germany, and Japan, showcased their humanoid robots, aiming to illustrate how these machines can integrate into human life
robotshumanoid-robotsAI-roboticsrobot-competitionsautonomous-robotsteleoperationrobot-applicationsRobot Team To Tunnel Deep Into Mars
The article discusses a team of three specialized robots engineered to explore and navigate lava tubes, both on Earth and extraterrestrial environments such as Mars. These robots are designed with complementary capabilities that enable them to work together effectively in challenging subterranean conditions. Their combined skills allow them to survey, enter, and traverse lava tubes, which are considered promising sites for scientific exploration due to their potential to harbor signs of past or present life and to offer protection from harsh surface conditions. The key takeaway is that this robotic team represents a significant advancement in planetary exploration technology, particularly for missions targeting subsurface environments on Mars. By leveraging their unique abilities, these robots can perform detailed mapping and analysis of lava tubes, which could provide critical insights into Mars' geology and habitability. The article highlights the importance of such robotic systems in expanding our understanding of other planets while overcoming the limitations faced by human explorers in extreme environments.
robotMars-explorationplanetary-roboticsautonomous-robotsspace-roboticslava-tube-explorationrobotic-surveyorsAugust 2025 issue: Motion control enables robots from the ISS to the AGT stage - The Robot Report
The August 2025 issue of The Robot Report highlights the critical role of motion control technologies in advancing robotics applications both in space and on Earth. A key feature explores PickNik Inc.’s collaboration with the Japan Aerospace Exploration Agency (JAXA) to develop a multi-arm robotic system designed for complex manipulation tasks in microgravity. This innovation aims to enhance cargo handling capabilities aboard the International Space Station (ISS) and support future crewed and uncrewed space missions. PickNik’s MoveIt Pro software, integral to this project, also finds applications in terrestrial governmental and commercial robotics. Additionally, the issue covers Boston Dynamics’ efforts to showcase its Spot quadruped robot on NBC’s America’s Got Talent (AGT). The performance combined teleoperated and autonomous control with precise choreography, demonstrating both the technical prowess of the engineering team and the expanding commercial and industrial potential of robotics. The company also turned an on-air malfunction into a memorable moment, highlighting the human side of robotic innovation. The issue
robotmotion-controlroboticsspace-roboticsBoston-Dynamicsautonomous-robotsrobotic-manipulationChina unveils antelope robot to study endangered Tibetan species
China has introduced a lifelike robotic Tibetan antelope in the Hoh Xil National Nature Reserve, located over 4,600 meters above sea level in Qinghai Province, to study the endangered species in its natural habitat. Developed collaboratively by Xinhua News Agency, the Chinese Academy of Sciences, and DEEP Robotics, this bionic antelope is equipped with 5G ultra-low latency networks and advanced AI algorithms. Its realistic appearance allows it to blend into herds, enabling researchers to collect precise, real-time ecological data without disturbing the animals. This marks a significant advancement in wildlife research within one of the world’s most extreme environments. Designed to withstand Hoh Xil’s harsh conditions—characterized by high altitude, strong winds, and cold temperatures—the robot can navigate rugged terrain and operate up to 2 kilometers from its control point. It records videos to analyze herd size, migration patterns, and movement speed, which also aids in preventing road collisions by alerting protection stations to manage traffic.
roboticsartificial-intelligence5G-technologywildlife-conservationautonomous-robotsecological-monitoringTibetan-antelopeUnitree launches A2 quadruped equipped with front and rear lidar - The Robot Report
Unitree Robotics has launched its latest quadruped robot, the Unitree A2, designed for industrial applications such as inspection, logistics, and research. The A2 features significant upgrades in perception, including dual industrial lidar sensors positioned at the front and rear, an HD camera, and a front light to improve environmental detection and eliminate blind spots. Weighing 37 kg unloaded, the A2 can carry a 25 kg payload while walking continuously for three hours or about 12.5 km, supported by hot-swappable dual batteries for extended missions. This model balances endurance, strength, speed, and perception, marking it as one of Unitree’s most advanced quadrupeds to date. Key specifications of the A2 include a top speed of 5 m/s, an unloaded range of 20 km, a maximum standing load of 100 kg, and the ability to climb steps up to 1 meter high. Compared to Unitree’s previous B2 model, the A2 is
robotquadruped-robotlidarautonomous-robotsroboticsAI-visionbattery-technologyUnitree Releases World's Fastest Quadruped Robot
The article announces Unitree's latest innovation in robotics, the Unitree A2 Stellar Explorer, which is touted as the world's fastest quadruped robot. Following the success of its predecessor, the Unitree R1, the A2 Stellar Explorer represents a significant advancement in speed and agility for four-legged robots. Although specific performance metrics and technical details are not provided in the excerpt, the emphasis is on the robot's enhanced capabilities and potential applications. Unitree continues to push the boundaries of robotic design, focusing on creating agile, dog-like robots that can navigate diverse environments quickly and efficiently. The A2 Stellar Explorer is positioned as a cutting-edge development in this field, likely aimed at industries requiring rapid and versatile robotic mobility. Further details on its features, use cases, and technological innovations would provide a clearer picture of its impact and significance.
robotquadruped-robotUnitreerobotics-technologyautonomous-robotsrobot-innovationrobotic-exploration30 humanoid robot teams to play soccer tournament in Beijing
China is preparing to host the World Humanoid Robot Games in Beijing from August 15 to 17, featuring athletic competitions among humanoid robots, with soccer as a highlight event. The tournament will include 30 teams from around the world, including China, the United States, Brazil, Germany, and Portugal, competing in fully autonomous five-a-side soccer matches. The robots are equipped with visual sensors to locate the ball and navigate the field, and they can recover from falls to continue playing. This event follows the RoboLeague held in June, which served as a precursor and showcased humanoid robots playing soccer for the first time. Teams are using advanced programming techniques such as imitation learning, where robots observe human movements and undergo extensive simulation training to master skills like dribbling, kicking, and shooting. Participants, including engineers from Tsinghua University, expressed excitement about competing on a global stage and demonstrating new algorithms. While some may view the competition as a novelty, experts see it as a valuable platform for
robothumanoid-robotsrobotics-competitionAI-roboticsautonomous-robotsrobot-soccerrobot-learning-algorithmsChina’s humanoid robot stuns by opening car door in a 'world-first'
AiMOGA Robotics has achieved a significant breakthrough with its humanoid robot, Mornine, which autonomously opened a car door inside a functioning Chery dealership in China—marking a world-first in embodied AI. Unlike scripted or teleoperated robots, Mornine used only onboard sensors, full-body motion control, and reinforcement learning to identify the door handle, adjust its posture, and apply coordinated force to open the door without any human input. This task, performed in a live commercial setting, demonstrates advanced autonomy and a shift from simulation-based robotics to real-world service applications. Mornine’s sophisticated sensor suite includes 3D LiDAR, depth and wide-angle cameras, and a visual-language model, enabling real-time perception and continuous learning through a cloud-based training loop. The robot was not explicitly programmed to recognize door handles but learned through millions of simulated cycles, with the learned model transferred to real-world operation via Sim2Real methods. Currently deployed in multiple Chery 4S dealerships
roboticshumanoid-robotautonomous-robotsAI-roboticsservice-robotsreinforcement-learningsensor-technologyChina: 'World’s first' robot dog–patrolled wind farm runs human-free
China has launched what is believed to be the world’s first fully autonomous wind farm, the 70-megawatt Ningxia Tongli Third Wind Farm, which has operated without any onsite human workers since September 2024. The facility, built by China Three Gorges and equipped with Goldwind turbines, uses an integrated system of four-legged inspection robots called X30 “robot dogs” from DEEP Robotics, drones, and over 5,000 sensors to monitor turbine conditions in real time. These robots can operate in extreme temperatures (–20 °C to 55 °C), climb stairs, and navigate in darkness, enabling continuous inspection and fault detection without human intervention. Data collected is streamed to a remote control center, though the robots can function autonomously if communication is lost. This innovation follows China’s broader push toward fully automated “dark” factories and infrastructure, aiming to reduce maintenance costs and improve safety in challenging environments. DEEP Robotics showcased the X30’s capabilities at the 202
robotIoTenergyrenewable-energywind-farmautonomous-robotssmart-sensors#RoboCup2025: social media round-up part 2 - Robohub
RoboCup2025 was held from July 15 to 21 in Salvador, Brazil, attracting around 3,000 participants competing across various robotics leagues. The event featured intense competition culminating in final rounds during the last days. Notably, in the #RoboCup2025 @Home Open Platform League (OPL) Final, the NimbRo team’s robot demonstrated impressive capabilities such as opening doors, removing trash, and closing a cabinet door, ultimately securing second place behind Korea’s team Tidyboy. Social media updates highlighted the tense atmosphere as top robots advanced to the finals, with teams overcoming challenges such as equipment damage during transport. Collaborative efforts among teams like RoboCanes (University of Miami), PUMAS (UNAM), @_erasers, and TIDbots enabled them to reach the finals in the @Home DSPL league. Additionally, the event included discussions on the future of RoboCup, reflecting the community’s engagement with advancing robotics and AI technologies. Overall, Robo
roboticsRoboCupAIautonomous-robotsrobot-competitionsservice-robotsrobotics-eventHomeBase USA enhances inventory operations with Simbe Robotics - The Robot Report
HomeBase USA has implemented Simbe Robotics’ Store Intelligence platform, featuring the autonomous Tally shelf-scanning robot, at its stores in Copperas Cove, Texas, and Laramie, Wyoming—the latter marking Tally’s first deployment in Wyoming. HomeBase stores carry tens of thousands of SKUs across diverse categories such as lumber, hardware, and farm supplies, making inventory management complex. Tally automates manual inventory tasks by scanning shelves multiple times daily to detect out-of-stock items, pricing errors, and misplaced products. This real-time data is accessible via a mobile app and dashboard, enabling store teams to prioritize and address issues promptly, thereby improving shelf availability, labor efficiency, and customer experience. The introduction of Tally aims to reduce the 30 hours per week that store associates typically spend on manual inventory duties, which are often error-prone and a major factor in employee attrition. By automating these tasks, HomeBase hopes to free associates to focus on higher-value activities and enhance operational
roboticsautonomous-robotsinventory-managementretail-automationSimbe-RoboticsTally-robotstore-intelligenceHow TRIC Robotics is reducing pesticide use on strawberries using UV light
TRIC Robotics, a startup based in San Luis Obispo, California, is addressing the heavy pesticide reliance in strawberry farming by deploying autonomous robots equipped with UV-C light technology to reduce chemical use. These tractor-sized robots can treat up to 100 acres overnight, using UV-C light to kill bacteria and pests, and vacuums to remove bug residue without damaging crops. Rather than selling the robots directly, TRIC offers them as a service, aligning with farmers’ existing pest control payment models. This approach was developed through close collaboration with farmers to ensure practical adoption. The company was founded by Adam Stager, who pivoted from developing 3D-printed robots for SWAT teams to agriculture in 2020, seeking to make a meaningful impact. Through a USDA program connecting innovators with uncommercialized technology, Stager discovered the UV light application that became central to TRIC’s solution. Starting with small-scale trials on farmers’ land in 2021, the company has since expanded to work
roboticsagriculture-technologyUV-C-lightpest-controlautonomous-robotssustainable-farmingpesticide-reductionTRIC Robotics raises seed funding to help farmers control pests and plant disease - The Robot Report
TRIC Robotics, a company specializing in autonomous pest and plant disease control, has raised $5.5 million in seed funding to scale its robotic solutions for specialty crop farming, beginning with strawberries—a crop known for high labor demands and heavy pesticide use. Their flagship robot, Luna, operates at tractor scale and uses ultraviolet light to destroy pests and pathogens, alongside vacuum technology to remove insects, all without chemicals. This approach aims to reduce pesticide use significantly, with pilot programs reporting up to a 70% reduction, while helping farmers meet sustainability goals and manage labor costs. The company offers its technology as a robotics-as-a-service (RaaS) model, which includes a data-driven platform featuring vision systems and real-time field analytics to improve farm profitability and produce chemical-free crops. TRIC Robotics has already deployed nine robots, doubling its fleet in the past year, and plans to expand operations into additional California farming regions such as Oxnard and Watsonville. The new funding, led by Version One Ventures and
robotagriculture-roboticsautonomous-robotspest-controlsustainable-farmingrobotics-as-a-serviceprecision-agriculture#RoboCup2025: social media round-up 1 - Robohub
RoboCup2025 was held in Salvador, Brazil, attracting approximately 3,000 participants competing across multiple leagues. The event showcased a wide range of robotics competitions, highlighting advancements in AI and robotics technologies. During the initial days, teams engaged in various challenges, demonstrating innovative solutions and pushing the boundaries of autonomous systems. The coverage by Robohub and AIhub emphasized the event's role in fostering collaboration and knowledge exchange within the AI community. As a non-profit organization, AIhub aims to bridge the gap between AI experts and the public by delivering accessible, high-quality information. The RoboCup2025 event continues to be a significant platform for showcasing cutting-edge research and developments in robotics and artificial intelligence.
robotRoboCuprobotics-competitionAIautonomous-robotsrobot-leaguesSalvador-Brazil'Robot metabolism' concept could help them grow by consuming each other
Researchers at Columbia University have introduced the concept of "robot metabolism," a novel approach enabling robots to physically grow, heal, and improve themselves by absorbing and reusing parts from their environment or other robots. This idea draws inspiration from biological systems, where organisms adapt and sustain themselves through modular components like amino acids. The research team demonstrated this concept using "Truss Links," magnetic building blocks that can self-assemble into complex structures and integrate new parts to enhance functionality, such as a tetrahedron robot that developed a walking stick to increase its speed downhill by 66.5%. This advancement aims to overcome the current limitations of rigid robotic bodies that depend heavily on human intervention for repairs and upgrades. By mimicking nature’s modular and adaptive processes, robot metabolism could lead to truly autonomous machines capable of self-maintenance and physical evolution. Potential applications include disaster recovery, where robots could self-repair in unpredictable environments, and space exploration, where robots might build and adapt without human resupply. Published in Science Advances,
robotself-healing-robotsmodular-roboticsautonomous-robotsrobot-metabolismadaptive-robotsrobot-self-repairUnveiling the Tree of Robots: A new taxonomy for understanding robotic diversity - The Robot Report
Researchers at the Munich Institute of Robotics and Machine Intelligence (MIRMI) at the Technical University of Munich (TUM) have developed the “Tree of Robots,” a novel taxonomy and evaluation scheme designed to measure and compare the sensitivity of autonomous robots. Sensitivity, which is critical for safe and flexible human-robot interaction, previously lacked a standardized assessment method. This new framework enables the categorization of various robotic systems—including industrial robots, cobots, soft robots, and tactile robots—based on 25 specific measurements related to physical contact sensitivity, such as force alignment and safety in human interaction. The resulting spider diagrams provide an accessible visual summary of a robot’s sensitivity performance, facilitating better understanding and comparison even for non-experts. The Tree of Robots draws inspiration from Darwin’s Tree of Life, illustrating the diversity and specialization of robotic “species” according to their design and operational environments. By analyzing single-armed robots from different manufacturers, the researchers identified distinct capabilities related to sensors, motors, and control
roboticsrobotic-manipulatorsrobot-sensitivityhuman-robot-interactionindustrial-robotsautonomous-robotsrobotic-taxonomyChina’s robot vacuum cleans factory floors with 200% suction power
China’s Pudu Robotics has introduced the MT1 Vac, an AI-powered robot vacuum designed for large commercial spaces such as airports, hotels, and metro stations. Unlike typical consumer models, the MT1 Vac combines sweeping, vacuuming, and dust mopping with a dual fan system that delivers up to 200% more suction power and HEPA-grade filtration. It features a large 20-liter trash capacity, smart navigation, and AI-driven surface recognition, enabling extended autonomous operation with minimal human intervention. The robot integrates with Pudu Link, an IoT-based management system that supports remote updates and adaptive AI routines, aiming for full autonomy in high-traffic environments. The MT1 Vac enters a competitive and rapidly evolving market of smart cleaning robots, where other players like LG and Roborock focus on hospitality and household sectors with different feature sets. While many current autonomous cleaners still require human oversight for maintenance and relocation, Pudu Robotics is pushing toward fully independent operation. The broader challenge in this space is
robotautonomous-robotsAI-powered-vacuumindustrial-cleaning-robotIoT-integrationsmart-navigationcommercial-roboticsRoboCupRescue: an interview with Adam Jacoff - Robohub
The RoboCupRescue League, now in its 25th year, is a key component of the international RoboCup competition focused on advancing autonomous robotic technologies for emergency responders. Co-founded by Adam Jacoff, the league develops and validates robots designed to perform hazardous search and rescue tasks, such as navigating compromised or collapsed structures, thereby enabling safer operations from a distance. Unique among RoboCup leagues, RoboCupRescue emphasizes realistic, chaotic arenas and uses twenty standardized test methods—developed in collaboration with emergency responders—to simulate complex, real-world challenges. These tests progressively increase in difficulty from flat terrains in preliminaries to slippery, obstacle-laden environments in the finals, pushing both autonomous and remotely operated robots to adapt and perform effectively. The league serves three main purposes: guiding research with practical, reproducible challenges that reflect actual emergency scenarios; providing an intense educational experience that helps recruit and advance engineers and computer scientists into robotics careers; and bridging the gap between research and commercial deployment of robotic technologies. By focusing
robotautonomous-robotsRoboCupRescuesearch-and-rescue-robotsemergency-response-technologyrobotics-competitionAI-in-roboticsAn interview with Nicolai Ommer: the RoboCupSoccer Small Size League - Robohub
The article features an interview with Nicolai Ommer, an Executive Committee member of the RoboCup Small Size League (SSL), which is part of the international RoboCup initiative aimed at advancing intelligent robots, AI, and automation. The SSL involves teams of 11 small, cylindrical, wheeled robots that play soccer autonomously, with teams responsible for both hardware and software development. A central AI system processes data from an overhead vision system that tracks all robots and the ball, enabling teams to send commands to their robots. The robots can move up to 4 m/s and kick the ball at speeds up to 6.5 m/s, with recent rules reducing kick speed to enhance gameplay fairness and allow goalkeepers and defenders to intercept passes. A notable innovation in the SSL is the use of multiple independent auto referee systems to assist human referees in monitoring the fast-paced matches, particularly for fouls and collisions that are difficult to judge visually. These auto refs operate simultaneously and their decisions are combined via majority
robotroboticsRoboCupAIautomationautonomous-robotsrobot-soccerHarvard swarm robots curl and crawl like entangled living worms
Harvard researchers led by Justin Werfel at the John A. Paulson School of Engineering and Applied Sciences have developed a novel swarm robotic system inspired by the behavior of California blackworms (Lumbriculus variegatus). These freshwater worms naturally form tightly entangled blobs that enable them to regulate temperature, protect against predators, and move cohesively. Mimicking this, the team created flexible, worm-like robots about a foot long, made from synthetic polymers with pressurized internal air chambers that allow them to curl and physically entangle with one another. This entanglement not only provides cohesion but also serves as a potential channel for mechanical communication and coordination among the robots. The entangled robotic swarm can move collectively over land and water, achieving tasks beyond the capability of individual units. The researchers aim to harness these emergent group dynamics for practical applications such as disaster zone exploration, navigation of irregular terrains, and manipulation of large objects. While current robots are individually powered and tethered, future iterations are planned
roboticsswarm-roboticssoft-robotsbiomimicryautonomous-robotsrobotic-materialsHarvard-SEASXTEND secures extension to Series B to scale autonomous tactical robots - The Robot Report
XTEND Reality Inc., a developer of tactical autonomous robots, announced a $30 million extension to its existing $70 million Series B funding round, co-led by Aliya Capital Partners and Protego Ventures. The company plans to use the new capital to scale production both in the U.S. and globally, integrate advanced real-time AI capabilities across its platforms, and expand deployments with U.S. and allied defense forces. XTEND’s CEO, Aviv Shapira, highlighted the growing demand for autonomous systems in defense and public safety, emphasizing that the investment reflects strong confidence in XTEND’s technology and mission. Originally founded as a gaming company, XTEND has evolved to create robots and autonomous systems that combine AI with human supervision to operate safely in complex, hazardous environments. Their patented XOS operating system enables “human-supervised autonomy,” allowing robots to perform complex tasks autonomously—such as building entry, floor scanning, and suspect pursuit—while leaving critical decision-making to human supervisors. This approach reduces the
robotautonomous-robotsAIdefense-technologytactical-robotshuman-supervised-autonomyrobotics-systemsZimmer Biomet to acquire Monogram Technologies for $177M - The Robot Report
Zimmer Biomet Holdings, a global medical technology company, announced its acquisition of Monogram Technologies, an orthopedic robotics firm, for $177 million. Monogram specializes in combining 3D printing, advanced machine vision, AI, and next-generation robotics, with a focus on semi- and fully autonomous robotic technologies for total knee arthroplasty (TKA). Their CT-based, AI-navigated mBôs system received FDA clearance in March 2025 and is expected to be commercialized with Zimmer Biomet implants by early 2027. Monogram is also developing a fully autonomous version of this technology, which aims to improve safety, efficiency, and surgical outcomes. The acquisition will integrate Monogram’s technology into Zimmer Biomet’s existing ROSA platform, which currently supports multiple orthopedic applications including knee and shoulder replacements. Zimmer Biomet expects this deal to enhance its surgical robotics portfolio by adding advanced semi- and fully autonomous capabilities, thereby broadening its product range and increasing market share, particularly in
roboticssurgical-roboticsAIorthopedic-surgeryautonomous-robotsmedical-technologyZimmer-BiometNew quadruped robot climbs vertically 50 times faster than rivals
Researchers at the University of Tokyo’s Jouhou System Kougaka Laboratory (JSK) have developed KLEIYN, a quadruped robot capable of climbing vertical walls up to 50 times faster than previous robots. Unlike other climbing robots that rely on grippers or claws, KLEIYN uses a chimney climbing technique, pressing its feet against two opposing walls for support. Its flexible waist joint allows adaptation to varying wall widths, particularly narrow gaps. The robot weighs about 40 pounds (18 kg), measures 2.5 feet (76 cm) in length, and features 13 joints powered by quasi-direct-drive motors for precise movement. KLEIYN’s climbing ability is enhanced through machine learning, specifically Reinforcement Learning combined with a novel Contact-Guided Curriculum Learning method, enabling it to transition smoothly from flat terrain to vertical surfaces. In tests, KLEIYN successfully climbed walls spaced between 31.5 inches (80 cm) and 39.4 inches (
robotquadruped-robotmachine-learningreinforcement-learningclimbing-robotrobotics-innovationautonomous-robotsRobot dog walks on tough terrain with two legs, withstands kicks
Researchers at the University of Hong Kong’s ArcLab have developed a quadruped robot capable of walking on two legs using a bio-inspired controller called TumblerNet, powered by Deep Reinforcement Learning. This system mimics human balance by integrating estimators for the robot’s center of mass and center of pressure into a closed-loop control, enabling seamless transitions between quadrupedal and bipedal locomotion. The robot can respond to various movement commands, including turning and walking in circles, demonstrating advanced adaptability. The robot’s robustness is notable, as it maintains balance on challenging terrains such as foam pads, loose rocks, sand, and even a beach environment. It withstands external disturbances like pushes and kicks without requiring a separate recovery model, and it can automatically recover from falls caused by obstacles. These capabilities highlight the potential advantages of bipedal robots over traditional quadrupeds, especially for navigating human environments and performing complex tasks in caregiving, disaster response, and human-robot collaboration. The researchers
robotquadruped-robotbipedal-locomotionbio-inspired-controllerdeep-reinforcement-learningrobot-stabilityautonomous-robotsTRI: pretrained large behavior models accelerate robot learning
The Toyota Research Institute (TRI) has advanced the development of Large Behavior Models (LBMs) to accelerate robot learning, demonstrating that a single pretrained LBM can learn hundreds of tasks and acquire new skills using 80% less training data. LBMs are trained on large, diverse datasets of robot manipulation, enabling general-purpose robots to perform complex, long-horizon behaviors such as installing a bike rotor. TRI’s study involved training diffusion-based LBMs on nearly 1,700 hours of robot data and conducting thousands of real-world and simulation rollouts, revealing that LBMs consistently outperform policies trained from scratch, require 3-5 times less data for new tasks, and improve steadily as more pretraining data is added. TRI’s LBMs use a diffusion transformer architecture with multimodal vision-language encoders and a transformer denoising head, processing inputs from wrist and scene cameras, proprioception, and language prompts to predict short action sequences. The training data combines real-world teleoperation data,
roboticslarge-behavior-modelsrobot-learningpretrained-modelsToyota-Research-Instituteautonomous-robotsembodied-AIWorld’s first robot dog learns animal gaits in 9 hours with AI power
Researchers at the University of Leeds have developed the world’s first robot dog capable of autonomously adapting its gait to mimic real animal movements across unfamiliar terrains. Using an AI system inspired by animals such as dogs, cats, and horses, the robot—nicknamed “Clarence”—learned to switch between walking styles like trotting, running, and bounding within just nine hours. This bio-inspired deep reinforcement learning framework enables the robot to adjust its stride for energy efficiency, balance, and coordination without human intervention or additional tuning, even in environments it has never encountered before. This breakthrough represents a significant advancement in legged robotics, with practical applications in hazardous environments like nuclear decommissioning and search and rescue, where human presence is risky. By training the robot entirely in simulation and then transferring the learned policies directly to the physical machine, the researchers achieved a high level of adaptability and resilience. The project also underscores the potential of biomimicry in robotics, offering insights into how biological intelligence principles can improve robotic
robotAIroboticslegged-robotsbio-inspired-roboticsautonomous-robotsrobot-dogFlipping Robot Senses and Movement On Its Head
The article discusses the AgiBot X2-N, a humanoid robot notable for its lack of cameras or visual sensors, challenging conventional robotic design that relies heavily on visual input. Despite having no "eyes," the AgiBot X2-N can navigate complex terrains such as steps and slopes with precise balance and movement. This capability is achieved through advanced internal sensing and control mechanisms that allow the robot to maintain stability and adapt to its environment without relying on vision. This innovative approach to robotic sensing and locomotion could significantly impact the field of robotics by demonstrating that visual input is not always necessary for effective movement and navigation. The AgiBot X2-N's design may lead to more robust and versatile robots capable of operating in environments where cameras and visual sensors are limited or ineffective, such as in low-light or visually obstructed conditions. Overall, the robot represents a shift in how sensory data is utilized in robotics, potentially broadening the applications and reliability of humanoid robots.
roboticshumanoid-robotsensor-technologyrobot-movementAgiBot-X2-Nrobotics-innovationautonomous-robotsGoogle DeepMind's new AI lets robots learn by talking to themselves
Google DeepMind is developing an innovative AI system that endows robots with an "inner voice" or internal narration, allowing them to describe visual observations in natural language as they perform tasks. This approach, detailed in a recent patent filing, enables robots to link what they see with corresponding actions, facilitating "zero-shot" learning—where robots can understand and interact with unfamiliar objects without prior training. This method not only improves task learning efficiency but also reduces memory and computational requirements, enhancing robots' adaptability in dynamic environments. Building on this concept, DeepMind introduced Gemini Robotics On-Device, a compact vision-language model designed to run entirely on robots without cloud connectivity. This on-device model supports fast, reliable performance in latency-sensitive or offline contexts, such as healthcare, while maintaining privacy. Despite its smaller size, Gemini Robotics On-Device can perform complex tasks like folding clothes or unzipping bags with low latency and can adapt to new tasks with minimal demonstrations. Although it lacks built-in semantic safety features found in
roboticsartificial-intelligencemachine-learningzero-shot-learningDeepMindautonomous-robotson-device-AIGalbot picks up $153M to commercialize G1 semi-humanoid - The Robot Report
Galbot, a Beijing-based robotics startup founded in May 2023, has raised approximately $153 million (RMB 1.1 billion) in its latest funding round, bringing its total capital raised over the past two years to about $335 million. The company recently launched its flagship semi-humanoid robot, the G1, which features wheels and two arms designed to automate tasks such as inventory management, replenishment, delivery, and packaging. The G1 robot is capable of handling 5,000 different types of goods and can be deployed in new stores within a day. Currently, nearly 10 stores in Beijing use the robot, with plans to expand deployment to 100 stores nationwide within the year. Galbot’s technology is powered by three proprietary vision-language-action (VLA) models: GraspVLA, GroceryVLA, and TrackVLA. GraspVLA, pre-trained on synthetic data, enables zero-shot generalization for robotic grasping. GroceryVLA
robotartificial-intelligencesemi-humanoid-robotretail-automationvision-language-action-modelsautonomous-robotsrobotics-fundingTop 10 robotics developments of June 2025 - The Robot Report
In June 2025, Automatica 2025 showcased significant robotics advancements, with The Robot Report highlighting the top 10 developments that captured industry and reader interest. Key product launches included Hexagon AB’s AEON humanoid robot designed to address labor shortages in industrial settings, and 1X Technologies’ Redwood AI model enhancing the autonomy of its NEO humanoid for household tasks. NEURA Robotics unveiled multiple innovations including the third generation of its 4NE1 humanoid, the MiPA cognitive robot, and the Neuraverse open robotics ecosystem, emphasizing cognitive and service robotics progress. Funding milestones marked the month as well, with Coco Robotics raising $80 million to expand its sidewalk delivery robot fleet and AI platform, and Pittsburgh-based Gecko Robotics achieving unicorn status with $125 million in Series D funding, doubling its valuation to $1.25 billion. Beewise secured $50 million to broaden access to its AI-powered BeeHome, a climate technology solution supporting pollination critical to global food crops.
roboticshumanoid-robotsAI-in-roboticsindustrial-robotsrobot-fundingautonomous-robotsrobot-applicationsAmazon’s 1 millionth robot powers world’s biggest mobile bot army
Amazon has reached a significant milestone by deploying its one millionth robot in a fulfillment center in Japan, solidifying its status as the world’s largest operator of mobile robotic systems with over 300 facilities utilizing such technology globally. Since beginning its robotics development in 2012 with a focus on shelf transport, Amazon now operates a diverse fleet of robots tailored to specific logistical tasks, including the Hercules platform for heavy inventory handling, the Pegasus system for sorting and routing, and Proteus, the first fully autonomous mobile robot designed to safely operate alongside human workers. The company emphasizes that robotics complements rather than replaces its workforce, supported by extensive employee upskilling programs. A key innovation accompanying this milestone is DeepFleet, a generative AI foundation model developed using Amazon’s internal logistics data and AWS tools like SageMaker. DeepFleet acts as an intelligent control layer that dynamically manages thousands of autonomous robots within high-density fulfillment centers, optimizing their movement to reduce congestion, improve throughput, and lower energy consumption. By integrating warehouse navigation
roboticsautonomous-robotswarehouse-automationAI-in-roboticsmobile-robotsAmazon-roboticsrobotic-logisticsSwiss robot dog can now pick up and throw a ball accurately like humans
ETH Zurich’s robotic dog ANYmal, originally designed for autonomous operation in challenging environments, has been enhanced with a custom arm and gripper, enabling it to pick up and throw objects with human-like accuracy. The robot’s advanced actuators and integrated sensors allow it to navigate complex terrain while maintaining stability and situational awareness. Unlike traditional factory robots, ANYmal is built to handle unpredictable outdoor conditions, making it suitable for tasks such as industrial inspection, disaster response, and exploration. The research team, led by Fabian Jenelten, trained ANYmal using reinforcement learning within a highly realistic virtual environment that simulated real-world physics. This approach, known as sim-to-real transfer, allowed the robot to practice millions of throws safely and ensured its skills transferred effectively to real-world scenarios. In testing, ANYmal successfully picked up and threw various objects—including balls, bottles, and fruit—across different surfaces and environmental challenges, such as wind and uneven ground, demonstrating adaptability and precise control without pre-programmed steps. This
roboticsautonomous-robotsreinforcement-learninglegged-robotsrobot-manipulationsim-to-real-transferrobot-perceptionMIT CSAIL's new vision system helps robots understand their bodies - The Robot Report
MIT CSAIL has developed a novel robotic control system called Neural Jacobian Fields (NJF) that enables robots to understand and control their own bodies using only visual data from a single camera, without relying on embedded sensors or pre-designed models. This approach allows robots to learn their own internal models by observing the effects of random movements, providing them with a form of bodily self-awareness. The system was successfully tested on diverse robotic platforms, including a soft pneumatic hand, a rigid Allegro hand, a 3D-printed arm, and a sensorless rotating platform, demonstrating its robustness across different morphologies. The key innovation of NJF lies in decoupling robot control from hardware constraints, thus enabling more flexible, affordable, and unconventional robot designs without the need for complex sensor arrays or reinforced structures. By leveraging a neural network that combines 3D geometry reconstruction with a Jacobian field predicting how robot parts move in response to commands, NJF builds on neural radiance fields (NeRF) to
roboticssoft-roboticsrobotic-controlmachine-learningMIT-CSAILNeural-Jacobian-Fieldsautonomous-robotsAutonomous humanoid robot teams compete in China's soccer tournament
In Beijing, the final leg of the Robo League robot football (soccer) tournament featured four teams of fully autonomous humanoid robots competing without any human intervention. The championship was won by THU Robotics from Tsinghua University, who defeated the Mountain Sea team from China Agricultural University 5:3. Each team had three humanoid robots playing in two 10-minute halves, relying on AI, sensors, and optical cameras to detect the ball and navigate the field with over 90% accuracy. Despite some limitations such as dynamic obstacle avoidance, the robots demonstrated the ability to walk, run, kick, and make split-second decisions autonomously, marking the first fully autonomous AI robot football match held in China. This tournament serves as a precursor to the upcoming 2025 World Humanoid Robot Sports Games, scheduled for August 15 to 17 in Beijing, which will showcase 11 humanoid sport events modeled on traditional human competitions, including track and field, gymnastics, soccer, and synchronized dancing.
robothumanoid-robotsautonomous-robotsAI-roboticsrobot-soccerrobotics-competitionartificial-intelligenceAutonomous robots to segregate radioactive waste at UK nuclear plant
The UK’s Nuclear Decommissioning Authority (NDA) is set to deploy autonomous robots at the former Oldbury nuclear power station site to segregate radioactive waste, marking a significant advancement in nuclear waste management. The project, named Auto-SAS (autonomous waste sorting and segregation system), aims to separate low-level waste from intermediate level waste retrieved from on-site vaults. This robotic system, developed through a collaboration between AtkinsRéalis and Createc (jointly known as ARCTEC), will use sensors and robotic manipulators to accurately categorize waste, thereby reducing reliance on costly disposal routes and enhancing safety by removing human workers from hazardous environments. The NDA has committed up to £9.5 million over four years to this initiative, which also involves Nuclear Restoration Services, Sellafield, and Nuclear Waste Services. The technology promises to save hundreds of millions of pounds in waste storage and disposal costs while enabling workers to develop new skills. UK Energy Minister Michael Shanks highlighted the project as a key innovation
roboticsautonomous-robotsnuclear-waste-managementradioactive-waste-segregationnuclear-decommissioningautomation-technologyhazardous-environment-safetyChina: Humanoid robots to dribble, score goals in 3-on-3 soccer game
China is hosting a groundbreaking robotic soccer event featuring four teams of humanoid robots competing in the finals of the RoBoLeague World Robot Soccer League on June 28, 2025, in Beijing’s Yizhuang Development Zone. This event marks the first fully autonomous 3-on-3 humanoid robot soccer game, with matches consisting of two 10-minute halves. The robots, developed by leading institutions such as Tsinghua University and Beijing Information Science and Technology University, use advanced optical cameras and sensors to detect the ball up to 65 feet away with over 90% accuracy. They autonomously make real-time decisions—such as passing, dribbling, or shooting—through AI powered by deep reinforcement learning, showcasing agility, strategy, and endurance without human control. This soccer competition serves as a preview for the upcoming 2025 World Humanoid Robot Sports Games scheduled for August 15–17 in Beijing, which will feature 11 humanoid sports events modeled on traditional athletic competitions,
robothumanoid-robotsAIrobotics-soccerautonomous-robotsdeep-reinforcement-learningrobot-sportsNBC’s AGT pushes Spot to perform under pressure
Boston Dynamics showcased its Spot quadruped robots on NBC’s America’s Got Talent (AGT), performing a live, choreographed dance routine to Queen’s “Don’t Stop Me Now.” Five Spots danced synchronously, using their robot arms to “lip-sync” Freddie Mercury’s vocals, impressing all four AGT judges who voted to advance the act. This high-profile appearance was both an entertainment milestone and a rigorous technical stress test for the robots and engineering team. The performance combined autonomous dancing via proprietary choreography software with teleoperated interactions, pushing Spot’s capabilities with aggressive moves like high-speed spins and one-legged balancing. These advanced maneuvers, enabled by recent improvements in reinforcement learning and dynamic behavior modeling, also enhance Spot’s real-world applications, such as maintaining balance on slippery factory floors. The decision to bring Spot to AGT followed successful live performances at the 2024 Calgary Stampede, which built confidence in managing the technical and logistical challenges of a live broadcast. Despite over 100
roboticsBoston-DynamicsSpot-robothumanoid-robotsrobot-performanceautonomous-robotsreinforcement-learningNEURA Robotics launches latest cognitive robots, Neuraverse ecosystem - The Robot Report
NEURA Robotics unveiled several key innovations at Automatica 2025 in Munich, including the third-generation 4NE1 humanoid robot, the market launch of the MiPA cognitive household and service robot, and the introduction of the Neuraverse open robotics ecosystem. The company, based in Metzingen, Germany, positions these developments as a milestone in cognitive robotics, aiming to make advanced robotic technology accessible to the mass market for the first time. NEURA emphasizes its integrated approach, combining hardware, software, and AI to create robots capable of autonomous perception, decision-making, and learning from experience. The company aims to deliver 5 million robots by 2030 across industrial, service, and home applications. The 4NE1 humanoid robot features multiple sensors, including a patented Omnisensor and seven cameras, enabling it to distinguish and interact safely with humans and objects in real environments. It boasts an intelligent dual-battery system for continuous operation, joint technology capable of lifting up to 100 kg
roboticscognitive-robotshumanoid-robotsartificial-intelligenceautonomous-robotsNeuraverse-ecosystemindustrial-robotsBlack-I Robotics wins autonomous mobile robot picking challenge
Black-I Robotics won the Chewy Autonomous Mobile Picking (CHAMP) Challenge, a competition organized by Chewy and MassRobotics to develop fully autonomous robots capable of handling large, heavy, and non-rigid items in complex warehouse environments. The challenge addressed significant difficulties in warehouse automation, such as manipulating irregularly shaped, deformable items weighing over 40 pounds, which are difficult to grasp using conventional methods. Black-I Robotics’ winning system combined a mobile base with a 6-DOF industrial arm and custom multi-modal end effectors, integrating AI-driven perception, precise object detection, and pose estimation to enable reliable grasping and navigation in tight aisles alongside live warehouse operations. Their solution demonstrated full autonomy, adaptability, and seamless integration into fulfillment workflows, earning them the $30,000 first-place prize. The CHAMP Challenge emphasized not only manipulation but also system-level integration, requiring robots to navigate narrow aisles, avoid dynamic obstacles, and place items into shipping containers with mixed contents. Twelve
roboticsautonomous-robotswarehouse-automationAI-perceptionrobotic-manipulationindustrial-robotsmobile-robotsRobot Talk Episode 126 – Why are we building humanoid robots? - Robohub
The article summarizes a special live episode of the Robot Talk podcast recorded at Imperial College London during the Great Exhibition Road Festival. The discussion centers on the motivations and implications behind building humanoid robots—machines designed to look and act like humans. The episode explores why humanoid robots captivate and sometimes unsettle us, questioning whether this fascination stems from vanity or if these robots could serve meaningful roles in future society. The conversation features three experts: Ben Russell, Curator of Mechanical Engineering at the Science Museum, Maryam Banitalebi Dehkordi, Senior Lecturer in Robotics and AI at the University of Hertfordshire, and Petar Kormushev, Director of the Robot Intelligence Lab at Imperial College London. Each brings a unique perspective, from historical and cultural insights to technical expertise in robotics, AI, and machine learning. Their dialogue highlights the rapid advancements in humanoid robotics and the ongoing research aimed at creating adaptable, autonomous robots capable of learning and functioning in dynamic environments. The episode underscores the multidisciplinary nature
roboticshumanoid-robotsartificial-intelligenceautonomous-robotsmachine-learningreinforcement-learningrobot-intelligenceSimbe, Coresight Research study finds retailers urgently need to reduce inefficiencies - The Robot Report
Simbe Robotics Inc. and Coresight Research released the “State of In-Store Retailing 2025” report, highlighting the urgent need for retailers to digitize stores through artificial intelligence and automation to address significant inefficiencies. Retailers currently lose $162.7 billion annually in margin due to in-store inefficiencies—a 27% increase from 2024—primarily driven by shrinkage, manual tasks, and employee turnover. Key operational challenges include promotion execution errors (39%), product pricing errors (37%), and misplaced or missing items on shelves (37%). Although 66% of retailers have begun adopting store intelligence technologies, only 20% have fully scaled these solutions, indicating substantial room for growth. Investment in store intelligence and automation technologies is rising sharply, with a 151% year-over-year increase in planned spending and notable adoption gains in shelf-digitization robotics. Simbe’s autonomous Tally robot exemplifies the benefits of automation, enabling retailers like ShopRite to reduce out
robotautomationretail-technologystore-intelligenceinventory-managementAI-in-retailautonomous-robotsPrismaX launches with $11M to scale virtual datasets for robotics foundation models - The Robot Report
PrismaX, a San Francisco-based startup founded in 2024 by Bayley Wang and Chyna Qu, has launched with $11 million in funding to address key challenges in the physical AI and robotics industry related to data quality, model development, and scalability. The company is developing a robotics teleoperations platform aimed at creating a decentralized ecosystem that incentivizes the collection and use of high-quality visual datasets. PrismaX’s approach focuses on establishing fair use standards where revenue generated from data powering AI models is shared with the communities that produce it, thereby tackling issues of data scarcity, bias, and affordability that have hindered robotics advancements. The platform is built around three foundational pillars: data, teleoperation, and models. PrismaX plans to validate and incentivize visual data to scale robotics datasets comparable to text data, define uniform teleoperation standards to streamline operator access and payments, and collaborate with AI teams to develop foundational models that enable more autonomous robots. This integrated approach aims to create a “data flywheel
roboticsartificial-intelligenceteleoperationdata-scalabilityautonomous-robotsrobotics-foundation-modelsdecentralized-technologyNew robot swarm builds resilient structures without human interference
Engineers at the University of Pennsylvania have developed a novel swarm robotics system inspired by insect colonies, enabling robots to build resilient honeycomb-like structures without centralized plans, blueprints, or coordination. Mimicking how bees, ants, and termites construct complex nests through local environmental cues, these robots follow simple mathematical rules to self-assemble by reacting only to their immediate surroundings. This decentralized approach allows the swarm to continue building even if individual robots fail, enhancing resilience and adaptability in unpredictable conditions. The research team fine-tuned the swarm’s behavior through extensive simulations, adjusting parameters such as speed and turn angle to influence the geometry and toughness of the resulting structures. Their findings build on prior insights that introducing disorder into honeycomb lattices can increase material toughness, demonstrating that swarm behavior can autonomously generate such beneficial variations. While still primarily in simulation, early physical prototypes have been created, and future work aims to translate the system to real-world applications, potentially using electrochemical methods to grow metal structures. This approach represents a
roboticsswarm-roboticsdecentralized-manufacturingautonomous-robotsresilient-structuresbio-inspired-robotsself-assemblyANYbotics launches Gas Leak and Presence Detection for ANYmal inspection robot - The Robot Report
ANYbotics AG has introduced a new Gas Leak and Presence Detection capability for its autonomous quadruped robot, ANYmal, aimed at improving safety and reducing costs in industrial environments such as petrochemical plants. Gas leaks, often invisible and costly—potentially exceeding $57,000 annually per leak—pose significant safety and financial risks. Traditional manual inspections are often infrequent, inconsistent, and miss early warning signs like subtle temperature changes or unusual noises, allowing leaks to go undetected. ANYmal’s new system combines autonomous navigation with modular gas detectors and a 360° acoustic imaging payload to precisely locate leaks, quantify gas concentrations, and alert personnel in real time, thereby enhancing operational efficiency and safety while lowering emissions. The integrated acoustic imaging camera can access difficult-to-reach areas and detect a broad range of gases, including steam, compressed air, vacuum, toxic gases, and hydrocarbons. The modular design supports hot-swappable detectors for different gases such as oxygen, hydrocarbons, and ammonia, allowing
robotgas-leak-detectionindustrial-safetyautonomous-robotsmodular-sensorsacoustic-imagingANYmalU.S. Air Force gives additional funding to Palladyne AI
Palladyne AI has received additional funding from the U.S. Air Force to advance its Palladyne IQ software, which enables robots to perceive and adapt to dynamic real-world environments. This funding is part of a multi-million-dollar Phase II contract with the Air Logistics Complex at Warner Robins Air Force Base, Georgia, where Palladyne AI is working to automate complex remediation tasks on aircraft components. The company recently completed key Military Utility Assessment milestones, including autonomous media blasting on aircraft parts and automated sanding at height using commercial robotic systems, demonstrating both the technical feasibility and operational value of their AI-driven robotic automation. The project is in its second year of a potential four-year effort under the Air Force’s Strategic Funding Increase (STRATFI) program, initiated through AFWERX AFVentures. Palladyne AI’s software operates on the edge, reducing programming effort and enabling autonomous capabilities for industrial and collaborative robots in complex environments. Beyond defense, the company highlights broad applicability of its technology across sectors
robotAI-softwareautonomous-robotsrobotic-automationindustrial-robotscollaborative-robotsmilitary-roboticsHow Warp is introducing robots to automate its network of warehouses
Warp, founded in 2021, aims to enhance supply chain efficiency by automating workflows within its network of warehouses using robotics. While the company acknowledges it cannot automate long-haul trucking or last-mile delivery, it focuses on warehouse operations. Warp began by creating a digital twin of its Los Angeles test warehouse using computer vision and cameras, allowing experimentation with automation strategies. Initial attempts to train humanoid robots to operate pallet jacks failed, but success came from retrofitting off-the-shelf robots with additional technology. Warp integrates AI tools—including voice, text, and email—with robotics to streamline unloading, storing, and reloading freight, aiming to reduce labor needs without expanding headcount. Warp’s robotic solutions are intended to benefit its warehouse partners, who face staffing challenges and labor dissatisfaction. Although Warp does not own most warehouses in its network, it provides robotic kits to empower these partners, improving operational efficiency and reducing costs. The company recently raised $10 million in a Series A funding round led by Up
robotwarehouse-automationlogistics-technologysupply-chain-roboticsAI-in-logisticsautonomous-robotsdigital-twin-simulationSam Altman-backed Coco Robotics raises $80M
Coco Robotics, a Los Angeles-based startup specializing in last-mile delivery robots, has raised $80 million in a new funding round, bringing its total capital raised to over $120 million. The round included returning angel investors Sam Altman and Max Altman, alongside venture capital firms such as Pelion Venture Partners and Offline Ventures. Previously, Coco secured a $36 million Series A round in 2021. Founded in 2020 by Brad Squicciarini and Zach Rash, the company’s zero-emissions robots have completed more than 500,000 deliveries since launching in 2020 and can carry up to 90 liters of groceries or goods. Coco partners with national retailers including Subway, Wingstop, and Jack in the Box. Sam Altman’s involvement extends beyond personal investment; Coco benefits from access to OpenAI’s technology, while OpenAI gains valuable real-world data collected by Coco’s robots to enhance its AI models. This symbiotic relationship underscores the strategic value of the partnership
robotlast-mile-deliveryautonomous-robotszero-emissionsAI-integrationfundingstartupMeta’s new AI helps robots learn real-world logic from raw video
Meta has introduced V-JEPA 2, an advanced AI model trained solely on raw video data to help robots and AI agents better understand and predict physical interactions in the real world. Unlike traditional AI systems that rely on large labeled datasets, V-JEPA 2 operates in a simplified latent space, enabling faster and more adaptable simulations of physical reality. The model learns cause-and-effect relationships such as gravity, motion, and object permanence by analyzing how people and objects interact in videos, allowing it to generalize across diverse contexts without extensive annotations. Meta views this development as a significant step toward artificial general intelligence (AGI), aiming to create AI systems capable of thinking before acting. In practical applications, Meta has tested V-JEPA 2 on lab-based robots, which successfully performed tasks like picking up unfamiliar objects and navigating new environments, demonstrating improved adaptability in unpredictable real-world settings. The company envisions broad use cases for autonomous machines—including delivery robots and self-driving cars—that require quick interpretation of physical surroundings and real
roboticsartificial-intelligencemachine-learningautonomous-robotsvideo-based-learningphysical-world-simulationAI-modelsNVIDIA Isaac, Omniverse, and Halos to aid European robotics developers - The Robot Report
At the GPU Technology Conference (GTC) in Paris, NVIDIA announced new AI-driven tools and platforms aimed at advancing robotics development, particularly for European manufacturers facing labor shortages and sustainability demands. Central to this initiative is NVIDIA Isaac GR00T N1.5, an open foundation model designed to enhance humanoid robot reasoning and skills, now available on Hugging Face. Alongside this, the company released Isaac Sim 5.0 and Isaac Lab 2.2, open-source robotics simulation frameworks optimized for NVIDIA RTX PRO 6000 systems, enabling developers to better train, simulate, and deploy robots across various applications. NVIDIA’s approach for the European robotics ecosystem revolves around a “three-computer” strategy: DGX systems and GPUs for AI model training, Omniverse and Cosmos platforms on OVX systems for simulation and synthetic data generation, and the DRIVE AGX in-vehicle computer for real-time autonomous driving processing. This scalable architecture supports diverse robotic forms, from industrial robots to humanoids. Several European robotics companies are actively integrating NVIDIA’s stack—Agile Robots uses Isaac Lab to train dual-arm manipulators, idealworks extends Omniverse Blueprints for humanoid fleet simulation, Neura Robotics collaborates with SAP to refine robot behavior in complex scenarios, Vorwerk enhances home robotics models with synthetic data pipelines, and Humanoid leverages the full NVIDIA stack to significantly reduce prototyping time and improve robot cognition. Overall, NVIDIA’s new tools and collaborative ecosystem aim to accelerate the development and deployment of smarter, safer robots in Europe, addressing critical challenges such as labor gaps and the need for sustainable manufacturing and automation solutions.
roboticsartificial-intelligenceNVIDIA-Isaacrobot-simulationautonomous-robotsindustrial-robotsAI-driven-manufacturingInterview with Amar Halilovic: Explainable AI for robotics - Robohub
Amar Halilovic, a PhD student at Ulm University in Germany, is conducting research on explainable AI (XAI) for robotics, focusing on how robots can generate explanations of their actions—particularly in navigation—that align with human preferences and expectations. His work involves developing frameworks for environmental explanations, especially in failure scenarios, using black-box and generative methods to produce textual and visual explanations. He also studies how to plan explanation attributes such as timing, representation, and duration, and is currently exploring dynamic selection of explanation strategies based on context and user preferences. Halilovic finds it particularly interesting how people interpret robot behavior differently depending on urgency or failure context, and how explanation expectations shift accordingly. Moving forward, he plans to extend his framework to enable real-time adaptation, allowing robots to learn from user feedback and adjust explanations on the fly. He also aims to conduct more user studies to validate the effectiveness of these explanations in real-world human-robot interaction settings. His motivation for studying explainable robot navigation stems from a broader interest in human-machine interaction and the importance of understandable AI for trust and usability. Before his PhD, Amar studied Electrical Engineering and Computer Science in Bosnia and Herzegovina and Sweden. Outside of research, he enjoys traveling and photography and values building a supportive network of mentors and peers for success in doctoral studies. His interdisciplinary approach combines symbolic planning and machine learning to create context-sensitive, explainable robot systems that adapt to diverse human needs.
roboticsexplainable-AIhuman-robot-interactionrobot-navigationAI-researchPhD-researchautonomous-robotsMilan Kovac, Head Of Tesla Optimus Program, Departs - CleanTechnica
Milan Kovac, the head of engineering for Tesla's Optimus humanoid robot program, announced his departure on June 6, 2025, citing a desire to spend more time with his family. Despite initial speculation about possible conflicts with Elon Musk, Kovac expressed strong support for Musk and confidence in the Tesla team’s ability to advance the Optimus project. Tesla confirmed that Ashok Elluswamy, head of the Autopilot team, will assume leadership of the Optimus program. Both Kovac and Musk exchanged respectful public statements, indicating an amicable transition without signs of urgency or internal strife. Elon Musk continues to position Optimus as a transformative product, claiming it could be Tesla’s most significant development ever, with a target price around $30,000. Musk envisions the robot performing everyday tasks such as folding clothes, walking dogs, and cleaning dishes, and he has publicly declared that autonomy and Optimus are Tesla’s long-term priorities. However, skepticism remains among observers and industry watchers, who note that Tesla’s promotional videos—like the one showing Optimus folding a shirt—are staged and that the robot is not yet capable of autonomous operation. Critics draw parallels to previous Tesla hype cycles, such as with Full Self-Driving (FSD) technology, suggesting that Optimus may be similarly overpromised and years away from practical reality. Overall, while Kovac’s departure marks a leadership change, Tesla’s commitment to the Optimus project remains firm under Musk’s vision. Yet, the program faces scrutiny over its current capabilities versus public expectations, highlighting the challenges Tesla faces in delivering on its ambitious humanoid robot goals.
robotTesla-Optimushumanoid-robotautonomous-robotsAI-roboticspersonal-assistant-robotsrobotics-engineeringSpot robot dog gets AI boost to detect equipment failures early
Boston Dynamics has enhanced its Spot robot dog through an upgraded version of its Orbit intelligent automation platform, aimed at advancing predictive industrial maintenance. The new system enables Spot to autonomously inspect industrial sites, capturing consistent visual data that Orbit analyzes using vision-language prompts to quickly identify hazards such as overheating motors, air leaks, safety risks, corrosion, and equipment deterioration. This approach addresses traditional gaps in condition-based monitoring by providing repeatable, detailed inspections and transforming visuals into actionable insights, including numerical data and descriptive text. A notable addition is the Site View feature, which creates a lightweight visual history of facilities using 360° images, supporting remote monitoring and condition tracking over time. The updated Orbit platform also introduces centralized fleet management dashboards for enterprise users, allowing oversight of multiple robots across sites with customizable user permissions and detailed activity logs. Privacy is maintained through an automatic face-blurring function in images captured by Spot’s cameras. Software updates can be deployed over the air to multiple robots simultaneously, and Orbit can be hosted on-premise or in the cloud as a virtual machine. Integration with third-party systems is facilitated via APIs, webhooks, and a low-code beta for automated work order generation. Additionally, a dynamic thermal thresholding feature helps automatically detect temperature anomalies by analyzing statistical data, reducing the need for expert intervention and enhancing early failure detection in industrial environments.
robotAIpredictive-maintenanceindustrial-automationBoston-Dynamicsfacility-inspectionautonomous-robots