Articles tagged with "sensors"
New sensor achieves record-level alcohol sensitivity at ultra-low power
Researchers from Yonsei University and collaborators have developed a novel low-power gas sensor that achieves ultra-sensitive detection of ethanol at parts-per-billion levels. The sensor integrates ultrathin ruthenium dioxide nanosheets with a tin dioxide thin film, creating a hybrid structure that significantly enhances ethanol detection. The ruthenium dioxide nanosheets provide a high surface area and strong catalytic activity, accelerating ethanol molecule reactions on the sensor surface. Additionally, interactions between the nanosheets and tin dioxide amplify the electron depletion layer, increasing changes in electrical resistance and making the sensor over three times more responsive than conventional devices. Built on a suspended membrane with a microheater, the sensor operates continuously using less than 30 milliwatts of power, detecting ethanol concentrations from 10 parts per million down to about 5 parts per billion. It demonstrated stable performance over nearly a month, resisted interference from common gases, and reliably tracked real-time breath alcohol levels consistent with commercial breathalyzers. The design’s compatibility with existing microfabrication techniques
energymaterialssensorslow-power-technologynanomaterialsgas-detectionethanol-sensingLuminar receives a larger $33 million bid for its lidar business
Luminar’s lidar business received a higher bid of $33 million from MicroVision during a court-run auction in the company’s bankruptcy case, surpassing the previous $28 million offer from Quantum Computing Inc., the stalking horse bidder. MicroVision, a company specializing in lidar sensor development, plans to acquire Luminar’s intellectual property, inventory related to Iris and Halo lidar sensors, key engineering and operations talent, and certain commercial contracts. MicroVision’s CEO Glen DeVos emphasized the company’s intent to disrupt and consolidate the lidar market by leveraging its automotive leadership and diverse lidar sensor portfolio to enhance commercial adoption and safety. The sale is pending approval by the bankruptcy judge, with a hearing scheduled shortly. It remains unclear whether Luminar’s founder and former CEO Austin Russell submitted a bid, though he had expressed interest through his new venture, Russell AI Labs. The bankruptcy proceedings have involved legal disputes between Russell and Luminar, including a subpoena and protective order concerning Russell’s personal information. If the sale is approved,
lidarautonomous-vehiclessensorsroboticsautomotive-technologyadvanced-perceptionMicroVisionUber launches an ‘AV Labs’ division to gather driving data for robotaxi partners
Uber has launched a new division called Uber AV Labs to collect and share real-world driving data with its autonomous vehicle (AV) partners, including companies like Waymo, Waabi, and Lucid Motors. Although Uber exited direct robotaxi development after a fatal accident in 2018 and sold off its AV division in 2020, it is now leveraging its fleet to gather sensor data (lidars, radars, cameras) from vehicles operating in cities. This data aims to support AV companies in training their systems, especially as the industry shifts from rule-based approaches to reinforcement learning, where extensive real-world data is critical for handling rare and complex driving scenarios. Currently, Uber AV Labs is in an early prototype phase with a single Hyundai Ioniq 5 equipped with sensors, and plans to scale up gradually. The data collected will not be provided raw; instead, Uber will process and refine it to create a semantic understanding layer that partners can use to improve their autonomous driving software. Additionally, Uber
robotautonomous-vehiclesrobotaxisensorsdata-collectionUber-AV-Labsreinforcement-learningThis shape-shifting graphene material may power next-gen soft robots
Researchers at McGill University have developed ultra-thin graphene oxide films that can fold, move, and sense motion like animated origami, paving the way for advanced soft robotics and adaptive devices. These graphene oxide sheets are both strong and flexible, overcoming previous limitations of brittleness and manufacturing challenges. The material can be folded into complex shapes without cracking, enabling soft robots that operate safely around humans without rigid parts or heavy motors. The folded structures respond to environmental triggers such as humidity, opening and closing reversibly, or can be embedded with magnetic particles for remote control via external magnetic fields. This versatility allows the same base material to be adapted for diverse applications, from medical tools navigating delicate spaces to smart packaging reacting to environmental changes. Beyond actuation, the graphene oxide layers exhibit changes in electrical conductivity as they bend or fold, enabling the material to sense its own motion. This integrated sensing-actuation capability reduces the need for separate components, simplifying design and minimizing size. The researchers describe these as the first reconfig
robotsoft-roboticsgraphene-oxideorigami-materialsactuatorssmart-materialssensorsAutonomous microrobots finally break the millimeter barrier
Researchers from the University of Pennsylvania and the University of Michigan have developed autonomous microrobots that break the longstanding millimeter-size barrier, achieving fully integrated sensing, computation, and motion control at a scale of just 210 × 340 × 50 micrometers—about the size of a paramecium. This represents a volume roughly 10,000 times smaller than previous programmable robots. Unlike earlier microrobots that rely on external control systems such as magnetic coils or ultrasound arrays, these new robots operate independently, sensing their environment, making decisions, and acting autonomously. The devices are manufactured using fully lithographic processes, enabling low-cost production (under a penny per unit at scale), and can be programmed wirelessly via LED light to perform complex behaviors like climbing temperature gradients and encoding sensor data through movement patterns. Historically, microrobots have faced a fundamental trade-off: either be very small but externally controlled with no onboard intelligence, or be larger (around one millimeter)
roboticsmicrorobotsautonomous-robotsmicrotechnologysensorsonboard-computingmedical-roboticsEthernovia raises $90M as investors rush to fund ‘Physical AI’
Ethernovia, a San Jose-based company specializing in Ethernet-based processors that facilitate rapid data transfer from distributed sensors to central computers in systems like autonomous vehicles, has raised $90 million in a Series B funding round. This investment reflects growing interest in "Physical AI," a sector focused on applying AI advancements to tangible technologies such as robotics and autonomous vehicles. The funding round was led by Maverick Silicon, an AI-focused fund launched in 2024 by Maverick Capital, marking the hedge fund’s first sector-specific fund in its 30-year history. Existing investors Porsche SE and Qualcomm Ventures also participated. The influx of capital into Ethernovia highlights a broader trend where investors are increasingly channeling funds into companies that provide critical infrastructure and behind-the-scenes technology enabling the practical deployment of AI in physical systems. This shift indicates heightened investor confidence in the potential of Physical AI to transform industries by integrating AI with hardware, signaling that more under-the-radar companies in this space are likely to attract significant funding in the
robotAIautonomous-vehiclesEthernet-processorssensorsPhysical-AIrobotics-technologyProsthetic hands get identification boost to predict precise grip strength need
Researchers at Guilin University of Electronic Technology in China have developed an advanced prosthetic hand system that integrates vision and machine learning to automate and optimize grip strength. Traditional prosthetics use Electromyography (EMG) sensors to detect a user’s intent to grasp but cannot accurately determine the necessary pressure, forcing users to consciously adjust their grip to avoid crushing or dropping objects. The new system employs a palm-mounted camera combined with pressure sensors on the prosthetic fingertips and EMG signals from the forearm. When the user reaches for an object, the camera identifies it, and a machine learning algorithm references a database of required grip strengths for common items, enabling the prosthetic to apply the appropriate force automatically. This innovation aims to make prosthetic hand use more intuitive by freeing users from the mental burden of calculating grip strength, allowing them to focus on the task itself. The researchers are also working on adding haptic feedback to create a two-way communication system that sends tactile sensations back to the user, enhancing the lif
robotprostheticsmachine-learningsensorsEMGhaptic-feedbackassistive-technologyChina-Chile team launch mission to study 435-mile-deep Atacama Trench
A joint China-Chile expedition has launched a three-month mission (January–March) to explore a 435-mile section of the Atacama Trench, a deep subduction zone in the eastern Pacific Ocean where the Nazca and South American tectonic plates collide. Operating from the Chilean port of Valparaiso, the mission is led by Chinese researcher Du Mengran and represents the largest deep-sea operation ever conducted in the region. The expedition utilizes the advanced Chinese manned submersible Fendouzhe ("Striver"), capable of reaching depths over 10,000 meters, allowing scientists to directly observe and sample the trench’s extreme environment, including chemosynthetic life forms that survive without sunlight. The mission aims to address three critical scientific frontiers: improving understanding of seismic activity to enhance tsunami and earthquake disaster prevention, investigating the trench’s role in global carbon cycling, and searching for rare biochemical compounds that could lead to medical breakthroughs. The team employs cutting-edge technology such as autonomous robotic
robotdeep-sea-submersibleautonomous-robotic-landerssensorsenergy-harvestingmaterials-scienceocean-explorationTechCrunch Mobility: ‘Physical AI’ enters the hype machine
The article from TechCrunch Mobility highlights the growing prominence of "physical AI" or "embodied AI" showcased at the 2026 Consumer Electronics Show (CES) in Las Vegas. With traditional U.S. automakers notably absent, the event was dominated by autonomous vehicle technology firms, Chinese automakers, and companies specializing in AI-driven robotics and automotive chips. Physical AI refers to AI systems integrated with sensors, cameras, and motor controls that enable machines—such as humanoid robots, drones, and autonomous vehicles—to perceive and interact with the physical world. Hyundai, for example, featured a range of robots, including those from its subsidiary Boston Dynamics, and innovations like an autonomous vehicle charging robot and a four-wheel electric platform called Mobile Eccentric Droid (MobEd), set for production in 2026. The enthusiasm around humanoid robots was significant, with industry leaders like Mobileye’s Amnon Shashua acknowledging the hype but affirming the long-term reality and potential of humanoid robotics despite
robotautonomous-vehiclesphysical-AIembodied-AIroboticselectric-vehiclessensorsUK flies first autonomous helicopter with over one-tonne payload
The UK achieved a significant milestone in aviation with the maiden flight of Proteus, the country’s first fully autonomous full-size helicopter, at Predannack airfield in Cornwall. Developed by Leonardo for the Royal Navy, Proteus is a technology demonstrator designed to operate alongside crewed aircraft within a future hybrid air wing. Unlike smaller drones currently in service, Proteus matches the scale and capability of conventional helicopters, carrying payloads exceeding one tonne and capable of operating in challenging maritime conditions. Its advanced sensors and onboard computers enable real-time environmental assessment and autonomous decision-making, reducing risks to personnel and freeing crewed helicopters for other tasks. Built in Yeovil at a cost of £60 million, the Proteus programme supports around 100 skilled UK jobs and represents a major step in British helicopter innovation. The helicopter plays a central role in the UK’s Atlantic Bastion strategy to defend the North Atlantic and NATO allies through advanced hybrid forces. It can support anti-submarine warfare, maritime patrol, and surveillance missions
robotautonomous-helicoptermilitary-aviationsensorscontrol-systemsLeonardounmanned-aerial-vehiclePhotos: 1,044 marine animal observations analyzed for tidal turbine collision risks
Researchers in Washington State conducted a 141-day study using a small cross-flow tidal turbine equipped with optical cameras and sensors to assess collision risks between marine animals and tidal energy infrastructure. Over 109 days of optical monitoring, they recorded 1,044 observations of fish, seabirds, and seals. Notably, no collisions were observed involving seabirds or seals; seabirds appeared only during daylight when the turbine was stationary, while seals were present day and night, including when the turbine was rotating. Four fish collisions with turbine blades were documented, with most fish successfully avoiding contact, even at water flow speeds exceeding 2 m/s. This study is the first in North America to use optical imagery to directly observe interactions between marine wildlife and tidal turbines, moving beyond theoretical risk assessments. Data collection employed scheduled recordings and real-time sensor-triggered footage, refined during the study to improve detection. Machine learning models were tested for identifying animals underwater, revealing challenges such as differentiating wildlife from organic matter. The behavioral insights
energytidal-turbinesmarine-wildliferenewable-energysensorsoptical-camerasenvironmental-monitoringPhotos: 1,044 marine animal observations analyzed for tidal turbine collision risks
Researchers in Washington State conducted a 141-day study using a small-scale cross-flow tidal turbine equipped with optical cameras and sensors to assess collision risks between marine animals and tidal energy infrastructure. Over 109 days of optical monitoring, they recorded 1,044 observations of fish, seabirds, and seals. No collisions were observed involving seabirds or seals; seabirds appeared only when the turbine was stationary, while seals were present both day and night, including when the turbine was rotating. Four fish collisions with moving turbine blades were documented, with most fish successfully avoiding contact—over 50 fish passed without collision for every one that collided, even at flow speeds exceeding 2 m/s. This study is notable as the first in North America to use optical camera imagery to directly observe interactions between marine wildlife and tidal turbines, moving beyond theoretical risk assessments. The researchers employed two data collection methods—scheduled recording and real-time detection triggered by optical or acoustic sensors—and evaluated machine learning models to improve animal detection underwater. Challenges
energytidal-turbinemarine-wildliferenewable-energysensorsoptical-camerasenvironmental-monitoringLuminar lines up $22 million bidder for its lidar business
Luminar, a lidar technology company that filed for Chapter 11 bankruptcy in December 2025, has agreed to sell its lidar business to Quantum Computing Inc. for $22 million, subject to higher bids by a deadline on Monday. This sale follows Luminar’s plan to sell its semiconductor subsidiary to the same buyer for $110 million. Both transactions require approval from the bankruptcy court in the Southern District of Texas. Quantum Computing Inc. has been named the “stalking horse bidder,” setting a minimum price to discourage low offers. Luminar aims to expedite the bankruptcy process with support from its largest creditors, primarily financial institutions. The $22 million stalking horse bid marks a dramatic decline from Luminar’s peak valuation of approximately $11 billion in 2021, a period when the company was expected to secure large-scale contracts with automakers like Volvo, Mercedes-Benz, and Polestar—deals that eventually fell through. Austin Russell, Luminar’s founder and former CEO, has shown interest in bidding
robotlidarautonomous-vehiclessensorsquantum-computingbankruptcyautomotive-technologyClearX shoe-cleaning robot uses sensors to wash and dry footwear
Brolan is introducing ClearX, an intelligent shoe-cleaning robot unveiled at CES 2026, designed to automate the process of washing, drying, and optionally sanitizing footwear. ClearX uses built-in sensors to analyze shoe material and dirt levels, automatically selecting an appropriate cleaning method. Unlike traditional approaches, it employs micro-bubble cleaning technology that cleans shoes using only water, avoiding harsh detergents. The drying process is low-temperature and gentle to protect delicate materials, while leather shoes are cleaned with a specialized mechanical method involving a water-soaked roller instead of direct water exposure. The system is designed for everyday home use and is compatible with most shoe types that can safely contact water. It features two 40-liter tanks to separate clean and dirty water, enabling water recycling and reducing waste, with each cleaning cycle using about one liter of water per pair of shoes. ClearX can operate as a standalone portable unit or connect directly to a water hose. Although not yet commercially available, Brolan plans
robotautomationsensorsshoe-cleaningmicro-bubble-technologysmart-home-devicewater-efficiencyWaymo Zeekr A No-Go, Waymo Ojai A Go - CleanTechnica
The article discusses Waymo’s ongoing development and deployment of its robotaxi fleet, focusing on the transition from the initially revealed Zeekr-based vehicle to the newly branded Waymo Ojai. While the Zeekr robotaxi was anticipated for some time, Waymo has decided not to emphasize the Chinese EV supplier’s name in its branding, instead naming the vehicle "Ojai" after a Californian village known for arts and wellness. The Ojai robotaxi, showcased at CES 2026, includes a steering wheel to comply with U.S. regulations and is equipped with an array of sensors—13 cameras, 4 lidar, 6 radar, and external audio receivers—along with innovative features like tiny sensor wipers. The vehicle is designed to greet passengers with a personalized “Oh, hi,” reflecting a friendly user experience. Despite significant progress and rapid scaling of Waymo’s robotaxi operations, some technical challenges remain, particularly with sensor interpretation around bridges. The article highlights incidents in Venice,
robotautonomous-vehiclesWaymorobotaxisensorslidarAIWaymo is rebranding its Zeekr robotaxi
Waymo is rebranding its Zeekr RT robotaxi as the Ojai, named after a California village known for its arts and wellness focus. This change comes as the vehicle, developed in partnership with Chinese automaker Zeekr, prepares to join Waymo’s commercial fleet. The rebranding aims to improve U.S. market familiarity, as the Zeekr name is relatively unknown domestically and may impact rider experience. The Ojai robotaxi will feature a personalized greeting (“Oh hi”) for passengers, reflecting the new name’s pronunciation. The Ojai has undergone several years of development and testing in cities like Phoenix and San Francisco, evolving from a concept vehicle without a steering wheel to a refined model showcased at CES 2026 that includes one. It retains its advanced sensor suite—13 cameras, four lidar units, six radar sensors, and external audio receivers—though its paint color has shifted from a blueish tint to a more silver hue. Currently, Waymo employees and their acquaintances
robotautonomous-vehiclesrobotaxiWaymosensorslidartransportation-technologyHow Quilt solved the heat pump’s biggest challenge
Quilt, a smart home startup, has launched a three-zone heat pump system that significantly improves efficiency under challenging conditions that typically hinder traditional heat pumps. This new system features one outdoor unit capable of driving three indoor heads, simplifying large installations, reducing costs, and minimizing the outdoor footprint. Central to Quilt’s innovation is its extensive use of data collected from over a thousand internet-connected units equipped with numerous sensors. By analyzing real-world operational data, Quilt engineers enhanced heat pump capacity by 20% through a software update and developed the three-zone unit that overcomes common multi-zone heat pump challenges, particularly maintaining compressor stability at low speeds. Unlike most systems that sacrifice efficiency and comfort by stopping the compressor at low speeds, Quilt’s data-driven approach and design improvements—including a larger copper coil paired with a smaller compressor—allow the unit to operate efficiently even at very low temperatures (down to -13˚F/-25˚C). This design delivers nearly 90% of its rated capacity without losing low-demand performance
energyheat-pumpsmart-homeIoTHVACenergy-efficiencysensorsMercedes Launches Parking Lot to Destination Driver Assist in USA - CleanTechnica
Mercedes has introduced its MB.DRIVE ASSIST PRO, an SAE-Level 2 driver-assist system, in the United States starting with the new electric CLA model. This technology integrates advanced driver assistance with navigation, enabling the vehicle to assist with driving from parking lots to destinations in city environments. The system features a cooperative steering approach that allows steering adjustments without deactivating the assistance, enhancing safety and convenience. The MB.DRIVE ASSIST PRO leverages a sophisticated sensor suite comprising 30 sensors, including 10 cameras, 5 radar sensors, and 12 ultrasonic sensors, feeding data into a powerful NVIDIA AI-powered supercomputer capable of 508 TOPs (trillions of operations per second). Developed in partnership with NVIDIA, the system uses full-stack software to deliver its capabilities. Notably, this technology was first launched in China at the end of 2023 before its rollout in the U.S. later in 2024. While its performance relative to Tesla’s Full Self-
robotautonomous-drivingdriver-assist-technologysensorsAINVIDIAelectric-vehiclesThe Future of War Was Built in 2025 — Here’s What You Missed
The article highlights how 2025 marked a transformative year in military strategy and technology, emphasizing that modern warfare extends far beyond individual weapons to encompass entire systems of production, logistics, and software-driven adaptability. Key examples include the U.S. Navy’s reintroduction of the USS Utah as a Virginia-class fast-attack submarine, reflecting a shift from traditional naval power based on visibility and mass to stealth, advanced sensors, and precision strikes. Concurrently, investments in Columbia-class ballistic missile submarines underscore the continued importance of stealthy deterrence platforms. Beyond platforms, 2025 revealed a growing focus on industrial geography and rapid reinforcement capabilities, exemplified by efforts to develop Subic Bay as a forward shipbuilding and logistics hub in the Indo-Pacific. Technological advances such as high-power microwave weapons to counter drone swarms, robotic ground systems integrated with soldiers, and enhanced sensors challenging conventional stealth illustrate a battlefield increasingly dominated by software and electromagnetic warfare. Airpower is evolving in two directions—toward autonomous,
robotenergymaterialssensorsdirected-energy-weaponsmilitary-technologyautonomous-systemsAmazon’s Ring doorbells get fire alerts, an app store and new sensors
Amazon has introduced several new features and devices to enhance its Ring smart doorbell ecosystem. The updates include new Ring Sensors capable of detecting motion, openings, glass breakage, smoke, carbon monoxide, leaks, temperature changes, and air quality, while also allowing control over connected lighting and appliances. Additionally, Amazon is launching an app store within the Ring app (initially in the U.S.) to enable integration with third-party apps focused on small business operations and everyday home needs. To address increasing fire risks, especially in drought-affected areas, Amazon has partnered with an unspecified entity to provide real-time fire updates and early warnings through the Ring app’s Neighbors section, where users can also share live camera feeds. The new devices support Amazon’s Sidewalk network, which creates a mesh network among Echo and Ring devices to maintain connectivity beyond Wi-Fi range. Furthermore, Ring cameras now feature an AI-based “AI Unusual Event Alerts” system that learns property activity patterns and notifies users of unusual events,
IoTsmart-homeRing-doorbellsensorsAI-alertsmesh-networkAmazon-SidewalkPhotos: Lego’s new bricks react to movement and sounds without using screens
At CES 2026, Lego introduced Smart Play, a new system that integrates light, sound, and motion responses into traditional Lego bricks without using screens, preserving the classic hands-on building experience. Central to Smart Play is a standard-looking 2×4 brick embedded with advanced electronics, including a patented ASIC chip, accelerometer, LED array, speaker, and motion sensors. The system also features Smart Tag tiles with unique digital IDs and Smart Minifigures, enabling physical models to react dynamically to movement and proximity, such as a helicopter brick lighting up and playing propeller sounds that vary with motion. Smart Play includes BrickNet, a Bluetooth-based protocol allowing multiple Smart Bricks to communicate and coordinate effects across larger builds, with enhanced encryption and privacy controls. The system requires no setup or pairing, runs on rechargeable internal batteries charged wirelessly, and supports firmware updates via a phone app for ongoing feature additions. The first Smart Play sets, themed around Star Wars, will launch on March 1,
robotIoTsmart-toysBluetoothsensorsinteractive-playwireless-chargingLEGO SMART Bricks introduce a new way to build — and they don’t require screens
LEGO has introduced the SMART Play system, a new interactive building experience that does not require screens. The system features SMART Bricks, SMART Tag tiles, and SMART Minifigures that interact through near-field magnetic positioning. SMART Tags are 2×2 studless tiles with unique digital IDs that instruct SMART Bricks and Minifigures how to behave. For example, a helicopter set’s SMART Tag triggers lights and propeller sounds on the SMART Brick, which also uses an accelerometer to respond dynamically to movement, enhancing play realism. The SMART Bricks contain a patented ASIC chip smaller than a LEGO stud, equipped with a miniature speaker, accelerometer, and LED array. LEGO has developed a Bluetooth-based protocol called BrickNet, enabling multiple SMART Bricks to communicate securely with encryption and privacy controls. The system requires no setup or pairing, making it easy for children to use and appealing to parents due to its screen-free design. LEGO’s first SMART Play sets, both Star Wars-themed,
IoTsmart-toysBluetoothsensorsaccelerometernear-field-communicationencryptionChina’s Hesai will double production as lidar sensor industry shakes out
Chinese lidar manufacturer Hesai plans to double its production capacity from 2 million to 4 million units in 2026, aiming to dominate the global lidar sensor market. This expansion follows Hesai surpassing 1 million units in 2025 and is driven by growing demand in automotive and robotics sectors. Hesai highlighted that lidar sensors are now installed in 25% of new electric vehicles in China, with many cars integrating three to six sensors each, significantly broadening the company’s market potential. Hesai currently serves 24 automotive customers, including a leading European automaker, and has secured 4 million orders for its latest ATX lidar sensor. Hesai’s growth contrasts with the recent bankruptcy of U.S. lidar maker Luminar, which struggled due to failed automotive partnerships and intense price competition from Chinese manufacturers like Hesai. Luminar’s bankruptcy filings cited cost pressures from lower-priced Chinese competitors as a key factor in its downfall. Hesai has also contributed to a 99.5%
lidarroboticsautonomous-vehiclessensorselectric-carsautomotive-technologyrobotics-industryPhotos: New forehead glasses help blind people navigate without guide dogs
The assistive technology startup .lumen is set to unveil innovative forehead-worn glasses at CES 2026 that help blind people navigate independently without guide dogs. These glasses combine cameras, sensors, artificial intelligence, and robotics to create a “virtual guide dog” experience. By continuously scanning and building a 3D map of the user’s surroundings—including walls, doors, stairs, sidewalks, and moving objects—the device provides real-time navigation support. It tracks both the user’s movement and environmental changes, dynamically adjusting guidance to ensure safe travel through streets, buildings, and public spaces. The glasses use haptic feedback via vibrations on the forehead to replace the traditional leash signals from guide dogs, directing users to turn or proceed safely. They also incorporate sound cues to enhance spatial awareness. The AI processes environmental data over 100 times per second to rapidly identify safe paths and hazards, such as traffic or stairs. Designed to be scalable and more affordable than guide dogs—which are limited in number and costly to train—the
roboticswearable-technologyassistive-technologyartificial-intelligencesensors3D-mappinghaptic-feedbackCan AI fix the operating room? This startup thinks so
The article discusses a significant yet often overlooked issue in healthcare: inefficiencies in operating room (OR) coordination, which result in hospitals losing two to four hours of OR time daily. This lost time is not due to the surgeries themselves but stems from manual scheduling, coordination challenges, and uncertainty around room turnover. Addressing this problem, the startup Akara has developed an AI-driven solution likened to "air traffic control" for hospitals, utilizing thermal sensors and artificial intelligence to optimize OR operations. Akara's innovative approach has gained notable recognition, including being named one of Time’s Best Inventions of 2025. The company aims to streamline the complex logistics of OR management, reducing wasted time and associated costs for hospitals. The article highlights a conversation on TechCrunch’s Equity podcast between AI Editor Russell Brandom and Akara’s CEO Conor McGinn, emphasizing the practical impact of AI in healthcare beyond the typical hype around robots and automation.
robotAIhealthcare-technologyoperating-room-optimizationsensorsautomationhospital-managementUkraine's landmine crisis is driving a new wave of demining technology
The article highlights the critical challenge Ukraine faces in reclaiming land contaminated by landmines and unexploded ordnance following the large-scale conflict. Vast areas of farmland, residential zones, and infrastructure corridors remain unsafe, posing ongoing risks to civilian lives and hindering economic recovery and reconstruction efforts. Demining is essential for restoring agriculture, enabling displaced families to return, and allowing infrastructure repairs to proceed. Ukraine’s situation is unprecedented in modern Europe, with delays in clearance directly causing economic losses, prolonged displacement, and casualties. To address this, a new wave of “post-conflict recovery technology” is emerging, integrating drones, advanced sensors, robotics, and data processing to improve the speed, safety, and efficiency of mine detection. Broswarm, a Lithuanian startup led by CEO Ernestas Zvaigzdinas, is developing a drone-mounted synthetic-aperture radar system designed to detect buried threats, including plastic mines that traditional metal detectors miss. After extensive testing of various technologies such as drone-mounted metal detectors
roboticsdronessynthetic-aperture-radardemining-technologysensorspost-conflict-recovery-techdefense-technologyPhotos: World’s first robotic chessboard autonomously moves pieces without human touch
The Phantom Chessboard, launched in late 2025, is the world’s first robotic chessboard that autonomously moves pieces without any human touch. Crafted entirely from solid walnut and maple, it combines traditional woodworking with advanced robotics to create an elegant, heirloom-quality chess experience. The board features a patented layered architecture concealing a magnetic sensor grid and ultra-quiet linear actuators that move pieces silently at under 18 decibels, avoiding the mechanical noise common in previous robotic chessboards. Its design maintains the appearance of a classic chessboard with no visible motors or plastic components. Phantom easily connects via Bluetooth to a companion app, enabling seamless integration with popular online chess platforms like Lichess and Chess.com. It supports online matches by physically replicating moves on the board and offers AI opponents such as Stockfish and Maia for varied skill levels. A unique Sculpture Mode allows autonomous replay of historic or personal games, enhancing learning and enjoyment through a tactile, visual experience. The technology is protected by
roboticsrobotic-chessboardautonomous-systemssensorsactuatorsBluetooth-connectivityAI-integration'World’s most advanced' robotic hand pairs vision and touch sensing
Sharpa Robotics has advanced its flagship robotic hand, SharpaWave, into mass production, marking a significant milestone in the general-purpose robotics market. Headquartered in Singapore, the company has implemented a rolling production process supported by automated testing systems to ensure the durability and reliability of thousands of microscale gears, motors, and sensors within each hand. Initial shipments began in October, with a broader launch planned at CES 2026. SharpaWave is designed to match human hand size and dexterity while providing exceptional strength and precision, attracting early orders from global technology firms. The SharpaWave hand features 22 active degrees of freedom and integrates proprietary Dynamic Tactile Array technology, combining miniature cameras with over 1,000 tactile pixels per fingertip to deliver visuo-tactile sensing capable of detecting forces as small as 0.005 newtons. This enables six-dimensional force sensing for adaptive grip control and slip prevention, allowing the hand to manipulate both delicate and heavy objects intelligently. Sharpa Robotics
roboticsrobotic-handtactile-sensingautomationdexterous-manipulationsensorsindustrial-robotsPhotos: Marines’ mobile air defense drone killer passes live-fire validation test
The Marine Corps has officially introduced the first full-rate production version of the Marine Air Defense Integrated System (MADIS), marking a significant advancement in expeditionary air defense capabilities. Following intensive New Equipment Training (NET) and a live-fire exercise at the Marine Corps Air Ground Combat Center, MADIS transitions from prototype to active deployment. The system employs two Joint Light Tactical Vehicles (JLTVs) to deliver mobile short-range air defense (SHORAD) against unmanned aerial systems (UAS) and manned aircraft, capable of operating both stationary and on the move without external support. The production variant of MADIS features technical enhancements such as integrated sensors, updated targeting algorithms, and improved mobility, enabling faster detection and tracking of aerial threats. Its modular design supports future hardware and software upgrades, ensuring adaptability to evolving threats. Marines underwent rigorous training involving classroom instruction and field exercises, culminating in a live-fire event that validated the system’s operational readiness. The rollout of MADIS aligns with the Marine Corps’
robotdronemobile-air-defensesensorstargeting-algorithmsmilitary-technologyunmanned-aerial-systemsHow Luminar’s doomed Volvo deal helped drag the company into bankruptcy
In early 2023, Luminar Technologies appeared poised for success, having secured major automotive customers including Volvo, Mercedes-Benz, and Polestar for its lidar sensors designed to enhance vehicle safety and autonomy. Volvo, a longstanding advocate for vehicle safety, initially committed to purchasing 39,500 sensors in 2020, then increased its order to 673,000 in 2021, and further to 1.1 million sensors in 2022. Luminar invested heavily—around $200 million—in manufacturing capabilities, including a new facility in Monterrey, Mexico, to fulfill Volvo’s large orders, particularly for the EX90 SUV. However, the relationship with Volvo deteriorated significantly by 2024. Volvo delayed the EX90 launch for additional software development and subsequently cut its sensor volume forecast by 75%. Other key partnerships also faltered: Polestar abandoned Luminar’s lidar integration due to software incompatibilities, and Mercedes-Benz terminated its sensor agreement in late 2024, citing unmet requirements
robotlidarautonomous-vehiclesautomotive-technologysensorsbankruptcymanufacturingWhole Foods to install smart food waste bins from Mill starting in 2027
Whole Foods plans to install smart food waste bins from the startup Mill in its produce departments nationwide starting in 2027. Mill, which has raised $250 million to date and received investment from Amazon’s Climate Pledge Fund (amount undisclosed), aims to tackle the significant issue of food waste in grocery stores. In the U.S., approximately 10% of all food—around 43 billion pounds annually—is discarded at grocery stores, representing both a lost economic opportunity and an increased carbon footprint. The smart bins developed by Mill are equipped with sensors to collect data on food waste, enabling Whole Foods to better understand and reduce produce waste. After collecting the waste, the bins dehydrate and grind it, converting the byproduct into chicken feed. This feed will then be supplied to Whole Foods’ private label egg producers, creating a circular system that minimizes waste and supports sustainability efforts within the supply chain.
IoTsmart-binsfood-waste-managementsensorssustainabilitydata-analyticsclimate-pledgeLidar-maker Luminar files for bankruptcy
Lidar company Luminar has filed for Chapter 11 bankruptcy protection following a challenging year marked by executive departures, significant layoffs, and legal disputes. The company plans to sell its lidar business during the bankruptcy process and has already arranged to sell its semiconductor subsidiary. Despite continuing operations to minimize disruption for suppliers and customers, Luminar will ultimately cease to exist once the bankruptcy proceedings conclude. CEO Paul Ricci emphasized that a court-supervised sale is the best path forward after a thorough review of alternatives. Luminar’s troubles intensified after founder Austin Russell resigned as CEO amid an ethics inquiry but remained on the board and later launched a new venture, Russell AI Labs, while attempting to buy Luminar. The company faced a 25% workforce reduction, the departure of its CFO, loan defaults, an SEC investigation, and eviction lawsuits. A major setback occurred when Volvo, Luminar’s largest customer and early investor, canceled a five-year contract, prompting Luminar to take legal action. The company also faces legal claims
robotlidarautonomous-vehiclessensorsbankruptcytechnologyautomotive-technologyChina's humanoid robot handles rough terrain with human-like motion
Chinese robotics company LimX Dynamics has introduced significant advancements in its full-size humanoid robot, Oli, demonstrating impressive human-like mobility across challenging terrains such as loose sand, rocks, unstable boards, and debris. Equipped with 31 finely tuned joints and a sophisticated perception system—including depth cameras and a motion-tracking unit—Oli continuously processes environmental data to maintain balance and adapt its movements in real time. During tests, the robot successfully compensated for shifting surfaces and obstacles, adjusting its gait dynamically to stay upright and stable without hesitation. Additional capabilities like object pickup and full-body stretching suggest practical applications in navigating cluttered or uneven environments and performing complex tasks. Oli, standing 165 centimeters tall and weighing 55 kilograms, features 31 degrees of freedom that enable fine motor skills through interchangeable end-effectors. Its modular design supports rapid disassembly and component swapping, facilitating accelerated research and development. The robot’s mobility is powered by high-fidelity sensors—including a 6-axis IMU, Intel RealSense depth
roboticshumanoid-robotmotion-controlsensorsautonomous-navigationmodular-designartificial-intelligence‘End-to-end encrypted’ smart toilet camera is not actually end-to-end encrypted
Kohler’s smart toilet camera, Dekoda, which captures images of users’ toilet bowls to analyze gut health, has been marketed as using “end-to-end encryption” to secure user data. However, security researcher Simon Fondrie-Teitler revealed that Kohler’s claim is misleading. The company actually employs TLS encryption, which protects data during transmission over the internet but does not provide true end-to-end encryption where only the communicating users can access the data. This distinction is critical because users might mistakenly believe Kohler cannot access their images, when in fact the company can decrypt and process the data on its servers. Kohler’s privacy contact confirmed that user data is encrypted at rest on devices and servers, and encrypted in transit, but is decrypted on Kohler’s systems for analysis. This means Kohler has access to the images, raising concerns about potential use of this data, such as training AI algorithms. The company stated that their algorithms are trained only on de-identified data,
IoTsmart-homeprivacyencryptionsmart-toiletdata-securitysensors'Human washing machine' that cleans head to toe in 15 mins hits Japan
Japanese company Science Inc. has launched the "Mirai Human Washing Machine," a high-tech spa pod that automatically cleans, rinses, and dries users from head to toe in about 15 minutes. The machine uses microbubbles to penetrate pores and remove oil, dirt, and dead skin, a technology already popular in Japanese baths and salons. Users recline inside the pod, which also monitors vital signs to ensure safety and plays relaxing music during the process. The pod is spacious enough for most people and represents a modern revival of a concept first introduced at the 1970 Osaka Expo. Priced at approximately 60 million yen ($385,000), the device targets luxury commercial spas, high-end hotels, onsens, and resorts rather than typical households. Production is limited to 40-50 hand-built units, with some already reserved by clients. Beyond its immediate use as a cleaning device, the machine reflects Japan’s broader interest in automation and robotic care, especially for its aging population, serving
robotautomationspa-technologysensorselderly-caremicrobubblesJapanese-innovationNew humanoid robot head with sensory awareness, interactive ability
German semiconductor company Infineon Technologies AG and AI engineering firm HTEC have jointly unveiled a humanoid robotic head featuring 360-degree multi-sensory awareness at OktoberTech™ Silicon Valley 2025. The prototype integrates advanced sensing technologies—including Infineon’s XENSIV™ 60 GHz radar for spatial awareness, REAL3™ Time-of-Flight depth sensors, and XENSIV™ MEMS microphones for audio recognition—combined with onboard cameras and embedded AI software. This fusion enables the robot head to detect human presence, identify sound direction, orient itself accordingly, and analyze visual input, thereby creating a seamless, human-like perception of its environment. The project demonstrates how blending cutting-edge hardware with AI intelligence can push the boundaries of robotic perception and interaction. Built on standard embedded platforms, the system is designed for easy integration into various commercial and industrial robotics applications such as eldercare robots, autonomous delivery systems, smart home devices, and security robots. Although still a prototype, the humanoid head received
robotroboticsAIsensorshumanoid-robotIoTembedded-systems'World's most powerful humanoid robot' aces backflip like parkour pro
Chinese robotics company PHYBOT has unveiled its new full-sized electric humanoid robot, the M1, which demonstrated impressive acrobatic ability by performing a standing backflip and nearly executing a perfect superman landing in a single take. PHYBOT markets the M1 as the "most powerful humanoid robot ever created," emphasizing its high torque density as a key advantage over competitors. Standing 172 cm tall and weighing under 60 kg, the M1 is equipped with a 72-volt power system, Jetson Orin and Intel Core i7 processors, and sensors including 3D LiDAR, stereo cameras, and an IMU for balance and environmental awareness. The robot can produce bursts of over 10 kilowatts of power, enabling dynamic movements, and its peak joint torque reaches 530 N·m. Designed for real-world applications beyond demonstrations, the M1 can lift 10 to 20 kilograms with its arms and carry over 50 kilograms using a backpack system. It offers
robothumanoid-robotroboticselectric-robothigh-torque-densitysensorsAI-powered-robotVideo: Baseball-playing robot swings, hits and catches with pinpoint accuracy
Researchers at the RAI Institute in Cambridge, Massachusetts, have developed two robots capable of playing baseball with human-like speed and precision. These robots can throw, catch, and hit baseballs, demonstrating advanced reflexes and control. The system uses soft joints and adaptive control to absorb impact forces, allowing smooth and safe interactions with the ball. Sensors track the ball’s trajectory, while prediction algorithms enable the robots to position their gloves accurately and respond almost instantly to throws. The robots can throw at speeds up to 70 mph and catch or hit balls thrown at speeds up to 41 mph and 30 mph respectively, from distances around 7 meters. The robots’ arms are designed with lightweight materials like carbon fiber rods and flexible joints, combined with electric actuators for quick, smooth movements. Their low center of gravity enhances stability during swings and catches. Using ordinary computer hardware, the system integrates live camera feeds with motion models to guide precise movements. The software continuously adjusts for the ball’s unpredictable spins and curves
roboticsrobotsrobotic-armsartificial-intelligenceautomationsensorselectric-actuatorsUS Army’s Apache V6 can reliably hunt drones using existing sensors, weapons, shows tests
The U.S. Army has successfully tested the AH-64E Apache Version 6 (V6) helicopter as an effective counter-drone platform using its existing sensors and weapons systems. In high-tempo trials, the Apache V6 achieved 13 kills out of 14 drones, demonstrating a high success rate against one of the modern battlefield’s most pressing threats—small drones. The tests utilized the Apache’s current arsenal, including Joint Air-to-Ground Missiles (JAGM), Hellfire missiles, Advanced Precision Kill Weapon System (APKWS) laser-guided rockets, and its 30mm chain gun, showing that no major upgrades are necessary for effective anti-drone operations. The Apache V6 variant is particularly suited for this role due to its advanced Longbow radar, improved electro-optical and infrared sensors, and Link 16 networking capabilities, which allow it to share and receive targeting data across multiple platforms. Additionally, it incorporates L3 Harris’ manned–unmanned teaming
robotdronesmilitary-technologysensorsweapons-systemsunmanned-systemsdefense-technologyWhen the fastest driver has no pulse
The Abu Dhabi Autonomous Racing League (A2RL) recently showcased the world’s first extreme autonomous motorsport series at Yas Marina Circuit, where driverless race cars equipped with advanced AI algorithms competed at speeds up to 185 mph. Notably, Italy’s Unimore Racing team achieved a 58.87-second lap time during qualifiers, surpassing professional human drivers for the first time on this track. This milestone highlights the rapid advancement of autonomous driving technology, demonstrating that AI can now perform complex, high-speed maneuvers traditionally reserved for human racers. Beyond racing, these developments have broader implications for autonomous navigation in urban delivery and air traffic management. The A2RL cars are based on Japan’s Super Formula SF23 chassis, modified to replace the driver with approximately 143 lbs of sophisticated electronics, including cameras, radars, and LiDAR sensors. These vehicles generate enormous amounts of data—up to 500 gigabytes per lap—to enable real-time perception, planning, and control. The AI systems
robotautonomous-vehiclesAI-racingsensorscomputer-visionLiDARhigh-performance-materialsAJAX enters UK service after £5.5B delays and safety issues
The British Army’s AJAX armored reconnaissance vehicle, developed by General Dynamics UK and based on the ASCOD 2 platform, has finally been declared ready for limited service after an eight-year delay and significant cost overruns. Initially contracted in 2010 to replace the aging CVR(T) fleet, the program aimed to deliver 589 vehicles with advanced armament and sensors capable of engaging targets from up to 8 km away. However, only about 165 vehicles have been delivered to date, with full operational capability expected by 2029–2030. Each AJAX unit now costs around £10 million, pushing total program costs between £5.5 billion and £6.3 billion. The program has faced numerous technical and safety challenges, including suspension issues, inability to reverse over certain obstacles, excessive noise, and severe vibration problems that caused hearing damage and nausea among soldiers. These health concerns led to the suspension of prototype trials in 2021 and the implementation of double hearing protection for crews. Despite
robotmilitary-roboticsarmored-vehiclesdefense-technologysensorsreconnaissance-systemsautonomous-systemsWorld's first AI firefighting system extinguishes oil fires on ships
The Korea Institute of Machinery and Materials (KIMM) has developed the world’s first AI-driven autonomous firefighting system specifically designed to detect and extinguish oil fires aboard naval vessels, even under challenging sea conditions. Unlike traditional systems that flood entire compartments with extinguishing agents, KIMM’s technology uses AI-based fire verification and reinforcement learning to accurately identify real fires and target suppression precisely at the source. This approach minimizes unnecessary damage from false alarms. The system integrates sensors, fire monitors, and a control unit capable of estimating fire location with over 98% accuracy, and can discharge foam up to 24 meters. It has been successfully tested in simulated ship compartments and real-world conditions aboard the ROKS Ilchulbong amphibious assault ship, demonstrating stable operation in waves up to one meter high. Developed by Senior Researcher Hyuk Lee and his team, the system adapts to ship movement using a reinforcement learning algorithm that adjusts nozzle aiming based on six degrees of freedom acceleration data. It
AIautonomous-systemsfirefighting-technologyroboticssensorsreinforcement-learningmaritime-safetyStretchable Liquid-metal fibers stretch 10x to power smart clothing
Researchers at EPFL have developed a novel fiber-based electronic sensor that remains fully functional even when stretched over ten times its original length, marking a significant advancement for wearable electronics. The key innovation lies in using a safe, flexible liquid metal alloy of indium and gallium, combined with a thermal drawing process adapted from optical fiber manufacturing. This technique involves creating a large-scale “preform” with a 3D pattern of liquid metal droplets embedded in a soft elastomer matrix, which, when heated and stretched, produces thin fibers with finely tuned electrical properties. This structure allows selective activation of conductive areas within the fiber, resulting in sensors that maintain high sensitivity and conductivity despite extreme stretching. To demonstrate practical applications, the team integrated these fibers into a soft knee brace capable of accurately monitoring joint movements during various activities such as walking, running, and jumping. The fibers’ combination of stretchability, conductivity, and ease of integration makes them promising for smart textiles used in sports, health monitoring, physical rehabilitation, and
materialswearable-technologysmart-textilesliquid-metalstretchable-electronicssensorssoft-roboticsWaymo is bringing its robotaxis to Las Vegas, San Diego, and Detroit - The Robot Report
Waymo has announced the expansion of its autonomous ride-hailing service to three new U.S. cities: Las Vegas, San Diego, and Detroit. The company began driving its fleet—comprising Jaguar I-PACE and Zeekr RT vehicles equipped with its sixth-generation Waymo Driver—in these cities, with plans to start rider services in San Diego in 2025 and Las Vegas in mid-2026; no timeline was provided for Detroit. This expansion follows Waymo’s recent announcement to launch robotaxi services in London in 2026 and ongoing international testing in Tokyo. Domestically, Waymo currently operates in Phoenix, San Francisco, Los Angeles, and Austin, having driven over 100 million fully autonomous miles and provided more than 10 million paid rides. The company aims to further expand to cities including Miami, Atlanta, Dallas, and Nashville. A significant focus of Waymo’s development is adapting its technology for challenging weather conditions, particularly snow, which none of its current operational cities
robotautonomous-vehiclesWaymorobotaxisself-driving-technologysensorsautonomous-drivingThis Toyota self-driving bubble EV transports kids across town alone
At the Japan Mobility Show 2025, Toyota unveiled Mobi, a fully autonomous electric bubble car designed specifically to transport elementary school children across town without adult supervision. As part of Toyota’s “Mobility for All” initiative, Mobi aims to expand independent travel options for young children by leveraging an AI-driven system that controls navigation, speed, traffic management, and obstacle detection. The vehicle is equipped with multiple sensors and cameras to maintain situational awareness, while an integrated AI assistant named UX Friend communicates with the child passenger, providing instructions and engagement throughout the journey. The Mobi features a distinctive rounded design with a gullwing canopy and high-visibility colors to enhance safety and presence in traffic. Its interior is tailored for single-child occupancy, using comfortable, textured materials to create a secure and inviting environment. Although technical specifications remain undisclosed, the vehicle is described as compact and lightweight, optimized for urban use. However, despite its innovative approach, Mobi faces significant regulatory challenges, as current laws generally
robotautonomous-vehicleselectric-vehiclesAIchild-transportationsensorsmobility-technologyThis Toyota self-driving bubble EV transports kids across town alone
At the Japan Mobility Show 2025, Toyota unveiled the Mobi, a fully autonomous electric bubble car designed specifically to transport elementary school-aged children across town without adult supervision. As part of Toyota’s “Mobility for All” initiative, the Mobi aims to expand independent travel options for young children by leveraging an AI-driven system that controls all driving functions, including navigation, speed, and obstacle detection. The vehicle is equipped with exterior sensors and cameras to monitor its surroundings and ensure safe operation in real-world traffic conditions. The Mobi features a distinctive rounded design with a gullwing canopy and high-visibility colors to enhance safety and recognition on the road. Inside, the single-occupant cabin is tailored for children’s comfort, using soft materials to create a secure and inviting environment. An integrated AI assistant named UX Friend interacts with the child passenger, providing instructions and engagement throughout the journey. While technical specifications remain limited, the prototype emphasizes a lightweight, compact form suited for urban use. However, significant
robotautonomous-vehicleselectric-vehiclesAIchild-transportationsensorsmobility-technologyLuminar is cutting jobs, losing its CFO, and warning of a cash shortage
Luminar, a lidar sensor manufacturer, has announced a 25% workforce reduction—the second layoff this year—and warned shareholders it will run out of cash by early 2026 without additional funding. The company, which began the year with approximately 580 employees, did not specify the number affected in this latest cut. Luminar also disclosed that its CFO, Thomas Fennimore, will step down on November 13 to pursue other opportunities, with the company emphasizing that his departure is unrelated to any financial disagreements. These developments occur amid founder Austin Russell’s ongoing attempt to buy the company, following his replacement as CEO earlier in the year after an ethics inquiry. Luminar’s financial struggles are linked to weaker-than-expected sales, particularly to major customer Volvo, leading the company to sell sensors at a loss. As of October 24, Luminar held $72 million in cash and marketable securities but faces a burn rate that could deplete funds by the first quarter of 2026. The
robotlidarautonomous-vehiclessensorsautomotive-technologyrobotics-industrytechnology-startupsNew modular truck can stretch and shrink body, wheels to fit any cargo
At the Japan Mobility Show 2025, Isuzu and UD Trucks introduced the Vertical Core Cycle Concept, a modular delivery truck prototype featuring a unique vertical frame that allows the vehicle’s body and wheels to expand, shrink, or be swapped out to suit different cargo or passenger needs. This design enables quick transformation between cargo boxes and passenger cabins, allowing a single vehicle to serve multiple roles—such as delivery, logistics transport, or passenger carrying—within the same day. The modular system includes detachable wheels on each cargo module that can move independently, improving load balance and simplifying attachment or removal. The front driving module houses essential systems like sensors, cameras, control units, electric motors, and batteries, functioning as the vehicle’s operational core. The cargo modules are box-shaped with flat surfaces, designed to carry various goods efficiently. Isuzu and UD Trucks are developing mechanical locking joints and electronic connectors to ensure secure and seamless integration between modules while maintaining communication across the vehicle’s sections. Beyond logistics, the concept’s adaptable
robotIoTenergymodular-vehicleselectric-motorssensorssmart-technologylogistics-innovationNew microsensors for nuclear reactors can endure 1,832°F, radiation
Researchers at the University of Maine have developed innovative microelectronic sensors capable of withstanding extreme conditions inside advanced nuclear reactors, including temperatures up to 1,832°F (1,000°C) and intense nuclear radiation. These sensors represent a significant advancement over existing commercial sensors, which cannot operate reliably at such high temperatures. The new microchips can measure critical reactor parameters like power output and neutron flux in real time, enabling faster issue detection and reducing maintenance costs for nuclear power plants, which supply about 20% of the U.S. energy. After two years of development and extensive testing, including a successful week-long trial at Ohio State University’s Nuclear Research Laboratory, the sensors demonstrated stable performance under simultaneous high heat and radiation exposure. The research team plans to enhance the technology further by incorporating wireless connectivity powered solely by interrogation signals, eliminating the need for batteries. This breakthrough aims to overcome current technological barriers in monitoring advanced reactors, such as microreactors, and positions the University of Maine as a leader in
energynuclear-reactorsmicrochipssensorshigh-temperature-electronicsradiation-resistant-materialspower-plant-monitoringStellantis teams up with Pony.ai to develop robotaxis in Europe
Automaker Stellantis and Chinese autonomous vehicle firm Pony.ai have entered a non-binding agreement to develop robotaxis for the European market. The partnership will integrate Pony.ai’s self-driving software into Stellantis’s electric medium-size van platform, specifically starting with the Peugeot e-Traveller model equipped with advanced sensors for autonomous driving. Initial testing is set to begin soon in Luxembourg, which serves as Pony.ai’s European headquarters, with plans to expand deployment to other European cities by 2026. This collaboration follows Pony.ai’s recent partnership with Uber to deploy autonomous vehicles in international markets, including Europe and the Middle East, and comes shortly after Pony.ai received an autonomous vehicle testing permit from Luxembourg in April. As Pony.ai aims to grow beyond its established presence in China and increase its footprint in Europe, the company is also pursuing a secondary IPO on the Hong Kong Stock Exchange, complementing its existing Nasdaq listing.
robotautonomous-vehiclesrobotaxisself-driving-softwareelectric-vehiclessensorsmobility-technologyChinese tanks could soon strike like fighter jets to kill beyond sight
China’s People’s Liberation Army (PLA) is revolutionizing its armored warfare by equipping its new-generation main battle tanks, notably the Type 100, with advanced sensors, artificial intelligence, and networked warfare capabilities. This transformation enables tanks to engage targets beyond visual range, a capability traditionally reserved for air and naval forces. The Type 100 tank integrates optical, infrared, radar sensors, and electronic warfare tools, allowing it to perceive the battlefield with full-circle awareness and coordinate long-range strikes in real time. This marks a significant shift from conventional close-range tank battles to a more sophisticated, information-driven combat approach. The PLA’s recent exercises demonstrated the integration of these tanks with other military branches, including helicopters, rocket launchers, electronic warfare units, and reconnaissance drones, forming a highly coordinated joint force. Military analysts highlight that China’s breakthroughs in miniaturizing radar and communication systems have overcome the challenges of fitting advanced beyond-visual-range capabilities into the limited space and power of ground vehicles. This development
robotIoTenergymaterialsartificial-intelligencesensorsnetworked-warfareFigure 03 robot tackles household chores with realistic motion
Figure AI has introduced its third-generation humanoid robot, Figure 03, designed to perform household and warehouse tasks with enhanced realism and efficiency. Standing five-foot-six, Figure 03 improves on its predecessor with advanced sensory systems, including cameras that process twice as many frames per second and offer a 60% wider field of view, enabling smoother navigation in complex environments. Each hand features a palm camera and highly sensitive fingertip sensors capable of detecting minimal pressure, allowing delicate handling of objects like glassware. The robot is lighter, smaller, and covered in washable mesh fabric with foam padding for safety, and it supports wireless charging through coils in its feet, providing about five hours of operation per full charge. The robot’s AI, named Helix, integrates vision, language, and movement to learn from human behavior, while upgraded actuators deliver faster, more powerful motion suitable for tasks such as sorting parts and packaging. Audio improvements include a louder speaker and clearer microphone placement, facilitating natural communication without distortion. Figure
robothumanoid-robotAIroboticswireless-chargingsensorsautomationVideo: Chinese humanoid robot picks up tennis balls like a human
The article highlights a new video from Chinese robotics company LimX Dynamics showcasing their humanoid robot, Oli, autonomously picking up tennis balls with human-like dexterity and balance. Without any remote control or motion-capture assistance, Oli visually tracks and retrieves tennis balls scattered on the floor, demonstrating real-time perception, adaptive locomotion, and precise manipulation. The robot repeatedly collects and deposits the balls into a basket, maintaining stable gait and fluid motion throughout the task, underscoring its advanced embodied intelligence and autonomous capabilities. Oli stands 165 cm tall, weighs 55 kg, and features 31 degrees of freedom, enabling fine motor control and agile movements such as bending, reaching, and grasping. Its modular design supports quick reconfiguration for research and development. Equipped with multi-sensor fusion—including IMUs and Intel RealSense depth cameras—Oli achieves 3D spatial awareness and object recognition critical for dynamic environments. The platform also offers extensive connectivity, development tools, and simulation support to facilitate
robothumanoid-robotautonomous-robotroboticsmotion-planningsensorsartificial-intelligenceRobot that reads wind direction competes with archers in Korea event
At the 2025 Hyundai Motor Chung Mong-koo Cup Korea Archery Championship held on October 3 in Gwangju, South Korea, a cutting-edge archery robot developed by Hyundai Motor Group competed against the nation’s elite archers. Equipped with advanced sensors, the robot measured wind direction and speed, adjusting its arrow launch angles with millimeter-level precision. Despite a sudden storm that initially disrupted its performance, the robot recalibrated and delivered a streak of perfect 10-point shots. However, the human archers narrowly outscored the robot, with the men’s and women’s teams combining for a 55 to 54 victory in the recurve category, while also outperforming the robot in the compound bow category. The event recreated international tournament conditions to provide realistic domestic practice and served as a test of South Korea’s readiness for upcoming major competitions like the 2026 Asian Games and 2028 Olympics. Hyundai’s innovations extend beyond the robot, including a multi-camera posture analysis system
roboticssensorsarchery-robotHyundaiBoston-Dynamicsprecision-technologyrobotics-competitionUS deploys space sensors to track nuclear explosions frame by frame
The United States has completed deployment of its latest space-based nuclear detonation detection system, the IIIA series of the Global Burst Detection system, which is hosted on GPS satellites and has been monitoring nuclear explosions worldwide for over 60 years. These sensors detect electromagnetic pulses, X-rays, and optical flashes from nuclear detonations, enabling real-time determination of the explosion’s time, location, and yield. Developed jointly by Sandia National Laboratories and Los Alamos National Laboratory, the IIIA series underwent successful calibration and testing after its final launch in May 2025, marking a significant advancement in the US’s nuclear detection capabilities. Looking ahead, the US is preparing to field the next-generation IIIF series starting in 2027, featuring a new core instrument called the Spectral Imaging Geolocation Hyper-Temporal Sensor (SIGHTS). This advanced optical sensor can capture tens of thousands of frames per second at megapixel resolution, allowing for faster and more accurate identification of nuclear events while reducing false positives. The II
energysensorsnuclear-detectionspace-technologysatellite-systemselectromagnetic-pulsesnational-securityFrom autonomous running coach to mini-scooter, Trego does it all
The Trego, developed by YUPD and Wooks designers, is an innovative AI-powered autonomous personal vehicle designed to support runners throughout their entire exercise routine. It operates in two main modes: AI Mode and Mobility Mode. In AI Mode, Trego runs alongside the user, using sensors to adapt to their pace and running conditions, helping maintain rhythm and efficiency. Mobility Mode transforms Trego into a mini-scooter with foldable handlebars, footrests, and a built-in seat, allowing users to comfortably travel to and from their running locations without walking. Equipped with a built-in display, Trego provides real-time running metrics such as distance, pace, and calories burned, while also allowing users to input or confirm destinations. Safety is prioritized with front and rear cameras and sensors that detect obstacles, pedestrians, and vehicles, automatically adjusting the device’s path to avoid collisions in both modes. Additionally, Trego features a storage compartment integrated into the seat for securing essentials, and a dedicated docking and
robotAIautonomous-vehiclepersonal-mobilitysensorselectric-scootersmart-deviceChina’s humanoid robot survives several kicks with 'anti-gravity mode'
Unitree’s G1 humanoid robot has demonstrated impressive resilience and balance in a recent series of physical tests, surviving repeated kicks, shoves, and body blows without falling over permanently. Central to this capability is the robot’s new “Anti-Gravity mode,” which enables it to actively anticipate impacts, adjust its posture in real time, and recover quickly rather than simply falling and resetting. Equipped with depth cameras, 3D LiDAR, and multiple joint motors, the G1 continuously scans its environment and calculates how to shift its center of gravity, brace against hits, and regain balance smoothly—actions that resemble a human athlete’s reflexes. The robot’s ability to absorb shocks and recover rapidly has practical implications beyond technical demonstration. Designed for industrial and research environments, the G1’s durability and adaptability can minimize downtime caused by unexpected collisions or falls, allowing it to continue tasks without human intervention. Priced at around $16,000, the G1 offers a more affordable option compared to other advanced
robothumanoid-robotanti-gravity-modeLiDARsensorsrobotics-technologyindustrial-robotsFirst quantum squeezing achieved with nanoscale particle motion
Researchers at the University of Tokyo have achieved a groundbreaking feat by demonstrating quantum squeezing of the motion of a levitated nanoscale particle. Quantum squeezing reduces the uncertainty in a particle’s position or velocity below the standard quantum limit set by zero-point fluctuations, a fundamental aspect of quantum mechanics. By levitating a glass nanoparticle in a vacuum and cooling it near its ground state, the team managed to measure a velocity distribution narrower than the quantum uncertainty limit, marking the first such observation for nanoscale particle motion. This experiment bridges the gap between microscopic quantum phenomena and larger-scale objects, offering a new platform to explore quantum mechanics at mesoscopic scales. The achievement required overcoming significant challenges, including stabilizing the levitated particle and minimizing environmental noise. The sensitivity of the nanoscale particle to external fluctuations, while initially a hurdle, now provides a powerful system for studying the boundary between classical and quantum physics. Beyond fundamental science, this advance holds promise for practical applications such as ultra-precise quantum sensors that could enable GPS
materialsquantum-physicsnanoscale-particlesquantum-squeezingsensorsquantum-mechanicsnanotechnologyNew AI-triggered airbag system could save lives in a plane crash
Engineers at BITS Pilani’s Dubai campus have developed Project REBIRTH, an AI-powered airplane crash survival system designed to protect passengers during unavoidable crashes. The system uses AI and sensors to detect imminent crashes below 3,000 feet, automatically deploying external airbags around the aircraft’s nose, belly, and tail within two seconds. These airbags, made from advanced materials like Kevlar and non-Newtonian fluids, absorb impact forces to reduce damage and increase passenger safety. Additionally, the system employs reverse thrust or gas thrusters to slow and stabilize the plane before impact. Post-crash, bright paint, infrared beacons, GPS, and flashing lights aid rescue teams in quickly locating the crash site. A 1:12 scale prototype combining sensors, microcontrollers, and CO2 canisters has been built, with computer simulations indicating a potential reduction in crash impact by over 60%. The team plans to collaborate with aircraft manufacturers for full-scale testing and aims to make the system compatible with both new
robotAIsensorssafety-systemsmaterialscrash-survivalsmart-airbagsStudents build Bond-style micro pocket drone that flies instantly
Students at Texas A&M University’s Advanced Vertical Flight Laboratory have developed a groundbreaking micro air vehicle (MAV) weighing just 112 grams that folds to smartphone size and unfolds midair to stabilize itself within seconds. This pocket-sized drone features foldable propeller arms that extend and lock automatically when thrown, enabling it to recover from extreme spins—up to 2,500 degrees per second—and hover smoothly. Its stability is achieved through an advanced onboard feedback controller that uses sensors and algorithms to detect orientation and adjust propeller speeds in real time, allowing immediate flight readiness without careful handling or controlled takeoff. The design balances portability, strength, and performance, supported by a sophisticated six degrees of freedom (6DOF) flight dynamics model validated with real-world motion tracking data. This rigorous testing ensures reliable operation even under unpredictable launch conditions. The MAV’s compactness and rapid deployment make it ideal for practical applications such as emergency response, where first responders could quickly launch drones into hazardous zones for damage assessment or survivor
robotdronemicro-air-vehiclesensorsflight-controlstabilizationaerospace-engineeringHow an over-the-air update made Quilt’s heat pumps more powerful
Quilt, a heat pump startup, has pioneered the use of over-the-air (OTA) software updates in residential HVAC systems, significantly enhancing the performance of its heat pumps without hardware changes. By integrating higher quality sensors—such as additional pressure sensors and more accurate temperature and current sensors—Quilt was able to collect detailed operational data that revealed untapped capacity in their units. This insight allowed the company to increase the maximum heating and cooling output from 20,500 and 19,700 BTUs per hour to 25,200 and 24,000 BTUs per hour, respectively, enabling the heat pumps to better handle extreme temperatures while maintaining efficiency. The OTA update involved both software and firmware improvements across the main processor and microcontrollers within the indoor and outdoor units. While the inclusion of advanced sensors and networking components added some cost to the bill of materials, Quilt’s leadership believes the benefits—such as continuous improvement, broader market applicability, and avoiding the need for new hardware models—far outweigh
energyHVACheat-pumpsover-the-air-updatesensorsfirmwaresoftware-defined-HVACHumanoids, AVs, and what’s next in AI hardware with Waabi and Apptronik at TechCrunch Disrupt 2025
TechCrunch Disrupt 2025, taking place from October 27 to 29 at Moscone West in San Francisco, will feature a key session focused on the future of AI hardware, particularly in robotics and autonomous systems. The event will bring together over 10,000 startup and venture capital leaders to explore groundbreaking technologies and ideas. A highlight of the conference is a discussion with Raquel Urtasun, founder and CEO of Waabi, and Jeff Cardenas, co-founder and CEO of Apptronik, who will share insights on integrating AI with real-world physical systems such as autonomous vehicles and humanoid robots. The session will delve into the challenges and innovations involved in developing intelligent machines that operate safely and effectively in the physical world. Topics include the use of simulation, sensors, and software infrastructure critical to scaling these technologies. The conversation aims to provide a realistic and forward-looking perspective on how AI-driven robotics and self-driving platforms are evolving and the implications for industry, labor, and infrastructure.
roboticsautonomous-vehiclesAI-hardwarehumanoid-robotssensorssimulation-technologyintelligent-machinesHumanoids, AVs, and what’s next in AI hardware at TechCrunch Disrupt 2025
TechCrunch Disrupt 2025, taking place from October 27 to 29 at Moscone West in San Francisco, will gather over 10,000 startup and venture capital leaders to explore cutting-edge technology and future trends. A highlight of the event is a session focused on the future of AI hardware, particularly in robotics and autonomous systems. This session will feature live demonstrations and discussions on the advancements and challenges in developing humanoid robots and autonomous vehicles, emphasizing the integration of AI with real-world physics through simulation, sensors, and software infrastructure. Key speakers include Raquel Urtasun, founder and CEO of Waabi, and Jeff Cardenas, co-founder and CEO of Apptronik, who will share insights into the breakthroughs and bottlenecks in scaling intelligent machines safely and effectively. The discussion aims to provide a realistic and forward-looking perspective on how AI-driven robotics and autonomous platforms are evolving, highlighting their potential impact on industry, labor, and infrastructure. This session underscores the unique constraints and
robotautonomous-vehiclesAI-hardwareroboticshumanoid-robotssensorsautonomous-systemsAcoustic AI helps cars hear sirens, horns, and improve driver safety
The article discusses a pioneering development in autonomous vehicle technology called "The Hearing Car," which integrates acoustic AI and microphones to enable cars to "hear" their surroundings. Unlike traditional optical systems reliant on cameras and radar, this acoustic sensing technology allows vehicles to detect critical sounds such as emergency sirens, horns, and pedestrian noises even before they are visible. This capability enhances safety by providing early warnings, especially in scenarios where line of sight is obstructed, such as around corners or in crowded urban environments. The system is robustly designed to function under various weather conditions and at highway speeds, with important sounds transmitted directly to the driver through the headrest to prompt faster reactions. Beyond external sound detection, the technology also supports interactive features inside the vehicle. Drivers can use voice commands secured by speaker verification, and the system monitors driver health and attention through non-contact sensors measuring heart rate, breathing, and brain activity, as well as voice analysis for stress detection. Additionally, the YourSound system personalizes in-car audio
IoTautonomous-vehiclesacoustic-AIdriver-safetysensorsvoice-recognitionautomotive-technologyThis headband uses electrical signals to help ease stress, balance mood
Studio Beyond, a Cambridge-based design firm led by Paul Gibson and Matt Maitland, has introduced Sometimes, a conceptual wearable headband designed to help users manage stress and balance mood through gentle electrical signals delivered to the nervous system. Drawing on research indicating that electrical stimulation can influence brain activity, the device uses electrodes to send carefully regulated pulses aimed at lowering stress without causing discomfort. Integrated sensors and a small processor continuously monitor and adjust these signals to ensure safety and user comfort. The Sometimes headband features an adjustable elastic band with electronic modules positioned likely over the temples or forehead, combining functionality with a consumer-friendly design that resembles familiar wearable products rather than clinical devices. It is intended for daily use without the need for surgery or medical training, reflecting a broader trend toward accessible, at-home neurotechnology solutions. However, it remains a concept at this stage, with no current plans for commercial release until further development transforms it into a functional product.
wearable-technologyneurotechelectrical-stimulationconsumer-electronicshealth-techsensorsbrain-activityChina's 'scissor wing' project could revive hypersonic drone concept
Chinese engineers are revisiting the oblique wing aircraft concept, originally developed in the 1940s, which features a single wing that pivots around the fuselage like a scissor blade. This design allows the wing to be perpendicular at low speeds for takeoff and landing, then rotate to align with the fuselage at high speeds, reducing drag and enabling hypersonic flight. Unlike previous variable-sweep wing aircraft like the F-14, the oblique wing uses a simpler mechanism involving just one wing. However, past attempts, such as NASA’s 1970s AD-1, faced significant stability and control challenges. To overcome these issues, the Chinese project incorporates advanced technologies including supercomputers, artificial intelligence for airflow modeling, smart materials, and sensors to manage structural stresses. The design also uses canards, tailplanes, and active control surfaces to maintain stability during wing movement. The aircraft aims to serve as a hypersonic “mother ship” drone carrier capable of Mach
robotdronehypersonic-technologysmart-materialssensorsartificial-intelligenceaerospace-engineeringPhotos: World's first Robocar promises pure autonomy with lidars, radars
The Tensor Robocar, introduced by California-based startup Tensor, is the world’s first personal autonomous vehicle designed from the ground up for private ownership rather than fleet use. Scheduled for delivery in late 2026, the Robocar features a comprehensive sensor suite including 37 cameras, 5 lidars, 11 radars, and multiple microphones and ultrasonic sensors, enabling Level 4 autonomy with no driver input required under defined conditions. Its architecture emphasizes safety and redundancy, meeting global automotive safety standards such as FMVSS and IIHS Top Safety Pick+, with full backup systems to prevent single points of failure. The vehicle’s autonomy is powered by a dual-system AI: one system handles rapid, reflexive driving responses based on expert driver data, while the other uses a multimodal Visual Language Model to reason through complex or unusual scenarios, including low-visibility conditions. The Robocar also functions as an "AI agentic car," featuring a Large Language Model that enables conversational interaction and adapts to the owner
robotautonomous-vehiclesAIsensorslidarradarautomotive-technologyPebble’s smartwatch is back: Pebble Time 2 specs revealed
Pebble’s original creator, Eric Migicovsky, has unveiled the final design and specifications for the Pebble Time 2 smartwatch, marking the company’s return to the market under the Pebble brand after regaining the trademark. Previously referred to as Core 2 Duo and Core Time 2, the new watches will now be called Pebble 2 Duo and Pebble Time 2. The Time 2 features an updated industrial design and will debut in four colors, with buyers having input on the final choices. Key new features include a multicolor RGB LED backlight, a second microphone for potential noise cancellation, a compass sensor, and a screw-mounted stainless steel back cover, maintaining the premium feel of the Pebble Time Steel. The Pebble Time 2 retains many previously announced specifications, such as a 1.5-inch 64-color e-paper touchscreen, a quick-release 22mm strap, flat hardened glass lens, and an estimated 30-day battery life. It also includes
IoTsmartwatchwearable-technologyBluetoothsensorse-paper-displaybattery-lifeDigiKey, onsemi discuss the intersection of robotics and physical AI - The Robot Report
DigiKey and onsemi recently explored how advancements in sensing technologies and physical AI are driving the evolution of autonomous mobile robots (AMRs), which have the potential to transform industrial and commercial sectors. AMRs utilize a variety of sensors—including lidar, cameras, ultrasonic detectors, and radar—to enhance safety, improve productivity, and navigate complex environments. Similar to self-driving vehicles, AMRs employ technologies such as simultaneous localization and mapping (SLAM) to create real-time maps and localize themselves, enabling them to operate beyond controlled indoor settings into more unpredictable outdoor environments. These developments are supported by improvements in sensor integration, edge computing, and AI, which collectively make AMRs more autonomous, adaptive, and capable of performing a wider range of tasks safely alongside humans. The discussion also highlighted the shift in communication protocols within AMRs, moving from traditional CAN (Controller Area Network) to the newer 10BASE-T1S Ethernet-based protocol, led by onsemi. This protocol offers higher data rates (10 Mbps
roboticsautonomous-mobile-robotsphysical-AIsensorsindustrial-robotsedge-computingAI-integrationFancy a personal dragon? US students build AI pet that you can touch
A team of students at Carnegie Mellon University’s Entertainment Technology Center (ETC) has developed Luceal, an innovative AI pet prototype that blends virtual reality with physical interaction. Created under the Physical Presence Pet (PPP) project, Luceal is a plush animal embedded with custom textile sensors that respond to touch, sending signals to Apple Vision Pro VR headsets to provide real-time virtual animations and reactions. This integration allows users to physically feel and interact with a virtual pet, combining tactile features with expressive digital behavior. The project was guided by professor Olivia Robinson, who introduced the team to e-textiles, enabling the seamless incorporation of conductive fabrics into the plush form. The concept was inspired by the desire for a constant companion, especially for those unable to have real pets, such as international students, and draws on nostalgia from digital pets like Tamagotchi. The team envisioned creating exotic virtual creatures—such as dragons or seals—that users could interact with in ways not possible with real animals. Designers on the
robotAI-petvirtual-realitye-textilessensorsinteractive-technologywearable-technologySensing robot hand flicks, flinches, and grips like a human
A student team at USC Viterbi, led by assistant professor Daniel Seita, has developed the MOTIF Hand, a robotic hand designed to mimic human touch by sensing multiple modalities such as pressure, temperature, and motion. Unlike traditional robot grippers, the MOTIF Hand integrates a thermal camera embedded in its palm to detect heat without physical contact, allowing it to "flinch" away from hot surfaces much like a human would. It also uses force sensors in its fingers to apply precise pressure and can gauge the weight or contents of objects by flicking or shaking them, replicating human instincts in object interaction. The MOTIF Hand builds on previous open-source designs like Carnegie Mellon’s LEAP Hand, with the USC team also committing to open-source their work to foster collaboration in the robotics community. The developers emphasize that this platform is intended as a foundation for further research, aiming to make advanced tactile sensing accessible to more teams. Their findings have been published on Arxiv, highlighting a significant step toward
robotrobotic-handsensorshuman-robot-interactiontactile-sensingthermal-detectionrobotics-researchSupersonic parachutes get upgrade, NASA conducts flight tests
NASA is advancing the reliability and safety of supersonic parachutes used for delivering scientific instruments and payloads to Mars through a series of flight tests led by the EPIC (Enhancing Parachutes by Instrumenting the Canopy) team. These tests, conducted at NASA’s Armstrong Flight Research Center in California, involved air-launching a capsule from a drone that deployed a parachute equipped with flexible, strain-measuring sensors. The sensors successfully collected data without interfering with the parachute canopy, validating the team's approach and providing valuable information for refining computer models and future tests. The parachute system, developed by NASA’s Langley Research Center with support from Armstrong interns, builds on previous supersonic parachute technology used during the Perseverance Mars Rover landing in 2021. That parachute, measuring 65 feet in diameter, deployed at hypersonic speeds and endured extreme aerodynamic forces. NASA’s ongoing work aims to improve numerical simulations of parachute inflation dynamics, which are complex due to
sensorsaerospaceflexible-strain-sensorsNASAsupersonic-parachutesdrone-technologyMars-explorationExploring the future of humanoid robotics with Novanta
In episode 205 of The Robot Report Podcast, Nick Damiano, senior business development manager at Novanta Robotics and Automation, discusses the company's innovative approach to enhancing safety in humanoid robotics. Based in Bedford, Massachusetts, Novanta focuses on achieving component-level safety ratings and implementing advanced joint-level control to ensure safer robot operation. Damiano highlights the critical role of integrating high-performance sensors and drives in overcoming the technical challenges associated with developing safe humanoid robots. Novanta Robotics and Automation, a leader in motion control solutions since 2022, collaborates with top robotics platforms across various industries by providing key components such as drives, encoders, motors, and force torque sensors. These technologies aim to reduce risks and costs while accelerating time-to-market for robotics developers. The episode emphasizes Novanta’s commitment to addressing unique challenges in robotics safety and shaping the future of humanoid robot development through cutting-edge motion control innovations.
roboticshumanoid-robotsmotion-controlsensorsautomationsafety-in-roboticsrobotics-innovationUS turns cargo containers into nuke bunkers for remote military bases
Sandia National Laboratories has developed a mobile, high-security vault housed within a 20-foot shipping container to safeguard nuclear weapons at remote or temporary military locations where permanent bunkers are not feasible. Created under the National Nuclear Security Administration’s Stockpile Responsiveness Program, the project was completed in six months using a rapid, adaptable design approach. The vault features advanced access control, alarm systems, sensors, and backup power, built with a combination of off-the-shelf parts, rapid prototyping, and additive manufacturing. Two additional prototypes are underway, with upcoming testing planned during the Department of Defense’s Grey Flag 25 exercise to simulate real-world conditions. This mobile vault offers a flexible and scalable solution for secure storage of nuclear weapons and other critical assets in field conditions, providing new capabilities for military and civilian missions. The technology aims to protect sensitive materials during transport or operations in locations lacking traditional infrastructure, such as battlefields or disaster zones. Sandia plans to transition the technology to industry for broader production and deployment
energymaterialssecurity-technologyadditive-manufacturingsensorsrapid-prototypingnuclear-safetyGrab Prototypes Autonomous Shuttle with Employees in Singapore - CleanTechnica
Grab, the Southeast Asian superapp, is launching a pilot program in Singapore to test an autonomous electric shuttle bus for transporting its employees between the One-North headquarters and the nearby One-North MRT station. Developed in partnership with South Korean autonomous vehicle technology firm Autonomous A2Z (A2Z), the shuttle is equipped with A2Z’s autonomous software and hardware, alongside Grab’s IoT devices. The pilot, which begins operations with a trained safety driver onboard at all times, aims to evaluate the safety and feasibility of autonomous public transport in Singapore, while also exploring new job opportunities in the sector. This marks A2Z’s first deployment of autonomous technology in Singapore, with collaboration from the Land Transport Authority and local safety drivers to adapt the shuttle to Singapore’s transport environment. The shuttle has undergone over 100 hours of training on a fixed 3.9-kilometer route, collecting data on road infrastructure, traffic signals, and obstacles, and programming responses to real-world scenarios such as stopping
robotautonomous-vehiclesIoTelectric-shuttletransportation-technologysensorspublic-transport-innovationSelf-healing EV batteries designed to double lifespan, enhance range
Scientists involved in the EU-funded PHOENIX project are developing self-healing batteries for electric vehicles (EVs) that can diagnose internal damage and initiate repairs, potentially doubling battery lifespan and enhancing performance. This innovation aims to address battery degradation, a major limitation for EV longevity and adoption, while also reducing the carbon footprint associated with battery production. The PHOENIX system integrates advanced internal sensors that go beyond traditional Battery Management Systems by detecting physical swelling, generating heat maps, and identifying specific gases to provide early warnings of damage. When damage is detected, the system can activate repair mechanisms such as applying targeted heat to reform chemical bonds or using magnetic fields to break down harmful metallic dendrites that cause short circuits. The project has recently progressed to testing sensor and trigger prototypes on battery pouch cells. Additionally, the research explores incorporating silicon in battery anodes to increase energy density, which, combined with self-healing technology, could enable lighter EVs with longer ranges. While the sensors increase production costs, efforts
energyelectric-vehiclesbattery-technologyself-healing-batteriessensorsPHOENIX-projectsustainable-energyAV startup Pronto.ai acquires off-road autonomous vehicle rival SafeAI
Pronto.ai, a San Francisco-based startup specializing in autonomous haulage systems for off-road vehicles used in construction and mining, has acquired its competitor SafeAI. The acquisition, reportedly valued in the millions, brings SafeAI’s 12-person engineering team and intellectual property under Pronto’s umbrella. Pronto CEO Anthony Levandowski described the move as both a talent and technology acquisition aimed at consolidating resources to accelerate growth. The deal positions Pronto as one of the two main players in the autonomous haulage space, enabling it to expand its customer base, including international markets, and serve a wider range of mining operations from small quarries to large mines. Pronto’s technology primarily relies on a camera-only approach combined with advanced sensors, AI, and a proprietary peer-to-peer mobile data network called Pollen, which supports high-speed data exchange in low-connectivity environments. SafeAI, founded in 2017 and backed by $38 million in funding, employs a multi-sensor system including cameras
robotautonomous-vehiclesAImining-technologysensorssafety-certificationoff-road-vehiclesLucid Motors will roll out hands-free highway driving this month
Lucid Motors is launching a software update on July 30, 2025, that will enable hands-free highway driving on its Air sedans, marking a significant advancement in the company’s advanced driver assistance system. This update places Lucid among a select group of automakers in the U.S. offering hands-free driving capabilities, alongside Ford’s BlueCruise, General Motors’ Super Cruise, and Mercedes-Benz’s Drive Pilot. The feature requires the $2,500 “Dream Drive Pro” package, which includes a comprehensive sensor suite with lidar, radar, cameras, and ultrasonics, and will initially be available only on compatible divided highways. Drivers must remain attentive and ready to take control, with monitoring via an in-cabin camera positioned above the steering column. The hands-free system rollout currently targets Air sedans, with plans to extend the update to the new Gravity SUV later in 2025, although only a handful of Gravity units have been delivered so far. Lucid has delivered approximately
IoTautonomous-vehiclesadvanced-driver-assistance-systemssensorslidarradarhands-free-drivingAI-powered graphene tongue detects flavors with 98% precision
Scientists have developed an AI-powered artificial tongue using graphene oxide within a nanofluidic device that mimics human taste with remarkable accuracy. This system integrates both sensing and computing on a single platform, enabling it to detect chemical signals and classify flavors in real time, even in moist conditions similar to the human mouth. Trained on 160 chemicals representing common flavors, the device achieved about 98.5% accuracy in identifying known tastes (sweet, salty, sour, and bitter) and 75-90% accuracy on 40 new flavors, including complex mixtures like coffee and cola. This breakthrough marks a significant advancement over previous artificial taste systems by combining sensing and processing capabilities. The sensor exploits graphene oxide’s sensitivity to chemical changes, detecting subtle conductivity variations when exposed to flavor compounds. Coupled with machine learning, it effectively recognizes flavor patterns much like the human brain processes taste signals. The researchers highlight potential applications such as restoring taste perception for individuals affected by stroke or viral infections, as well as uses
grapheneartificial-tongueAImaterials-sciencesensorsmachine-learningnanotechnologyFlexible new polymer may replace toxic plastics in smart devices
Scientists at Case Western Reserve University have developed a novel fluorine-free ferroelectric polymer that promises to replace environmentally harmful plastics commonly used in electronics, such as poly(vinylidene fluoride) (PVDF), a persistent “forever chemical.” Led by Professor Lei Zhu, the team created a flexible, rubber-like material that generates electric properties without requiring crystallization, unlike traditional ferroelectric materials. This innovation offers tunable electrical characteristics, improved manufacturability into thin films or coatings, and acoustic compatibility with biological tissue, making it particularly suitable for wearable medical sensors, virtual and augmented reality devices, and other smart electronics. The new polymer addresses key limitations of existing ferroelectric materials, which are often brittle ceramics, by combining flexibility, lightness, and environmental safety. Although still in the development phase with small-scale synthesis underway, the material’s potential to reduce toxicity and waste in electronics is significant. The research, initially funded by a U.S. Department of Energy grant from 2017
materialspolymerferroelectricflexible-electronicseco-friendlysensorswearable-technologySlow-motion earthquake that travels miles in weeks captured in action
Researchers from the University of Texas at Austin have, for the first time, directly captured a slow slip earthquake traveling along the Nankai Fault off the coast of Japan. Using sensitive borehole sensors installed nearly 1,500 feet below the seafloor, the team recorded the gradual release of tectonic pressure over weeks as the fault slowly "unzipped" along a 20-mile stretch. These slow slip earthquakes, which unfold over days or weeks rather than seconds, are a relatively recent discovery and are believed to play a critical role in the earthquake cycle by gradually accumulating and releasing stress along fault lines. The study, published in the journal Science, revealed that the slow slip events initiated about 30 kilometers inland from the trench and migrated seaward at a rate of 1 to 2 kilometers per day, reaching close to or possibly breaching the trench itself. These events coincided with tremors and very-low-frequency earthquakes in a zone characterized by high pore fluid pressure and low stress, providing key
energysensorsearthquake-monitoringborehole-technologyseismic-activitygeophysicsslow-slip-earthquakeVolkswagen's 4-seat robotaxi with 27 sensors to hit US roads in 2026
Volkswagen has officially launched the production-ready ID. Buzz AD, a four-seat electric robotaxi equipped with 27 sensors—including 13 cameras, nine Lidars, and five radars—designed to compete with Tesla’s autonomous vehicles. Unlike Tesla’s current Level 2 autonomy, the ID. Buzz AD is built for SAE Level 4 autonomy, enabling fully driverless operation in designated areas without human intervention. The vehicle’s AI-powered control system, developed in partnership with Mobileye, processes real-time sensory data to handle various driving scenarios and emergencies. Additionally, the robotaxi includes remote monitoring capabilities and software certification, features Tesla has yet to achieve. Volkswagen offers the ID. Buzz AD as a turnkey Autonomous Driving Mobility-as-a-Service (AD MaaS) platform, which integrates fleet management, passenger assistance, and compatibility with third-party ride-hailing services. This comprehensive package allows businesses, cities, and fleet operators to deploy autonomous vehicle services without developing infrastructure or software from scratch. The van’s
robotautonomous-vehiclessensorsAIelectric-vehiclesmobility-as-a-serviceVolkswagenUS quantum tech tracks 3D acceleration to boost GPS-free navigation
Researchers at the University of Colorado Boulder have developed a novel quantum-based atom interferometer capable of measuring acceleration in three dimensions (3D), a significant advancement over traditional accelerometers that measure acceleration only in one dimension. The device uses six ultra-thin lasers and tens of thousands of rubidium atoms cooled to near absolute zero to create a Bose-Einstein Condensate (BEC), placing atoms in a superposition state. By manipulating these atoms with lasers and analyzing their interference patterns, the interferometer can precisely detect acceleration without the aging issues that affect conventional electronic sensors like those used in GPS systems. This compact system, roughly the size of an air hockey table, represents an engineering breakthrough with potential applications in spacecraft, submarines, and vehicles for GPS-free navigation. The researchers employed artificial intelligence to manage the complex laser operations required to split and recombine the atom clouds. Currently, the device can detect accelerations thousands of times smaller than Earth’s gravity, and the team anticipates further improvements. This technology
quantum-technologyatom-interferometer3D-acceleration-measurementnavigation-technologysensorsBose-Einstein-Condensaterubidium-atomsSensitive skin to help robots detect information about surroundings
Researchers from the University of Cambridge and University College London have developed a highly sensitive, low-cost, and durable robotic skin that can detect various types of touch and environmental information similarly to human skin. This flexible, conductive skin is made from a gelatine-based hydrogel that can be molded into complex shapes, such as a glove for robotic hands. Unlike traditional robotic touch sensors that require multiple sensor types for different stimuli, this new skin acts as a single sensor capable of multi-modal sensing, detecting taps, temperature changes, cuts, and multiple simultaneous touches through over 860,000 tiny conductive pathways. The team employed a combination of physical testing and machine learning to interpret signals from just 32 electrodes placed at the wrist, enabling the robotic skin to process more than 1.7 million data points across the hand. Tests included exposure to heat, gentle and firm touches, and even cutting, with the collected data used to train the system to recognize different types of contact efficiently. While not as sensitive as human skin
roboticsrobotic-skinsensorsflexible-materialsconductive-hydrogelmulti-modal-sensinghuman-robot-interactionTRON 1: China’s robot balances like Messi during moving truck test
robotroboticsbipedalbalancemobilitysensorscontrol-algorithmsUS scientists harvest electrical energy from human movement
energytriboelectric-generatorenergy-harvestingwearable-biosensorsmechanical-energypower-generationsensors