UK lab’s humanoid robots get NVIDIA grant to turn sound into motion

Source: interestingengineering
Author: @IntEngineering
Published: 1/16/2026
To read the full content, please visit the original article.
Read original articleChengxu Zhou, an associate professor at UCL Computer Science, has received an NVIDIA Academic Grant to advance his research on real-time, audio-driven whole-body motion for humanoid robots. The grant provides critical resources, including two NVIDIA RTX PRO 6000 GPUs and two Jetson AGX Orin devices, which will accelerate training and deployment cycles by enabling faster iteration and reducing the gap between simulation and real-robot testing. Zhou’s project, called Beat-to-Body, aims to develop humanoid robots that respond dynamically to audio cues such as tempo, accents, and loudness fluctuations, allowing them to adapt their movements in real time rather than following pre-scripted commands.
The Beat-to-Body system leverages large-scale simulation training with GPU compute and low-latency inference directly on the robot, minimizing dependence on offboard processing and enhancing responsiveness to sound. This approach aligns with recent research demonstrating that robots can generate expressive locomotion and gestures from music and speech without predefined motion templates, and
Tags
roboticshumanoid-robotsNVIDIAmachine-learningreal-time-motionaudio-driven-controlhuman-robot-interaction