RIEM News LogoRIEM News

Articles tagged with "virtual-reality"

  • World's most advanced driving simulator uses VR for EV autonomy tests

    Graz University of Technology (TU Graz) and Magna have launched one of Europe’s most advanced driving simulators at the new Advanced Driving Simulation Center on TU Graz’s Campus Inffeldgasse. This state-of-the-art facility offers an exceptionally realistic driving experience that bridges the gap between mathematical vehicle modeling and human perception. By enabling engineers to test and fine-tune vehicle components such as chassis, tires, and advanced driver assistance systems (ADAS) long before physical prototypes exist, the simulator accelerates development cycles while reducing reliance on costly physical testing. Funded by Magna with TU Graz covering operational costs, the center strengthens Austria’s position as a hub for mobility innovation. The simulator features high-fidelity feedback, including vibrations above 100 Hz, allowing test drivers to feel subtle road textures and vehicle responses, which is especially important for electric vehicles where engine noise is minimal. Integrated virtual reality creates photorealistic traffic environments for testing displays and assistance features in realistic and potentially hazardous scenarios without risk. With extremely

    robotelectric-vehiclesadvanced-driver-assistance-systemsdriving-simulatorvehicle-developmentmobility-technologyvirtual-reality
  • Meta reportedly delays mixed reality glasses until 2027

    Meta has delayed the release of its new mixed reality glasses, codenamed Phoenix, from the second half of 2026 to the first half of 2027. Unlike its existing smart glasses, these new devices are expected to have a form factor similar to Apple’s Vision Pro, featuring a separate puck-like power source. The delay follows internal memos seen by Business Insider, where Meta executives cited CEO Mark Zuckerberg’s directive to prioritize sustainability and higher quality user experiences. According to Meta’s metaverse leaders Gabriel Aul and Ryan Cairns, the postponement will provide additional time to refine the product details. This move aligns with Meta’s broader strategy to ensure the business model behind the glasses is viable and the technology meets higher standards before launch. The article also references a recent Bloomberg report about Meta’s plans, but the content is incomplete and does not provide further details on those plans.

    IoTmixed-realityaugmented-realitywearable-technologyMetasmart-glassesvirtual-reality
  • 1HMX introduces Nexus NX1 for full-body motion capture, teleoperation - The Robot Report

    1HMX has introduced the Nexus NX1, a comprehensive full-body motion capture and teleoperation system designed to enhance training and simulation for humanoid robotics, embodied AI, and virtual reality (VR). The system integrates advanced technologies including HaptX Gloves G1 for tactile and force feedback, Virtuix Omni One’s 360-degree movement platform, and Freeaim’s motorized robotic shoes. It offers 72 degrees of freedom (DoF) body and hand tracking with sub-millimeter precision, capturing detailed data such as skeletal and soft tissue models, tactile displacement, pressure points, center of mass, and locomotion metrics. An included software development kit (SDK) facilitates integration with VR and robotics applications, enabling realistic real-time sensory input and valuable output data for robotic control, AI training, and user performance feedback. 1HMX envisions Nexus NX1 as a transformative tool across various industries including manufacturing, medical, defense, and research, supporting both single and multi-user immersive experiences with full

    roboticsteleoperationmotion-capturehumanoid-robotsAI-trainingvirtual-realityhuman-machine-interface
  • Nanoparticle screen hits record clarity visible to the human eye

    Researchers from Swedish institutions—including Chalmers University of Technology, the University of Gothenburg, and Uppsala University—have developed a groundbreaking display technology called retina E-paper, featuring pixels as small as 560 nanometres. This size is smaller than the wavelength of visible light, enabling a pixel density exceeding 25,000 pixels per inch (ppi), roughly 150 times denser than typical smartphone screens. The display uses tungsten oxide nanoparticles to control light scattering and produce highly accurate, tunable red, green, and blue colors. Unlike conventional LED or OLED screens, retina E-paper is reflective, relying on ambient light rather than emitting its own, which significantly reduces energy consumption and allows the screen to be positioned very close to the eye. The retina E-paper’s pixel size corresponds approximately to the size of a single photoreceptor in the human retina, meaning it achieves the maximum resolution perceivable by the human eye. The researchers demonstrated the technology by reproducing Gustav Klimt’s painting “The

    nanoparticlesdisplay-technologymaterials-scienceenergy-efficient-displaysvirtual-realityaugmented-realitytungsten-oxide-nanoparticles
  • Samsung takes on Apple’s Vision Pro with new Galaxy XR headset

    Samsung has launched its Galaxy XR headset as a direct competitor to Apple’s Vision Pro, offering a more affordable option at $1,800—nearly half the price of Apple’s device. The Galaxy XR runs on Google’s Android XR OS and Qualcomm’s Snapdragon XR2+ Gen 2 platform. It features a micro OLED display with 27 million pixels (surpassing Vision Pro’s 21 million), a resolution of 3,552 x 3,840, and a 90Hz refresh rate compared to Vision Pro’s 120Hz. Weighing 545 grams, it is lighter than Apple’s headset, which weighs between 750g and 800g. The device supports up to two hours of general use and two and a half hours of video playback, and includes multiple cameras for pass-through, world tracking, and eye tracking. Samsung emphasizes ergonomic design for comfort, with a balanced frame to reduce facial pressure. The headset supports various XR-optimized experiences such as immersive 3

    robotIoTwearable-technologyaugmented-realityvirtual-realitysmart-devicesXR-headset
  • Fundamental XR launches Fundamental Touch for wireless haptics - The Robot Report

    Fundamental XR has launched Fundamental Touch, a wireless haptics platform designed to deliver precise, untethered tactile feedback across multiple industries beyond healthcare, including robotics, industrial training, automotive, aerospace, retail, and gaming. This new software removes the traditional physical tether required by high-fidelity kinesthetic haptic devices, enabling greater user mobility and performance parity. Built on a client-server architecture, Fundamental Touch decouples haptic simulations from visual rendering and user interfaces, allowing sub-100ms latency and scalable, real-time force feedback via a peer-to-peer network layer. The system supports various output devices such as XR headsets (e.g., Apple Vision Pro, Meta Quest), robotic platforms (e.g., Boston Dynamics’ Spot), and gaming peripherals. Fundamental XR, formerly FundamentalVR, has a strong track record in healthcare, where its immersive technologies have reduced onboarding time by over 60%, improved surgical accuracy by 44%, and increased sales performance by 22%. The company has delivered

    robotwireless-hapticshuman-machine-interactionaugmented-realityvirtual-realityprecision-kinesthetic-hapticsimmersive-technology
  • Budget exoskeleton delivers muscle-like VR feedback for $400

    Kinethreads is an innovative, budget-friendly exoskeleton suit designed to deliver realistic muscle-like feedback for virtual reality (VR) experiences and movement assistance at a fraction of traditional costs. Priced under $500, the lightweight suit uses nylon threads threaded through fabric channels connected to compact motors that act as synthetic tendons, tightening to guide muscles and stabilize joints. A Raspberry Pi runs the system, coordinating motor actions via Python scripts. Initially developed for arm support, the design expanded to include leg stabilization with motors housed in a belt pack powered by a lithium-polymer battery, enabling two hours of active use. Vibration motors provide additional haptic feedback, enhancing the user's perception of assistance and improving natural control over time. The suit weighs less than five kilograms and can be donned in under 30 seconds, delivering up to 120 newtons of force and vibrotactile feedback at frequencies up to 200 hertz. Ten motorized reels on a vest create tension interpreted by the body as weight

    robotexoskeletonhaptic-feedbackwearable-technologyvirtual-realitymuscle-assistancerehabilitation-technology
  • Fancy a personal dragon? US students build AI pet that you can touch

    A team of students at Carnegie Mellon University’s Entertainment Technology Center (ETC) has developed Luceal, an innovative AI pet prototype that blends virtual reality with physical interaction. Created under the Physical Presence Pet (PPP) project, Luceal is a plush animal embedded with custom textile sensors that respond to touch, sending signals to Apple Vision Pro VR headsets to provide real-time virtual animations and reactions. This integration allows users to physically feel and interact with a virtual pet, combining tactile features with expressive digital behavior. The project was guided by professor Olivia Robinson, who introduced the team to e-textiles, enabling the seamless incorporation of conductive fabrics into the plush form. The concept was inspired by the desire for a constant companion, especially for those unable to have real pets, such as international students, and draws on nostalgia from digital pets like Tamagotchi. The team envisioned creating exotic virtual creatures—such as dragons or seals—that users could interact with in ways not possible with real animals. Designers on the

    robotAI-petvirtual-realitye-textilessensorsinteractive-technologywearable-technology
  • Russian students build anti-drone simulator to train gunners

    Students at Russia’s Southern Federal University (SFU) have developed the world’s first dedicated anti-drone rifle simulator, powered by Unreal Engine, to train gunners in countering unmanned aerial vehicles (UAVs). The interactive platform places trainees in a virtual test range where radio signals, GPS interference, and device behaviors are modeled in near-real time. Users can operate digital replicas of Russian anti-drone rifles like the “Pars” and “Harpy,” as well as detectors such as the “Bulat” v3, against a variety of target drones ranging from small FPV quadcopters to commercial DJI models and Ukrainian military drones like the “Leleki-100.” The simulator includes theoretical lessons, educational materials, and video scenarios designed to improve practical skills, decision-making under stress, and familiarity with electronic warfare conditions without the cost of live ammunition or hardware. The development responds to the increasing dominance of drones on the Russia-Ukraine battlefield, where UAVs have been used extensively for

    robotdrone-technologysimulation-trainingunmanned-aerial-vehicleselectronic-warfarevirtual-realitydefense-technology
  • You Can Now Feel Touch In VR

    The USC Viterbi School of Engineering has developed a new haptic system that enables users to experience the sense of touch within virtual reality environments. This innovation marks a significant advancement in VR technology by adding tactile feedback, allowing users to physically feel interactions in a digital space. The system enhances immersion and could transform how people engage with virtual content, making online interactions more realistic and intuitive. This breakthrough has broad implications for various applications, including gaming, remote collaboration, education, and training, where the ability to feel virtual objects or interactions can improve user experience and effectiveness. While the article does not provide detailed technical specifications or deployment timelines, the introduction of touch sensation in VR represents a major step toward more comprehensive and multisensory virtual experiences.

    robothapticsvirtual-realityhuman-computer-interactionwearable-technologysensory-technologyUSC-Viterbi
  • Chinese humanoid robot cooks steak by remote control from 1,118 miles away

    Chinese robotics company Dobot Robotics has demonstrated its humanoid robot, Atom, cooking a steak via remote control from 1,118 miles away using virtual reality (VR). The robot, equipped with five-fingered hands and 28 degrees of freedom, precisely mirrored an engineer’s hand gestures in real time with an accuracy of 0.05 millimeters. The demonstration showcased Atom performing delicate cooking tasks such as patting the steak, pouring oil, flipping the steak, and sprinkling salt, highlighting its human-like dexterity. Currently, only the robot’s upper body is controllable via VR, while walking remains autonomous or limited. Released in March 2025 at a price of around $27,500, Atom represents a significant advancement in teleoperated robotics, proving that high-precision control over long distances is feasible. Dobot envisions applications in hazardous or inaccessible environments like nuclear plants or outer space, as well as precision-demanding tasks such as surgery and housework. The demonstration positions

    roboticshumanoid-robotteleoperationvirtual-realityremote-controlprecision-roboticsDobot-Robotics
  • This Chinese 'school' teaches robots to perform tasks using VR

    A specialized robot training facility in Hefei, China, known as an embodied intelligent robot training environment, is pioneering the use of virtual reality (VR) to teach robots practical skills in real-world scenarios. Human trainers wearing VR headsets guide robot "students" through fine motor tasks such as picking up tools and tightening screws, with each robot receiving around 200 action sequences daily. This hands-on approach allows robots to gather physical data and develop machine learning models that enable them to generalize tasks beyond memorized motions, adapting to variable conditions like different screw types or uneven surfaces. The school serves as China’s first public robot training platform offering shared resources such as computing power, datasets, and realistic simulated environments, which are typically costly for smaller companies to develop independently. It supports multiple business models, allowing companies to co-run, operate independently, or purchase training services. By bridging the gap between simulated training and real-world performance, the initiative aims to accelerate the development of versatile autonomous robots capable of functioning effectively in logistics

    robotrobotics-trainingvirtual-realitymachine-learningautomationindustrial-robotsrobot-education
  • FANUC unveils ROBOGUIDE v10 robot simulation software - The Robot Report

    FANUC America has released ROBOGUIDE v10, its most advanced version of offline robot programming and simulation software designed to enhance automation design and implementation. The software enables manufacturers to create, program, and simulate robotic workcells in 3D without physical prototypes, reducing costs and improving accuracy. Key improvements in ROBOGUIDE v10 include new virtual reality capabilities for immersive workcell visualization, a high-performance 64-bit architecture for better processing of complex systems, and a modernized user interface with ribbon-style toolbars and drag-and-drop robot definition to streamline navigation and setup. Additionally, ROBOGUIDE v10 offers enhanced support for native CAD imports, simplifying integration and optimization of automation layouts. The software is available alongside the previous version under a shared license, allowing existing users to access the latest features. FANUC supports users with tutorial videos, engineer-guided tips, and technical resources via its Tech Transfer website, aiming to assist students, customers, and integrators in maximizing productivity and reliability

    roboticsrobot-simulationFANUCautomation-softwareoffline-programmingvirtual-realityindustrial-robots
  • Shape-shifting soft robots offer 16 ways to simulate human touch

    Engineers at EPFL’s Reconfigurable Robotics Lab have developed "Digits," a modular, shape-shifting soft robotic system that delivers realistic human touch through 16 distinct haptic modes. Powered by compressed air, Digits uses flexible joints and rigid links to change shape and tactile feedback, enabling vibrations, stiffness modulation, and dynamic responses. Two prototypes—TangiGlove, an exoskeleton for the hand, and TangiBall, a handheld module—demonstrate the system’s versatility by morphing into multiple shapes and providing nuanced tactile cues. This approach bridges the gap in haptic realism by combining open-chain and closed-chain robotic configurations, allowing complex interactions such as grasping and pressing that most existing devices cannot replicate. The Digits system is designed for user-friendliness, integrating with the open-source Feelix platform, which employs machine learning to generate intuitive, real-time haptic feedback without requiring users to write code. The pneumatic actuation underlying Digits offers precise control over shape

    robotsoft-roboticshaptic-technologyvirtual-realitypneumatic-actuationmodular-robotstactile-feedback