RIEM News LogoRIEM News

Articles tagged with "assistive-technology"

  • Neuralink's breakthrough lets patient control robot with thoughts

    Neuralink has achieved a significant breakthrough in brain-computer interface technology by enabling an amyotrophic lateral sclerosis (ALS) patient, Nick Wray, to control a robotic arm using only his thoughts. Through an implanted brain chip, Wray was able to perform everyday tasks such as microwaving food, drinking from a cup, opening a refrigerator, and even maneuvering his wheelchair. This milestone was demonstrated during the FDA-approved “CONVOY” study, which aims to restore independence for people with severe mobility impairments by translating neural signals into Bluetooth commands that control external devices. The implant, called the N1 chip, is a small device equipped with 128 ultra-fine threads containing about 1,000 electrodes that connect directly to the brain’s surface. These electrodes detect neural activity and convert it into precise digital commands. Neuralink began human trials in 2024 after overcoming initial FDA safety concerns. Eight participants have received the implant so far, including the first recipient, Noland Arbaugh

    robotbrain-computer-interfaceNeuralinkassistive-technologymedical-roboticsbrain-implantrobotic-arm-control
  • This Startup Wants to Put Its Brain-Computer Interface in the Apple Vision Pro

    Startup Cognixion is launching a clinical trial to integrate its noninvasive brain-computer interface (BCI) technology with Apple’s Vision Pro headset to help paralyzed individuals with speech impairments communicate using their thoughts. Unlike implant-based BCIs from companies like Neuralink, Cognixion’s system uses a custom headband equipped with six EEG sensors that detect brain signals related to visual fixation, enabling users to select options via mental attention. The trial will involve up to 10 participants in the US with speech disorders caused by conditions such as spinal cord injury, stroke, traumatic brain injury, or ALS. Cognixion’s technology combines hardware with AI-driven software that customizes communication models based on each user’s speech history and patterns, allowing for near-normal conversation speeds. Previously tested with ALS patients using their own Axon-R headset, the company now aims to leverage the broader functionality and app ecosystem of the Vision Pro to democratize access to BCI communication tools. Cognixion’s approach focuses

    robotbrain-computer-interfacewearable-technologyassistive-technologyaugmented-realityAI-communicationmedical-devices
  • Robotic exoskeleton gives YouTuber 63% aim boost, 17ms latency

    YouTuber Nick Zetta, known as Basically Homeless, developed a robotic exoskeleton aimed at enhancing aiming performance in the Aimlabs training program. Combining Nvidia Jetson hardware with a YOLO-powered AI vision system, motors, and 3D-printed components, the device physically guides his wrist and fingers to improve target acquisition. Initial tests showed a 20% accuracy drop as Zetta adapted to the system, but after hardware optimizations—such as reducing latency from 50ms to 17ms and increasing motor strength—he achieved a 63% boost in his Aimlabs score, propelling him to second place on the global leaderboard. The exoskeleton attaches to the forearm using 3D-printed hinges, with Kevlar lines and gimbal motors controlling wrist movements and solenoids managing finger clicks. A high-speed camera feeds real-time target data to the AI, which directs the motors to adjust hand positioning, effectively acting as a physical aimbot. Unlike

    roboticsrobotic-exoskeletonAI-visioncomputer-visionNvidia-Jetson3D-printingassistive-technology
  • Neuralink performs first-ever brain implant surgeries in Canada

    Neuralink has successfully performed its first brain-computer interface implant surgeries in Canada, marking a significant expansion of its clinical trials beyond the United States and the United Kingdom. Two patients with cervical spinal cord injuries underwent robotic-assisted implantation of Neuralink’s wireless brain device at the University Health Network (UHN) in Toronto as part of the CAN-PRIME Study. This study aims to assess the safety of the implant and surgical robot, and to determine whether individuals with paralysis can use their thoughts to control external devices such as cursors, text messaging, or robotic arms. Recruitment for the study is ongoing, including patients with cervical spinal injuries or amyotrophic lateral sclerosis (ALS). The implants hold promise for dramatically improving the quality of life for people with paralysis by enabling them to perform everyday tasks like checking emails or using smart home devices through thought control. The surgeries underscore Canada’s growing prominence in neurotechnology research, with UHN recognized as a leading center for surgical innovation. Neuralink, founded by Elon

    robotbrain-computer-interfaceneural-implantsneurotechnologyrobotic-surgeryassistive-technologywireless-devices
  • Neuralink’s Bid to Trademark ‘Telepathy’ and ‘Telekinesis’ Faces Legal Issues

    Neuralink, the brain implant company co-founded by Elon Musk, has encountered legal challenges in its attempt to trademark the terms "Telepathy" and "Telekinesis." The United States Patent and Trademark Office (USPTO) rejected Neuralink’s applications due to prior filings by Wesley Berry, a computer scientist and co-founder of tech startup Prophetic, who submitted trademark applications for "Telepathy" in May 2023 and "Telekinesis" in August 2024. Berry’s applications, filed as “intent-to-use,” describe software analyzing EEG data to decode internal dialogue for device control, though he has not yet commercialized products under these names. Additionally, the USPTO cited an existing trademark for Telepathy Labs, a company offering voice and chatbot technology, in its refusal to advance Neuralink’s application for "Telepathy." Neuralink has been using the name "Telepathy" for its brain implant product designed to enable paralyzed individuals to operate phones and computers via thought.

    robotbrain-computer-interfaceneural-implantswearable-technologyEEG-analysisassistive-technologyhuman-machine-interaction
  • AI brain interface lets users move robot arm with pure thought

    Researchers at the University of California, Los Angeles (UCLA) have developed a new wearable, noninvasive brain-computer interface (BCI) system that uses artificial intelligence (AI) to help individuals with physical disabilities control robotic arms or computer cursors through thought. Unlike previous BCI devices that required invasive neurosurgery, this system combines an electroencephalography (EEG) cap with a camera-based AI platform to decode brain signals and interpret user intent in real time. The AI acts as a “co-pilot,” enhancing the user’s control by guiding actions such as moving objects, thereby offering a safer and more practical alternative for people with paralysis or neurological disorders. In trials involving four participants—including one paralyzed individual—the AI-assisted system enabled faster and more accurate task completion, such as moving a cursor to targets and manipulating blocks with a robotic arm. Notably, the paralyzed participant was able to complete a robotic arm “pick-and-place” task in about six and a half minutes

    roboticsbrain-computer-interfaceartificial-intelligenceassistive-technologywearable-technologyneural-engineeringrobotic-arm-control
  • China Is Building a Brain-Computer Interface Industry

    China has unveiled an ambitious policy roadmap aiming to establish itself as a global leader in brain-computer interface (BCI) technology by 2030, with breakthroughs targeted by 2027. BCIs, which decode neural activity to control external devices, hold significant promise for assisting people with severe physical disabilities. The policy, jointly issued by seven Chinese government departments, outlines 17 specific steps including developing advanced brain signal chips, improving decoding software, standardizing technology, and building manufacturing capacity. This initiative reflects China’s broader strength in rapidly translating research into commercial products, as seen in other sectors like photovoltaics and electric vehicles. Although BCI research began in the 1970s, practical applications have only recently become feasible due to technological advances. China entered the field later than the US but is quickly closing the gap. Chinese companies and research institutions have successfully implanted BCIs in paralyzed patients, enabling them to control computer cursors, robotic arms, and even decode speech. For example, Shanghai-based Neuro

    robotbrain-computer-interfaceneuralinkassistive-technologyneuroengineeringChina-technology-policyBCI-development
  • Wearable robot helps ALS patients regain daily function

    The article discusses a wearable robotic device developed by Harvard bioengineers to assist individuals with movement impairments caused by neurodegenerative diseases like ALS or stroke. The device, designed as a sensor-loaded vest with an inflatable balloon under the arm, provides mechanical assistance to weak limbs, helping users perform daily tasks such as eating, brushing teeth, or combing hair. A key advancement in the latest version is the integration of a machine learning model that personalizes assistance by learning the user’s specific intended movements through motion and pressure sensors. This personalized approach addresses previous challenges where users struggled to control the robot’s movements due to insufficient residual strength. The research, led by Conor Walsh at Harvard’s John A. Paulson School of Engineering and Applied Sciences in collaboration with clinicians from Massachusetts General Hospital and Harvard Medical School, emphasizes a multidisciplinary approach involving both patient and clinician input from the outset. ALS patient Kate Nycz, diagnosed in 2018, has actively contributed to the device’s development through data and user testing

    wearable-robotassistive-technologyALSmachine-learningpersonalized-roboticsneurorehabilitationmobility-aid
  • Soft robot jacket offers support for upper-limb disabilities

    Researchers at Harvard John A. Paulson School of Engineering and Applied Sciences, in collaboration with Massachusetts General Hospital and Harvard Medical School, have developed a soft, wearable robotic jacket designed to assist individuals with upper-limb impairments caused by conditions such as stroke and ALS. This device uses a combination of machine learning and a physics-based hysteresis model to personalize movement assistance by accurately detecting the user’s motion intentions through sensors. The integrated real-time controller adjusts the level of support based on the user’s specific movements and kinematic state, enhancing control transparency and practical usability in daily tasks like eating and drinking. In trials involving stroke and ALS patients, the robotic jacket demonstrated a 94.2% accuracy in identifying subtle shoulder movements and reduced the force needed to lower the arm by nearly one-third compared to previous models. It also improved movement quality by increasing range of motion in the shoulder, elbow, and wrist, reducing compensatory trunk movements by up to 25.4%, and enhancing hand-path efficiency by up

    soft-roboticswearable-robotsupper-limb-supportassistive-technologymachine-learningrehabilitation-roboticshuman-robot-interaction
  • Woman regains speech 18 years after stroke with brain implant

    Eighteen years after suffering a brainstem stroke that left her with locked-in syndrome and near-total paralysis, Ann Johnson regained the ability to speak through an AI-powered brain-computer interface (BCI). The implant, placed over her brain’s speech motor cortex, detects neural signals when she attempts to speak and translates them via an AI decoder into audible words and facial animations on a digital avatar. Initially, the system had an eight-second delay due to sentence-based processing, but recent advances reported in 2025 have reduced this latency to about one second using a streaming AI architecture, enabling near-real-time communication. Johnson’s voice was personalized using recordings from her 2004 wedding speech, and she selected an avatar that mimics her facial expressions. The clinical trial, led by researchers at UC Berkeley and UCSF, aims to transform neuroprostheses from experimental devices into practical, plug-and-play clinical tools. Future developments may include wireless implants and photorealistic avatars to enhance natural interaction. The technology

    robotAIbrain-computer-interfaceneuroprostheticsmedical-technologyspeech-restorationassistive-technology
  • GR-3 humanoid robot debuts with empathy, emotion, and lifelike walk

    The GR-3 humanoid robot, unveiled by Fourier on August 6, 2025, represents a significant advancement in human-robot interaction by emphasizing empathy, emotional awareness, and lifelike movement. Standing 165 cm tall and weighing 71 kg, GR-3 features 55 degrees of freedom enabling natural, balanced motion, including expressive gaits such as a “bouncy walk.” Its design incorporates a soft-touch shell with warm tones and premium upholstery to create a familiar, comforting presence rather than a mechanical one. Central to its capabilities is Fourier’s Full-Perception Multimodal Interaction System, which integrates vision, audio, and tactile inputs into a real-time emotional processing engine. This system allows GR-3 to localize voices, maintain eye contact, recognize faces, and respond to touch via 31 pressure sensors, producing subtle emotional gestures that simulate genuine empathy. Beyond sensing, GR-3 employs a dual-path cognitive architecture combining fast, reflexive responses with slower, context-aware reasoning

    roboticshumanoid-robotemotional-AIhuman-robot-interactionhealthcare-roboticsempathetic-robotsassistive-technology
  • Neuralink brain chip trials launch in Britain for paralyzed patients

    Neuralink, Elon Musk’s brain-implant company, has initiated its first European clinical trial in the UK, aiming to test its brain-computer interface (BCI) technology on seven patients with severe paralysis caused by spinal cord injuries or neurological conditions like ALS. The trial, conducted in partnership with University College London Hospitals NHS Foundation Trust and Newcastle upon Tyne Hospitals, involves implanting Neuralink’s N1 chip under the skull to enable patients to control digital devices such as smartphones and tablets using only their thoughts. This marks the UK as the first European country to host such a study and builds on Neuralink’s earlier human trials in the US, where five paralyzed patients have already used the chip to operate devices mentally. Neuralink’s N1 chip is a small device, about the size of a 10-pence coin, equipped with 128 ultra-thin threads that connect approximately 1,000 electrodes to the brain to read electrical activity and translate it into digital commands. The company

    robotIoTbrain-computer-interfaceNeuralinkmedical-technologyassistive-technologyneurotechnology
  • AI robot arm builds meals and helps users with limited mobility

    Engineers at Virginia Tech have developed an advanced robotic arm designed to assist people with limited mobility in performing everyday tasks, such as preparing meals. The system features adaptive grippers that combine rigid mechanics with soft, switchable adhesives, enabling the robot to handle a wide range of objects—from heavy items like metal pans to delicate ingredients like sprinkles. This innovation addresses the challenge that traditional robots face when gripping irregular or fragile items, by allowing the grippers to switch between strong adhesion and easy release. The robotic arm is controlled via a joystick-style interface, allowing users to guide the robot’s movements while artificial intelligence interprets and completes the tasks. This collaboration was demonstrated through complex activities like assembling a pizza, which involves handling diverse textures and shapes, and building an ice cream sundae with small, delicate toppings. Funded by over $600,000 from the National Science Foundation, the project aims to enhance independence for people with disabilities by making robotic assistance more intuitive and closely aligned with natural human motions. The research

    roboticsassistive-technologyrobotic-armadaptive-grippersAI-controlsoft-roboticsdisability-aid
  • Neuralink helps paralysed woman write her name after 20 years

    Audrey Crews, a woman paralyzed for over 20 years, has successfully written her name using only her mind, thanks to Elon Musk’s Neuralink brain-computer interface (BCI) technology. Crews, who lost movement at age 16, is the first woman to receive the Neuralink implant, which involves brain surgery to insert 128 threads into her motor cortex. The chip, about the size of a quarter, enables her to control a computer purely through brain signals, marking a significant milestone in BCI development. However, Crews clarified that the implant does not restore physical mobility but is designed solely for telepathic control of digital devices. Neuralink’s PRIME Study, which tests these implants in human subjects, includes other participants such as Nick Wray, who also shared positive early experiences with the technology. Wray, living with ALS, expressed hope and excitement about the potential for digital autonomy and the future impact of BCIs. Founded in 2016, Neural

    robotbrain-computer-interfaceNeuralinkassistive-technologymedical-implanthuman-machine-interactionneurotechnology
  • Photos: Meta's new wristband translates hand movements to digital commands

    Meta researchers have developed a novel wristband called sEMG-RD (surface electromyography research device) that translates hand gestures into digital commands by interpreting electrical motor nerve signals from muscle movements at the wrist. The device uses 16 gold-plated dry electrodes arranged around the wrist to capture muscle contraction signals at a high sampling rate, enabling real-time gesture recognition without the need for skin preparation or conductive gels. Its modular design accommodates different wrist sizes and muscle configurations, while separating the heavier processing components into a separate capsule to enhance user comfort. The sEMG-RD supports a wide range of computer interactions beyond simple cursor control, including finger pinches, thumb swipes, thumb taps, and handwriting-like text entry at speeds of about 20.9 words per minute. By employing deep learning models trained on data from many users, the system can decode gestures generically without requiring personalized calibration, facilitating broad usability. The device is designed for ease of use, supporting both left- and right-handed users

    IoTwearable-technologyelectromyographyBluetooth-deviceshuman-computer-interactiongesture-recognitionassistive-technology
  • There's Neuralink—and There's the Mind-Reading Company That Might Surpass It

    The article contrasts two brain-computer interface (BCI) technologies aimed at helping people with paralysis regain autonomy: Elon Musk’s Neuralink and the startup Synchron. Unlike Neuralink, which requires invasive open-skull brain surgery, Synchron’s BCI is implanted via a less invasive procedure through blood vessels, avoiding direct brain surgery. The article follows Mark Jackson, a 65-year-old man with ALS (amyotrophic lateral sclerosis), who uses Synchron’s implant to control a computer game with his thoughts. Despite his paralysis, Jackson can steer a cursor by thinking about specific hand movements, demonstrating how the system decodes neural signals linked to intended actions using AI-powered software. Jackson’s journey highlights the potential of Synchron’s technology to restore independence for people with neurodegenerative diseases. After a multi-hour implantation procedure and months of calibration, Jackson successfully connected the internal implant with an external unit, enabling him to interact with digital devices through thought alone. While the implant does not slow ALS progression, it offers a new

    robotbrain-computer-interfaceneural-technologyassistive-technologymedical-devicesneurotechnologyALS-treatment
  • Robotic hand moves like magic, controlled by nothing but thought

    Researchers at Carnegie Mellon University have achieved a breakthrough in noninvasive brain-computer interface (BCI) technology by enabling real-time control of a robotic hand’s individual fingers using only human thought. Utilizing electroencephalography (EEG) combined with a novel deep-learning decoding strategy, the system translates brain signals into precise finger movements without any muscle activity. Volunteers successfully performed multi-finger tasks, demonstrating the system’s ability to overcome traditional EEG spatial limitations and achieve fine motor control. Led by Professor Bin He, whose lab has pioneered several EEG-powered robotic controls, this innovation offers a risk-free, external alternative to invasive BCIs that require surgery. The technology holds significant promise for a broad range of users, including people with motor impairments or those recovering from injuries, by enhancing hand function and quality of life. Beyond medical rehabilitation, the system’s natural dexterity opens possibilities for everyday tasks like typing or manipulating small objects, potentially redefining how assistive devices integrate seamlessly as intuitive extensions of the human body

    roboticsbrain-computer-interfacenoninvasive-BCIdeep-learningprostheticsassistive-technologyEEG-control
  • NAU researchers release open-source exoskeleton framework - The Robot Report

    Researchers at Northern Arizona University (NAU), led by associate professor Zach Lerner’s Biomechatronics Lab, have developed and released OpenExo, a comprehensive open-source robotic exoskeleton framework. This framework aims to lower the barriers to entry in exoskeleton development by providing free access to design files, code, and step-by-step building instructions for single- or multi-joint exoskeletons. OpenExo addresses the high costs, complexity, and interdisciplinary challenges involved in creating effective biomechanical exoskeletons, which traditionally require extensive trial, error, and collaboration across engineering, computer science, and physiology fields. Lerner’s team has a proven track record of applying exoskeleton technology to help children with cerebral palsy and patients with gait disorders, securing millions in grant funding and launching a spin-off company that brought a robotic ankle device to market. The lab has also been awarded nine patents related to exoskeleton development. By making OpenExo openly accessible, the

    robotexoskeletonbiomechanicsrehabilitation-technologyopen-source-roboticswearable-roboticsassistive-technology
  • China tests neural implant that lets amputee to move cursor with mind

    Chinese researchers have successfully tested an advanced invasive brain-computer interface (BCI) implant that enables a 37-year-old quadruple amputee to control a computer cursor with his mind. The implant, a coin-sized device with ultra-small, flexible electrodes developed by the Chinese Academy of Sciences (CAS), was implanted into the patient’s motor cortex. Within weeks, he was able to perform tasks such as playing chess and gaming with near-normal skill. The electrode is notable for being about one-fifth the thickness of Neuralink’s electrodes and highly flexible, minimizing tissue disruption and immune rejection. The implant underwent extensive preclinical testing on mice and macaques before human trials began. The surgical procedure took less than 30 minutes, using advanced 3D brain mapping and real-time navigation to ensure precise placement. Moving forward, the research team plans to expand trials to include up to 40 patients with paralysis or ALS by 2026. Future phases will focus on training participants to control robotic arms for practical tasks

    robotbrain-computer-interfaceneural-implantmedical-roboticsbrain-machine-interfaceassistive-technologyneural-electrodes
  • New brain-computer tech lets paralyzed patient talk in real time

    A new investigational brain-computer interface (BCI) developed by researchers at the University of California, Davis, has enabled a paralyzed patient with Amyotrophic Lateral Sclerosis (ALS) to communicate in real time using a synthesized version of his own voice. ALS causes loss of muscle control, including speech, making communication difficult or impossible. This BCI system uses surgically implanted microelectrode arrays in the brain’s speech region to capture neural activity, which is then decoded by advanced AI algorithms to produce near-instantaneous audible speech. The technology significantly reduces the delay seen in previous speech neuroprostheses, allowing for more natural, spontaneous conversations with a delay as low as one-fortieth of a second. The system was tested on a 45-year-old participant in the BrainGate2 clinical trial, who was asked to attempt speaking sentences displayed on a screen while his brain activity was recorded. The AI model mapped his neural firing patterns to intended speech sounds, enabling the participant to control

    robotbrain-computer-interfaceneuroprostheticsreal-time-voice-synthesismicroelectrode-arraysassistive-technologyparalysis-communication
  • A Neuralink Rival Just Tested a Brain Implant in a Person

    Paradromics, an Austin-based neurotechnology company founded in 2015, has conducted its first human test of Connexus, a brain implant designed to restore speech and communication in people with paralysis caused by spinal cord injury, stroke, or ALS. The device translates neural signals into synthesized speech, text, and cursor control by recording electrical activity from individual neurons via 420 tiny electrodes embedded in the brain tissue. The initial human implantation occurred on May 14 at the University of Michigan during epilepsy surgery, where the device was temporarily inserted into the temporal lobe using a specialized EpiPen-like tool. This procedure allowed researchers to confirm the device’s ability to capture neural signals with high resolution, which is critical for accurately decoding intended speech. Connexus is part of a growing field of brain-computer interface (BCI) technologies, including Elon Musk’s Neuralink and Synchron, which also develop implants to interpret neural signals but differ in electrode design and signal resolution. Unlike other devices that record from groups of neurons, Paradromics’ implant targets individual neurons to achieve higher-quality signals. BCIs do not read private thoughts but decode neural patterns associated with intended movements, such as facial muscle activity involved in speech. Recent studies from Stanford and UC San Francisco have demonstrated the ability to decode intended speech at rates approaching half of normal speaking speed in paralyzed individuals. Paradromics aims to launch a clinical trial by the end of 2023 to implant Connexus long-term in patients with paralysis, advancing toward commercial availability despite the regulatory and technical challenges of fully implantable brain devices.

    robotbrain-computer-interfaceneural-implantsmedical-devicesneurotechnologyassistive-technologybiomedical-engineering
  • MIT engineers create elder assist robot E-BAR to prevent falls at home

    roboteldercareassistive-technologyfall-preventionmobility-supportMITE-BAR
  • Chip não giúp chỉnh sửa video và đăng YouTube bằng suy nghĩ

    robotIoTNeuralinkbrain-computer-interfaceassistive-technologyAIALS
  • Robot Talk Episode 110 – Designing ethical robots, with Catherine Menon

    robot-ethicsassistive-technologyautonomous-systemsAI-safetyhuman-robot-interactionethical-designpublic-trust-in-AI