U.S. researchers have developed a technology that decodes the brain waves (EEG) of paralyzed patients, allowing them to control robotic arms and computers with their thoughts. This was made possible by artificial intelligence (AI) decoding their thoughts using only the subtle currents flowing over their scalp, without the need to insert electrodes into the brain or place devices in the blood vessels.
The research team at the University of California, Los Angeles (UCLA) noted that "paralyzed patients successfully used a non-invasive AI-brain-computer interface (BCI) system to move robotic arms and computer cursors," as published in the international journal Nature Machine Intelligence on the 1st (local time).
BCI is a technology that decodes brain waves into electrical signals to control machines or computers. Neuralink, a startup founded by Elon Musk, and Syncron from Australia are representative BCI corporations. However, traditional BCI methods involved implanting small chips beneath the skull or in the neck vessels, making them risky and cumbersome. These methods had limitations due to the short lifespan of the devices, necessitating reoperations.
Paralyzed patients have previously manipulated computers or robots through eye movements. However, this was slow and inaccurate, making it difficult for them to sustain the effort. The UCLA research team developed a customized BCI algorithm to detect brain waves and decode intentions. They were able to detect brain waves using subtle currents flowing over the scalp instead of implanting electrodes in the brain. By combining this with a camera-based AI platform, the tasks were completed much faster than before.
Paralyzed patients wore a cap that detected brain waves flowing over their scalps and performed tasks such as moving a computer screen cursor to a target point or using the robotic arm to relocate blocks on a table to designated positions. Without AI assistance, they were unable to finish moving the blocks, but with AI applied, they completed the task in 6 minutes and 30 seconds.
Earlier research successfully operated computers or robots based on the thoughts of paralyzed patients by implanting electrodes in their brains. The research team at Stanford University has decoded even the thoughts that patients are not verbally expressing, which opens a channel for paralyzed patients who find it uncomfortable to attempt to speak aloud to convey their thoughts internally. In this case, AI played a role in interpreting brain signals so that machines could understand them. However, it was only able to classify brain wave patterns as commands like 'left hand movement' or 'right hand movement.'
Although brain waves can be detected with currents flowing over the scalp, the signals were weak. Consequently, the accuracy and speed of detecting brain waves were lower than those achieved with implanted electrodes. The UCLA research team combined the cap-style brain wave detector with an AI that plays a kind of 'co-pilot' role. The AI co-pilot autonomously understands the contextual situation of what action the user is trying to perform with brain waves and combines the two pieces of information to enhance both the speed and accuracy of tasks.
Jonathan Kao, a professor in the Department of Electrical and Computer Engineering at UCLA who led the study, said, "If AI complements BCI, it will enable much safer and less invasive methods for rehabilitation in daily life" and added, "The ultimate goal is to enable individuals with paralysis or amyotrophic lateral sclerosis to perform basic tasks like eating and picking up objects independently."
Dr. Johannes Y. Lee, a participant in the study, stated, "In the future, we need to implement more precise and faster robotic arms and develop an AI pilot equipped with a 'delicate touch' that can adapt to various objects," and predicted, "Utilizing large-scale training data will also enhance the ability to interpret brain wave signals."
References
Nature Machine Intelligence (2025), DOI: www.doi.org/10.1038/s42256-025-01090-y
Cell (2025), DOI:https://doi.org/10.1016/j.cell.2025.06.015