A 69-year-old man with paralysis and a brain implant was able to fly a virtual drone through complex obstacle courses, simply by thinking about moving his fingers, thanks to an experimental device developed by researchers from Stanford University.
The participant, who has quadriplegia from a C4 spinal cord injury, navigated obstacle courses and random flight patterns using neural signals from two tiny electrode arrays implanted in his brain. His ability to combine multiple movements simultaneously represents a significant advance in brain-computer interface technology.
For the experiment, researchers developed a system that can decode four distinct control dimensions from brain signals. This level of control matches what able-bodied gamers achieve with physical controllers, the researchers said.
“Just as able-bodied users of digital systems use their fingers to manipulate keyboards and game controllers, this system allows an intuitive framework for a brain-controlled digital interface, providing opportunities for recreation and socialization as well as eliciting feelings of enablement.”
“He expressed on multiple occasions (even before enrollment in the clinical trial) that one of his most important personal priorities was to use a BCI to control a quadcopter,” the researchers wrote in their paper. “He felt controlling a quadcopter would enable him, for the first time since his injury, to figuratively ‘rise up’ from his bed/chair.”
This motivation drove impressive results: after different tries, the unidentified subject was able to complete 12 laps around an obstacle course averaging 222 seconds per lap, and navigating through 28 randomly placed rings in just 10 minutes.
How the Technology Works
When you think about moving your fingers, neurons (brain cells) in the motor cortex (the brain’s movement control center) fire electrical signals. Even if the body is paralyzed, these signals still exist. Mind reading studies have been trying to decode these signals to trigger external devices capable of achieving what the brain wants to do.
The system that helped this man fly a virtual drone relies on two 96-channel silicon microelectrode arrays placed in the “hand knob” area of the participant’s motor cortex. The electrodes pick up spike-band power, a measure of how active neurons are. For example, when the participant imagines flexing their thumb, specific neurons fire rapidly and these electrodes detect the neural activity patterns.
Then, a computer uses a machine learning algorithm (like a smart translator) to convert these signals into finger movements in real time. The algorithm was trained by having the participant watch a virtual hand move and try to mimic it mentally. Over time, the system learned patterns that associated specific electrical patterns to specific finger movements.
The control scheme then learned to map different imagined finger movements to specific drone actions:
Thumb movements control forward/backward and left/right motion
Index-middle finger movements control altitude
Ring-little finger movements handle rotation
“Flying it is tiny little finesses off a middle line, a little bit up, a little bit down,” the patient explained. The system allows for smooth, simultaneous control across all dimensions, enabling complex maneuvers like combining forward movement with turns.
Mind reading technology is not exactly new; AI has given a massive boost to the discipline. Recent advances across multiple labs and companies show how quickly the field is evolving.
For instance, Neuralink, Elon Musk’s brain-computer interface company, has made headlines with its first two human patients. Its second participant, known as “Alex,” broke records for brain-computer interface cursor control and managed to play the video game Counter-Strike 2 and use 3D design software just one month after receiving the implant.
Elon Musk expects to make brain-computer interface devices massive. “If all goes well, there will be hundreds of people with Neuralinks within a few years, maybe tens of thousands within five years, millions within ten years,” Musk tweeted shortly after sharing the results of Alex’s performance.
However, some experts believe Neuralink’s approach is too invasive. This led one of its researchers to leave the company and fund another brain-control interface startup: “Precision Neuroscience” which is working on a device that registers activity by “wrapping” the brain rather than sticking needles in it.
Synchron, a New York-based company, has developed a less invasive brain implant called The Stentrode that avoids traditional brain surgery by being inserted through blood vessels. Their patient, a 64-year-old identified as “Mark,” successfully controlled Amazon Alexa devices and interacted with an Apple Vision Pro headset using just his thoughts. The device is implanted via the jugular vein and positioned near the motor cortex.
There are many other examples, from practical to more experimental.
Unbabel has been able to convert thoughts directly into text, UC San Francisco researchers have developed a thoughts-to-speech system and even Meta has been working on non-invasive brain-machine interfaces for augmented reality applications, developing a system that converts thoughts into images almost in real time.
In 2023, UC Berkeley researchers were able to reconstruct music directly from brain activity. Their system successfully recreated Pink Floyd’s “Another Brick in the Wall, Part 1” by analyzing neural signals from epilepsy patients. The breakthrough suggests potential applications for helping speech-impaired patients communicate through thought.
Generally Intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.
Source: https://decrypt.co/302292/paralyzed-man-controls-virtual-drone-with-mind