Home Science & TechSecurity Mind-Controlled Robotics: UCSF’s Brain-Computer Interface Success

Mind-Controlled Robotics: UCSF’s Brain-Computer Interface Success

by ccadm


A team of researchers at the University of California, San Francisco (UCSF), has developed a unique brain-computer interface (BCI) that brings the world one step closer to mind-controlled robots. Here’s how the new system could change how you interact with your devices over a longer term and help those suffering from limb loss regain a better quality of life.

Brain-Computer Interface (BCI)

The use of BCIs continues to expand in the market. These devices enable humans to control devices using only thoughts. They operate using a variety of electromagnetic sensors that can monitor changes in brain activity. These systems utilize the brain’s distinct somatotopic representations of simple actions, like tapping your finger, to determine their motions.

Problems with Today’s Brain-Computer Interface

BCIs provide exciting opportunities to the market, but the technology is still in its fledgling state. Significant drawbacks, such as the cost of programming these devices and the need to constantly readjust to achieve proper calibration, continue to limit adoption. Thankfully, a new study delves into why BCIs need readjustments and introduces a novel system that provides long-term BCI support.

Brain-Computer Interface Study

The study titled “Sampling representational plasticity of simple imagined movements across days enables long-term neuroprosthetic control”1 published in the scientific journal Cell provides details on how to enable long-term complex neuroprosthetic control.

Mind-Controlled Robotics

The goal of the study was to monitor, catalog, and discover shifts in brain activity for day-to-day tasks and simple movements. To accomplish this task, the researchers tracked representational structure changes of brain activity across days via BCI control.

Electrocorticography Brain-Computer Interface

The Electrocorticography BCI allowed engineers to compare neural activity to a unihemispheric ECoG grid that represents imagined movements of body parts. This approach was required to determine representational structure in the brain. Specifically, the team used pairwise separation as a tracking metric.

The BCI integrates artificial intelligence (AI) models to adjust for gradual shifts in neural activity patterns over time. These shifts, known as representational drift, occur as the brain adapts to repeated motor tasks. The AI refines its interpretation of brain signals, allowing the participant to maintain control of the robotic arm for months. The study used an intracortical brain-computer interface, where tiny electrodes were implanted directly into the brain to record neural activity. Unlike electrocorticography (ECoG), which places sensors on the brain’s surface, intracortical implants provide high-resolution recordings but require direct implantation into brain tissue.

Plasticity

Plasticity refers to your brain’s ability to adapt to changes in your environment, health, or experiences. Specifically, your brain’s synaptic plasticity, homeostatic plasticity, and adult neurogenesis undergo adaptive structural and functional changes on a daily basis.

These tiny changes may not be noticeable to humans, but BCIs need to overcome this challenge to remain stable. As such, the researchers chronologically plotted each session’s average Mahalanobis distance to provide trackability.

Representational Drift

Neural drift is another occurrence that the engineers need to account for when creating their BCI system. Drift refers to changes in activity and behavior over time. Drift occurs with most long-term memories surrounding motor skills.

Understanding that neural representations of familiar movements continually evolve, the team constructed a common manifold using data across days. They monitored exact daily changes and specific differences, especially in the neural centroids that were not found in the construct of the original representations.

Neural Representational Variance

The engineers were able to account for the neural representational variance of each action.  As such, the team uncovered a meta-representational structure with generalizable decision boundaries for each action repertoire, which could be located as it shifted throughout the mental network.

Notably, the team previously studied mental variance in animals. It was during these studies that they first noticed that day-to-day actions were easily decodable with high accuracy using BCI sensors. They also noted that the actions stimulated different centroids across the neural system as time progressed.

Long-term Brain-Computer Interface

This discovery led the engineers to track the representations of migrations across the brain’s network to achieve long-term BCI control. Specifically, the researchers were able to track and adjust for across-day plasticity and drift via a proprietary built AI.

Brain-Computer Interface Test

As part of the testing phase, the engineers collected data during 30 actions in 49 trials and 32 actions in 48 trials. Notably, the test focused on a single body part, the hand. The first step was to select a subset of actions and measure the representational structure of the actions across different contexts, with the eventual goal to control a virtual Jaco robotic arm.

Brain-Computer Interface Test Participants

The engineers selected a participant who suffered from severe tetraparesis and anarthria due to a bilateral brainstem stroke. The stroke was so serious that it took away their ability to talk or move. The tetraplegic participants suffered no cognitive damage, making them ideal for the study.

After hooking the patient up to the updated BCI, he was given several tasks ranging in difficulty from visualizing moving different parts of his body like his finger tip, head, or leg, to the micro movements of his pointer finger.

The team utilized ECoG-based BCIs to register the brain representations for each action. The enhanced BCI offered engineers increased resolution and the ability to conduct precise feedback-driven manipulation of representations. Notably, there was no observable body movement by the patient, but the mental activity was the same as if they weren’t paralyzed.

Control Robot Arm

The next step was to integrate the Kinova Jaco robot arm for testing. In the first testing phase, the patient was asked to manipulate the device using their mental capacity. The test had the patient attempt to lift an item and move it to a new location. This early stage testing showed poor controllability and a lack of reliability by the user.

3D Virtual Environment

Recognizing that there needed to be more feedback given to the controller, the team created a virtual robot arm. This approach allows the users to refine their control and provide valuable feedback, enabling them to track their progress and capabilities. The engineers believe that this rapid in-session learning will be crucial to future prosthetic training systems.

Brain-Computer Interface Adjustment Time Test

One of the biggest breakthroughs of this study is that the engineers were able to utilize the same prosthetic arm and patient with only a 15-minute recalibration after waiting months between sessions. The term utilized the deep recurrent neural network (RNN) to adjust for plasticity and drift.

After waiting for months, the patient was sent back to texting and given particular tasks. Two complex reach-to-grasp and object manipulation tests with varying levels of difficulty were set up to see if the system was functioning correctly.

The first task required the patient to reach and rotate the arm in order to grasp an object and move it to another location. Impressively, the team achieved a median success rate of 90% while finishing the task in only 60.8s. The following tasks increased in difficulty, with the final requiring the patient to open a cabinet, take out a cup, and hold it to a water dispenser until filled.

Brain-Computer Interface Test Results

The test results showed that the upgraded BCI could track neural variance and provide increased neural precision. The study showed that the brain’s signals for movement remain stable over time, but their locations of operation switch slightly.

The AI adjusts automatically to track these changes, allowing for easily configured systems that operate similar to plug-and-play devices on your PC. The team also discovered some interesting data along their journey.

They noted that each limb has similar initiative patterns across people. For example, they can look at brain patterns and see the difference between right and left hand movements. Also, the team concluded that reducing variance rapidly is vital for perceptual decision-making.

Additionally, the study demonstrates that neural statistics like variance can be tracked and regulated to increase representational distances during BCI control without somatotopic changes.

Brain-Computer Interface Benefits

There’s a long list of benefits obtained by blending human and AI learning. These systems could one day help those suffering from painful losses regain control over their life and allow them to conduct daily activities without stress.

Stability

The study demonstrated how adjusting BCI can provide stability to these control devices. The team’s decision to utilize low-dimensional manifold and relative representational distances for a repertoire of simple imagined movements proved to be the right choice.

New Record

Until this recent test, the longest a BCI worked without recalibration was around 2-3 days. This need to constantly recalibrate relegated these devices to only testing. Now, the upgraded BCI can last up to 7 months without any updates, opening the door for more responsive and cheaper prosthetics and more.

More Efficient

The enhanced BCI only takes around 15 minutes to recalibrate every 6 months. This is a major update to the previous system that required calibration every 3 days due to degradation of performance over extended periods for tasks requiring high precision.

Brain-Computer Interface Researchers

Researchers from UC San Francisco were led by professor of neurology and a member of the UCSF Weill Institute for Neurosciences, Karunesh Ganguly MD, PhD. The paper was co-authored by neurology researcher Nikhilesh Natraj, PhD, Sarah Seko, Adelyn Tu-Chan, and Reza Abiri of the University of Rhode Island. Notably, the project was funded by the National Institutes of Health and the UCSF Weill Institute for Neurosciences.

Future of Brain-Computer Interface

According to the team, the goal now is to make the robotic arm smoother and more responsive. They also want to expand what commands they BCI map to increase the versatility and capabilities of the device. In the future, they hope to encompass other body parts.

Real-World Applications & Timeline for Brain-Computer Interfaces

This advancement could have an upending effect across multiple industries. The ability to control and interact with devices utilizing mental controls would be a major upgrade to current methods. It could also open the door for a new era in healthcare, electronics, and learning.

While current implementations are in experimental stages, widespread clinical applications could become feasible within the next 5 to 10 years, depending on further research outcomes and regulatory approvals.​ As such, there is a lot of excitement surrounding the future possibilities of this tech.

Medical

One area where this technology is seen to find an immediate use case is in the prosthetics sector. The use of BCI technology has been seen by many as the pinnacle of prosthetics control systems. This recent discovery holds significant promise for restoring autonomy to individuals with paralysis by allowing them to interact with their environment through thought-controlled devices.

Innovative Companies Leading Brain-Computer Interface Development

The race to create brain-controlled computers and devices has led several firms to invest millions in R&D. These companies seek to usher in a new age of health and science using devices that surpass today’s keyboards and traditional input methods. Here’s one company pioneering these efforts and making a name for itself in the market.

Synchron entered service in 2012 as a cutting-edge neurotechnology firm. Notably, the company was named SmartStent. In 2016, the firm rebranded to Synchron, reflecting its focus on developing minimally invasive BCIs to help patients suffering from mobility loss.

Today, Synchron offers a variety of products, including an endovascular neural interface called Stentrode. This device enters the body via arteries and implants itself in the brain to provide support to motor skills. This product represents Synchronous constant innovations in the sector.

Additionally, the firm has secured grants from the U.S. Defense Advanced Research Projects Agency (DARPA), the United States Department of Defense (DoD), and the National Health and Medical Research Council of Australia.

Those seeking exposure to the BCI market should conduct further research on Synchron. Its market positioning and pioneering efforts continue to lay the groundwork for future AI-powered computer interfaces and more.

Brain-Computer Interfaces will Change everything.

Today’s BCI strides could make your sci-fi dreams seem outdated. Computers of the future will be able to communicate directly with you via thoughts, opening the door for a new age of human evolution. For now, these engineers deserve a standing applause for their efforts.

Learn about other Robotics breakthroughs Today.


Studies Referenced:

1. Ganguly, K., Natraj, N., Seko, S., Tu-Chan, A., & Abiri, R. (2024). Sampling representational plasticity of simple imagined movements across days enables long-term neuroprosthetic control. Cell, 2024. https://doi.org/10.1016/j.cell.2024.02.029



Source link

Related Articles