Home Science & TechSecurity The Future of Touch: Enhancing Artificial Senses for BCI Users

The Future of Touch: Enhancing Artificial Senses for BCI Users

by ccadm


Rapid technological advancements are increasingly narrowing the line between humans and machines. At the forefront of this progress is a brain-computer interface (BCI), which creates a direct communication link between the brain’s electrical activity and an external output.

By facilitating communication between the brain and computer, BCI helps restore the capabilities of physically challenged people. It converts activity in our brains into signals that can replace or improve body functions, such as muscle movement, typically controlled by the brain. This way, BCI showcases huge potential in improving people’s quality of life. 

In development for about half a century now, BCI has seen massive progress over the years with researchers now demonstrating the technology’s ability to efficiently restore capabilities of people with disabilities, such as paralysis, motor impairments, speech difficulties, and those with schizophrenia symptoms.

Studies have also shown that using BCI helps disabled people experience the lost sensation of touch. Such tactile sensations, however, remain imperfect and are similar between objects with different textures or temperatures. Now, scientists are looking to create an intuitive sense of touch.

Touch is an integral part of our lives, helping us not only connect with others but also pick up objects and walk. According to Charles Greenspon, a neuroscientist at the University of Chicago:

“Most people don’t realize how often they rely on touch instead of vision. If you can’t feel, you have to constantly watch your hand while doing anything, and you still risk spilling, crushing or dropping objects.”

To restore sensation in prosthetic limbs, researchers use tiny electrode arrays placed in the brain areas responsible for that specific function. 

This allows participants to move their limbs using a robotic arm by simply thinking about movement while sensors on it trigger pulses of electrical activity in the regions of the brain dedicated to touch. While scientists could invoke feelings of touch, these were rather weak and difficult to localise as to just where the contact actually occurred.

But now, brand new research has brain-computer interface users designing unique tactile experiences for different objects shown on a screen and then, using sensation alone, guessing the object with some accuracy.

The Challenge with Integrating Sensory Feedback in Prosthetics

Scientists from the University of Pittsburgh School of Medicine have achieved a breakthrough that takes them all that closer to developing a BCI that allows people with tetraplegia1, also known as quadriplegia, to restore their lost sense of touch.

Tetraplegia is when someone loses movement in both their arms and legs, and often the torso too, usually because of an injury to the cervical spinal cord, strokes, or other neurological damage. 

The damage disrupts the signals that allow the brain to receive and process sensory information like touch, resulting in the patient’s loss of feeling in the affected limbs.

While prosthetics provide an artificial limb to replace the lost function, this can only be achieved if they also provide a sense of touch, much like a real limb. Traditional artificial limbs focused mainly on restoring movement, but technological advancements have made it possible to have a sense of touch with the use of sensors and electrical stimulation.

To completely restore the lost function of limbs, the device has to be seamlessly integrated with a person’s existing sensorimotor system, which connects human perception and action.

To achieve this, tactile feedback is of key importance. This form of sensory interaction with devices is all about physical touch experiences.

A promising way to provide this kind of feedback is through intracortical microstimulation (ICMS) of the somatosensory cortex, which can evoke localized sensations on a person’s paralyzed limb. By delivering tactile information directly to the brain, ICMS makes for an attractive option for people with high-level amputation or spinal cord injury.

However, achieving this isn’t simple but rather very difficult because of our limited understanding of the neural processing of touch. Hardware restrictions also limit the ability to replicate neural responses naturally. Also, the stimulation parameter space is complex, and reports of measuring just how real the artificial experience with sensation was are prone to bias and difficult to interpret.

While the position and strength of the touch evoked through microstimulation of the somatosensory cortex can be conveyed reliably, there are issues with developing more intricate natural sensations, which involve insufficient techniques to effectively scan a wide stimulus space and problems with analyzing the perceptual quality. 

Most studies that have explored the psychophysics of ICMS have manipulated one stimulation parameter at a time. Most of the studies have been conducted in non-human primates that can’t verbalize their experienced sensations. 

There is simply a need for more efficient methods to explore the quality of sensations evoked through ICMS.

So, in the latest study, a collaboration between Pitt and the University of Chicago, scientists have presented an interface that addresses the challenges of creating complex naturalistic sensations. 

The study noted that, with people ultimately expected to use closed-loop BCIs in daily life, it is important to investigate the functional use and experience of ICMS-evoked sensations. 

However, the sample of participants available in such research is limited. Only about seven people with bidirectional intracortical implants in their somatosensory and motor cortex are available, and the study included three of them.

Empowering BCI Users to Define Their Sensory Experience

The interface created by the scientists has been used by three male individuals with tetraplegia to design their sensations for different virtual objects.

Artificial tactile sensations were designed to represent interactions with a cat, an apple, a key, a towel, and a piece of toast. These objects were chosen for their range of tactile dimensions, including familiarity, pleasantness, temperature, micro and macro texture, moisture, friction, and compliance.

Participants used the residual function in their left hand to interact with the tablet interface that generated “touch” sensations on their palm surface.

While exploring the objects through their artificial touch, participants described the cool roundness of an apple, the smooth, rigid surface of a door key, and the warm fur of a cat. This is completely different from earlier experiments, where artificial touch often felt like tingling or buzzing, which didn’t even vary from one object to another.

What set this experiment apart from previous research was that participants were in control of their own stimulation and could actively explore an object presented visually. 

In passive stimulation, where there is no vision and exploration, people usually report skin-level sensations such as pressure or vibration. This is because participants’ attention is focused on their body. 

In contrast, participants in active exploration are likely focused on the external world. Hence, they interpret the same percept as object-oriented sensations like roughness. This could be why the latest study participants spontaneously reported more object-oriented sensation descriptors.

This ability of participants to control stimulation presentation by “touching” an object displayed on the tablet helped create a more realistic experimental context. Here, the experienced sensations weren’t the result of experimenter-driven stimulation without a meaningful context, but rather targeted explorative movements.

Scientists basically gave BCI users control over the details of the electrical stimulation that creates tactile sensations rather than making those decisions themselves, allowing them to recreate a sense of touch that felt intuitive to them. According to lead author Ceci Verbaarschot, a former postdoc fellow at Pitt Rehab Neural Engineering Labs and currently the assistant professor of neurological surgery and biomedical engineering at the University of Texas-Southwestern:

“Touch is an important part of non-verbal social communication; it is a sensation that is personal and that carries a lot of meaning.” 

She added:

“Designing their own sensations allows BCI users to make interactions with objects feel more realistic and meaningful, which gets us closer to creating a neuroprosthetic that feels pleasant and intuitive to use.”

Reconstructing Sensory Reality from Neural Input Alone

Reconstructing Sensory Reality from Neural Input Alone

During the study, while looking for the perfect touch, scientists asked BCI participants to first find a combination of stimulation parameters that felt like touching a toast, towel, key, apple, or cat.

All the while, users explored the digitally presented object. Study participants described objects in vivid terms, which was subjective. For instance, one participant described the cat as silky and smooth, while the other one described it as tappy and warm. Then the images were taken away, and using just their stimulation, they had to recognize the objects.

To simulate the “touching” experience for BCI users, the study stimulated three electrodes sequentially. Each of these electrodes, which were driven by contact with different regions of the object, evoked a sensation in an adjacent area of the hand.

Participants were able to accurately identify one of five objects without any visual cues 35% of the time, which is significantly above chance but in need of significant improvement. However, confusion between two sensations increased for objects that shared more tactile characteristics.

“We designed this study to shoot for the moon and made it into orbit. Participants had a really hard task of distinguishing between objects by tactile sensation alone, and they were quite successful at it.”

– Senior study author Robert Gaunt, Ph.D., who’s an associate professor of physical medicine and rehabilitation at Pitt

Gaunt pointed out that even the mistakes were “predictable,” since it’s harder to tell apart a towel and a cat, both being soft, whereas confusion was less likely between a cat and a key.

Overall, the study found it promising that participants were able to create distinct object sensations even in the challenging environment of a large parameter space.

It concluded that microstimulation in the somatosensory cortex can stimulate intuitive percepts with various tangible properties. More complex stimuli “unlock a greater perceptual space that may allow people to distinguish artificially perceived objects with increased precision and intuition,” it said.

This is a major step towards creating an artificial limb that seamlessly blends with an individual’s unique sensory world.

A Growing Focus on Refining Artificial Touch in BCIs

In the world of BCI, there is currently a growing focus on making touch more intuitive. Just this year alone, several researchers have made progress on robotic prosthetic arms and BCIs to restore motor control and a sense of touch.

Neuroscientist Greenspon and his team addressed the current limitations of creating natural touch sensations in artificial limbs by focusing on ensuring that electrically stimulated touch sensations are precisely localized, stable, and strong enough for daily use.

For this, they delivered short pulses to individual electrodes in the touch centers. Then, participants reported the location and strength of each sensation they felt, helping create a comprehensive map of brain regions for specific body parts. 

The testing showed that stimulating two closely spaced electrodes together had participants feeling a stronger, clearer touch. 

The complementary paper worked on making artificial touch more intuitive and immersive by placing clusters of electrodes with overlapped sensory locations. This generated feelings described by participants as a tender gliding touch, even though the stimulus provided was in small, discrete steps.

Activating the electrodes sequentially also improved participants’ ability to differentiate intricate tangible shapes and respond to changes in the touched objects, helping move bionic feedback closer to the complex and precise abilities of natural touch.

“We conveyed tactile sensations related to orientation, curvature, motion, and 3D shapes for a participant using a brain-controlled bionic limb. We are in another level of artificial touch now. We think this richness is crucial for achieving the level of dexterity, manipulation, and a highly dimensional tactile experience typical of the human hand.”

– Lead study author Giacomo Valle, Assistant Professor at Chalmers University of Technology.

Meanwhile, scientists at the Max Planck Institute for Intelligent Systems invented wearable devices that can deliver expressive tactile sensations such as a calming touch, pressing on the skin, and vibrations at wide frequencies, going beyond current devices’ capabilities. 

To expand haptic sensations, they developed electrically driven cutaneous electrohydraulic (CUTE) wearables that can be customized to deliver multiple kinds of touches by changing voltage over time.

“Our CUTE devices demonstrate the feasibility of creating lightweight wearable systems that provide pleasant and expressive tactile communication. Future developments could see this technology applied to larger areas of the body, producing more complex sensations, and even studying human perception of haptic cues that were previously difficult to create.”

– First author Natalia Sanchez, a Ph.D. student at MPI-IS

Investing in Brain Computer Interface Space

Most companies working on brain-computer interfaces are still private, which makes it hard for regular investors to get exposure. ClearPoint Neuro is one of the few public exceptions. It doesn’t build BCIs itself, but its MRI-guided surgical tools are already used in hospitals to implant neural devices. That makes it a behind-the-scenes player helping push the field forward—and a rare chance to invest in this space through the public market.

ClearPoint Neuro Inc. (CLPT -5.55%)

ClearPoint Neuro, a gene therapy-enabling company, specializes in precise navigation to the brain and spine. By providing platforms that facilitate precise placement of devices in the brain, its technology plays an important role in the development and implementation of BCI systems and advances their applications.

With over 50 partners in BioPharma and more than 75 neurosurgery centers globally, ClearPoint is well-positioned to grow in this emerging industry.

ClearPoint Neuro, Inc. (CLPT -5.55%)

With a market cap of $387 million, the company shares are currently trading at $13.86, down 9.75% so far this year. Its EPS (TTM) is -0.70 and the P/E (TTM) is -19.68, while no dividend is offered to shareholders.

When it comes to company financials, ClearPoint reported revenue of $31.4 million for the year 2024. With an increase of 31% from the previous year, the company had its tenth consecutive year of growth. During this period, it also achieved a gross margin of 61% on its sales, a 4% increase from the previous year.

This was the “strongest financial and strategic performance,” said CEO Joe Burnett, adding, “Very importantly, we feel that we have entered the next phase of ClearPoint as a company, a phase that we call ‘Fast. Forward.”’

Having made an early repayment on a $10 million convertible note, the company had no outstanding debt at the end of the year. The cash burned during this period was $9 mln, 35% less than what ClearPoint spent in 2023. As of Dec. 31, 2024, the company had cash and cash equivalents of $20.1 million.

Most recently, the company applied with the FDA to expand the use of its ClearPoint Prism Neuro Laser Therapy System for MRI. This move can potentially open the currently inaccessible US Interstitial Thermal Therapy (LITT) market to it.

Latest on Clearpoint Neuro Inc.

ClearPoint Neuro: Fundamentals Continue To Strengthen

seekingalpha.com April 29, 2025

ClearPoint Neuro to Announce First Quarter 2025 Results May 13, 2025

accessnewswire.com April 25, 2025

ClearPoint Neuro: Drug Delivery Prospects Continue To Strengthen

seekingalpha.com February 27, 2025

ClearPoint Neuro, Inc. (CLPT) Q4 2024 Earnings Call Transcript

seekingalpha.com February 26, 2025

ClearPoint Neuro, Inc. (CLPT) Reports Q4 Loss, Misses Revenue Estimates

zacks.com February 26, 2025

ClearPoint Neuro Reports Fourth Quarter and Full Year 2024 Results

accessnewswire.com February 26, 2025

Conclusion

With their ability to determine our intent to move or control something in your environment directly from brain activity, brain-computer interfaces (BCI) showcase immense potential in revolutionizing how humans, particularly those with disabilities, interact with technology as well as the world around us. BCIs basically allow users to control devices using just their thoughts.

The study from the Pitt Med, which allows users to create their own tactile environments, marks an important shift from simply restoring function to restoring experience. It shows that prosthetics can be more than just tools; they can actually be extensions of the self.

This is still the beginning, though, with significant room for improvement. However, as BCI systems evolve, the line between artificial and biological sensation will continue to blur, and with that, advanced capabilities that will reshape how we perceive, interact, and even define consciousness.

Click here for a list of the best brain-computer interface companies.


Studies Referenced:

1. Verbaarschot, C., Karapetyan, V., Greenspon, C. M., Cramer, A., van der Kouwe, A., Wendelken, S., … & Andersen, R. A. (2025). Conveying tactile object characteristics through customized intracortical microstimulation of the human somatosensory cortex. Nature Communications, 16, 4017. https://doi.org/10.1038/s41467-025-58616-6



Source link

Related Articles