Summary: In a ground-breaking study, researchers made it possible for tetraplegic brain-computer software (BCI) users to create unique sensory sensations, bringing about the restoration of real contact. Participants could change stimulation parameters to make online objects like a cat, an apple, or a key feel different, in contrast to previous attempts where unnatural touch felt general.
Members described vivid and distinctive emotions and were more than capable of identifying objects by contact only. This study demonstrates how individualized sensory input can improve neuroprosthetics ‘ ease, personalization, and finally usability.
Important Information
- BCI users customized the stimulation to give objects unique sensations.
- Genuine Feel: Members referred to rich, meaningful experiences like “warm and tappy” or” smooth and silky”
- Findings in the field of artificial arms that feel included and biological are closer to the goal of artificial neuroprosthetics.
University of Pittsburgh supply
Researchers at the University of Pittsburgh School of Medicine are one step closer to creating a brain-computer software, or BCI, that restores tetraplegia patients ‘ misplaced sense of touch.  ,
Users described the warm hair of a roaring cat, the clean, firm surface of a door key, and the great roundness of an apple as they explored a online represented object through their artificially created sense of touch.
This study, a partnership between Pitt and the University of Chicago, was published today in Nature Communications.  ,
Instead of making those decisions themselves, scientists gave BCI users command over the specifics of the electric stimulus that causes sensory sensations, in contrast to earlier experiments, where unnatural effect frequently felt like vague buzzing or tingling and didn’t change from object to object.
This significant development made it possible for participants to recreate a touch that was logical to them.  ,  ,
The tactile experience is significant in non-verbal interpersonal communication because it is both personal and meaningful, according to direct author Ceci Verbaarschot, Ph. Dr., past doctoral brother at , Pitt Rehab Neural Engineering Labs, and associate professor of neural surgery and medical engineering at the University of Texas-Southwestern.
Bi users can create experiences with objects that feel more practical and meaningful by designing their own sensations, which brings us closer to developing a neuroprosthetic that feels awe-inspiring and use-friendly.
A brain-computer program is a system that transforms brain activity into signals that may change, recover, or enhance body functions that are usually brain-controlled, such as muscle movement.
BCIs can also be used to rebuild damaged body feedback and restore lost sensations by stimulating the brain immediately.  ,
A crippled man was helped by Pitt scientists over the course of the past decade of research to experience the sensation of reach through a mind-controlled mechanical arm. This made moving the mechanical arm more effective.
However, those sensory experiences were flawed and remained consistent across objects with different temperature and texture: stumbling someone’s hand felt the same as removing a strong, difficult rock.  ,  ,
Scientists are now closer to achieving their goal of developing a contact that is intuitive.  ,
Bi users were able to create unique tactile experiences for various computer-generated objects in the new study, which was poorly able to predict the object’s location based on sensation only.  ,
In a gloomy room with an infinite number of sensory experiences, searching for the perfect effect resembled a match of “hot and cool.” Researchers asked study participants to find a mixture of excitement parameters that felt like petting a cat or touching an apple, code, blanket, or bread while exploring an subject presented to them online. All of the study participants had lost sensation in their hands as a result of a spinal cord injury.  ,  ,
All three study participants gave rich, vivid descriptions of objects that made sense of both logical and subjective terms: one participant described a cat as warm and “tappy” and another as smooth and silky.  ,
When the image was removed and participants had to rely on stimulation alone to identify one of five items 35 % of the time: better than chance but far from perfect.  ,  ,
Robert Gaunt, Ph. D., senior author of the study, said,” We designed this study to shoot for the moon and made it into orbit. associate professor of physical medicine and rehabilitation at Pitt, D.
Participants were successful in making the distinction between objects using just tactile sensation, according to the participants. It was easier to tell apart a cat from a towel because both are soft, but it was less likely to confuse a cat for a key when they made mistakes.
The study makes a significant contribution to creating an artificial limb that seamlessly integrates into a person’s individual sensory world by involving accurate touch sensations in a paralyzed hand.  ,  ,
Vahagn Karapetyan, M. D., Ph. D., and Michael Boninger, both of Pitt, Charles Greenspon, Ph. , and Sliman Bensmaia, Ph.D. both of the University of Chicago, and Bettina Sorger, Ph.D. of Maastricht University, D.  ,
Funding: This study received funding from the Dutch Research Council ( NWO Rubicon: 19.193SG) and the National Institute for Neurological Disorders and Stroke ( UH3 NS107714 ). 011, NWO Vidi: VI. Vidi. 191.210 ).  ,
About this news about BCI and neurotechnology research
Author: Anastasia ( Ana ) Gorelova
Source: University of Pittsburgh
Contact: Anastasia (Ana) Gorelova – University of Pittsburgh
Image: The image is credited to Neuroscience News
Open access to original research.
By Ceci Verbaarschot et al.,” Conving tactile object characteristics through customized intracortical microstimulation of the human somatosensory cortex.” Nature Communications
Abstract
revealing tactile object characteristics through custom intracortical microstimulation of the human somatosensory cortex
In people with spinal cord injuries, microstimulation of the somatosensory cortex can induce tactile perceptions, giving a means of restoring touch.
Although location and intensity can be accurately captured, a lack of tools to effectively scan the large stimulus parameter space and difficulties with determining percept quality are two issues that prevent creating more complex naturalistic sensations.
We present a study that enables three male tetraplegics to control their stimulation parameters blindly in order to create sensations for various virtual objects in order to address both challenges.
Participants can reliably produce object-specific sensations and report vivid object-appropriate characteristics using this method.
Additionally, both linear classifiers and participants are able to match stimulus profiles significantly above chance with their respective objects without the use of visual cues.
As the associated objects share more tactile characteristics, the confusion between the two sensations gets worse.
We come to the conclusion that, despite visual information being a key component of the processing of the artificially evoked sensations, microstimulation in the somatosensory cortex can also produce intuitive perceptions through a range of tactile characteristics.
This self-guided stimulation approach can be effectively applied to the characterization of percepts in upcoming stimulation paradigms.