Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Nat Hum Behav ; 2024 May 13.
Article in English | MEDLINE | ID: mdl-38740984

ABSTRACT

Speech brain-machine interfaces (BMIs) translate brain signals into words or audio outputs, enabling communication for people having lost their speech abilities due to diseases or injury. While important advances in vocalized, attempted and mimed speech decoding have been achieved, results for internal speech decoding are sparse and have yet to achieve high functionality. Notably, it is still unclear from which brain areas internal speech can be decoded. Here two participants with tetraplegia with implanted microelectrode arrays located in the supramarginal gyrus (SMG) and primary somatosensory cortex (S1) performed internal and vocalized speech of six words and two pseudowords. In both participants, we found significant neural representation of internal and vocalized speech, at the single neuron and population level in the SMG. From recorded population activity in the SMG, the internally spoken and vocalized words were significantly decodable. In an offline analysis, we achieved average decoding accuracies of 55% and 24% for each participant, respectively (chance level 12.5%), and during an online internal speech BMI task, we averaged 79% and 23% accuracy, respectively. Evidence of shared neural representations between internal speech, word reading and vocalized speech processes was found in participant 1. SMG represented words as well as pseudowords, providing evidence for phonetic encoding. Furthermore, our decoder achieved high classification with multiple internal speech strategies (auditory imagination/visual imagination). Activity in S1 was modulated by vocalized but not internal speech in both participants, suggesting no articulator movements of the vocal tract occurred during internal speech production. This work represents a proof-of-concept for a high-performance internal speech BMI.

2.
Neuron ; 110(11): 1777-1787.e3, 2022 06 01.
Article in English | MEDLINE | ID: mdl-35364014

ABSTRACT

The cortical grasp network encodes planning and execution of grasps and processes spoken and written aspects of language. High-level cortical areas within this network are attractive implant sites for brain-machine interfaces (BMIs). While a tetraplegic patient performed grasp motor imagery and vocalized speech, neural activity was recorded from the supramarginal gyrus (SMG), ventral premotor cortex (PMv), and somatosensory cortex (S1). In SMG and PMv, five imagined grasps were well represented by firing rates of neuronal populations during visual cue presentation. During motor imagery, these grasps were significantly decodable from all brain areas. During speech production, SMG encoded both spoken grasp types and the names of five colors. Whereas PMv neurons significantly modulated their activity during grasping, SMG's neural population broadly encoded features of both motor imagery and speech. Together, these results indicate that brain signals from high-level areas of the human cortex could be used for grasping and speech BMI applications.


Subject(s)
Brain-Computer Interfaces , Motor Cortex , Hand Strength/physiology , Humans , Motor Cortex/physiology , Parietal Lobe , Psychomotor Performance/physiology , Speech
SELECTION OF CITATIONS
SEARCH DETAIL
...