Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 35
Filtrar
1.
Cell ; 181(2): 396-409.e26, 2020 04 16.
Artigo em Inglês | MEDLINE | ID: mdl-32220308

RESUMO

Decades after the motor homunculus was first proposed, it is still unknown how different body parts are intermixed and interrelated in human motor cortical areas at single-neuron resolution. Using multi-unit recordings, we studied how face, head, arm, and leg movements are represented in the hand knob area of premotor cortex (precentral gyrus) in people with tetraplegia. Contrary to traditional expectations, we found strong representation of all movements and a partially "compositional" neural code that linked together all four limbs. The code consisted of (1) a limb-coding component representing the limb to be moved and (2) a movement-coding component where analogous movements from each limb (e.g., hand grasp and toe curl) were represented similarly. Compositional coding might facilitate skill transfer across limbs, and it provides a useful framework for thinking about how the motor system constructs movement. Finally, we leveraged these results to create a whole-body intracortical brain-computer interface that spreads targets across all limbs.


Assuntos
Lobo Frontal/fisiologia , Córtex Motor/anatomia & histologia , Córtex Motor/fisiologia , Adulto , Mapeamento Encefálico , Lobo Frontal/anatomia & histologia , Corpo Humano , Humanos , Córtex Motor/metabolismo , Movimento/fisiologia
2.
Nature ; 620(7976): 1031-1036, 2023 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-37612500

RESUMO

Speech brain-computer interfaces (BCIs) have the potential to restore rapid communication to people with paralysis by decoding neural activity evoked by attempted speech into text1,2 or sound3,4. Early demonstrations, although promising, have not yet achieved accuracies sufficiently high for communication of unconstrained sentences from a large vocabulary1-7. Here we demonstrate a speech-to-text BCI that records spiking activity from intracortical microelectrode arrays. Enabled by these high-resolution recordings, our study participant-who can no longer speak intelligibly owing to amyotrophic lateral sclerosis-achieved a 9.1% word error rate on a 50-word vocabulary (2.7 times fewer errors than the previous state-of-the-art speech BCI2) and a 23.8% word error rate on a 125,000-word vocabulary (the first successful demonstration, to our knowledge, of large-vocabulary decoding). Our participant's attempted speech was decoded  at 62 words per minute, which is 3.4 times as fast as the previous record8 and begins to approach the speed of natural conversation (160 words per minute9). Finally, we highlight two aspects of the neural code for speech that are encouraging for speech BCIs: spatially intermixed tuning to speech articulators that makes accurate decoding possible from only a small region of cortex, and a detailed articulatory representation of phonemes that persists years after paralysis. These results show a feasible path forward for restoring rapid communication to people with paralysis who can no longer speak.


Assuntos
Interfaces Cérebro-Computador , Próteses Neurais , Paralisia , Fala , Humanos , Esclerose Lateral Amiotrófica/fisiopatologia , Esclerose Lateral Amiotrófica/reabilitação , Córtex Cerebral/fisiologia , Microeletrodos , Paralisia/fisiopatologia , Paralisia/reabilitação , Vocabulário
3.
N Engl J Med ; 391(7): 609-618, 2024 Aug 15.
Artigo em Inglês | MEDLINE | ID: mdl-39141853

RESUMO

BACKGROUND: Brain-computer interfaces can enable communication for people with paralysis by transforming cortical activity associated with attempted speech into text on a computer screen. Communication with brain-computer interfaces has been restricted by extensive training requirements and limited accuracy. METHODS: A 45-year-old man with amyotrophic lateral sclerosis (ALS) with tetraparesis and severe dysarthria underwent surgical implantation of four microelectrode arrays into his left ventral precentral gyrus 5 years after the onset of the illness; these arrays recorded neural activity from 256 intracortical electrodes. We report the results of decoding his cortical neural activity as he attempted to speak in both prompted and unstructured conversational contexts. Decoded words were displayed on a screen and then vocalized with the use of text-to-speech software designed to sound like his pre-ALS voice. RESULTS: On the first day of use (25 days after surgery), the neuroprosthesis achieved 99.6% accuracy with a 50-word vocabulary. Calibration of the neuroprosthesis required 30 minutes of cortical recordings while the participant attempted to speak, followed by subsequent processing. On the second day, after 1.4 additional hours of system training, the neuroprosthesis achieved 90.2% accuracy using a 125,000-word vocabulary. With further training data, the neuroprosthesis sustained 97.5% accuracy over a period of 8.4 months after surgical implantation, and the participant used it to communicate in self-paced conversations at a rate of approximately 32 words per minute for more than 248 cumulative hours. CONCLUSIONS: In a person with ALS and severe dysarthria, an intracortical speech neuroprosthesis reached a level of performance suitable to restore conversational communication after brief training. (Funded by the Office of the Assistant Secretary of Defense for Health Affairs and others; BrainGate2 ClinicalTrials.gov number, NCT00912041.).


Assuntos
Esclerose Lateral Amiotrófica , Interfaces Cérebro-Computador , Disartria , Fala , Humanos , Masculino , Pessoa de Meia-Idade , Esclerose Lateral Amiotrófica/complicações , Esclerose Lateral Amiotrófica/reabilitação , Calibragem , Auxiliares de Comunicação para Pessoas com Deficiência , Disartria/reabilitação , Disartria/etiologia , Eletrodos Implantados , Microeletrodos , Quadriplegia/etiologia , Quadriplegia/reabilitação
4.
Nature ; 593(7858): 249-254, 2021 05.
Artigo em Inglês | MEDLINE | ID: mdl-33981047

RESUMO

Brain-computer interfaces (BCIs) can restore communication to people who have lost the ability to move or speak. So far, a major focus of BCI research has been on restoring gross motor skills, such as reaching and grasping1-5 or point-and-click typing with a computer cursor6,7. However, rapid sequences of highly dexterous behaviours, such as handwriting or touch typing, might enable faster rates of communication. Here we developed an intracortical BCI that decodes attempted handwriting movements from neural activity in the motor cortex and translates it to text in real time, using a recurrent neural network decoding approach. With this BCI, our study participant, whose hand was paralysed from spinal cord injury, achieved typing speeds of 90 characters per minute with 94.1% raw accuracy online, and greater than 99% accuracy offline with a general-purpose autocorrect. To our knowledge, these typing speeds exceed those reported for any other BCI, and are comparable to typical smartphone typing speeds of individuals in the age group of our participant (115 characters per minute)8. Finally, theoretical considerations explain why temporally complex movements, such as handwriting, may be fundamentally easier to decode than point-to-point movements. Our results open a new approach for BCIs and demonstrate the feasibility of accurately decoding rapid, dexterous movements years after paralysis.


Assuntos
Interfaces Cérebro-Computador , Encéfalo/fisiologia , Comunicação , Escrita Manual , Humanos , Redes Neurais de Computação , Traumatismos da Medula Espinal , Fatores de Tempo
5.
J Neurosci ; 42(25): 5007-5020, 2022 06 22.
Artigo em Inglês | MEDLINE | ID: mdl-35589391

RESUMO

Consolidation of memory is believed to involve offline replay of neural activity. While amply demonstrated in rodents, evidence for replay in humans, particularly regarding motor memory, is less compelling. To determine whether replay occurs after motor learning, we sought to record from motor cortex during a novel motor task and subsequent overnight sleep. A 36-year-old man with tetraplegia secondary to cervical spinal cord injury enrolled in the ongoing BrainGate brain-computer interface pilot clinical trial had two 96-channel intracortical microelectrode arrays placed chronically into left precentral gyrus. Single- and multi-unit activity was recorded while he played a color/sound sequence matching memory game. Intended movements were decoded from motor cortical neuronal activity by a real-time steady-state Kalman filter that allowed the participant to control a neurally driven cursor on the screen. Intracortical neural activity from precentral gyrus and 2-lead scalp EEG were recorded overnight as he slept. When decoded using the same steady-state Kalman filter parameters, intracortical neural signals recorded overnight replayed the target sequence from the memory game at intervals throughout at a frequency significantly greater than expected by chance. Replay events occurred at speeds ranging from 1 to 4 times as fast as initial task execution and were most frequently observed during slow-wave sleep. These results demonstrate that recent visuomotor skill acquisition in humans may be accompanied by replay of the corresponding motor cortex neural activity during sleep.SIGNIFICANCE STATEMENT Within cortex, the acquisition of information is often followed by the offline recapitulation of specific sequences of neural firing. Replay of recent activity is enriched during sleep and may support the consolidation of learning and memory. Using an intracortical brain-computer interface, we recorded and decoded activity from motor cortex as a human research participant performed a novel motor task. By decoding neural activity throughout subsequent sleep, we find that neural sequences underlying the recently practiced motor task are repeated throughout the night, providing direct evidence of replay in human motor cortex during sleep. This approach, using an optimized brain-computer interface decoder to characterize neural activity during sleep, provides a framework for future studies exploring replay, learning, and memory.


Assuntos
Aprendizagem/fisiologia , Córtex Motor/fisiologia , Sono/fisiologia , Adulto , Interfaces Cérebro-Computador , Vértebras Cervicais , Eletroencefalografia/métodos , Humanos , Masculino , Projetos Piloto , Quadriplegia/etiologia , Quadriplegia/fisiopatologia , Traumatismos da Medula Espinal/complicações , Traumatismos da Medula Espinal/fisiopatologia
6.
Lancet ; 389(10081): 1821-1830, 2017 05 06.
Artigo em Inglês | MEDLINE | ID: mdl-28363483

RESUMO

BACKGROUND: People with chronic tetraplegia, due to high-cervical spinal cord injury, can regain limb movements through coordinated electrical stimulation of peripheral muscles and nerves, known as functional electrical stimulation (FES). Users typically command FES systems through other preserved, but unrelated and limited in number, volitional movements (eg, facial muscle activity, head movements, shoulder shrugs). We report the findings of an individual with traumatic high-cervical spinal cord injury who coordinated reaching and grasping movements using his own paralysed arm and hand, reanimated through implanted FES, and commanded using his own cortical signals through an intracortical brain-computer interface (iBCI). METHODS: We recruited a participant into the BrainGate2 clinical trial, an ongoing study that obtains safety information regarding an intracortical neural interface device, and investigates the feasibility of people with tetraplegia controlling assistive devices using their cortical signals. Surgical procedures were performed at University Hospitals Cleveland Medical Center (Cleveland, OH, USA). Study procedures and data analyses were performed at Case Western Reserve University (Cleveland, OH, USA) and the US Department of Veterans Affairs, Louis Stokes Cleveland Veterans Affairs Medical Center (Cleveland, OH, USA). The study participant was a 53-year-old man with a spinal cord injury (cervical level 4, American Spinal Injury Association Impairment Scale category A). He received two intracortical microelectrode arrays in the hand area of his motor cortex, and 4 months and 9 months later received a total of 36 implanted percutaneous electrodes in his right upper and lower arm to electrically stimulate his hand, elbow, and shoulder muscles. The participant used a motorised mobile arm support for gravitational assistance and to provide humeral abduction and adduction under cortical control. We assessed the participant's ability to cortically command his paralysed arm to perform simple single-joint arm and hand movements and functionally meaningful multi-joint movements. We compared iBCI control of his paralysed arm with that of a virtual three-dimensional arm. This study is registered with ClinicalTrials.gov, number NCT00912041. FINDINGS: The intracortical implant occurred on Dec 1, 2014, and we are continuing to study the participant. The last session included in this report was Nov 7, 2016. The point-to-point target acquisition sessions began on Oct 8, 2015 (311 days after implant). The participant successfully cortically commanded single-joint and coordinated multi-joint arm movements for point-to-point target acquisitions (80-100% accuracy), using first a virtual arm and second his own arm animated by FES. Using his paralysed arm, the participant volitionally performed self-paced reaches to drink a mug of coffee (successfully completing 11 of 12 attempts within a single session 463 days after implant) and feed himself (717 days after implant). INTERPRETATION: To our knowledge, this is the first report of a combined implanted FES+iBCI neuroprosthesis for restoring both reaching and grasping movements to people with chronic tetraplegia due to spinal cord injury, and represents a major advance, with a clear translational path, for clinically viable neuroprostheses for restoration of reaching and grasping after paralysis. FUNDING: National Institutes of Health, Department of Veterans Affairs.


Assuntos
Interfaces Cérebro-Computador/estatística & dados numéricos , Encéfalo/fisiopatologia , Força da Mão/fisiologia , Músculo Esquelético/fisiopatologia , Quadriplegia/diagnóstico , Traumatismos da Medula Espinal/fisiopatologia , Encéfalo/cirurgia , Terapia por Estimulação Elétrica/métodos , Eletrodos Implantados/normas , Estudos de Viabilidade , Mãos/fisiologia , Humanos , Masculino , Microeletrodos/efeitos adversos , Pessoa de Meia-Idade , Córtex Motor/fisiopatologia , Movimento/fisiologia , Quadriplegia/fisiopatologia , Quadriplegia/cirurgia , Tecnologia Assistiva/estatística & dados numéricos , Traumatismos da Medula Espinal/terapia , Estados Unidos , United States Department of Veterans Affairs , Interface Usuário-Computador
7.
bioRxiv ; 2024 Apr 26.
Artigo em Inglês | MEDLINE | ID: mdl-38712189

RESUMO

Keyboard typing with finger movements is a versatile digital interface for users with diverse skills, needs, and preferences. Currently, such an interface does not exist for people with paralysis. We developed an intracortical brain-computer interface (BCI) for typing with attempted flexion/extension movements of three finger groups on the right hand, or both hands, and demonstrated its flexibility in two dominant typing paradigms. The first paradigm is "point-and-click" typing, where a BCI user selects one key at a time using continuous real-time control, allowing selection of arbitrary sequences of symbols. During cued character selection with this paradigm, a human research participant with paralysis achieved 30-40 selections per minute with nearly 90% accuracy. The second paradigm is "keystroke" typing, where the BCI user selects each character by a discrete movement without real-time feedback, often giving a faster speed for natural language sentences. With 90 cued characters per minute, decoding attempted finger movements and correcting errors using a language model resulted in more than 90% accuracy. Notably, both paradigms matched the state-of-the-art for BCI performance and enabled further flexibility by the simultaneous selection of multiple characters as well as efficient decoder estimation across paradigms. Overall, the high-performance interface is a step towards the wider accessibility of BCI technology by addressing unmet user needs for flexibility.

8.
Sci Rep ; 14(1): 1598, 2024 01 18.
Artigo em Inglês | MEDLINE | ID: mdl-38238386

RESUMO

Brain-computer interfaces have so far focused largely on enabling the control of a single effector, for example a single computer cursor or robotic arm. Restoring multi-effector motion could unlock greater functionality for people with paralysis (e.g., bimanual movement). However, it may prove challenging to decode the simultaneous motion of multiple effectors, as we recently found that a compositional neural code links movements across all limbs and that neural tuning changes nonlinearly during dual-effector motion. Here, we demonstrate the feasibility of high-quality bimanual control of two cursors via neural network (NN) decoders. Through simulations, we show that NNs leverage a neural 'laterality' dimension to distinguish between left and right-hand movements as neural tuning to both hands become increasingly correlated. In training recurrent neural networks (RNNs) for two-cursor control, we developed a method that alters the temporal structure of the training data by dilating/compressing it in time and re-ordering it, which we show helps RNNs successfully generalize to the online setting. With this method, we demonstrate that a person with paralysis can control two computer cursors simultaneously. Our results suggest that neural network decoders may be advantageous for multi-effector decoding, provided they are designed to transfer to the online setting.


Assuntos
Interfaces Cérebro-Computador , Redes Neurais de Computação , Humanos , Movimento , Lateralidade Funcional , Mãos , Paralisia , Encéfalo
9.
bioRxiv ; 2024 Feb 08.
Artigo em Inglês | MEDLINE | ID: mdl-38370697

RESUMO

People with paralysis express unmet needs for peer support, leisure activities, and sporting activities. Many within the general population rely on social media and massively multiplayer video games to address these needs. We developed a high-performance finger brain-computer-interface system allowing continuous control of 3 independent finger groups with 2D thumb movements. The system was tested in a human research participant over sequential trials requiring fingers to reach and hold on targets, with an average acquisition rate of 76 targets/minute and completion time of 1.58 ± 0.06 seconds. Performance compared favorably to previous animal studies, despite a 2-fold increase in the decoded degrees-of-freedom (DOF). Finger positions were then used for 4-DOF velocity control of a virtual quadcopter, demonstrating functionality over both fixed and random obstacle courses. This approach shows promise for controlling multiple-DOF end-effectors, such as robotic fingers or digital interfaces for work, entertainment, and socialization.

10.
medRxiv ; 2024 Apr 10.
Artigo em Inglês | MEDLINE | ID: mdl-38645254

RESUMO

Brain-computer interfaces can enable rapid, intuitive communication for people with paralysis by transforming the cortical activity associated with attempted speech into text on a computer screen. Despite recent advances, communication with brain-computer interfaces has been restricted by extensive training data requirements and inaccurate word output. A man in his 40's with ALS with tetraparesis and severe dysarthria (ALSFRS-R = 23) was enrolled into the BrainGate2 clinical trial. He underwent surgical implantation of four microelectrode arrays into his left precentral gyrus, which recorded neural activity from 256 intracortical electrodes. We report a speech neuroprosthesis that decoded his neural activity as he attempted to speak in both prompted and unstructured conversational settings. Decoded words were displayed on a screen, then vocalized using text-to-speech software designed to sound like his pre-ALS voice. On the first day of system use, following 30 minutes of attempted speech training data, the neuroprosthesis achieved 99.6% accuracy with a 50-word vocabulary. On the second day, the size of the possible output vocabulary increased to 125,000 words, and, after 1.4 additional hours of training data, the neuroprosthesis achieved 90.2% accuracy. With further training data, the neuroprosthesis sustained 97.5% accuracy beyond eight months after surgical implantation. The participant has used the neuroprosthesis to communicate in self-paced conversations for over 248 hours. In an individual with ALS and severe dysarthria, an intracortical speech neuroprosthesis reached a level of performance suitable to restore naturalistic communication after a brief training period.

11.
bioRxiv ; 2024 Sep 16.
Artigo em Inglês | MEDLINE | ID: mdl-39345372

RESUMO

Understanding how the body is represented in motor cortex is key to understanding how the brain controls movement. The precentral gyrus (PCG) has long been thought to contain largely distinct regions for the arm, leg and face (represented by the "motor homunculus"). However, mounting evidence has begun to reveal a more intermixed, interrelated and broadly tuned motor map. Here, we revisit the motor homunculus using microelectrode array recordings from 20 arrays that broadly sample PCG across 8 individuals, creating a comprehensive map of human motor cortex at single neuron resolution. We found whole-body representations throughout all sampled points of PCG, contradicting traditional leg/arm/face boundaries. We also found two speech-preferential areas with a broadly tuned, orofacial-dominant area in between them, previously unaccounted for by the homunculus. Throughout PCG, movement representations of the four limbs were interlinked, with homologous movements of different limbs (e.g., toe curl and hand close) having correlated representations. Our findings indicate that, while the classic homunculus aligns with each area's preferred body region at a coarse level, at a finer scale, PCG may be better described as a mosaic of functional zones, each with its own whole-body representation.

12.
ArXiv ; 2023 Nov 06.
Artigo em Inglês | MEDLINE | ID: mdl-37986728

RESUMO

Intracortical brain-computer interfaces (iBCIs) have shown promise for restoring rapid communication to people with neurological disorders such as amyotrophic lateral sclerosis (ALS). However, to maintain high performance over time, iBCIs typically need frequent recalibration to combat changes in the neural recordings that accrue over days. This requires iBCI users to stop using the iBCI and engage in supervised data collection, making the iBCI system hard to use. In this paper, we propose a method that enables self-recalibration of communication iBCIs without interrupting the user. Our method leverages large language models (LMs) to automatically correct errors in iBCI outputs. The self-recalibration process uses these corrected outputs ("pseudo-labels") to continually update the iBCI decoder online. Over a period of more than one year (403 days), we evaluated our Continual Online Recalibration with Pseudo-labels (CORP) framework with one clinical trial participant. CORP achieved a stable decoding accuracy of 93.84% in an online handwriting iBCI task, significantly outperforming other baseline methods. Notably, this is the longest-running iBCI stability demonstration involving a human participant. Our results provide the first evidence for long-term stabilization of a plug-and-play, high-performance communication iBCI, addressing a major barrier for the clinical translation of iBCIs.

13.
bioRxiv ; 2023 Apr 21.
Artigo em Inglês | MEDLINE | ID: mdl-37131830

RESUMO

Advances in deep learning have given rise to neural network models of the relationship between movement and brain activity that appear to far outperform prior approaches. Brain-computer interfaces (BCIs) that enable people with paralysis to control external devices, such as robotic arms or computer cursors, might stand to benefit greatly from these advances. We tested recurrent neural networks (RNNs) on a challenging nonlinear BCI problem: decoding continuous bimanual movement of two computer cursors. Surprisingly, we found that although RNNs appeared to perform well in offline settings, they did so by overfitting to the temporal structure of the training data and failed to generalize to real-time neuroprosthetic control. In response, we developed a method that alters the temporal structure of the training data by dilating/compressing it in time and re-ordering it, which we show helps RNNs successfully generalize to the online setting. With this method, we demonstrate that a person with paralysis can control two computer cursors simultaneously, far outperforming standard linear methods. Our results provide evidence that preventing models from overfitting to temporal structure in training data may, in principle, aid in translating deep learning advances to the BCI setting, unlocking improved performance for challenging applications.

14.
Adv Neural Inf Process Syst ; 36: 42258-42270, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-38738213

RESUMO

Intracortical brain-computer interfaces (iBCIs) have shown promise for restoring rapid communication to people with neurological disorders such as amyotrophic lateral sclerosis (ALS). However, to maintain high performance over time, iBCIs typically need frequent recalibration to combat changes in the neural recordings that accrue over days. This requires iBCI users to stop using the iBCI and engage in supervised data collection, making the iBCI system hard to use. In this paper, we propose a method that enables self-recalibration of communication iBCIs without interrupting the user. Our method leverages large language models (LMs) to automatically correct errors in iBCI outputs. The self-recalibration process uses these corrected outputs ("pseudo-labels") to continually update the iBCI decoder online. Over a period of more than one year (403 days), we evaluated our Continual Online Recalibration with Pseudo-labels (CORP) framework with one clinical trial participant. CORP achieved a stable decoding accuracy of 93.84% in an online handwriting iBCI task, significantly outperforming other baseline methods. Notably, this is the longest-running iBCI stability demonstration involving a human participant. Our results provide the first evidence for long-term stabilization of a plug-and-play, high-performance communication iBCI, addressing a major barrier for the clinical translation of iBCIs.

15.
bioRxiv ; 2023 Oct 12.
Artigo em Inglês | MEDLINE | ID: mdl-37873182

RESUMO

How does the motor cortex combine simple movements (such as single finger flexion/extension) into complex movements (such hand gestures or playing piano)? Motor cortical activity was recorded using intracortical multi-electrode arrays in two people with tetraplegia as they attempted single, pairwise and higher order finger movements. Neural activity for simultaneous movements was largely aligned with linear summation of corresponding single finger movement activities, with two violations. First, the neural activity was normalized, preventing a large magnitude with an increasing number of moving fingers. Second, the neural tuning direction of weakly represented fingers (e.g. middle) changed significantly as a result of the movement of other fingers. These deviations from linearity resulted in non-linear methods outperforming linear methods for neural decoding. Overall, simultaneous finger movements are thus represented by the combination of individual finger movements by pseudo-linear summation.

16.
bioRxiv ; 2023 Feb 04.
Artigo em Inglês | MEDLINE | ID: mdl-36778458

RESUMO

Intracortical brain-computer interfaces (iBCIs) require frequent recalibration to maintain robust performance due to changes in neural activity that accumulate over time. Compensating for this nonstationarity would enable consistently high performance without the need for supervised recalibration periods, where users cannot engage in free use of their device. Here we introduce a hidden Markov model (HMM) to infer what targets users are moving toward during iBCI use. We then retrain the system using these inferred targets, enabling unsupervised adaptation to changing neural activity. Our approach outperforms the state of the art in large-scale, closed-loop simulations over two months and in closed-loop with a human iBCI user over one month. Leveraging an offline dataset spanning five years of iBCI recordings, we further show how recently proposed data distribution-matching approaches to recalibration fail over long time scales; only target-inference methods appear capable of enabling long-term unsupervised recalibration. Our results demonstrate how task structure can be used to bootstrap a noisy decoder into a highly-performant one, thereby overcoming one of the major barriers to clinically translating BCIs.

17.
bioRxiv ; 2023 Apr 25.
Artigo em Inglês | MEDLINE | ID: mdl-36711591

RESUMO

Speech brain-computer interfaces (BCIs) have the potential to restore rapid communication to people with paralysis by decoding neural activity evoked by attempted speaking movements into text or sound. Early demonstrations, while promising, have not yet achieved accuracies high enough for communication of unconstrainted sentences from a large vocabulary. Here, we demonstrate the first speech-to-text BCI that records spiking activity from intracortical microelectrode arrays. Enabled by these high-resolution recordings, our study participant, who can no longer speak intelligibly due amyotrophic lateral sclerosis (ALS), achieved a 9.1% word error rate on a 50 word vocabulary (2.7 times fewer errors than the prior state of the art speech BCI2) and a 23.8% word error rate on a 125,000 word vocabulary (the first successful demonstration of large-vocabulary decoding). Our BCI decoded speech at 62 words per minute, which is 3.4 times faster than the prior record for any kind of BCI and begins to approach the speed of natural conversation (160 words per minute). Finally, we highlight two aspects of the neural code for speech that are encouraging for speech BCIs: spatially intermixed tuning to speech articulators that makes accurate decoding possible from only a small region of cortex, and a detailed articulatory representation of phonemes that persists years after paralysis. These results show a feasible path forward for using intracortical speech BCIs to restore rapid communication to people with paralysis who can no longer speak.

18.
eNeuro ; 8(1)2021.
Artigo em Inglês | MEDLINE | ID: mdl-33495242

RESUMO

Intracortical brain-computer interfaces (iBCIs) have the potential to restore hand grasping and object interaction to individuals with tetraplegia. Optimal grasping and object interaction require simultaneous production of both force and grasp outputs. However, since overlapping neural populations are modulated by both parameters, grasp type could affect how well forces are decoded from motor cortex in a closed-loop force iBCI. Therefore, this work quantified the neural representation and offline decoding performance of discrete hand grasps and force levels in two human participants with tetraplegia. Participants attempted to produce three discrete forces (light, medium, hard) using up to five hand grasp configurations. A two-way Welch ANOVA was implemented on multiunit neural features to assess their modulation to force and grasp Demixed principal component analysis (dPCA) was used to assess for population-level tuning to force and grasp and to predict these parameters from neural activity. Three major findings emerged from this work: (1) force information was neurally represented and could be decoded across multiple hand grasps (and, in one participant, across attempted elbow extension as well); (2) grasp type affected force representation within multiunit neural features and offline force classification accuracy; and (3) grasp was classified more accurately and had greater population-level representation than force. These findings suggest that force and grasp have both independent and interacting representations within cortex, and that incorporating force control into real-time iBCI systems is feasible across multiple hand grasps if the decoder also accounts for grasp type.


Assuntos
Córtex Motor , Mãos , Força da Mão , Humanos , Quadriplegia
19.
J Neural Eng ; 17(1): 016049, 2020 02 05.
Artigo em Inglês | MEDLINE | ID: mdl-32023225

RESUMO

OBJECTIVE: Speech-related neural modulation was recently reported in 'arm/hand' area of human dorsal motor cortex that is used as a signal source for intracortical brain-computer interfaces (iBCIs). This raises the concern that speech-related modulation might deleteriously affect the decoding of arm movement intentions, for instance by affecting velocity command outputs. This study sought to clarify whether or not speaking would interfere with ongoing iBCI use. APPROACH: A participant in the BrainGate2 iBCI clinical trial used an iBCI to control a computer cursor; spoke short words in a stand-alone speech task; and spoke short words during ongoing iBCI use. We examined neural activity in all three behaviors and compared iBCI performance with and without concurrent speech. MAIN RESULTS: Dorsal motor cortex firing rates modulated strongly during stand-alone speech, but this activity was largely attenuated when speaking occurred during iBCI cursor control using attempted arm movements. 'Decoder-potent' projections of the attenuated speech-related neural activity were small, explaining why cursor task performance was similar between iBCI use with and without concurrent speaking. SIGNIFICANCE: These findings indicate that speaking does not directly interfere with iBCIs that decode attempted arm movements. This suggests that patients who are able to speak will be able to use motor cortical-driven computer interfaces or prostheses without needing to forgo speaking while using these devices.


Assuntos
Interfaces Cérebro-Computador , Córtex Motor/fisiologia , Desempenho Psicomotor/fisiologia , Fala/fisiologia , Traumatismos da Medula Espinal/reabilitação , Idoso , Interfaces Cérebro-Computador/tendências , Vértebras Cervicais/lesões , Humanos , Masculino , Movimento/fisiologia , Projetos Piloto , Traumatismos da Medula Espinal/fisiopatologia
20.
J Neural Eng ; 17(6): 066007, 2020 11 25.
Artigo em Inglês | MEDLINE | ID: mdl-33236720

RESUMO

OBJECTIVE: To evaluate the potential of intracortical electrode array signals for brain-computer interfaces (BCIs) to restore lost speech, we measured the performance of decoders trained to discriminate a comprehensive basis set of 39 English phonemes and to synthesize speech sounds via a neural pattern matching method. We decoded neural correlates of spoken-out-loud words in the 'hand knob' area of precentral gyrus, a step toward the eventual goal of decoding attempted speech from ventral speech areas in patients who are unable to speak. APPROACH: Neural and audio data were recorded while two BrainGate2 pilot clinical trial participants, each with two chronically-implanted 96-electrode arrays, spoke 420 different words that broadly sampled English phonemes. Phoneme onsets were identified from audio recordings, and their identities were then classified from neural features consisting of each electrode's binned action potential counts or high-frequency local field potential power. Speech synthesis was performed using the 'Brain-to-Speech' pattern matching method. We also examined two potential confounds specific to decoding overt speech: acoustic contamination of neural signals and systematic differences in labeling different phonemes' onset times. MAIN RESULTS: A linear decoder achieved up to 29.3% classification accuracy (chance = 6%) across 39 phonemes, while an RNN classifier achieved 33.9% accuracy. Parameter sweeps indicated that performance did not saturate when adding more electrodes or more training data, and that accuracy improved when utilizing time-varying structure in the data. Microphonic contamination and phoneme onset differences modestly increased decoding accuracy, but could be mitigated by acoustic artifact subtraction and using a neural speech onset marker, respectively. Speech synthesis achieved r = 0.523 correlation between true and reconstructed audio. SIGNIFICANCE: The ability to decode speech using intracortical electrode array signals from a nontraditional speech area suggests that placing electrode arrays in ventral speech areas is a promising direction for speech BCIs.


Assuntos
Interfaces Cérebro-Computador , Fala , Eletrodos , Mãos , Humanos , Idioma
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA