Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
Add more filters











Publication year range
1.
Nat Hum Behav ; 8(6): 1136-1149, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38740984

ABSTRACT

Speech brain-machine interfaces (BMIs) translate brain signals into words or audio outputs, enabling communication for people having lost their speech abilities due to diseases or injury. While important advances in vocalized, attempted and mimed speech decoding have been achieved, results for internal speech decoding are sparse and have yet to achieve high functionality. Notably, it is still unclear from which brain areas internal speech can be decoded. Here two participants with tetraplegia with implanted microelectrode arrays located in the supramarginal gyrus (SMG) and primary somatosensory cortex (S1) performed internal and vocalized speech of six words and two pseudowords. In both participants, we found significant neural representation of internal and vocalized speech, at the single neuron and population level in the SMG. From recorded population activity in the SMG, the internally spoken and vocalized words were significantly decodable. In an offline analysis, we achieved average decoding accuracies of 55% and 24% for each participant, respectively (chance level 12.5%), and during an online internal speech BMI task, we averaged 79% and 23% accuracy, respectively. Evidence of shared neural representations between internal speech, word reading and vocalized speech processes was found in participant 1. SMG represented words as well as pseudowords, providing evidence for phonetic encoding. Furthermore, our decoder achieved high classification with multiple internal speech strategies (auditory imagination/visual imagination). Activity in S1 was modulated by vocalized but not internal speech in both participants, suggesting no articulator movements of the vocal tract occurred during internal speech production. This work represents a proof-of-concept for a high-performance internal speech BMI.


Subject(s)
Brain-Computer Interfaces , Parietal Lobe , Speech , Humans , Speech/physiology , Male , Parietal Lobe/physiology , Parietal Lobe/physiopathology , Adult , Neurons/physiology , Quadriplegia/physiopathology , Female , Somatosensory Cortex/physiology , Somatosensory Cortex/physiopathology , Speech Perception/physiology
2.
bioRxiv ; 2024 May 14.
Article in English | MEDLINE | ID: mdl-38798438

ABSTRACT

Intra-cortical microstimulation (ICMS) is a technique to provide tactile sensations for a somatosensory brain-machine interface (BMI). A viable BMI must function within the rich, multisensory environment of the real world, but how ICMS is integrated with other sensory modalities is poorly understood. To investigate how ICMS percepts are integrated with visual information, ICMS and visual stimuli were delivered at varying times relative to one another. Both visual context and ICMS current amplitude were found to bias the qualitative experience of ICMS. In two tetraplegic participants, ICMS and visual stimuli were more likely to be experienced as occurring simultaneously when visual stimuli were more realistic, demonstrating an effect of visual context on the temporal binding window. The peak of the temporal binding window varied but was consistently offset from zero, suggesting that multisensory integration with ICMS can suffer from temporal misalignment. Recordings from primary somatosensory cortex (S1) during catch trials where visual stimuli were delivered without ICMS demonstrated that S1 represents visual information related to ICMS across visual contexts.

3.
Cell Rep ; 42(4): 112312, 2023 04 25.
Article in English | MEDLINE | ID: mdl-37002922

ABSTRACT

Recent literature suggests that tactile events are represented in the primary somatosensory cortex (S1) beyond its long-established topography; in addition, the extent to which S1 is modulated by vision remains unclear. To better characterize S1, human electrophysiological data were recorded during touches to the forearm or finger. Conditions included visually observed physical touches, physical touches without vision, and visual touches without physical contact. Two major findings emerge from this dataset. First, vision strongly modulates S1 area 1, but only if there is a physical element to the touch, suggesting that passive touch observation is insufficient to elicit neural responses. Second, despite recording in a putative arm area of S1, neural activity represents both arm and finger stimuli during physical touches. Arm touches are encoded more strongly and specifically, supporting the idea that S1 encodes tactile events primarily through its topographic organization but also more generally, encompassing other areas of the body.


Subject(s)
Somatosensory Cortex , Touch Perception , Humans , Physical Stimulation , Somatosensory Cortex/physiology , Fingers , Touch Perception/physiology , Brain Mapping
4.
Neuron ; 110(11): 1777-1787.e3, 2022 06 01.
Article in English | MEDLINE | ID: mdl-35364014

ABSTRACT

The cortical grasp network encodes planning and execution of grasps and processes spoken and written aspects of language. High-level cortical areas within this network are attractive implant sites for brain-machine interfaces (BMIs). While a tetraplegic patient performed grasp motor imagery and vocalized speech, neural activity was recorded from the supramarginal gyrus (SMG), ventral premotor cortex (PMv), and somatosensory cortex (S1). In SMG and PMv, five imagined grasps were well represented by firing rates of neuronal populations during visual cue presentation. During motor imagery, these grasps were significantly decodable from all brain areas. During speech production, SMG encoded both spoken grasp types and the names of five colors. Whereas PMv neurons significantly modulated their activity during grasping, SMG's neural population broadly encoded features of both motor imagery and speech. Together, these results indicate that brain signals from high-level areas of the human cortex could be used for grasping and speech BMI applications.


Subject(s)
Brain-Computer Interfaces , Motor Cortex , Hand Strength/physiology , Humans , Motor Cortex/physiology , Parietal Lobe , Psychomotor Performance/physiology , Speech
5.
Elife ; 102021 03 01.
Article in English | MEDLINE | ID: mdl-33647233

ABSTRACT

In the human posterior parietal cortex (PPC), single units encode high-dimensional information with partially mixed representations that enable small populations of neurons to encode many variables relevant to movement planning, execution, cognition, and perception. Here, we test whether a PPC neuronal population previously demonstrated to encode visual and motor information is similarly engaged in the somatosensory domain. We recorded neurons within the PPC of a human clinical trial participant during actual touch presentation and during a tactile imagery task. Neurons encoded actual touch at short latency with bilateral receptive fields, organized by body part, and covered all tested regions. The tactile imagery task evoked body part-specific responses that shared a neural substrate with actual touch. Our results are the first neuron-level evidence of touch encoding in human PPC and its cognitive engagement during a tactile imagery task, which may reflect semantic processing, attention, sensory anticipation, or imagined touch.


Subject(s)
Imagination/physiology , Parietal Lobe/physiology , Touch Perception/physiology , Cognition , Electrodes, Implanted , Female , Humans , Middle Aged , Neurons/physiology , Parietal Lobe/cytology , Quadriplegia
6.
J Neurosci ; 41(10): 2177-2185, 2021 03 10.
Article in English | MEDLINE | ID: mdl-33483431

ABSTRACT

Intracortical microstimulation (ICMS) in human primary somatosensory cortex (S1) has been used to successfully evoke naturalistic sensations. However, the neurophysiological mechanisms underlying the evoked sensations remain unknown. To understand how specific stimulation parameters elicit certain sensations we must first understand the representation of those sensations in the brain. In this study we record from intracortical microelectrode arrays implanted in S1, premotor cortex, and posterior parietal cortex of a male human participant performing a somatosensory imagery task. The sensations imagined were those previously elicited by ICMS of S1, in the same array of the same participant. In both spike and local field potential recordings, features of the neural signal can be used to classify different imagined sensations. These features are shown to be stable over time. The sensorimotor cortices only encode the imagined sensation during the imagery task, while posterior parietal cortex encodes the sensations starting with cue presentation. These findings demonstrate that different aspects of the sensory experience can be individually decoded from intracortically recorded human neural signals across the cortical sensory network. Activity underlying these unique sensory representations may inform the stimulation parameters for precisely eliciting specific sensations via ICMS in future work.SIGNIFICANCE STATEMENT Electrical stimulation of human cortex is increasingly more common for providing feedback in neural devices. Understanding the relationship between naturally evoked and artificially evoked neurophysiology for the same sensations will be important in advancing such devices. Here, we investigate the neural activity in human primary somatosensory, premotor, and parietal cortices during somatosensory imagery. The sensations imagined were those previously elicited during intracortical microstimulation (ICMS) of the same somatosensory electrode array. We elucidate the neural features during somatosensory imagery that significantly encode different aspects of individual sensations and demonstrate feature stability over almost a year. The correspondence between neurophysiology elicited with or without stimulation for the same sensations will inform methods to deliver more precise feedback through stimulation in the future.


Subject(s)
Electric Stimulation/methods , Imagination/physiology , Motor Cortex/physiology , Parietal Lobe/physiology , Somatosensory Cortex/physiology , Adult , Electrocorticography , Humans , Male , Neurophysiology/methods , Spinal Cord Injuries/physiopathology
7.
Commun Biol ; 3(1): 757, 2020 12 11.
Article in English | MEDLINE | ID: mdl-33311578

ABSTRACT

Classical systems neuroscience positions primary sensory areas as early feed-forward processing stations for refining incoming sensory information. This view may oversimplify their role given extensive bi-directional connectivity with multimodal cortical and subcortical regions. Here we show that single units in human primary somatosensory cortex encode imagined reaches in a cognitive motor task, but not other sensory-motor variables such as movement plans or imagined arm position. A population reference-frame analysis demonstrates coding relative to the cued starting hand location suggesting that imagined reaching movements are encoded relative to imagined limb position. These results imply a potential role for primary somatosensory cortex in cognitive imagery, engagement during motor production in the absence of sensation or expected sensation, and suggest that somatosensory cortex can provide control signals for future neural prosthetic systems.


Subject(s)
Imagination , Sensation , Somatosensory Cortex/physiology , Adult , Animals , Brain Mapping , Brain Waves , Cognition , Humans , Magnetic Resonance Imaging/methods , Male , Motor Cortex/diagnostic imaging , Motor Cortex/physiology , Neurons/physiology , Somatosensory Cortex/diagnostic imaging
8.
Neuron ; 102(3): 694-705.e3, 2019 05 08.
Article in English | MEDLINE | ID: mdl-30853300

ABSTRACT

Although animal studies provided significant insights in understanding the neural basis of learning and adaptation, they often cannot dissociate between different learning mechanisms due to the lack of verbal communication. To overcome this limitation, we examined the mechanisms of learning and its limits in a human intracortical brain-machine interface (BMI) paradigm. A tetraplegic participant controlled a 2D computer cursor by modulating single-neuron activity in the anterior intraparietal area (AIP). By perturbing the neuron-to-movement mapping, the participant learned to modulate the activity of the recorded neurons to solve the perturbations by adopting a target re-aiming strategy. However, when no cognitive strategies were adequate to produce correct responses, AIP failed to adapt to perturbations. These findings suggest that learning is constrained by the pre-existing neuronal structure, although it is possible that AIP needs more training time to learn to generate novel activity patterns when cognitive re-adaptation fails to solve the perturbations.


Subject(s)
Brain-Computer Interfaces , Cognition/physiology , Learning/physiology , Neurons/physiology , Parietal Lobe/cytology , Quadriplegia/rehabilitation , Adaptation, Physiological/physiology , Cervical Vertebrae , Female , Humans , Middle Aged , Parietal Lobe/physiology , Spinal Cord Injuries/rehabilitation
9.
Elife ; 72018 04 10.
Article in English | MEDLINE | ID: mdl-29633714

ABSTRACT

Pioneering work with nonhuman primates and recent human studies established intracortical microstimulation (ICMS) in primary somatosensory cortex (S1) as a method of inducing discriminable artificial sensation. However, these artificial sensations do not yet provide the breadth of cutaneous and proprioceptive percepts available through natural stimulation. In a tetraplegic human with two microelectrode arrays implanted in S1, we report replicable elicitations of sensations in both the cutaneous and proprioceptive modalities localized to the contralateral arm, dependent on both amplitude and frequency of stimulation. Furthermore, we found a subset of electrodes that exhibited multimodal properties, and that proprioceptive percepts on these electrodes were associated with higher amplitudes, irrespective of the frequency. These novel results demonstrate the ability to provide naturalistic percepts through ICMS that can more closely mimic the body's natural physiological capabilities. Furthermore, delivering both cutaneous and proprioceptive sensations through artificial somatosensory feedback could improve performance and embodiment in brain-machine interfaces.


Subject(s)
Electric Stimulation/instrumentation , Electrodes, Implanted , Hand/physiology , Microelectrodes , Proprioception , Skin/physiopathology , Somatosensory Cortex/physiology , Brain-Computer Interfaces , Evoked Potentials, Somatosensory , Humans , Skin/innervation , Touch Perception
10.
J Neurosci ; 35(46): 15466-76, 2015 Nov 18.
Article in English | MEDLINE | ID: mdl-26586832

ABSTRACT

Humans shape their hands to grasp, manipulate objects, and to communicate. From nonhuman primate studies, we know that visual and motor properties for grasps can be derived from cells in the posterior parietal cortex (PPC). Are non-grasp-related hand shapes in humans represented similarly? Here we show for the first time how single neurons in the PPC of humans are selective for particular imagined hand shapes independent of graspable objects. We find that motor imagery to shape the hand can be successfully decoded from the PPC by implementing a version of the popular Rock-Paper-Scissors game and its extension Rock-Paper-Scissors-Lizard-Spock. By simultaneous presentation of visual and auditory cues, we can discriminate motor imagery from visual information and show differences in auditory and visual information processing in the PPC. These results also demonstrate that neural signals from human PPC can be used to drive a dexterous cortical neuroprosthesis. SIGNIFICANCE STATEMENT: This study shows for the first time hand-shape decoding from human PPC. Unlike nonhuman primate studies in which the visual stimuli are the objects to be grasped, the visually cued hand shapes that we use are independent of the stimuli. Furthermore, we can show that distinct neuronal populations are activated for the visual cue and the imagined hand shape. Additionally we found that auditory and visual stimuli that cue the same hand shape are processed differently in PPC. Early on in a trial, only the visual stimuli and not the auditory stimuli can be decoded. During the later stages of a trial, the motor imagery for a particular hand shape can be decoded for both modalities.


Subject(s)
Brain Mapping , Hand Strength/physiology , Imagination/physiology , Parietal Lobe/physiology , Acoustic Stimulation , Cues , Humans , Image Processing, Computer-Assisted , Magnetic Resonance Imaging , Models, Neurological , Movement , Neurons/physiology , Oxygen/blood , Parietal Lobe/blood supply , Parietal Lobe/cytology , Photic Stimulation
11.
Science ; 348(6237): 906-10, 2015 May 22.
Article in English | MEDLINE | ID: mdl-25999506

ABSTRACT

Nonhuman primate and human studies have suggested that populations of neurons in the posterior parietal cortex (PPC) may represent high-level aspects of action planning that can be used to control external devices as part of a brain-machine interface. However, there is no direct neuron-recording evidence that human PPC is involved in action planning, and the suitability of these signals for neuroprosthetic control has not been tested. We recorded neural population activity with arrays of microelectrodes implanted in the PPC of a tetraplegic subject. Motor imagery could be decoded from these neural populations, including imagined goals, trajectories, and types of movement. These findings indicate that the PPC of humans represents high-level, cognitive aspects of action and that the PPC can be a rich source for cognitive control signals for neural prosthetics that assist paralyzed patients.


Subject(s)
Functional Neuroimaging/methods , Neural Prostheses , Neurons/physiology , Parietal Lobe/physiopathology , Quadriplegia/physiopathology , Quadriplegia/therapy , Brain-Computer Interfaces , Cognition , Electrodes, Implanted , Humans , Microelectrodes , Motor Activity , Movement
SELECTION OF CITATIONS
SEARCH DETAIL