Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 8 de 8
Filter
Add more filters

Therapeutic Methods and Therapies TCIM
Database
Language
Publication year range
1.
Sci Rep ; 14(1): 3433, 2024 02 10.
Article in English | MEDLINE | ID: mdl-38341457

ABSTRACT

Limitations in chronic pain therapies necessitate novel interventions that are effective, accessible, and safe. Brain-computer interfaces (BCIs) provide a promising modality for targeting neuropathology underlying chronic pain by converting recorded neural activity into perceivable outputs. Recent evidence suggests that increased frontal theta power (4-7 Hz) reflects pain relief from chronic and acute pain. Further studies have suggested that vibrotactile stimulation decreases pain intensity in experimental and clinical models. This longitudinal, non-randomized, open-label pilot study's objective was to reinforce frontal theta activity in six patients with chronic upper extremity pain using a novel vibrotactile neurofeedback BCI system. Patients increased their BCI performance, reflecting thought-driven control of neurofeedback, and showed a significant decrease in pain severity (1.29 ± 0.25 MAD, p = 0.03, q = 0.05) and pain interference (1.79 ± 1.10 MAD p = 0.03, q = 0.05) scores without any adverse events. Pain relief significantly correlated with frontal theta modulation. These findings highlight the potential of BCI-mediated cortico-sensory coupling of frontal theta with vibrotactile stimulation for alleviating chronic pain.


Subject(s)
Brain-Computer Interfaces , Chronic Pain , Neurofeedback , Humans , Chronic Pain/therapy , Electroencephalography , Pilot Projects , Longitudinal Studies , Non-Randomized Controlled Trials as Topic
2.
PLoS Biol ; 21(8): e3002176, 2023 08.
Article in English | MEDLINE | ID: mdl-37582062

ABSTRACT

Music is core to human experience, yet the precise neural dynamics underlying music perception remain unknown. We analyzed a unique intracranial electroencephalography (iEEG) dataset of 29 patients who listened to a Pink Floyd song and applied a stimulus reconstruction approach previously used in the speech domain. We successfully reconstructed a recognizable song from direct neural recordings and quantified the impact of different factors on decoding accuracy. Combining encoding and decoding analyses, we found a right-hemisphere dominance for music perception with a primary role of the superior temporal gyrus (STG), evidenced a new STG subregion tuned to musical rhythm, and defined an anterior-posterior STG organization exhibiting sustained and onset responses to musical elements. Our findings show the feasibility of applying predictive modeling on short datasets acquired in single patients, paving the way for adding musical elements to brain-computer interface (BCI) applications.


Subject(s)
Auditory Cortex , Music , Humans , Auditory Cortex/physiology , Brain Mapping , Auditory Perception/physiology , Temporal Lobe/physiology , Acoustic Stimulation
3.
Cereb Cortex ; 33(14): 8837-8848, 2023 07 05.
Article in English | MEDLINE | ID: mdl-37280730

ABSTRACT

Context modulates sensory neural activations enhancing perceptual and behavioral performance and reducing prediction errors. However, the mechanism of when and where these high-level expectations act on sensory processing is unclear. Here, we isolate the effect of expectation absent of any auditory evoked activity by assessing the response to omitted expected sounds. Electrocorticographic signals were recorded directly from subdural electrode grids placed over the superior temporal gyrus (STG). Subjects listened to a predictable sequence of syllables, with some infrequently omitted. We found high-frequency band activity (HFA, 70-170 Hz) in response to omissions, which overlapped with a posterior subset of auditory-active electrodes in STG. Heard syllables could be distinguishable reliably from STG, but not the identity of the omitted stimulus. Both omission- and target-detection responses were also observed in the prefrontal cortex. We propose that the posterior STG is central for implementing predictions in the auditory environment. HFA omission responses in this region appear to index mismatch-signaling or salience detection processes.


Subject(s)
Auditory Cortex , Humans , Auditory Cortex/physiology , Wernicke Area , Acoustic Stimulation , Evoked Potentials, Auditory/physiology , Brain Mapping , Auditory Perception/physiology
4.
Curr Biol ; 32(7): 1470-1484.e12, 2022 04 11.
Article in English | MEDLINE | ID: mdl-35196507

ABSTRACT

How is music represented in the brain? While neuroimaging has revealed some spatial segregation between responses to music versus other sounds, little is known about the neural code for music itself. To address this question, we developed a method to infer canonical response components of human auditory cortex using intracranial responses to natural sounds, and further used the superior coverage of fMRI to map their spatial distribution. The inferred components replicated many prior findings, including distinct neural selectivity for speech and music, but also revealed a novel component that responded nearly exclusively to music with singing. Song selectivity was not explainable by standard acoustic features, was located near speech- and music-selective responses, and was also evident in individual electrodes. These results suggest that representations of music are fractionated into subpopulations selective for different types of music, one of which is specialized for the analysis of song.


Subject(s)
Auditory Cortex , Music , Speech Perception , Acoustic Stimulation/methods , Auditory Cortex/physiology , Auditory Perception/physiology , Brain Mapping/methods , Humans , Speech/physiology , Speech Perception/physiology
5.
Neuroimage ; 243: 118498, 2021 11.
Article in English | MEDLINE | ID: mdl-34428572

ABSTRACT

Despite significant interest in the neural underpinnings of behavioral variability, little light has been shed on the cortical mechanism underlying the failure to respond to perceptual-level stimuli. We hypothesized that cortical activity resulting from perceptual-level stimuli is sensitive to the moment-to-moment fluctuations in cortical excitability, and thus may not suffice to produce a behavioral response. We tested this hypothesis using electrocorticographic recordings to follow the propagation of cortical activity in six human subjects that responded to perceptual-level auditory stimuli. Here we show that for presentations that did not result in a behavioral response, the likelihood of cortical activity decreased from auditory cortex to motor cortex, and was related to reduced local cortical excitability. Cortical excitability was quantified using instantaneous voltage during a short window prior to cortical activity onset. Therefore, when humans are presented with an auditory stimulus close to perceptual-level threshold, moment-by-moment fluctuations in cortical excitability determine whether cortical responses to sensory stimulation successfully connect auditory input to a resultant behavioral response.


Subject(s)
Cortical Excitability/physiology , Acoustic Stimulation , Adult , Aged , Alpha Rhythm/physiology , Auditory Cortex/physiology , Brain Mapping/methods , Electrocorticography/methods , Female , Humans , Male , Middle Aged
6.
Sci Rep ; 6: 25803, 2016 05 11.
Article in English | MEDLINE | ID: mdl-27165452

ABSTRACT

People that cannot communicate due to neurological disorders would benefit from an internal speech decoder. Here, we showed the ability to classify individual words during imagined speech from electrocorticographic signals. In a word imagery task, we used high gamma (70-150 Hz) time features with a support vector machine model to classify individual words from a pair of words. To account for temporal irregularities during speech production, we introduced a non-linear time alignment into the SVM kernel. Classification accuracy reached 88% in a two-class classification framework (50% chance level), and average classification accuracy across fifteen word-pairs was significant across five subjects (mean = 58%; p < 0.05). We also compared classification accuracy between imagined speech, overt speech and listening. As predicted, higher classification accuracy was obtained in the listening and overt speech conditions (mean = 89% and 86%, respectively; p < 0.0001), where speech stimuli were directly presented. The results provide evidence for a neural representation for imagined words in the temporal lobe, frontal lobe and sensorimotor cortex, consistent with previous findings in speech perception and production. These data represent a proof of concept study for basic decoding of speech imagery, and delineate a number of key challenges to usage of speech imagery neural representations for clinical applications.


Subject(s)
Brain Mapping , Brain/physiology , Electroencephalography , Imagination , Speech , Vocabulary , Acoustic Stimulation , Auditory Perception/physiology , Discrimination, Psychological , Electrodes , Gamma Rhythm/physiology , Humans , ROC Curve , Time Factors
7.
Neuroimage ; 97: 188-95, 2014 Aug 15.
Article in English | MEDLINE | ID: mdl-24768933

ABSTRACT

Neuroimaging approaches have implicated multiple brain sites in musical perception, including the posterior part of the superior temporal gyrus and adjacent perisylvian areas. However, the detailed spatial and temporal relationship of neural signals that support auditory processing is largely unknown. In this study, we applied a novel inter-subject analysis approach to electrophysiological signals recorded from the surface of the brain (electrocorticography (ECoG)) in ten human subjects. This approach allowed us to reliably identify those ECoG features that were related to the processing of a complex auditory stimulus (i.e., continuous piece of music) and to investigate their spatial, temporal, and causal relationships. Our results identified stimulus-related modulations in the alpha (8-12 Hz) and high gamma (70-110 Hz) bands at neuroanatomical locations implicated in auditory processing. Specifically, we identified stimulus-related ECoG modulations in the alpha band in areas adjacent to primary auditory cortex, which are known to receive afferent auditory projections from the thalamus (80 of a total of 15,107 tested sites). In contrast, we identified stimulus-related ECoG modulations in the high gamma band not only in areas close to primary auditory cortex but also in other perisylvian areas known to be involved in higher-order auditory processing, and in superior premotor cortex (412/15,107 sites). Across all implicated areas, modulations in the high gamma band preceded those in the alpha band by 280 ms, and activity in the high gamma band causally predicted alpha activity, but not vice versa (Granger causality, p<1e(-8)). Additionally, detailed analyses using Granger causality identified causal relationships of high gamma activity between distinct locations in early auditory pathways within superior temporal gyrus (STG) and posterior STG, between posterior STG and inferior frontal cortex, and between STG and premotor cortex. Evidence suggests that these relationships reflect direct cortico-cortical connections rather than common driving input from subcortical structures such as the thalamus. In summary, our inter-subject analyses defined the spatial and temporal relationships between music-related brain activity in the alpha and high gamma bands. They provide experimental evidence supporting current theories about the putative mechanisms of alpha and gamma activity, i.e., reflections of thalamo-cortical interactions and local cortical neural activity, respectively, and the results are also in agreement with existing functional models of auditory processing.


Subject(s)
Alpha Rhythm/physiology , Auditory Perception/physiology , Electroencephalography/methods , Gamma Rhythm/physiology , Acoustic Stimulation , Adolescent , Adult , Brain Mapping , Causality , Epilepsy/psychology , Female , Humans , Individuality , Male , Middle Aged , Music/psychology , Young Adult
8.
Neuroimage ; 61(4): 841-8, 2012 Jul 16.
Article in English | MEDLINE | ID: mdl-22537600

ABSTRACT

Previous studies demonstrated that brain signals encode information about specific features of simple auditory stimuli or of general aspects of natural auditory stimuli. How brain signals represent the time course of specific features in natural auditory stimuli is not well understood. In this study, we show in eight human subjects that signals recorded from the surface of the brain (electrocorticography (ECoG)) encode information about the sound intensity of music. ECoG activity in the high gamma band recorded from the posterior part of the superior temporal gyrus as well as from an isolated area in the precentral gyrus was observed to be highly correlated with the sound intensity of music. These results not only confirm the role of auditory cortices in auditory processing but also point to an important role of premotor and motor cortices. They also encourage the use of ECoG activity to study more complex acoustic features of simple or natural auditory stimuli.


Subject(s)
Auditory Perception/physiology , Brain Mapping , Frontal Lobe/physiology , Music , Temporal Lobe/physiology , Acoustic Stimulation , Adult , Electrodes, Implanted , Electroencephalography , Epilepsy/physiopathology , Female , Humans , Male , Middle Aged , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL