Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 33
Filter
Add more filters











Publication year range
1.
Cereb Cortex ; 34(7)2024 Jul 03.
Article in English | MEDLINE | ID: mdl-39051660

ABSTRACT

What is the function of auditory hemispheric asymmetry? We propose that the identification of sound sources relies on the asymmetric processing of two complementary and perceptually relevant acoustic invariants: actions and objects. In a large dataset of environmental sounds, we observed that temporal and spectral modulations display only weak covariation. We then synthesized auditory stimuli by simulating various actions (frictions) occurring on different objects (solid surfaces). Behaviorally, discrimination of actions relies on temporal modulations, while discrimination of objects relies on spectral modulations. Functional magnetic resonance imaging data showed that actions and objects are decoded in the left and right hemispheres, respectively, in bilateral superior temporal and left inferior frontal regions. This asymmetry reflects a generic differential processing-through differential neural sensitivity to temporal and spectral modulations present in environmental sounds-that supports the efficient categorization of actions and objects. These results support an ecologically valid framework of the functional role of auditory brain asymmetry.


Subject(s)
Acoustic Stimulation , Auditory Perception , Functional Laterality , Magnetic Resonance Imaging , Humans , Male , Female , Magnetic Resonance Imaging/methods , Functional Laterality/physiology , Adult , Acoustic Stimulation/methods , Auditory Perception/physiology , Young Adult , Brain Mapping/methods , Auditory Cortex/physiology , Auditory Cortex/diagnostic imaging
2.
Elife ; 132024 Jul 22.
Article in English | MEDLINE | ID: mdl-39038076

ABSTRACT

To what extent does speech and music processing rely on domain-specific and domain-general neural networks? Using whole-brain intracranial EEG recordings in 18 epilepsy patients listening to natural, continuous speech or music, we investigated the presence of frequency-specific and network-level brain activity. We combined it with a statistical approach in which a clear operational distinction is made between shared, preferred, and domain-selective neural responses. We show that the majority of focal and network-level neural activity is shared between speech and music processing. Our data also reveal an absence of anatomical regional selectivity. Instead, domain-selective neural responses are restricted to distributed and frequency-specific coherent oscillations, typical of spectral fingerprints. Our work highlights the importance of considering natural stimuli and brain dynamics in their full complexity to map cognitive and brain functions.


Subject(s)
Music , Humans , Male , Female , Adult , Nerve Net/physiology , Speech/physiology , Auditory Perception/physiology , Epilepsy/physiopathology , Young Adult , Electroencephalography , Cerebral Cortex/physiology , Electrocorticography , Speech Perception/physiology , Middle Aged , Brain Mapping
3.
Adv Exp Med Biol ; 1455: 199-213, 2024.
Article in English | MEDLINE | ID: mdl-38918353

ABSTRACT

Timing and motor function share neural circuits and dynamics, which underpin their close and synergistic relationship. For instance, the temporal predictability of a sensory event optimizes motor responses to that event. Knowing when an event is likely to occur lowers response thresholds, leading to faster and more efficient motor behavior though in situations of response conflict can induce impulsive and inappropriate responding. In turn, through a process of active sensing, coupling action to temporally predictable sensory input enhances perceptual processing. Action not only hones perception of the event's onset or duration, but also boosts sensory processing of its non-temporal features such as pitch or shape. The effects of temporal predictability on motor behavior and sensory processing involve motor and left parietal cortices and are mediated by changes in delta and beta oscillations in motor areas of the brain.


Subject(s)
Motor Cortex , Humans , Motor Cortex/physiology , Psychomotor Performance/physiology , Time Perception/physiology , Parietal Lobe/physiology , Animals , Motor Activity/physiology
4.
Cognition ; 248: 105793, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38636164

ABSTRACT

Speech comprehension is enhanced when preceded (or accompanied) by a congruent rhythmic prime reflecting the metrical sentence structure. Although these phenomena have been described for auditory and motor primes separately, their respective and synergistic contribution has not been addressed. In this experiment, participants performed a speech comprehension task on degraded speech signals that were preceded by a rhythmic prime that could be auditory, motor or audiomotor. Both auditory and audiomotor rhythmic primes facilitated speech comprehension speed. While the presence of a purely motor prime (unpaced tapping) did not globally benefit speech comprehension, comprehension accuracy scaled with the regularity of motor tapping. In order to investigate inter-individual variability, participants also performed a Spontaneous Speech Synchronization test. The strength of the estimated perception-production coupling correlated positively with overall speech comprehension scores. These findings are discussed in the framework of the dynamic attending and active sensing theories.


Subject(s)
Comprehension , Speech Perception , Humans , Speech Perception/physiology , Male , Female , Young Adult , Comprehension/physiology , Adult , Acoustic Stimulation , Psychomotor Performance/physiology , Auditory Perception/physiology , Speech/physiology
5.
Sci Rep ; 14(1): 5501, 2024 03 06.
Article in English | MEDLINE | ID: mdl-38448636

ABSTRACT

Speech and music are two fundamental modes of human communication. Lateralisation of key processes underlying their perception has been related both to the distinct sensitivity to low-level spectrotemporal acoustic features and to top-down attention. However, the interplay between bottom-up and top-down processes needs to be clarified. In the present study, we investigated the contribution of acoustics and attention to melodies or sentences to lateralisation in fMRI functional network topology. We used sung speech stimuli selectively filtered in temporal or spectral modulation domains with crossed and balanced verbal and melodic content. Perception of speech decreased with degradation of temporal information, whereas perception of melodies decreased with spectral degradation. Applying graph theoretical metrics on fMRI connectivity matrices, we found that local clustering, reflecting functional specialisation, linearly increased when spectral or temporal cues crucial for the task goal were incrementally degraded. These effects occurred in a bilateral fronto-temporo-parietal network for processing temporally degraded sentences and in right auditory regions for processing spectrally degraded melodies. In contrast, global topology remained stable across conditions. These findings suggest that lateralisation for speech and music partially depends on an interplay of acoustic cues and task goals under increased attentional demands.


Subject(s)
Cues , Magnetic Resonance Imaging , Humans , Communication , Acoustics , Perception
6.
Sci Adv ; 10(10): eadi2525, 2024 Mar 08.
Article in English | MEDLINE | ID: mdl-38446888

ABSTRACT

Why do humans spontaneously dance to music? To test the hypothesis that motor dynamics reflect predictive timing during music listening, we created melodies with varying degrees of rhythmic predictability (syncopation) and asked participants to rate their wanting-to-move (groove) experience. Degree of syncopation and groove ratings are quadratically correlated. Magnetoencephalography data showed that, while auditory regions track the rhythm of melodies, beat-related 2-hertz activity and neural dynamics at delta (1.4 hertz) and beta (20 to 30 hertz) rates in the dorsal auditory pathway code for the experience of groove. Critically, the left sensorimotor cortex coordinates these groove-related delta and beta activities. These findings align with the predictions of a neurodynamic model, suggesting that oscillatory motor engagement during music listening reflects predictive timing and is effected by interaction of neural dynamics along the dorsal auditory pathway.


Subject(s)
Music , Humans , Cell Membrane , Cerebral Cortex , Magnetoencephalography
7.
Cognition ; 232: 105345, 2023 03.
Article in English | MEDLINE | ID: mdl-36462227

ABSTRACT

Humans are expert at processing speech but how this feat is accomplished remains a major question in cognitive neuroscience. Capitalizing on the concept of channel capacity, we developed a unified measurement framework to investigate the respective influence of seven acoustic and linguistic features on speech comprehension, encompassing acoustic, sub-lexical, lexical and supra-lexical levels of description. We show that comprehension is independently impacted by all these features, but at varying degrees and with a clear dominance of the syllabic rate. Comparing comprehension of French words and sentences further reveals that when supra-lexical contextual information is present, the impact of all other features is dramatically reduced. Finally, we estimated the channel capacity associated with each linguistic feature and compared them with their generic distribution in natural speech. Our data reveal that while acoustic modulation, syllabic and phonemic rates unfold respectively at 5, 5, and 12 Hz in natural speech, they are associated with independent processing bottlenecks whose channel capacity are of 15, 15 and 35 Hz, respectively, as suggested by neurophysiological theories. They moreover point towards supra-lexical contextual information as the feature limiting the flow of natural speech. Overall, this study reveals how multilevel linguistic features constrain speech comprehension.


Subject(s)
Speech Perception , Speech , Humans , Speech/physiology , Comprehension/physiology , Speech Perception/physiology , Linguistics , Language
8.
PLoS Biol ; 20(7): e3001742, 2022 07.
Article in English | MEDLINE | ID: mdl-35905075

ABSTRACT

Categorising voices is crucial for auditory-based social interactions. A recent study by Rupp and colleagues in PLOS Biology capitalises on human intracranial recordings to describe the spatiotemporal pattern of neural activity leading to voice-selective responses in associative auditory cortex.


Subject(s)
Auditory Perception , Voice , Auditory Perception/physiology , Brain/physiology , Brain Mapping , Humans , Temporal Lobe , Voice/physiology
9.
J Neurosci ; 41(38): 7991-8006, 2021 09 22.
Article in English | MEDLINE | ID: mdl-34301825

ABSTRACT

Cortical oscillations have been proposed to play a functional role in speech and music perception, attentional selection, and working memory, via the mechanism of neural entrainment. One of the properties of neural entrainment that is often taken for granted is that its modulatory effect on ongoing oscillations outlasts rhythmic stimulation. We tested the existence of this phenomenon by studying cortical neural oscillations during and after presentation of melodic stimuli in a passive perception paradigm. Melodies were composed of ∼60 and ∼80 Hz tones embedded in a 2.5 Hz stream. Using intracranial and surface recordings in male and female humans, we reveal persistent oscillatory activity in the high-γ band in response to the tones throughout the cortex, well beyond auditory regions. By contrast, in response to the 2.5 Hz stream, no persistent activity in any frequency band was observed. We further show that our data are well captured by a model of damped harmonic oscillator and can be classified into three classes of neural dynamics, with distinct damping properties and eigenfrequencies. This model provides a mechanistic and quantitative explanation of the frequency selectivity of auditory neural entrainment in the human cortex.SIGNIFICANCE STATEMENT It has been proposed that the functional role of cortical oscillations is subtended by a mechanism of entrainment, the synchronization in phase or amplitude of neural oscillations to a periodic stimulation. One of the properties of neural entrainment that is often taken for granted is that its modulatory effect on ongoing oscillations outlasts rhythmic stimulation. Using intracranial and surface recordings of humans passively listening to rhythmic auditory stimuli, we reveal consistent oscillatory responses throughout the cortex, with persistent activity of high-γ oscillations. On the contrary, neural oscillations do not outlast low-frequency acoustic dynamics. We interpret our results as reflecting harmonic oscillator properties, a model ubiquitous in physics but rarely used in neuroscience.


Subject(s)
Auditory Cortex/physiology , Auditory Perception/physiology , Acoustic Stimulation , Adult , Female , Humans , Magnetic Resonance Imaging , Magnetoencephalography , Male , Periodicity , Speech/physiology , Young Adult
10.
Neurophysiol Clin ; 50(5): 331-338, 2020 Oct.
Article in English | MEDLINE | ID: mdl-32888771

ABSTRACT

OBJECTIVES: Rhythmic body rocking movements may occur in prefrontal epileptic seizures. Here, we compare quantified time-evolving frequency of stereotyped rocking with signal analysis of intracerebral electroencephalographic data. METHODS: In a single patient, prefrontal seizures with rhythmic anteroposterior body rocking recorded on stereoelectroencephalography (SEEG) were analyzed using fast Fourier transform, time-frequency decomposition and phase amplitude coupling, with regards to quantified video data. Comparison was made with seizures without rocking in the same patient, as well as resting state data. RESULTS: Rocking movements in the delta (∼1 Hz) range began a few seconds after SEEG onset of low voltage fast discharge. During rocking movements: (1) presence of a peak of delta band activity was visible in bipolar montage, with maximal power in epileptogenic zone and corresponding to mean rocking frequency; (2) correlation, using phase amplitude coupling, was shown between the phase of this delta activity and high-gamma power in the epileptogenic zone and the anterior cingulate region. CONCLUSIONS: Here, delta range rhythmic body rocking was associated with cortical delta oscillatory activity and phase-coupled high-gamma energy. These results suggest a neural signature during expression of motor semiology incorporating both temporal features associated with rhythmic movements and spatial features of seizure discharge.


Subject(s)
Epilepsy , Seizures , Stereotypic Movement Disorder , Cerebral Cortex , Electroencephalography , Humans
11.
Neuroimage ; 218: 116882, 2020 09.
Article in English | MEDLINE | ID: mdl-32439539

ABSTRACT

Neural oscillations in auditory cortex are argued to support parsing and representing speech constituents at their corresponding temporal scales. Yet, how incoming sensory information interacts with ongoing spontaneous brain activity, what features of the neuronal microcircuitry underlie spontaneous and stimulus-evoked spectral fingerprints, and what these fingerprints entail for stimulus encoding, remain largely open questions. We used a combination of human invasive electrophysiology, computational modeling and decoding techniques to assess the information encoding properties of brain activity and to relate them to a plausible underlying neuronal microarchitecture. We analyzed intracortical auditory EEG activity from 10 patients while they were listening to short sentences. Pre-stimulus neural activity in early auditory cortical regions often exhibited power spectra with a shoulder in the delta range and a small bump in the beta range. Speech decreased power in the beta range, and increased power in the delta-theta and gamma ranges. Using multivariate machine learning techniques, we assessed the spectral profile of information content for two aspects of speech processing: detection and discrimination. We obtained better phase than power information decoding, and a bimodal spectral profile of information content with better decoding at low (delta-theta) and high (gamma) frequencies than at intermediate (beta) frequencies. These experimental data were reproduced by a simple rate model made of two subnetworks with different timescales, each composed of coupled excitatory and inhibitory units, and connected via a negative feedback loop. Modeling and experimental results were similar in terms of pre-stimulus spectral profile (except for the iEEG beta bump), spectral modulations with speech, and spectral profile of information content. Altogether, we provide converging evidence from both univariate spectral analysis and decoding approaches for a dual timescale processing infrastructure in human auditory cortex, and show that it is consistent with the dynamics of a simple rate model.


Subject(s)
Auditory Cortex/physiology , Computer Simulation , Speech Perception/physiology , Adult , Electrocorticography , Female , Humans , Male , Signal Processing, Computer-Assisted
12.
PLoS Biol ; 18(3): e3000207, 2020 03.
Article in English | MEDLINE | ID: mdl-32119667

ABSTRACT

Speech perception is mediated by both left and right auditory cortices but with differential sensitivity to specific acoustic information contained in the speech signal. A detailed description of this functional asymmetry is missing, and the underlying models are widely debated. We analyzed cortical responses from 96 epilepsy patients with electrode implantation in left or right primary, secondary, and/or association auditory cortex (AAC). We presented short acoustic transients to noninvasively estimate the dynamical properties of multiple functional regions along the auditory cortical hierarchy. We show remarkably similar bimodal spectral response profiles in left and right primary and secondary regions, with evoked activity composed of dynamics in the theta (around 4-8 Hz) and beta-gamma (around 15-40 Hz) ranges. Beyond these first cortical levels of auditory processing, a hemispheric asymmetry emerged, with delta and beta band (3/15 Hz) responsivity prevailing in the right hemisphere and theta and gamma band (6/40 Hz) activity prevailing in the left. This asymmetry is also present during syllables presentation, but the evoked responses in AAC are more heterogeneous, with the co-occurrence of alpha (around 10 Hz) and gamma (>25 Hz) activity bilaterally. These intracranial data provide a more fine-grained and nuanced characterization of cortical auditory processing in the 2 hemispheres, shedding light on the neural dynamics that potentially shape auditory and speech processing at different levels of the cortical hierarchy.


Subject(s)
Auditory Cortex/physiology , Evoked Potentials, Auditory/physiology , Speech Perception/physiology , Acoustic Stimulation , Electrodes, Implanted , Electroencephalography , Epilepsy , Female , Functional Laterality/physiology , Humans , Male
13.
Science ; 367(6481): 1043-1047, 2020 02 28.
Article in English | MEDLINE | ID: mdl-32108113

ABSTRACT

Does brain asymmetry for speech and music emerge from acoustical cues or from domain-specific neural networks? We selectively filtered temporal or spectral modulations in sung speech stimuli for which verbal and melodic content was crossed and balanced. Perception of speech decreased only with degradation of temporal information, whereas perception of melodies decreased only with spectral degradation. Functional magnetic resonance imaging data showed that the neural decoding of speech and melodies depends on activity patterns in left and right auditory regions, respectively. This asymmetry is supported by specific sensitivity to spectrotemporal modulation rates within each region. Finally, the effects of degradation on perception were paralleled by their effects on neural classification. Our results suggest a match between acoustical properties of communicative signals and neural specializations adapted to that purpose.


Subject(s)
Auditory Cortex/physiology , Functional Laterality/physiology , Music , Pitch Perception/physiology , Speech Perception/physiology , Adolescent , Adult , Female , Humans , Magnetic Resonance Imaging , Male , Young Adult
14.
Nat Commun ; 11(1): 1051, 2020 02 26.
Article in English | MEDLINE | ID: mdl-32103014

ABSTRACT

That attention is a fundamentally rhythmic process has recently received abundant empirical evidence. The essence of temporal attention, however, is to flexibly focus in time. Whether this function is constrained by an underlying rhythmic neural mechanism is unknown. In six interrelated experiments, we behaviourally quantify the sampling capacities of periodic temporal attention during auditory or visual perception. We reveal the presence of limited attentional capacities, with an optimal sampling rate of ~1.4 Hz in audition and ~0.7 Hz in vision. Investigating the motor contribution to temporal attention, we show that it scales with motor rhythmic precision, maximal at ~1.7 Hz. Critically, motor modulation is beneficial to auditory but detrimental to visual temporal attention. These results are captured by a computational model of coupled oscillators, that reveals the underlying structural constraints governing the temporal alignment between motor and attention fluctuations.


Subject(s)
Attention/physiology , Auditory Perception/physiology , Periodicity , Visual Perception/physiology , Acoustic Stimulation , Adolescent , Adult , Female , Humans , Male , Photic Stimulation , Time Factors , Young Adult
15.
J Neurophysiol ; 123(3): 1063-1071, 2020 03 01.
Article in English | MEDLINE | ID: mdl-32023136

ABSTRACT

During auditory perception, neural oscillations are known to entrain to acoustic dynamics but their role in the processing of auditory information remains unclear. As a complex temporal structure that can be parameterized acoustically, music is particularly suited to address this issue. In a combined behavioral and EEG experiment in human participants, we investigated the relative contribution of temporal (acoustic dynamics) and nontemporal (melodic spectral complexity) dimensions of stimulation on neural entrainment, a stimulus-brain coupling phenomenon operationally defined here as the temporal coherence between acoustical and neural dynamics. We first highlight that low-frequency neural oscillations robustly entrain to complex acoustic temporal modulations, which underscores the fine-grained nature of this coupling mechanism. We also reveal that enhancing melodic spectral complexity, in terms of pitch, harmony, and pitch variation, increases neural entrainment. Importantly, this manipulation enhances activity in the theta (5 Hz) range, a frequency-selective effect independent of the note rate of the melodies, which may reflect internal temporal constraints of the neural processes involved. Moreover, while both emotional arousal ratings and neural entrainment were positively modulated by spectral complexity, no direct relationship between arousal and neural entrainment was observed. Overall, these results indicate that neural entrainment to music is sensitive to the spectral content of auditory information and indexes an auditory level of processing that should be distinguished from higher-order emotional processing stages.NEW & NOTEWORTHY Low-frequency (<10 Hz) cortical neural oscillations are known to entrain to acoustic dynamics, the so-called neural entrainment phenomenon, but their functional implication in the processing of auditory information remains unclear. In a behavioral and EEG experiment capitalizing on parameterized musical textures, we disentangle the contribution of stimulus dynamics, melodic spectral complexity, and emotional judgments on neural entrainment and highlight their respective spatial and spectral neural signature.


Subject(s)
Auditory Perception/physiology , Brain Waves/physiology , Cerebral Cortex/physiology , Emotions/physiology , Music , Adolescent , Adult , Female , Humans , Male , Young Adult
16.
Brain Cogn ; 140: 105531, 2020 04.
Article in English | MEDLINE | ID: mdl-31986324

ABSTRACT

When listening to temporally regular rhythms, most people are able to extract the beat. Evidence suggests that the neural mechanism underlying this ability is the phase alignment of endogenous oscillations to the external stimulus, allowing for the prediction of upcoming events (i.e., dynamic attending). Relatedly, individuals with dyslexia may have deficits in the entrainment of neural oscillations to external stimuli, especially at low frequencies. The current experiment investigated rhythmic processing in adults with dyslexia and matched controls. Regular and irregular rhythms were presented to participants while electroencephalography was recorded. Regular rhythms contained the beat at 2 Hz; while acoustic energy was maximal at 4 Hz and 8 Hz. These stimuli allowed us to investigate whether the brain responds non-linearly to the beat-level of a rhythmic stimulus, and whether beat-based processing differs between dyslexic and control participants. Both groups showed enhanced stimulus-brain coherence for regular compared to irregular rhythms at the frequencies of interest, with an overrepresentation of the beat-level in the brain compared to the acoustic signal. In addition, we found evidence that controls extracted subtle temporal regularities from irregular stimuli, whereas dyslexics did not. Findings are discussed in relation to dynamic attending theory and rhythmic processing deficits in dyslexia.


Subject(s)
Auditory Perception/physiology , Dyslexia/physiopathology , Time Perception/physiology , Adult , Electroencephalography , Female , Humans , Male , Young Adult
17.
Neurosci Biobehav Rev ; 107: 136-142, 2019 12.
Article in English | MEDLINE | ID: mdl-31518638

ABSTRACT

In the motor cortex, beta oscillations (∼12-30 Hz) are generally considered a principal rhythm contributing to movement planning and execution. Beta oscillations cohabit and dynamically interact with slow delta oscillations (0.5-4 Hz), but the role of delta oscillations and the subordinate relationship between these rhythms in the perception-action loop remains unclear. Here, we review evidence that motor delta oscillations shape the dynamics of motor behaviors and sensorimotor processes, in particular during auditory perception. We describe the functional coupling between delta and beta oscillations in the motor cortex during spontaneous and planned motor acts. In an active sensing framework, perception is strongly shaped by motor activity, in particular in the delta band, which imposes temporal constraints on the sampling of sensory information. By encoding temporal contextual information, delta oscillations modulate auditory processing and impact behavioral outcomes. Finally, we consider the contribution of motor delta oscillations in the perceptual analysis of speech signals, providing a contextual temporal frame to optimize the parsing and processing of slow linguistic information.


Subject(s)
Auditory Perception/physiology , Delta Rhythm/physiology , Motor Cortex/physiology , Speech Perception/physiology , Acoustic Stimulation , Humans , Speech
18.
Sci Rep ; 8(1): 13466, 2018 09 07.
Article in English | MEDLINE | ID: mdl-30194376

ABSTRACT

Anticipating the future rests upon our ability to exploit contextual cues and to formulate valid internal models or predictions. It is currently unknown how multiple predictions combine to bias perceptual information processing, and in particular whether this is determined by physiological constraints, behavioral relevance (task demands), or past knowledge (perceptual expertise). In a series of behavioral auditory experiments involving musical experts and non-musicians, we investigated the respective and combined contribution of temporal and spectral predictions in multiple detection tasks. We show that temporal and spectral predictions alone systematically increase perceptual sensitivity, independently of task demands or expertise. When combined, however, spectral predictions benefit more to non-musicians and dominate over temporal ones, and the extent of the spectrotemporal synergistic interaction depends on task demands. This suggests that the hierarchy of dominance primarily reflects the tonotopic organization of the auditory system and that expertise or attention only have a secondary modulatory influence.


Subject(s)
Attention/physiology , Auditory Cortex/physiology , Evoked Potentials, Auditory/physiology , Music , Pitch Perception/physiology , Adolescent , Adult , Female , Humans , Male , Middle Aged
19.
Trends Cogn Sci ; 22(10): 870-882, 2018 Oct.
Article in English | MEDLINE | ID: mdl-30266147

ABSTRACT

The ability to predict when something will happen facilitates sensory processing and the ensuing computations. Building on the observation that neural activity entrains to periodic stimulation, leading neurophysiological models imply that temporal predictions rely on oscillatory entrainment. Although they provide a sufficient solution to predict periodic regularities, these models are challenged by a series of findings that question their suitability to account for temporal predictions based on aperiodic regularities. Aiming for a more comprehensive model of how the brain anticipates 'when' in auditory contexts, we emphasize the capacity of motor and higher-order top-down systems to prepare sensory processing in a proactive and temporally flexible manner. Focusing on speech processing, we illustrate how this framework leads to new hypotheses.


Subject(s)
Anticipation, Psychological/physiology , Auditory Perception/physiology , Brain Waves/physiology , Time Factors , Time Perception/physiology , Humans
20.
Proc Natl Acad Sci U S A ; 114(42): E8913-E8921, 2017 10 17.
Article in English | MEDLINE | ID: mdl-28973923

ABSTRACT

In behavior, action and perception are inherently interdependent. However, the actual mechanistic contributions of the motor system to sensory processing are unknown. We present neurophysiological evidence that the motor system is involved in predictive timing, a brain function that aligns temporal fluctuations of attention with the timing of events in a task-relevant stream, thus facilitating sensory selection and optimizing behavior. In a magnetoencephalography experiment involving auditory temporal attention, participants had to disentangle two streams of sound on the unique basis of endogenous temporal cues. We show that temporal predictions are encoded by interdependent delta and beta neural oscillations originating from the left sensorimotor cortex, and directed toward auditory regions. We also found that overt rhythmic movements improved the quality of temporal predictions and sharpened the temporal selection of relevant auditory information. This latter behavioral and functional benefit was associated with increased signaling of temporal predictions in right-lateralized frontoparietal associative regions. In sum, this study points at a covert form of auditory active sensing. Our results emphasize the key role of motor brain areas in providing contextual temporal information to sensory regions, driving perceptual and behavioral selection.


Subject(s)
Attention/physiology , Auditory Perception/physiology , Brain/physiology , Acoustic Stimulation , Adult , Humans , Magnetoencephalography/methods , Male , Middle Aged , Nontherapeutic Human Experimentation
SELECTION OF CITATIONS
SEARCH DETAIL