Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Neurosci Biobehav Rev ; 107: 136-142, 2019 12.
Artigo em Inglês | MEDLINE | ID: mdl-31518638

RESUMO

In the motor cortex, beta oscillations (∼12-30 Hz) are generally considered a principal rhythm contributing to movement planning and execution. Beta oscillations cohabit and dynamically interact with slow delta oscillations (0.5-4 Hz), but the role of delta oscillations and the subordinate relationship between these rhythms in the perception-action loop remains unclear. Here, we review evidence that motor delta oscillations shape the dynamics of motor behaviors and sensorimotor processes, in particular during auditory perception. We describe the functional coupling between delta and beta oscillations in the motor cortex during spontaneous and planned motor acts. In an active sensing framework, perception is strongly shaped by motor activity, in particular in the delta band, which imposes temporal constraints on the sampling of sensory information. By encoding temporal contextual information, delta oscillations modulate auditory processing and impact behavioral outcomes. Finally, we consider the contribution of motor delta oscillations in the perceptual analysis of speech signals, providing a contextual temporal frame to optimize the parsing and processing of slow linguistic information.


Assuntos
Percepção Auditiva/fisiologia , Ritmo Delta/fisiologia , Córtex Motor/fisiologia , Percepção da Fala/fisiologia , Estimulação Acústica , Humanos , Fala
2.
Nat Commun ; 10(1): 3671, 2019 08 14.
Artigo em Inglês | MEDLINE | ID: mdl-31413319

RESUMO

Being able to produce sounds that capture attention and elicit rapid reactions is the prime goal of communication. One strategy, exploited by alarm signals, consists in emitting fast but perceptible amplitude modulations in the roughness range (30-150 Hz). Here, we investigate the perceptual and neural mechanisms underlying aversion to such temporally salient sounds. By measuring subjective aversion to repetitive acoustic transients, we identify a nonlinear pattern of aversion restricted to the roughness range. Using human intracranial recordings, we show that rough sounds do not merely affect local auditory processes but instead synchronise large-scale, supramodal, salience-related networks in a steady-state, sustained manner. Rough sounds synchronise activity throughout superior temporal regions, subcortical and cortical limbic areas, and the frontal cortex, a network classically involved in aversion processing. This pattern correlates with subjective aversion in all these regions, consistent with the hypothesis that roughness enhances auditory aversion through spreading of neural synchronisation.


Assuntos
Atenção , Córtex Auditivo/fisiologia , Percepção Auditiva/fisiologia , Som , Estimulação Acústica , Acústica , Adolescente , Adulto , Vias Auditivas/fisiologia , Epilepsia Resistente a Medicamentos/cirurgia , Eletrocorticografia , Epilepsias Parciais/cirurgia , Feminino , Humanos , Masculino , Fatores de Tempo , Adulto Jovem
3.
Neuropsychologia ; 131: 9-24, 2019 08.
Artigo em Inglês | MEDLINE | ID: mdl-31158367

RESUMO

The amygdala is crucially implicated in processing emotional information from various sensory modalities. However, there is dearth of knowledge concerning the integration and relative time-course of its responses across different channels, i.e., for auditory, visual, and audiovisual input. Functional neuroimaging data in humans point to a possible role of this region in the multimodal integration of emotional signals, but direct evidence for anatomical and temporal overlap of unisensory and multisensory-evoked responses in amygdala is still lacking. We recorded event-related potentials (ERPs) and oscillatory activity from 9 amygdalae using intracranial electroencephalography (iEEG) in patients prior to epilepsy surgery, and compared electrophysiological responses to fearful, happy, or neutral stimuli presented either in voices alone, faces alone, or voices and faces simultaneously delivered. Results showed differential amygdala responses to fearful stimuli, in comparison to neutral, reaching significance 100-200 ms post-onset for auditory, visual and audiovisual stimuli. At later latencies, ∼400 ms post-onset, amygdala response to audiovisual information was also amplified in comparison to auditory or visual stimuli alone. Importantly, however, we found no evidence for either super- or subadditivity effects in any of the bimodal responses. These results suggest, first, that emotion processing in amygdala occurs at globally similar early stages of perceptual processing for auditory, visual, and audiovisual inputs; second, that overall larger responses to multisensory information occur at later stages only; and third, that the underlying mechanisms of this multisensory gain may reflect a purely additive response to concomitant visual and auditory inputs. Our findings provide novel insights on emotion processing across the sensory pathways, and their convergence within the limbic system.


Assuntos
Tonsila do Cerebelo/fisiologia , Emoções/fisiologia , Potenciais Evocados/fisiologia , Estimulação Acústica , Adolescente , Adulto , Percepção Auditiva/fisiologia , Eletrocorticografia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estimulação Luminosa , Tempo de Reação/fisiologia , Percepção Visual/fisiologia , Adulto Jovem
4.
J Neurosci ; 37(33): 7930-7938, 2017 08 16.
Artigo em Inglês | MEDLINE | ID: mdl-28729443

RESUMO

Recent psychophysics data suggest that speech perception is not limited by the capacity of the auditory system to encode fast acoustic variations through neural γ activity, but rather by the time given to the brain to decode them. Whether the decoding process is bounded by the capacity of θ rhythm to follow syllabic rhythms in speech, or constrained by a more endogenous top-down mechanism, e.g., involving ß activity, is unknown. We addressed the dynamics of auditory decoding in speech comprehension by challenging syllable tracking and speech decoding using comprehensible and incomprehensible time-compressed auditory sentences. We recorded EEGs in human participants and found that neural activity in both θ and γ ranges was sensitive to syllabic rate. Phase patterns of slow neural activity consistently followed the syllabic rate (4-14 Hz), even when this rate went beyond the classical θ range (4-8 Hz). The power of θ activity increased linearly with syllabic rate but showed no sensitivity to comprehension. Conversely, the power of ß (14-21 Hz) activity was insensitive to the syllabic rate, yet reflected comprehension on a single-trial basis. We found different long-range dynamics for θ and ß activity, with ß activity building up in time while more contextual information becomes available. This is consistent with the roles of θ and ß activity in stimulus-driven versus endogenous mechanisms. These data show that speech comprehension is constrained by concurrent stimulus-driven θ and low-γ activity, and by endogenous ß activity, but not primarily by the capacity of θ activity to track the syllabic rhythm.SIGNIFICANCE STATEMENT Speech comprehension partly depends on the ability of the auditory cortex to track syllable boundaries with θ-range neural oscillations. The reason comprehension drops when speech is accelerated could hence be because θ oscillations can no longer follow the syllabic rate. Here, we presented subjects with comprehensible and incomprehensible accelerated speech, and show that neural phase patterns in the θ band consistently reflect the syllabic rate, even when speech becomes too fast to be intelligible. The drop in comprehension, however, is signaled by a significant decrease in the power of low-ß oscillations (14-21 Hz). These data suggest that speech comprehension is not limited by the capacity of θ oscillations to adapt to syllabic rate, but by an endogenous decoding process.


Assuntos
Estimulação Acústica/métodos , Córtex Auditivo/fisiologia , Ritmo beta/fisiologia , Compreensão/fisiologia , Percepção da Fala/fisiologia , Ritmo Teta/fisiologia , Adulto , Eletroencefalografia/métodos , Feminino , Humanos , Masculino , Distribuição Aleatória , Fala/fisiologia , Fatores de Tempo , Adulto Jovem
5.
Curr Biol ; 25(15): 2051-6, 2015 Aug 03.
Artigo em Inglês | MEDLINE | ID: mdl-26190070

RESUMO

Screaming is arguably one of the most relevant communication signals for survival in humans. Despite their practical relevance and their theoretical significance as innate [1] and virtually universal [2, 3] vocalizations, what makes screams a unique signal and how they are processed is not known. Here, we use acoustic analyses, psychophysical experiments, and neuroimaging to isolate those features that confer to screams their alarming nature, and we track their processing in the human brain. Using the modulation power spectrum (MPS [4, 5]), a recently developed, neurally informed characterization of sounds, we demonstrate that human screams cluster within restricted portion of the acoustic space (between ∼30 and 150 Hz modulation rates) that corresponds to a well-known perceptual attribute, roughness. In contrast to the received view that roughness is irrelevant for communication [6], our data reveal that the acoustic space occupied by the rough vocal regime is segregated from other signals, including speech, a pre-requisite to avoid false alarms in normal vocal communication. We show that roughness is present in natural alarm signals as well as in artificial alarms and that the presence of roughness in sounds boosts their detection in various tasks. Using fMRI, we show that acoustic roughness engages subcortical structures critical to rapidly appraise danger. Altogether, these data demonstrate that screams occupy a privileged acoustic niche that, being separated from other communication signals, ensures their biological and ultimately social efficiency.


Assuntos
Acústica da Fala , Inteligibilidade da Fala , Percepção da Fala , Estimulação Acústica , Adulto , Feminino , Humanos , Imageamento por Ressonância Magnética , Masculino , Som , Adulto Jovem
6.
Cereb Cortex ; 25(9): 3077-85, 2015 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-24846147

RESUMO

The ability to generate temporal predictions is fundamental for adaptive behavior. Precise timing at the time-scale of seconds is critical, for instance to predict trajectories or to select relevant information. What mechanisms form the basis for such accurate timing? Recent evidence suggests that (1) temporal predictions adjust sensory selection by controlling neural oscillations in time and (2) the motor system plays an active role in inferring "when" events will happen. We hypothesized that oscillations in the delta and beta bands are instrumental in predicting the occurrence of auditory targets. Participants listened to brief rhythmic tone sequences and detected target delays while undergoing magnetoencephalography recording. Prior to target occurrence, we found that coupled delta (1-3 Hz) and beta (18-22 Hz) oscillations temporally align with upcoming targets and bias decisions towards correct responses, suggesting that delta-beta coupled oscillations underpin prediction accuracy. Subsequent to target occurrence, subjects update their decisions using the magnitude of the alpha-band (10-14 Hz) response as internal evidence of target timing. These data support a model in which the orchestration of oscillatory dynamics between sensory and motor systems is exploited to accurately select sensory information in time.


Assuntos
Percepção Auditiva/fisiologia , Ritmo beta/fisiologia , Mapeamento Encefálico , Tomada de Decisões/fisiologia , Ritmo Delta/fisiologia , Estimulação Acústica , Adolescente , Adulto , Eletroencefalografia , Feminino , Humanos , Magnetoencefalografia , Masculino , Periodicidade , Psicoacústica , Análise Espectral , Estatística como Assunto , Fatores de Tempo , Adulto Jovem
7.
Neuroimage ; 85 Pt 2: 761-8, 2014 Jan 15.
Artigo em Inglês | MEDLINE | ID: mdl-23791839

RESUMO

A growing body of research suggests that intrinsic neuronal slow (<10 Hz) oscillations in auditory cortex appear to track incoming speech and other spectro-temporally complex auditory signals. Within this framework, several recent studies have identified critical-band temporal envelopes as the specific acoustic feature being reflected by the phase of these oscillations. However, how this alignment between speech acoustics and neural oscillations might underpin intelligibility is unclear. Here we test the hypothesis that the 'sharpness' of temporal fluctuations in the critical band envelope acts as a temporal cue to speech syllabic rate, driving delta-theta rhythms to track the stimulus and facilitate intelligibility. We interpret our findings as evidence that sharp events in the stimulus cause cortical rhythms to re-align and parse the stimulus into syllable-sized chunks for further decoding. Using magnetoencephalographic recordings, we show that by removing temporal fluctuations that occur at the syllabic rate, envelope-tracking activity is reduced. By artificially reinstating these temporal fluctuations, envelope-tracking activity is regained. These changes in tracking correlate with intelligibility of the stimulus. Together, the results suggest that the sharpness of fluctuations in the stimulus, as reflected in the cochlear output, drive oscillatory activity to track and entrain to the stimulus, at its syllabic rate. This process likely facilitates parsing of the stimulus into meaningful chunks appropriate for subsequent decoding, enhancing perception and intelligibility.


Assuntos
Córtex Auditivo/fisiologia , Compreensão/fisiologia , Ritmo Delta/fisiologia , Percepção da Fala/fisiologia , Ritmo Teta/fisiologia , Estimulação Acústica , Adolescente , Adulto , Sinais (Psicologia) , Feminino , Humanos , Magnetoencefalografia , Masculino , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA