Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 110
Filtrar
Mais filtros

Bases de dados
Tipo de documento
Intervalo de ano de publicação
1.
J Neurosci ; 43(45): 7668-7677, 2023 11 08.
Artigo em Inglês | MEDLINE | ID: mdl-37734948

RESUMO

Hearing is an active process, and recent studies show that even the ear is affected by cognitive states or motor actions. One example are movements of the eardrum induced by saccadic eye movements, known as "eye movement-related eardrum oscillations" (EMREOs). While these are systematically shaped by the direction and size of saccades, the consequences of saccadic eye movements and their resulting EMREOs for hearing remain unclear. We here studied their implications for the detection of near-threshold clicks in human participants. Across three experiments, sound detection was not affected by their time of presentation relative to saccade onset, by saccade amplitude or direction. While the EMREOs were shaped by the direction and amplitude of the saccadic movement, inducing covert shifts in spatial attention did not affect the EMREO, suggesting that this signature of active sensing is restricted to overt changes in visual focus. Importantly, in our experiments, fluctuations in the EMREO amplitude were not related to detection performance, at least when monaural cues are sufficient. Hence, while eye movements may shape the transduction of acoustic information, the behavioral implications remain to be understood.SIGNIFICANCE STATEMENT Previous studies suggest that oculomotor behavior may influence how we perceive spatially localized sounds. Recent work has introduced a new perspective on this question by showing that eye movements can directly modulate the eardrum. Yet, it remains unclear whether this signature of active hearing accounts for behavioral effects. We here show that overt but not covert changes in visual attention modulate the eardrum, but these modulations do not interfere with the detection of sounds. Our results provide a starting point to obtain a deeper understanding about the interplay of oculomotor behavior and the active ear.


Assuntos
Movimentos Oculares , Movimentos Sacádicos , Humanos , Membrana Timpânica , Audição , Som
2.
J Neurophysiol ; 131(4): 723-737, 2024 Apr 01.
Artigo em Inglês | MEDLINE | ID: mdl-38416720

RESUMO

The brain engages the processes of multisensory integration and recalibration to deal with discrepant multisensory signals. These processes consider the reliability of each sensory input, with the more reliable modality receiving the stronger weight. Sensory reliability is typically assessed via the variability of participants' judgments, yet these can be shaped by factors both external and internal to the nervous system. For example, motor noise and participant's dexterity with the specific response method contribute to judgment variability, and different response methods applied to the same stimuli can result in different estimates of sensory reliabilities. Here we ask how such variations in reliability induced by variations in the response method affect multisensory integration and sensory recalibration, as well as motor adaptation, in a visuomotor paradigm. Participants performed center-out hand movements and were asked to judge the position of the hand or rotated visual feedback at the movement end points. We manipulated the variability, and thus the reliability, of repeated judgments by asking participants to respond using either a visual or a proprioceptive matching procedure. We find that the relative weights of visual and proprioceptive signals, and thus the asymmetry of multisensory integration and recalibration, depend on the reliability modulated by the judgment method. Motor adaptation, in contrast, was insensitive to this manipulation. Hence, the outcome of multisensory binding is shaped by the noise introduced by sensorimotor processing, in line with perception and action being intertwined.NEW & NOTEWORTHY Our brain tends to combine multisensory signals based on their respective reliability. This reliability depends on sensory noise in the environment, noise in the nervous system, and, as we show here, variability induced by the specific judgment procedure.


Assuntos
Julgamento , Percepção Visual , Humanos , Julgamento/fisiologia , Percepção Visual/fisiologia , Reprodutibilidade dos Testes , Mãos/fisiologia , Movimento/fisiologia , Propriocepção/fisiologia
3.
Eur J Neurosci ; 59(7): 1770-1788, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38230578

RESUMO

Studies on multisensory perception often focus on simplistic conditions in which one single stimulus is presented per modality. Yet, in everyday life, we usually encounter multiple signals per modality. To understand how multiple signals within and across the senses are combined, we extended the classical audio-visual spatial ventriloquism paradigm to combine two visual stimuli with one sound. The individual visual stimuli presented in the same trial differed in their relative timing and spatial offsets to the sound, allowing us to contrast their individual and combined influence on sound localization judgements. We find that the ventriloquism bias is not dominated by a single visual stimulus but rather is shaped by the collective multisensory evidence. In particular, the contribution of an individual visual stimulus to the ventriloquism bias depends not only on its own relative spatio-temporal alignment to the sound but also the spatio-temporal alignment of the other visual stimulus. We propose that this pattern of multi-stimulus multisensory integration reflects the evolution of evidence for sensory causal relations during individual trials, calling for the need to extend established models of multisensory causal inference to more naturalistic conditions. Our data also suggest that this pattern of multisensory interactions extends to the ventriloquism aftereffect, a bias in sound localization observed in unisensory judgements following a multisensory stimulus.


Assuntos
Percepção Auditiva , Localização de Som , Estimulação Acústica , Estimulação Luminosa , Percepção Visual , Humanos
4.
Neuroimage ; 273: 120093, 2023 06.
Artigo em Inglês | MEDLINE | ID: mdl-37028733

RESUMO

Crossmodal correspondences describe our tendency to associate sensory features from different modalities with each other, such as the pitch of a sound with the size of a visual object. While such crossmodal correspondences (or associations) are described in many behavioural studies their neurophysiological correlates remain unclear. Under the current working model of multisensory perception both a low- and a high-level account seem plausible. That is, the neurophysiological processes shaping these associations could commence in low-level sensory regions, or may predominantly emerge in high-level association regions of semantic and object identification networks. We exploited steady-state visual evoked potentials (SSVEP) to directly probe this question, focusing on the associations between pitch and the visual features of size, hue or chromatic saturation. We found that SSVEPs over occipital regions are sensitive to the congruency between pitch and size, and a source analysis pointed to an origin around primary visual cortices. We speculate that this signature of the pitch-size association in low-level visual cortices reflects the successful pairing of congruent visual and acoustic object properties and may contribute to establishing causal relations between multisensory objects. Besides this, our study also provides a paradigm can be exploited to study other crossmodal associations involving visual stimuli in the future.


Assuntos
Córtex Visual , Percepção Visual , Humanos , Percepção Visual/fisiologia , Potenciais Evocados Visuais , Atenção/fisiologia , Semântica , Estimulação Luminosa , Estimulação Acústica
5.
J Neurophysiol ; 129(2): 465-478, 2023 02 01.
Artigo em Inglês | MEDLINE | ID: mdl-36651909

RESUMO

Information about the position of our hand is provided by multisensory signals that are often not perfectly aligned. Discrepancies between the seen and felt hand position or its movement trajectory engage the processes of 1) multisensory integration, 2) sensory recalibration, and 3) motor adaptation, which adjust perception and behavioral responses to apparently discrepant signals. To foster our understanding of the coemergence of these three processes, we probed their short-term dependence on multisensory discrepancies in a visuomotor task that has served as a model for multisensory perception and motor control previously. We found that the well-established integration of discrepant visual and proprioceptive signals is tied to the immediate discrepancy and independent of the outcome of the integration of discrepant signals in immediately preceding trials. However, the strength of integration was context dependent, being stronger in an experiment featuring stimuli that covered a smaller range of visuomotor discrepancies (±15°) compared with one covering a larger range (±30°). Both sensory recalibration and motor adaptation for nonrepeated movement directions were absent after two bimodal trials with same or opposite visuomotor discrepancies. Hence our results suggest that short-term sensory recalibration and motor adaptation are not an obligatory consequence of the integration of preceding discrepant multisensory signals.NEW & NOTEWORTHY The functional relation between multisensory integration and recalibration remains debated. We here refute the notion that they coemerge in an obligatory manner and support the hypothesis that they serve distinct goals of perception.


Assuntos
Desempenho Psicomotor , Percepção Visual , Percepção Visual/fisiologia , Desempenho Psicomotor/fisiologia , Retroalimentação Sensorial/fisiologia , Propriocepção/fisiologia , Adaptação Fisiológica/fisiologia
6.
Eur J Neurosci ; 58(5): 3253-3269, 2023 09.
Artigo em Inglês | MEDLINE | ID: mdl-37461244

RESUMO

Perceptual coherence in the face of discrepant multisensory signals is achieved via the processes of multisensory integration, recalibration and sometimes motor adaptation. These supposedly operate on different time scales, with integration reducing immediate sensory discrepancies and recalibration and motor adaptation reflecting the cumulative influence of their recent history. Importantly, whether discrepant signals are bound during perception is guided by the brains' inference of whether they originate from a common cause. When combined, these two notions lead to the hypothesis that the time scales on which integration and recalibration (or motor adaptation) operate are associated with different time scales of evidence about a common cause underlying two signals. We tested this prediction in a well-established visuo-motor paradigm, in which human participants performed visually guided hand movements. The kinematic correlation between hand and cursor movements indicates their common origin, which allowed us to manipulate the common-cause evidence by titrating this correlation. Specifically, we dissociated hand and cursor signals during individual movements while preserving their correlation across the series of movement endpoints. Following our hypothesis, this manipulation reduced integration compared with a condition in which visual and proprioceptive signals were perfectly correlated. In contrast, recalibration and motor adaption were not affected by this manipulation. This supports the notion that multisensory integration and recalibration deal with sensory discrepancies on different time scales guided by common-cause evidence: Integration is prompted by local common-cause evidence and reduces immediate discrepancies, whereas recalibration and motor adaptation are prompted by global common-cause evidence and reduce persistent discrepancies.


Assuntos
Propriocepção , Percepção Visual , Humanos , Adaptação Fisiológica , Movimento , Mãos , Desempenho Psicomotor
7.
J Neurosci ; 41(5): 1068-1079, 2021 02 03.
Artigo em Inglês | MEDLINE | ID: mdl-33273069

RESUMO

Our senses often receive conflicting multisensory information, which our brain reconciles by adaptive recalibration. A classic example is the ventriloquism aftereffect, which emerges following both cumulative (long-term) and trial-wise exposure to spatially discrepant multisensory stimuli. Despite the importance of such adaptive mechanisms for interacting with environments that change over multiple timescales, it remains debated whether the ventriloquism aftereffects observed following trial-wise and cumulative exposure arise from the same neurophysiological substrate. We address this question by probing electroencephalography recordings from healthy humans (both sexes) for processes predictive of the aftereffect biases following the exposure to spatially offset audiovisual stimuli. Our results support the hypothesis that discrepant multisensory evidence shapes aftereffects on distinct timescales via common neurophysiological processes reflecting sensory inference and memory in parietal-occipital regions, while the cumulative exposure to consistent discrepancies additionally recruits prefrontal processes. During the subsequent unisensory trial, both trial-wise and cumulative exposure bias the encoding of the acoustic information, but do so distinctly. Our results posit a central role of parietal regions in shaping multisensory spatial recalibration, suggest that frontal regions consolidate the behavioral bias for persistent multisensory discrepancies, but also show that the trial-wise and cumulative exposure bias sound position encoding via distinct neurophysiological processes.SIGNIFICANCE STATEMENT Our brain easily reconciles conflicting multisensory information, such as seeing an actress on screen while hearing her voice over headphones. These adaptive mechanisms exert a persistent influence on the perception of subsequent unisensory stimuli, known as the ventriloquism aftereffect. While this aftereffect emerges following trial-wise or cumulative exposure to multisensory discrepancies, it remained unclear whether both arise from a common neural substrate. We here rephrase this hypothesis using human electroencephalography recordings. Our data suggest that parietal regions involved in multisensory and spatial memory mediate the aftereffect following both trial-wise and cumulative adaptation, but also show that additional and distinct processes are involved in consolidating and implementing the aftereffect following prolonged exposure.


Assuntos
Estimulação Acústica/métodos , Lobo Parietal/fisiologia , Estimulação Luminosa/métodos , Desempenho Psicomotor/fisiologia , Localização de Som/fisiologia , Percepção Visual/fisiologia , Adulto , Percepção Auditiva/fisiologia , Eletroencefalografia/métodos , Feminino , Humanos , Masculino , Adulto Jovem
8.
J Neurosci ; 40(17): 3443-3454, 2020 04 22.
Artigo em Inglês | MEDLINE | ID: mdl-32179571

RESUMO

Biases in sensory perception can arise from both experimental manipulations and personal trait-like features. These idiosyncratic biases and their neural underpinnings are often overlooked in studies on the physiology underlying perception. A potential candidate mechanism reflecting such idiosyncratic biases could be spontaneous alpha band activity, a prominent brain rhythm known to influence perceptual reports in general. Using a temporal order judgment task, we here tested the hypothesis that alpha power reflects the overcoming of an idiosyncratic bias. Importantly, to understand the interplay between idiosyncratic biases and contextual (temporary) biases induced by experimental manipulations, we quantified this relation before and after temporal recalibration. Using EEG recordings in human participants (male and female), we find that prestimulus frontal alpha power correlates with the tendency to respond relative to an own idiosyncratic bias, with stronger α leading to responses matching the bias. In contrast, alpha power does not predict response correctness. These results also hold after temporal recalibration and are specific to the alpha band, suggesting that alpha band activity reflects, directly or indirectly, processes that help to overcome an individual's momentary bias in perception. We propose that combined with established roles of parietal α in the encoding of sensory information frontal α reflects complementary mechanisms influencing perceptual decisions.SIGNIFICANCE STATEMENT The brain is a biased organ, frequently generating systematically distorted percepts of the world, leading each of us to evolve in our own subjective reality. However, such biases are often overlooked or considered noise when studying the neural mechanisms underlying perception. We show that spontaneous alpha band activity predicts the degree of biasedness of human choices in a time perception task, suggesting that alpha activity indexes processes needed to overcome an individual's idiosyncratic bias. This result provides a window onto the neural underpinnings of subjective perception, and offers the possibility to quantify or manipulate such priors in future studies.


Assuntos
Ritmo alfa/fisiologia , Percepção Auditiva/fisiologia , Encéfalo/fisiologia , Percepção do Tempo/fisiologia , Percepção Visual/fisiologia , Adulto , Eletroencefalografia , Feminino , Humanos , Individualidade , Masculino , Adulto Jovem
9.
Neuroimage ; 233: 117958, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-33744458

RESUMO

The representation of speech in the brain is often examined by measuring the alignment of rhythmic brain activity to the speech envelope. To conveniently quantify this alignment (termed 'speech tracking') many studies consider the broadband speech envelope, which combines acoustic fluctuations across the spectral range. Using EEG recordings, we show that using this broadband envelope can provide a distorted picture on speech encoding. We systematically investigated the encoding of spectrally-limited speech-derived envelopes presented by individual and multiple noise carriers in the human brain. Tracking in the 1 to 6 Hz EEG bands differentially reflected low (0.2 - 0.83 kHz) and high (2.66 - 8 kHz) frequency speech-derived envelopes. This was independent of the specific carrier frequency but sensitive to attentional manipulations, and may reflect the context-dependent emphasis of information from distinct spectral ranges of the speech envelope in low frequency brain activity. As low and high frequency speech envelopes relate to distinct phonemic features, our results suggest that functionally distinct processes contribute to speech tracking in the same EEG bands, and are easily confounded when considering the broadband speech envelope.


Assuntos
Estimulação Acústica/métodos , Mapeamento Encefálico/métodos , Encéfalo/fisiologia , Ritmo Delta/fisiologia , Percepção da Fala/fisiologia , Ritmo Teta/fisiologia , Adulto , Encéfalo/diagnóstico por imagem , Eletroencefalografia/métodos , Feminino , Humanos , Masculino , Fala/fisiologia , Adulto Jovem
10.
PLoS Biol ; 16(3): e2004473, 2018 03.
Artigo em Inglês | MEDLINE | ID: mdl-29529019

RESUMO

During online speech processing, our brain tracks the acoustic fluctuations in speech at different timescales. Previous research has focused on generic timescales (for example, delta or theta bands) that are assumed to map onto linguistic features such as prosody or syllables. However, given the high intersubject variability in speaking patterns, such a generic association between the timescales of brain activity and speech properties can be ambiguous. Here, we analyse speech tracking in source-localised magnetoencephalographic data by directly focusing on timescales extracted from statistical regularities in our speech material. This revealed widespread significant tracking at the timescales of phrases (0.6-1.3 Hz), words (1.8-3 Hz), syllables (2.8-4.8 Hz), and phonemes (8-12.4 Hz). Importantly, when examining its perceptual relevance, we found stronger tracking for correctly comprehended trials in the left premotor (PM) cortex at the phrasal scale as well as in left middle temporal cortex at the word scale. Control analyses using generic bands confirmed that these effects were specific to the speech regularities in our stimuli. Furthermore, we found that the phase at the phrasal timescale coupled to power at beta frequency (13-30 Hz) in motor areas. This cross-frequency coupling presumably reflects top-down temporal prediction in ongoing speech perception. Together, our results reveal specific functional and perceptually relevant roles of distinct tracking and cross-frequency processes along the auditory-motor pathway.


Assuntos
Córtex Auditivo/fisiologia , Córtex Motor/fisiologia , Percepção da Fala , Fala , Estimulação Acústica , Adolescente , Adulto , Mapeamento Encefálico , Feminino , Humanos , Magnetoencefalografia , Masculino
11.
Neuroimage ; 186: 22-32, 2019 02 01.
Artigo em Inglês | MEDLINE | ID: mdl-30391564

RESUMO

As we get older, perception in cluttered environments becomes increasingly difficult as a result of changes in peripheral and central neural processes. Given the aging society, it is important to understand the neural mechanisms constraining perception in the elderly. In young participants, the state of rhythmic brain activity prior to a stimulus has been shown to modulate the neural encoding and perceptual impact of this stimulus - yet it remains unclear whether, and if so, how, the perceptual relevance of pre-stimulus activity changes with age. Using the auditory system as a model, we recorded EEG activity during a frequency discrimination task from younger and older human listeners. By combining single-trial EEG decoding with linear modelling we demonstrate consistent statistical relations between pre-stimulus power and the encoding of sensory evidence in short-latency EEG components, and more variable relations between pre-stimulus phase and subjects' decisions in longer-latency components. At the same time, we observed a significant slowing of auditory evoked responses and a flattening of the overall EEG frequency spectrum in the older listeners. Our results point to mechanistically consistent relations between rhythmic brain activity and sensory encoding that emerge despite changes in neural response latencies and the relative amplitude of rhythmic brain activity with age.


Assuntos
Envelhecimento/fisiologia , Percepção Auditiva/fisiologia , Ondas Encefálicas/fisiologia , Córtex Cerebral/fisiologia , Potenciais Evocados Auditivos/fisiologia , Desenvolvimento Humano/fisiologia , Adulto , Idoso , Discriminação Psicológica/fisiologia , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Adulto Jovem
12.
Proc Natl Acad Sci U S A ; 113(17): 4842-7, 2016 Apr 26.
Artigo em Inglês | MEDLINE | ID: mdl-27071110

RESUMO

The qualities of perception depend not only on the sensory inputs but also on the brain state before stimulus presentation. Although the collective evidence from neuroimaging studies for a relation between prestimulus state and perception is strong, the interpretation in the context of sensory computations or decision processes has remained difficult. In the auditory system, for example, previous studies have reported a wide range of effects in terms of the perceptually relevant frequency bands and state parameters (phase/power). To dissociate influences of state on earlier sensory representations and higher-level decision processes, we collected behavioral and EEG data in human participants performing two auditory discrimination tasks relying on distinct acoustic features. Using single-trial decoding, we quantified the relation between prestimulus activity, relevant sensory evidence, and choice in different task-relevant EEG components. Within auditory networks, we found that phase had no direct influence on choice, whereas power in task-specific frequency bands affected the encoding of sensory evidence. Within later-activated frontoparietal regions, theta and alpha phase had a direct influence on choice, without involving sensory evidence. These results delineate two consistent mechanisms by which prestimulus activity shapes perception. However, the timescales of the relevant neural activity depend on the specific brain regions engaged by the respective task.


Assuntos
Estimulação Acústica , Mapeamento Encefálico , Discriminação da Altura Tonal/fisiologia , Percepção da Altura Sonora/fisiologia , Adulto , Ritmo alfa , Tomada de Decisões , Eletroencefalografia , Feminino , Humanos , Masculino , Ritmo Teta , Fatores de Tempo , Adulto Jovem
13.
Nat Rev Neurosci ; 14(11): 770-85, 2013 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-24135696

RESUMO

The past decade has witnessed a renewed interest in cortical local field potentials (LFPs)--that is, extracellularly recorded potentials with frequencies of up to ~500 Hz. This is due to both the advent of multielectrodes, which has enabled recording of LFPs at tens to hundreds of sites simultaneously, and the insight that LFPs offer a unique window into key integrative synaptic processes in cortical populations. However, owing to its numerous potential neural sources, the LFP is more difficult to interpret than are spikes. Careful mathematical modelling and analysis are needed to take full advantage of the opportunities that this signal offers in understanding signal processing in cortical circuits and, ultimately, the neural basis of perception and cognition.


Assuntos
Córtex Cerebral/fisiologia , Potenciais Evocados/fisiologia , Modelos Neurológicos , Vias Neurais/fisiologia , Potenciais de Ação , Algoritmos , Animais , Eletroencefalografia , Fenômenos Eletrofisiológicos , Humanos , Rede Nervosa/citologia , Rede Nervosa/fisiologia , Neurônios/fisiologia , Sinapses/fisiologia
14.
PLoS Biol ; 13(2): e1002075, 2015 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-25710476

RESUMO

At any given moment, our brain processes multiple inputs from its different sensory modalities (vision, hearing, touch, etc.). In deciphering this array of sensory information, the brain has to solve two problems: (1) which of the inputs originate from the same object and should be integrated and (2) for the sensations originating from the same object, how best to integrate them. Recent behavioural studies suggest that the human brain solves these problems using optimal probabilistic inference, known as Bayesian causal inference. However, how and where the underlying computations are carried out in the brain have remained unknown. By combining neuroimaging-based decoding techniques and computational modelling of behavioural data, a new study now sheds light on how multisensory causal inference maps onto specific brain areas. The results suggest that the complexity of neural computations increases along the visual hierarchy and link specific components of the causal inference process with specific visual and parietal regions.


Assuntos
Percepção Auditiva/fisiologia , Modelos Neurológicos , Rede Nervosa/fisiologia , Vias Neurais/fisiologia , Percepção Visual/fisiologia , Estimulação Acústica , Córtex Auditivo/anatomia & histologia , Córtex Auditivo/fisiologia , Teorema de Bayes , Mapeamento Encefálico , Cognição/fisiologia , Humanos , Imageamento por Ressonância Magnética , Lobo Parietal/anatomia & histologia , Lobo Parietal/fisiologia , Estimulação Luminosa , Psicofísica , Córtex Visual/anatomia & histologia , Córtex Visual/fisiologia
15.
Proc Natl Acad Sci U S A ; 112(1): 273-8, 2015 Jan 06.
Artigo em Inglês | MEDLINE | ID: mdl-25535356

RESUMO

When social animals communicate, the onset of informative content in one modality varies considerably relative to the other, such as when visual orofacial movements precede a vocalization. These naturally occurring asynchronies do not disrupt intelligibility or perceptual coherence. However, they occur on time scales where they likely affect integrative neuronal activity in ways that have remained unclear, especially for hierarchically downstream regions in which neurons exhibit temporally imprecise but highly selective responses to communication signals. To address this, we exploited naturally occurring face- and voice-onset asynchronies in primate vocalizations. Using these as stimuli we recorded cortical oscillations and neuronal spiking responses from functional MRI (fMRI)-localized voice-sensitive cortex in the anterior temporal lobe of macaques. We show that the onset of the visual face stimulus resets the phase of low-frequency oscillations, and that the face-voice asynchrony affects the prominence of two key types of neuronal multisensory responses: enhancement or suppression. Our findings show a three-way association between temporal delays in audiovisual communication signals, phase-resetting of ongoing oscillations, and the sign of multisensory responses. The results reveal how natural onset asynchronies in cross-sensory inputs regulate network oscillations and neuronal excitability in the voice-sensitive cortex of macaques, a suggested animal model for human voice areas. These findings also advance predictions on the impact of multisensory input on neuronal processes in face areas and other brain regions.


Assuntos
Comunicação Animal , Percepção Auditiva/fisiologia , Encéfalo/fisiologia , Macaca mulatta/fisiologia , Neurônios/fisiologia , Sensação , Percepção Visual/fisiologia , Voz , Animais , Potenciais Evocados Visuais , Humanos , Masculino
16.
Neuroimage ; 148: 31-41, 2017 03 01.
Artigo em Inglês | MEDLINE | ID: mdl-28082107

RESUMO

Sensory discriminations, such as judgements about visual motion, often benefit from multisensory evidence. Despite many reports of enhanced brain activity during multisensory conditions, it remains unclear which dynamic processes implement the multisensory benefit for an upcoming decision in the human brain. Specifically, it remains difficult to attribute perceptual benefits to specific processes, such as early sensory encoding, the transformation of sensory representations into a motor response, or to more unspecific processes such as attention. We combined an audio-visual motion discrimination task with the single-trial mapping of dynamic sensory representations in EEG activity to localize when and where multisensory congruency facilitates perceptual accuracy. Our results show that a congruent sound facilitates the encoding of motion direction in occipital sensory - as opposed to parieto-frontal - cortices, and facilitates later - as opposed to early (i.e. below 100ms) - sensory activations. This multisensory enhancement was visible as an earlier rise of motion-sensitive activity in middle-occipital regions about 350ms from stimulus onset, which reflected the better discriminability of motion direction from brain activity and correlated with the perceptual benefit provided by congruent multisensory information. This supports a hierarchical model of multisensory integration in which the enhancement of relevant sensory cortical representations is transformed into a more accurate choice.


Assuntos
Discriminação Psicológica/fisiologia , Percepção de Movimento/fisiologia , Lobo Occipital/fisiologia , Som , Percepção Visual/fisiologia , Estimulação Acústica , Ritmo alfa/fisiologia , Mapeamento Encefálico , Eletroencefalografia , Feminino , Humanos , Masculino , Desempenho Psicomotor/fisiologia , Adulto Jovem
17.
Neuroimage ; 147: 32-42, 2017 02 15.
Artigo em Inglês | MEDLINE | ID: mdl-27903440

RESUMO

The timing of slow auditory cortical activity aligns to the rhythmic fluctuations in speech. This entrainment is considered to be a marker of the prosodic and syllabic encoding of speech, and has been shown to correlate with intelligibility. Yet, whether and how auditory cortical entrainment is influenced by the activity in other speech-relevant areas remains unknown. Using source-localized MEG data, we quantified the dependency of auditory entrainment on the state of oscillatory activity in fronto-parietal regions. We found that delta band entrainment interacted with the oscillatory activity in three distinct networks. First, entrainment in the left anterior superior temporal gyrus (STG) was modulated by beta power in orbitofrontal areas, possibly reflecting predictive top-down modulations of auditory encoding. Second, entrainment in the left Heschl's Gyrus and anterior STG was dependent on alpha power in central areas, in line with the importance of motor structures for phonological analysis. And third, entrainment in the right posterior STG modulated theta power in parietal areas, consistent with the engagement of semantic memory. These results illustrate the topographical network interactions of auditory delta entrainment and reveal distinct cross-frequency mechanisms by which entrainment can interact with different cognitive processes underlying speech perception.


Assuntos
Córtex Auditivo/fisiologia , Ritmo Delta/fisiologia , Lobo Frontal/fisiologia , Magnetoencefalografia , Lobo Parietal/fisiologia , Estimulação Acústica , Adulto , Ritmo alfa/fisiologia , Ritmo beta/fisiologia , Feminino , Humanos , Masculino , Rede Nervosa/fisiologia , Percepção da Fala/fisiologia , Lobo Temporal/fisiologia , Ritmo Teta/fisiologia , Adulto Jovem
18.
Eur J Neurosci ; 46(10): 2565-2577, 2017 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-28940728

RESUMO

To make accurate perceptual estimates, observers must take the reliability of sensory information into account. Despite many behavioural studies showing that subjects weight individual sensory cues in proportion to their reliabilities, it is still unclear when during a trial neuronal responses are modulated by the reliability of sensory information or when they reflect the perceptual weights attributed to each sensory input. We investigated these questions using a combination of psychophysics, EEG-based neuroimaging and single-trial decoding. Our results show that the weighted integration of sensory information in the brain is a dynamic process; effects of sensory reliability on task-relevant EEG components were evident 84 ms after stimulus onset, while neural correlates of perceptual weights emerged 120 ms after stimulus onset. These neural processes had different underlying sources, arising from sensory and parietal regions, respectively. Together these results reveal the temporal dynamics of perceptual and neural audio-visual integration and support the notion of temporally early and functionally specific multisensory processes in the brain.


Assuntos
Percepção Auditiva/fisiologia , Encéfalo/fisiologia , Percepção Visual/fisiologia , Estimulação Acústica , Adulto , Comportamento de Escolha , Análise Discriminante , Eletroencefalografia , Feminino , Humanos , Masculino , Estimulação Luminosa , Psicofísica , Adulto Jovem
19.
Hum Brain Mapp ; 38(3): 1541-1573, 2017 03.
Artigo em Inglês | MEDLINE | ID: mdl-27860095

RESUMO

We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open-source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541-1573, 2017. © 2016 Wiley Periodicals, Inc.


Assuntos
Mapeamento Encefálico , Encéfalo/diagnóstico por imagem , Encéfalo/fisiologia , Teoria da Informação , Neuroimagem/métodos , Distribuição Normal , Simulação por Computador , Eletroencefalografia , Entropia , Humanos , Sensibilidade e Especificidade
20.
J Neurosci ; 35(20): 7750-62, 2015 May 20.
Artigo em Inglês | MEDLINE | ID: mdl-25995464

RESUMO

The phase of low-frequency network activity in the auditory cortex captures changes in neural excitability, entrains to the temporal structure of natural sounds, and correlates with the perceptual performance in acoustic tasks. Although these observations suggest a causal link between network rhythms and perception, it remains unknown how precisely they affect the processes by which neural populations encode sounds. We addressed this question by analyzing neural responses in the auditory cortex of anesthetized rats using stimulus-response models. These models included a parametric dependence on the phase of local field potential rhythms in both stimulus-unrelated background activity and the stimulus-response transfer function. We found that phase-dependent models better reproduced the observed responses than static models, during both stimulation with a series of natural sounds and epochs of silence. This was attributable to two factors: (1) phase-dependent variations in background firing (most prominent for delta; 1-4 Hz); and (2) modulations of response gain that rhythmically amplify and attenuate the responses at specific phases of the rhythm (prominent for frequencies between 2 and 12 Hz). These results provide a quantitative characterization of how slow auditory cortical rhythms shape sound encoding and suggest a differential contribution of network activity at different timescales. In addition, they highlight a putative mechanism that may implement the selective amplification of appropriately timed sound tokens relative to the phase of rhythmic auditory cortex activity.


Assuntos
Córtex Auditivo/fisiologia , Ritmo Delta , Modelos Neurológicos , Animais , Percepção Auditiva , Masculino , Ratos , Ratos Sprague-Dawley
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA