Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 36
Filtrar
1.
Neuroimage ; 233: 117958, 2021 06.
Artículo en Inglés | MEDLINE | ID: mdl-33744458

RESUMEN

The representation of speech in the brain is often examined by measuring the alignment of rhythmic brain activity to the speech envelope. To conveniently quantify this alignment (termed 'speech tracking') many studies consider the broadband speech envelope, which combines acoustic fluctuations across the spectral range. Using EEG recordings, we show that using this broadband envelope can provide a distorted picture on speech encoding. We systematically investigated the encoding of spectrally-limited speech-derived envelopes presented by individual and multiple noise carriers in the human brain. Tracking in the 1 to 6 Hz EEG bands differentially reflected low (0.2 - 0.83 kHz) and high (2.66 - 8 kHz) frequency speech-derived envelopes. This was independent of the specific carrier frequency but sensitive to attentional manipulations, and may reflect the context-dependent emphasis of information from distinct spectral ranges of the speech envelope in low frequency brain activity. As low and high frequency speech envelopes relate to distinct phonemic features, our results suggest that functionally distinct processes contribute to speech tracking in the same EEG bands, and are easily confounded when considering the broadband speech envelope.


Asunto(s)
Estimulación Acústica/métodos , Mapeo Encefálico/métodos , Encéfalo/fisiología , Ritmo Delta/fisiología , Percepción del Habla/fisiología , Ritmo Teta/fisiología , Adulto , Encéfalo/diagnóstico por imagen , Electroencefalografía/métodos , Femenino , Humanos , Masculino , Habla/fisiología , Adulto Joven
2.
Sci Rep ; 11(1): 2370, 2021 01 27.
Artículo en Inglés | MEDLINE | ID: mdl-33504860

RESUMEN

Many studies speak in favor of a rhythmic mode of listening, by which the encoding of acoustic information is structured by rhythmic neural processes at the time scale of about 1 to 4 Hz. Indeed, psychophysical data suggest that humans sample acoustic information in extended soundscapes not uniformly, but weigh the evidence at different moments for their perceptual decision at the time scale of about 2 Hz. We here test the critical prediction that such rhythmic perceptual sampling is directly related to the state of ongoing brain activity prior to the stimulus. Human participants judged the direction of frequency sweeps in 1.2 s long soundscapes while their EEG was recorded. We computed the perceptual weights attributed to different epochs within these soundscapes contingent on the phase or power of pre-stimulus EEG activity. This revealed a direct link between 4 Hz EEG phase and power prior to the stimulus and the phase of the rhythmic component of these perceptual weights. Hence, the temporal pattern by which the acoustic information is sampled over time for behavior is directly related to pre-stimulus brain activity in the delta/theta band. These results close a gap in the mechanistic picture linking ongoing delta band activity with their role in shaping the segmentation and perceptual influence of subsequent acoustic information.


Asunto(s)
Estimulación Acústica , Corteza Auditiva/fisiología , Percepción Auditiva , Ritmo Delta , Electroencefalografía , Ritmo Teta , Adulto , Análisis de Datos , Femenino , Humanos , Masculino , Adulto Joven
3.
J Neurosci ; 41(5): 1068-1079, 2021 02 03.
Artículo en Inglés | MEDLINE | ID: mdl-33273069

RESUMEN

Our senses often receive conflicting multisensory information, which our brain reconciles by adaptive recalibration. A classic example is the ventriloquism aftereffect, which emerges following both cumulative (long-term) and trial-wise exposure to spatially discrepant multisensory stimuli. Despite the importance of such adaptive mechanisms for interacting with environments that change over multiple timescales, it remains debated whether the ventriloquism aftereffects observed following trial-wise and cumulative exposure arise from the same neurophysiological substrate. We address this question by probing electroencephalography recordings from healthy humans (both sexes) for processes predictive of the aftereffect biases following the exposure to spatially offset audiovisual stimuli. Our results support the hypothesis that discrepant multisensory evidence shapes aftereffects on distinct timescales via common neurophysiological processes reflecting sensory inference and memory in parietal-occipital regions, while the cumulative exposure to consistent discrepancies additionally recruits prefrontal processes. During the subsequent unisensory trial, both trial-wise and cumulative exposure bias the encoding of the acoustic information, but do so distinctly. Our results posit a central role of parietal regions in shaping multisensory spatial recalibration, suggest that frontal regions consolidate the behavioral bias for persistent multisensory discrepancies, but also show that the trial-wise and cumulative exposure bias sound position encoding via distinct neurophysiological processes.SIGNIFICANCE STATEMENT Our brain easily reconciles conflicting multisensory information, such as seeing an actress on screen while hearing her voice over headphones. These adaptive mechanisms exert a persistent influence on the perception of subsequent unisensory stimuli, known as the ventriloquism aftereffect. While this aftereffect emerges following trial-wise or cumulative exposure to multisensory discrepancies, it remained unclear whether both arise from a common neural substrate. We here rephrase this hypothesis using human electroencephalography recordings. Our data suggest that parietal regions involved in multisensory and spatial memory mediate the aftereffect following both trial-wise and cumulative adaptation, but also show that additional and distinct processes are involved in consolidating and implementing the aftereffect following prolonged exposure.


Asunto(s)
Estimulación Acústica/métodos , Lóbulo Parietal/fisiología , Estimulación Luminosa/métodos , Desempeño Psicomotor/fisiología , Localización de Sonidos/fisiología , Percepción Visual/fisiología , Adulto , Percepción Auditiva/fisiología , Electroencefalografía/métodos , Femenino , Humanos , Masculino , Adulto Joven
4.
Nat Commun ; 11(1): 5440, 2020 10 28.
Artículo en Inglés | MEDLINE | ID: mdl-33116148

RESUMEN

Despite recent progress in understanding multisensory decision-making, a conclusive mechanistic account of how the brain translates the relevant evidence into a decision is lacking. Specifically, it remains unclear whether perceptual improvements during rapid multisensory decisions are best explained by sensory (i.e., 'Early') processing benefits or post-sensory (i.e., 'Late') changes in decision dynamics. Here, we employ a well-established visual object categorisation task in which early sensory and post-sensory decision evidence can be dissociated using multivariate pattern analysis of the electroencephalogram (EEG). We capitalize on these distinct neural components to identify when and how complementary auditory information influences the encoding of decision-relevant visual evidence in a multisensory context. We show that it is primarily the post-sensory, rather than the early sensory, EEG component amplitudes that are being amplified during rapid audiovisual decision-making. Using a neurally informed drift diffusion model we demonstrate that a multisensory behavioral improvement in accuracy arises from an enhanced quality of the relevant decision evidence, as captured by the post-sensory EEG component, consistent with the emergence of multisensory evidence in higher-order brain areas.


Asunto(s)
Percepción Auditiva/fisiología , Toma de Decisiones/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Adolescente , Adulto , Conducta de Elección/fisiología , Electroencefalografía/estadística & datos numéricos , Femenino , Humanos , Masculino , Modelos Neurológicos , Modelos Psicológicos , Análisis Multivariante , Estimulación Luminosa , Adulto Joven
5.
Elife ; 92020 08 24.
Artículo en Inglés | MEDLINE | ID: mdl-32831168

RESUMEN

Visual speech carried by lip movements is an integral part of communication. Yet, it remains unclear in how far visual and acoustic speech comprehension are mediated by the same brain regions. Using multivariate classification of full-brain MEG data, we first probed where the brain represents acoustically and visually conveyed word identities. We then tested where these sensory-driven representations are predictive of participants' trial-wise comprehension. The comprehension-relevant representations of auditory and visual speech converged only in anterior angular and inferior frontal regions and were spatially dissociated from those representations that best reflected the sensory-driven word identity. These results provide a neural explanation for the behavioural dissociation of acoustic and visual speech comprehension and suggest that cerebral representations encoding word identities may be more modality-specific than often upheld.


Asunto(s)
Encéfalo/anatomía & histología , Encéfalo/fisiología , Fonética , Habla , Estimulación Acústica/métodos , Adolescente , Adulto , Mapeo Encefálico , Femenino , Humanos , Estimulación Luminosa , Lectura , Percepción del Habla , Adulto Joven
6.
Elife ; 82019 06 27.
Artículo en Inglés | MEDLINE | ID: mdl-31246172

RESUMEN

Perception adapts to mismatching multisensory information, both when different cues appear simultaneously and when they appear sequentially. While both multisensory integration and adaptive trial-by-trial recalibration are central for behavior, it remains unknown whether they are mechanistically linked and arise from a common neural substrate. To relate the neural underpinnings of sensory integration and recalibration, we measured whole-brain magnetoencephalography while human participants performed an audio-visual ventriloquist task. Using single-trial multivariate analysis, we localized the perceptually-relevant encoding of multisensory information within and between trials. While we found neural signatures of multisensory integration within temporal and parietal regions, only medial superior parietal activity encoded past and current sensory information and mediated the perceptual recalibration within and between trials. These results highlight a common neural substrate of sensory integration and perceptual recalibration, and reveal a role of medial parietal regions in linking present and previous multisensory evidence to guide adaptive behavior.


Asunto(s)
Neuronas/fisiología , Percepción/fisiología , Sensación/fisiología , Estimulación Acústica , Percepción Auditiva , Conducta , Calibración , Movimientos Oculares/fisiología , Femenino , Humanos , Magnetoencefalografía , Masculino , Adulto Joven
7.
Neuron ; 102(5): 1076-1087.e8, 2019 06 05.
Artículo en Inglés | MEDLINE | ID: mdl-31047778

RESUMEN

When combining information across different senses, humans need to flexibly select cues of a common origin while avoiding distraction from irrelevant inputs. The brain could solve this challenge using a hierarchical principle by deriving rapidly a fused sensory estimate for computational expediency and, later and if required, filtering out irrelevant signals based on the inferred sensory cause(s). Analyzing time- and source-resolved human magnetoencephalographic data, we unveil a systematic spatiotemporal cascade of the relevant computations, starting with early segregated unisensory representations, continuing with sensory fusion in parietal-temporal regions, and culminating as causal inference in the frontal lobe. Our results reconcile previous computational accounts of multisensory perception by showing that prefrontal cortex guides flexible integrative behavior based on candidate representations established in sensory and association cortices, thereby framing multisensory integration in the generalized context of adaptive behavior.


Asunto(s)
Percepción Auditiva/fisiología , Toma de Decisiones/fisiología , Lóbulo Frontal/fisiología , Lóbulo Parietal/fisiología , Lóbulo Temporal/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Adulto , Teorema de Bayes , Femenino , Humanos , Magnetoencefalografía , Masculino , Modelos Neurológicos , Modelos Teóricos , Estimulación Luminosa , Corteza Prefrontal/fisiología , Adulto Joven
8.
PLoS Biol ; 16(3): e2004473, 2018 03.
Artículo en Inglés | MEDLINE | ID: mdl-29529019

RESUMEN

During online speech processing, our brain tracks the acoustic fluctuations in speech at different timescales. Previous research has focused on generic timescales (for example, delta or theta bands) that are assumed to map onto linguistic features such as prosody or syllables. However, given the high intersubject variability in speaking patterns, such a generic association between the timescales of brain activity and speech properties can be ambiguous. Here, we analyse speech tracking in source-localised magnetoencephalographic data by directly focusing on timescales extracted from statistical regularities in our speech material. This revealed widespread significant tracking at the timescales of phrases (0.6-1.3 Hz), words (1.8-3 Hz), syllables (2.8-4.8 Hz), and phonemes (8-12.4 Hz). Importantly, when examining its perceptual relevance, we found stronger tracking for correctly comprehended trials in the left premotor (PM) cortex at the phrasal scale as well as in left middle temporal cortex at the word scale. Control analyses using generic bands confirmed that these effects were specific to the speech regularities in our stimuli. Furthermore, we found that the phase at the phrasal timescale coupled to power at beta frequency (13-30 Hz) in motor areas. This cross-frequency coupling presumably reflects top-down temporal prediction in ongoing speech perception. Together, our results reveal specific functional and perceptually relevant roles of distinct tracking and cross-frequency processes along the auditory-motor pathway.


Asunto(s)
Corteza Auditiva/fisiología , Corteza Motora/fisiología , Percepción del Habla , Habla , Estimulación Acústica , Adolescente , Adulto , Mapeo Encefálico , Femenino , Humanos , Magnetoencefalografía , Masculino
9.
Eur J Neurosci ; 46(10): 2565-2577, 2017 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-28940728

RESUMEN

To make accurate perceptual estimates, observers must take the reliability of sensory information into account. Despite many behavioural studies showing that subjects weight individual sensory cues in proportion to their reliabilities, it is still unclear when during a trial neuronal responses are modulated by the reliability of sensory information or when they reflect the perceptual weights attributed to each sensory input. We investigated these questions using a combination of psychophysics, EEG-based neuroimaging and single-trial decoding. Our results show that the weighted integration of sensory information in the brain is a dynamic process; effects of sensory reliability on task-relevant EEG components were evident 84 ms after stimulus onset, while neural correlates of perceptual weights emerged 120 ms after stimulus onset. These neural processes had different underlying sources, arising from sensory and parietal regions, respectively. Together these results reveal the temporal dynamics of perceptual and neural audio-visual integration and support the notion of temporally early and functionally specific multisensory processes in the brain.


Asunto(s)
Percepción Auditiva/fisiología , Encéfalo/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Adulto , Conducta de Elección , Análisis Discriminante , Electroencefalografía , Femenino , Humanos , Masculino , Estimulación Luminosa , Psicofísica , Adulto Joven
10.
Neuroimage ; 148: 31-41, 2017 03 01.
Artículo en Inglés | MEDLINE | ID: mdl-28082107

RESUMEN

Sensory discriminations, such as judgements about visual motion, often benefit from multisensory evidence. Despite many reports of enhanced brain activity during multisensory conditions, it remains unclear which dynamic processes implement the multisensory benefit for an upcoming decision in the human brain. Specifically, it remains difficult to attribute perceptual benefits to specific processes, such as early sensory encoding, the transformation of sensory representations into a motor response, or to more unspecific processes such as attention. We combined an audio-visual motion discrimination task with the single-trial mapping of dynamic sensory representations in EEG activity to localize when and where multisensory congruency facilitates perceptual accuracy. Our results show that a congruent sound facilitates the encoding of motion direction in occipital sensory - as opposed to parieto-frontal - cortices, and facilitates later - as opposed to early (i.e. below 100ms) - sensory activations. This multisensory enhancement was visible as an earlier rise of motion-sensitive activity in middle-occipital regions about 350ms from stimulus onset, which reflected the better discriminability of motion direction from brain activity and correlated with the perceptual benefit provided by congruent multisensory information. This supports a hierarchical model of multisensory integration in which the enhancement of relevant sensory cortical representations is transformed into a more accurate choice.


Asunto(s)
Discriminación en Psicología/fisiología , Percepción de Movimiento/fisiología , Lóbulo Occipital/fisiología , Sonido , Percepción Visual/fisiología , Estimulación Acústica , Ritmo alfa/fisiología , Mapeo Encefálico , Electroencefalografía , Femenino , Humanos , Masculino , Desempeño Psicomotor/fisiología , Adulto Joven
11.
Neuroimage ; 147: 32-42, 2017 02 15.
Artículo en Inglés | MEDLINE | ID: mdl-27903440

RESUMEN

The timing of slow auditory cortical activity aligns to the rhythmic fluctuations in speech. This entrainment is considered to be a marker of the prosodic and syllabic encoding of speech, and has been shown to correlate with intelligibility. Yet, whether and how auditory cortical entrainment is influenced by the activity in other speech-relevant areas remains unknown. Using source-localized MEG data, we quantified the dependency of auditory entrainment on the state of oscillatory activity in fronto-parietal regions. We found that delta band entrainment interacted with the oscillatory activity in three distinct networks. First, entrainment in the left anterior superior temporal gyrus (STG) was modulated by beta power in orbitofrontal areas, possibly reflecting predictive top-down modulations of auditory encoding. Second, entrainment in the left Heschl's Gyrus and anterior STG was dependent on alpha power in central areas, in line with the importance of motor structures for phonological analysis. And third, entrainment in the right posterior STG modulated theta power in parietal areas, consistent with the engagement of semantic memory. These results illustrate the topographical network interactions of auditory delta entrainment and reveal distinct cross-frequency mechanisms by which entrainment can interact with different cognitive processes underlying speech perception.


Asunto(s)
Corteza Auditiva/fisiología , Ritmo Delta/fisiología , Lóbulo Frontal/fisiología , Magnetoencefalografía , Lóbulo Parietal/fisiología , Estimulación Acústica , Adulto , Ritmo alfa/fisiología , Ritmo beta/fisiología , Femenino , Humanos , Masculino , Red Nerviosa/fisiología , Percepción del Habla/fisiología , Lóbulo Temporal/fisiología , Ritmo Teta/fisiología , Adulto Joven
12.
Elife ; 52016 05 05.
Artículo en Inglés | MEDLINE | ID: mdl-27146891

RESUMEN

During continuous speech, lip movements provide visual temporal signals that facilitate speech processing. Here, using MEG we directly investigated how these visual signals interact with rhythmic brain activity in participants listening to and seeing the speaker. First, we investigated coherence between oscillatory brain activity and speaker's lip movements and demonstrated significant entrainment in visual cortex. We then used partial coherence to remove contributions of the coherent auditory speech signal from the lip-brain coherence. Comparing this synchronization between different attention conditions revealed that attending visual speech enhances the coherence between activity in visual cortex and the speaker's lips. Further, we identified a significant partial coherence between left motor cortex and lip movements and this partial coherence directly predicted comprehension accuracy. Our results emphasize the importance of visually entrained and attention-modulated rhythmic brain activity for the enhancement of audiovisual speech processing.


Asunto(s)
Ondas Encefálicas , Labio/fisiología , Corteza Motora/fisiología , Movimiento , Desempeño Psicomotor , Inteligibilidad del Habla , Estimulación Acústica , Adolescente , Adulto , Femenino , Voluntarios Sanos , Humanos , Masculino , Corteza Visual , Adulto Joven
13.
Proc Natl Acad Sci U S A ; 113(17): 4842-7, 2016 Apr 26.
Artículo en Inglés | MEDLINE | ID: mdl-27071110

RESUMEN

The qualities of perception depend not only on the sensory inputs but also on the brain state before stimulus presentation. Although the collective evidence from neuroimaging studies for a relation between prestimulus state and perception is strong, the interpretation in the context of sensory computations or decision processes has remained difficult. In the auditory system, for example, previous studies have reported a wide range of effects in terms of the perceptually relevant frequency bands and state parameters (phase/power). To dissociate influences of state on earlier sensory representations and higher-level decision processes, we collected behavioral and EEG data in human participants performing two auditory discrimination tasks relying on distinct acoustic features. Using single-trial decoding, we quantified the relation between prestimulus activity, relevant sensory evidence, and choice in different task-relevant EEG components. Within auditory networks, we found that phase had no direct influence on choice, whereas power in task-specific frequency bands affected the encoding of sensory evidence. Within later-activated frontoparietal regions, theta and alpha phase had a direct influence on choice, without involving sensory evidence. These results delineate two consistent mechanisms by which prestimulus activity shapes perception. However, the timescales of the relevant neural activity depend on the specific brain regions engaged by the respective task.


Asunto(s)
Estimulación Acústica , Mapeo Encefálico , Discriminación de la Altura Tonal/fisiología , Percepción de la Altura Tonal/fisiología , Adulto , Ritmo alfa , Toma de Decisiones , Electroencefalografía , Femenino , Humanos , Masculino , Ritmo Teta , Factores de Tiempo , Adulto Joven
14.
J Neurosci ; 35(44): 14691-701, 2015 Nov 04.
Artículo en Inglés | MEDLINE | ID: mdl-26538641

RESUMEN

The entrainment of slow rhythmic auditory cortical activity to the temporal regularities in speech is considered to be a central mechanism underlying auditory perception. Previous work has shown that entrainment is reduced when the quality of the acoustic input is degraded, but has also linked rhythmic activity at similar time scales to the encoding of temporal expectations. To understand these bottom-up and top-down contributions to rhythmic entrainment, we manipulated the temporal predictive structure of speech by parametrically altering the distribution of pauses between syllables or words, thereby rendering the local speech rate irregular while preserving intelligibility and the envelope fluctuations of the acoustic signal. Recording EEG activity in human participants, we found that this manipulation did not alter neural processes reflecting the encoding of individual sound transients, such as evoked potentials. However, the manipulation significantly reduced the fidelity of auditory delta (but not theta) band entrainment to the speech envelope. It also reduced left frontal alpha power and this alpha reduction was predictive of the reduced delta entrainment across participants. Our results show that rhythmic auditory entrainment in delta and theta bands reflect functionally distinct processes. Furthermore, they reveal that delta entrainment is under top-down control and likely reflects prefrontal processes that are sensitive to acoustical regularities rather than the bottom-up encoding of acoustic features. SIGNIFICANCE STATEMENT: The entrainment of rhythmic auditory cortical activity to the speech envelope is considered to be critical for hearing. Previous work has proposed divergent views in which entrainment reflects either early evoked responses related to sound encoding or high-level processes related to expectation or cognitive selection. Using a manipulation of speech rate, we dissociated auditory entrainment at different time scales. Specifically, our results suggest that delta entrainment is controlled by frontal alpha mechanisms and thus support the notion that rhythmic auditory cortical entrainment is shaped by top-down mechanisms.


Asunto(s)
Estimulación Acústica/métodos , Ritmo alfa/fisiología , Corteza Auditiva/fisiología , Potenciales Evocados Auditivos/fisiología , Medición de la Producción del Habla/métodos , Habla/fisiología , Adolescente , Adulto , Femenino , Lóbulo Frontal/fisiología , Humanos , Masculino , Adulto Joven
15.
Trends Cogn Sci ; 19(12): 783-796, 2015 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-26454482

RESUMEN

Social animals can identify conspecifics by many forms of sensory input. However, whether the neuronal computations that support this ability to identify individuals rely on modality-independent convergence or involve ongoing synergistic interactions along the multiple sensory streams remains controversial. Direct neuronal measurements at relevant brain sites could address such questions, but this requires better bridging the work in humans and animal models. Here, we overview recent studies in nonhuman primates on voice and face identity-sensitive pathways and evaluate the correspondences to relevant findings in humans. This synthesis provides insights into converging sensory streams in the primate anterior temporal lobe (ATL) for identity processing. Furthermore, we advance a model and suggest how alternative neuronal mechanisms could be tested.


Asunto(s)
Mapeo Encefálico , Encéfalo/fisiología , Modelos Neurológicos , Vías Nerviosas/fisiología , Estimulación Acústica , Animales , Percepción Auditiva , Lateralidad Funcional , Humanos , Estimulación Luminosa
16.
PLoS Biol ; 13(2): e1002075, 2015 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-25710476

RESUMEN

At any given moment, our brain processes multiple inputs from its different sensory modalities (vision, hearing, touch, etc.). In deciphering this array of sensory information, the brain has to solve two problems: (1) which of the inputs originate from the same object and should be integrated and (2) for the sensations originating from the same object, how best to integrate them. Recent behavioural studies suggest that the human brain solves these problems using optimal probabilistic inference, known as Bayesian causal inference. However, how and where the underlying computations are carried out in the brain have remained unknown. By combining neuroimaging-based decoding techniques and computational modelling of behavioural data, a new study now sheds light on how multisensory causal inference maps onto specific brain areas. The results suggest that the complexity of neural computations increases along the visual hierarchy and link specific components of the causal inference process with specific visual and parietal regions.


Asunto(s)
Percepción Auditiva/fisiología , Modelos Neurológicos , Red Nerviosa/fisiología , Vías Nerviosas/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Corteza Auditiva/anatomía & histología , Corteza Auditiva/fisiología , Teorema de Bayes , Mapeo Encefálico , Cognición/fisiología , Humanos , Imagen por Resonancia Magnética , Lóbulo Parietal/anatomía & histología , Lóbulo Parietal/fisiología , Estimulación Luminosa , Psicofísica , Corteza Visual/anatomía & histología , Corteza Visual/fisiología
17.
J Neurosci ; 34(7): 2524-37, 2014 Feb 12.
Artículo en Inglés | MEDLINE | ID: mdl-24523543

RESUMEN

Effective interactions between conspecific individuals can depend upon the receiver forming a coherent multisensory representation of communication signals, such as merging voice and face content. Neuroimaging studies have identified face- or voice-sensitive areas (Belin et al., 2000; Petkov et al., 2008; Tsao et al., 2008), some of which have been proposed as candidate regions for face and voice integration (von Kriegstein et al., 2005). However, it was unclear how multisensory influences occur at the neuronal level within voice- or face-sensitive regions, especially compared with classically defined multisensory regions in temporal association cortex (Stein and Stanford, 2008). Here, we characterize auditory (voice) and visual (face) influences on neuronal responses in a right-hemisphere voice-sensitive region in the anterior supratemporal plane (STP) of Rhesus macaques. These results were compared with those in the neighboring superior temporal sulcus (STS). Within the STP, our results show auditory sensitivity to several vocal features, which was not evident in STS units. We also newly identify a functionally distinct neuronal subpopulation in the STP that appears to carry the area's sensitivity to voice identity related features. Audiovisual interactions were prominent in both the STP and STS. However, visual influences modulated the responses of STS neurons with greater specificity and were more often associated with congruent voice-face stimulus pairings than STP neurons. Together, the results reveal the neuronal processes subserving voice-sensitive fMRI activity patterns in primates, generate hypotheses for testing in the visual modality, and clarify the position of voice-sensitive areas within the unisensory and multisensory processing hierarchies.


Asunto(s)
Percepción Auditiva/fisiología , Mapeo Encefálico , Neuronas/fisiología , Lóbulo Temporal/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Animales , Macaca mulatta , Imagen por Resonancia Magnética , Masculino , Neuronas/citología , Estimulación Luminosa , Lóbulo Temporal/citología
18.
Neuropsychologia ; 53: 84-93, 2014 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-24262657

RESUMEN

Multisensory interactions shape every day perception and stimuli in one modality can enhance perception in another even when not being directly task relevant. While the underlying neural principles are slowly becoming evident, most work has focused on transient stimuli and little is known about those mechanisms underlying audio-visual motion processing. We studied the facilitation of visual motion perception by auxiliary sounds, i.e. sounds that by themselves do not provide the specific evidence required for the perceptual task at hand. In our experiment human observers became significantly better at detecting visual random dot motion when this was accompanied by auxiliary acoustic motion rather than stationary sounds. EEG measurements revealed that both auditory and visual motion modulated low frequency oscillations over the respective sensory cortices. Using single trial decoding we quantified those oscillatory signatures permitting the discrimination of visual motion similar to the subject's task. This revealed visual motion-related signatures in low (1-4 Hz) and alpha (8-12 Hz) bands that were significantly enhanced during congruent compared to disparate audio-visual conditions. Importantly, the auditory enhancement of these oscillatory signatures was predictive of the perceptual multisensory facilitation. These findings emphasise the importance of slow and alpha rhythms for perception in a multisensory context and suggest that acoustic motion can enhance visual perception by means of attention or priming-related mechanisms that are reflected in rhythmic activity over parieto-occipital regions.


Asunto(s)
Ritmo alfa , Percepción Auditiva/fisiología , Encéfalo/fisiología , Percepción de Movimiento/fisiología , Estimulación Acústica , Adolescente , Adulto , Electroencefalografía , Femenino , Humanos , Masculino , Movimiento (Física) , Pruebas Neuropsicológicas , Estimulación Luminosa , Psicofísica , Percepción Visual/fisiología , Adulto Joven
19.
J Cogn Neurosci ; 26(4): 699-711, 2014 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-24236698

RESUMEN

Perception is a multisensory process, and previous work has shown that multisensory interactions occur not only for object-related stimuli but also for simplistic and apparently unrelated inputs to the different senses. We here compare the facilitation of visual perception induced by transient (target-synchronized) sounds to the facilitation provided by continuous background noise like sounds. Specifically, we show that continuous acoustic noise improves visual contrast detection by systematically shifting psychometric curves in an amplitude-dependent manner. This multisensory benefit was found to be both qualitatively and quantitatively similar to that induced by a transient and target synchronized sound in the same paradigm. Studying the underlying neural mechanisms using electric neuroimaging (EEG), we found that acoustic noise alters occipital alpha (8-12 Hz) power and decreases beta-band (14-20 Hz) coupling of occipital and temporal sites. Task-irrelevant and continuous sounds thereby have an amplitude-dependent effect on cortical mechanisms implicated in shaping visual cortical excitability. The same oscillatory mechanisms also mediate visual facilitation by transient sounds, and our results suggest that task-related sounds and task-irrelevant background noises could induce perceptually and mechanistically similar enhancement of visual perception. Given the omnipresence of sounds and noises in our environment, such multisensory interactions may affect perception in many everyday scenarios.


Asunto(s)
Percepción Auditiva/fisiología , Encéfalo/fisiología , Ruido , Percepción Visual/fisiología , Estimulación Acústica , Adolescente , Adulto , Ondas Encefálicas , Electroencefalografía , Potenciales Evocados Auditivos , Potenciales Evocados Visuales , Femenino , Humanos , Masculino , Estimulación Luminosa , Psicometría , Umbral Sensorial , Adulto Joven
20.
J Neurosci ; 33(46): 18277-87, 2013 Nov 13.
Artículo en Inglés | MEDLINE | ID: mdl-24227737

RESUMEN

The encoding of sensory information by populations of cortical neurons forms the basis for perception but remains poorly understood. To understand the constraints of cortical population coding we analyzed neural responses to natural sounds recorded in auditory cortex of primates (Macaca mulatta). We estimated stimulus information while varying the composition and size of the considered population. Consistent with previous reports we found that when choosing subpopulations randomly from the recorded ensemble, the average population information increases steadily with population size. This scaling was explained by a model assuming that each neuron carried equal amounts of information, and that any overlap between the information carried by each neuron arises purely from random sampling within the stimulus space. However, when studying subpopulations selected to optimize information for each given population size, the scaling of information was strikingly different: a small fraction of temporally precise cells carried the vast majority of information. This scaling could be explained by an extended model, assuming that the amount of information carried by individual neurons was highly nonuniform, with few neurons carrying large amounts of information. Importantly, these optimal populations can be determined by a single biophysical marker-the neuron's encoding time scale-allowing their detection and readout within biologically realistic circuits. These results show that extrapolations of population information based on random ensembles may overestimate the population size required for stimulus encoding, and that sensory cortical circuits may process information using small but highly informative ensembles.


Asunto(s)
Estimulación Acústica/métodos , Potenciales de Acción/fisiología , Corteza Auditiva/fisiología , Neuronas/fisiología , Animales , Macaca mulatta , Masculino , Distribución Aleatoria , Factores de Tiempo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA