Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros

Banco de datos
Tipo del documento
Asunto de la revista
País de afiliación
Intervalo de año de publicación
1.
J Neurophysiol ; 132(3): 1098-1114, 2024 Sep 01.
Artículo en Inglés | MEDLINE | ID: mdl-39140590

RESUMEN

Sinusoidal amplitude modulation (SAM) is a key feature of complex sounds. Although psychophysical studies have characterized SAM perception, and neurophysiological studies in anesthetized animals report a transformation from the cochlear nucleus' (CN; brainstem) temporal code to the inferior colliculus' (IC; midbrain's) rate code, none have used awake animals or nonhuman primates to compare CN and IC's coding strategies to modulation-frequency perception. To address this, we recorded single-unit responses and compared derived neurometric measures in the CN and IC to psychometric measures of modulation frequency (MF) discrimination in macaques. IC and CN neurons often exhibited tuned responses to SAM in rate and spike-timing measures of modulation coding. Neurometric thresholds spanned a large range (2-200 Hz ΔMF). The lowest 40% of IC thresholds were less than or equal to psychometric thresholds, regardless of which code was used, whereas CN thresholds were greater than psychometric thresholds. Discrimination at 10-20 Hz could be explained by indiscriminately pooling 30 units in either structure, whereas discrimination at higher MFs was best explained by more selective pooling. This suggests that pooled CN activity was sufficient for AM discrimination. Psychometric and neurometric thresholds decreased as stimulus duration increased, but IC and CN thresholds were higher and more variable than behavior at short durations. This slower subcortical temporal integration compared with behavior was consistent with a drift diffusion model that reproduced individual differences in performance and can constrain future neurophysiological studies of temporal integration. These measures provide an account of AM perception at the neurophysiological, computational, and behavioral levels.NEW & NOTEWORTHY In everyday environments, the brain is tasked with extracting information from sound envelopes, which involves both sensory encoding and perceptual decision-making. Different neural codes for envelope representation have been characterized in midbrain and cortex, but studies of brainstem nuclei such as the cochlear nucleus (CN) have usually been conducted under anesthesia in nonprimate species. Here, we found that subcortical activity in awake monkeys and a biologically plausible perceptual decision-making model accounted for sound envelope discrimination behavior.


Asunto(s)
Colículos Inferiores , Macaca mulatta , Vigilia , Animales , Colículos Inferiores/fisiología , Vigilia/fisiología , Masculino , Núcleo Coclear/fisiología , Percepción Auditiva/fisiología , Neuronas/fisiología , Femenino , Vías Auditivas/fisiología , Estimulación Acústica
2.
J Assoc Res Otolaryngol ; 22(4): 365-386, 2021 07.
Artículo en Inglés | MEDLINE | ID: mdl-34014416

RESUMEN

In a naturalistic environment, auditory cues are often accompanied by information from other senses, which can be redundant with or complementary to the auditory information. Although the multisensory interactions derived from this combination of information and that shape auditory function are seen across all sensory modalities, our greatest body of knowledge to date centers on how vision influences audition. In this review, we attempt to capture the state of our understanding at this point in time regarding this topic. Following a general introduction, the review is divided into 5 sections. In the first section, we review the psychophysical evidence in humans regarding vision's influence in audition, making the distinction between vision's ability to enhance versus alter auditory performance and perception. Three examples are then described that serve to highlight vision's ability to modulate auditory processes: spatial ventriloquism, cross-modal dynamic capture, and the McGurk effect. The final part of this section discusses models that have been built based on available psychophysical data and that seek to provide greater mechanistic insights into how vision can impact audition. The second section reviews the extant neuroimaging and far-field imaging work on this topic, with a strong emphasis on the roles of feedforward and feedback processes, on imaging insights into the causal nature of audiovisual interactions, and on the limitations of current imaging-based approaches. These limitations point to a greater need for machine-learning-based decoding approaches toward understanding how auditory representations are shaped by vision. The third section reviews the wealth of neuroanatomical and neurophysiological data from animal models that highlights audiovisual interactions at the neuronal and circuit level in both subcortical and cortical structures. It also speaks to the functional significance of audiovisual interactions for two critically important facets of auditory perception-scene analysis and communication. The fourth section presents current evidence for alterations in audiovisual processes in three clinical conditions: autism, schizophrenia, and sensorineural hearing loss. These changes in audiovisual interactions are postulated to have cascading effects on higher-order domains of dysfunction in these conditions. The final section highlights ongoing work seeking to leverage our knowledge of audiovisual interactions to develop better remediation approaches to these sensory-based disorders, founded in concepts of perceptual plasticity in which vision has been shown to have the capacity to facilitate auditory learning.


Asunto(s)
Percepción Auditiva , Percepción Visual , Estimulación Acústica , Animales , Audición , Humanos , Estimulación Luminosa
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA