Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
1.
J Neurosci ; 38(11): 2854-2862, 2018 03 14.
Artículo en Inglés | MEDLINE | ID: mdl-29440554

RESUMEN

The cerebral cortex is a major hub for the convergence and integration of signals from across the sensory modalities; sensory cortices, including primary regions, are no exception. Here we show that visual stimuli influence neural firing in the auditory cortex of awake male and female mice, using multisite probes to sample single units across multiple cortical layers. We demonstrate that visual stimuli influence firing in both primary and secondary auditory cortex. We then determine the laminar location of recording sites through electrode track tracing with fluorescent dye and optogenetic identification using layer-specific markers. Spiking responses to visual stimulation occur deep in auditory cortex and are particularly prominent in layer 6. Visual modulation of firing rate occurs more frequently at areas with secondary-like auditory responses than those with primary-like responses. Auditory cortical responses to drifting visual gratings are not orientation-tuned, unlike visual cortex responses. The deepest cortical layers thus appear to be an important locus for cross-modal integration in auditory cortex.SIGNIFICANCE STATEMENT The deepest layers of the auditory cortex are often considered its most enigmatic, possessing a wide range of cell morphologies and atypical sensory responses. Here we show that, in mouse auditory cortex, these layers represent a locus of cross-modal convergence, containing many units responsive to visual stimuli. Our results suggest that this visual signal conveys the presence and timing of a stimulus rather than specifics about that stimulus, such as its orientation. These results shed light on both how and what types of cross-modal information is integrated at the earliest stages of sensory cortical processing.


Asunto(s)
Corteza Auditiva/fisiología , Percepción Visual/fisiología , Estimulación Acústica , Animales , Corteza Auditiva/citología , Mapeo Encefálico , Electrodos , Potenciales Evocados Visuales/fisiología , Femenino , Masculino , Ratones , Ratones Endogámicos C57BL , Optogenética , Orientación/fisiología , Estimulación Luminosa , Corteza Visual/fisiología
2.
Proc Natl Acad Sci U S A ; 110(5): 1959-63, 2013 Jan 29.
Artículo en Inglés | MEDLINE | ID: mdl-23319616

RESUMEN

Human speech universally exhibits a 3- to 8-Hz rhythm, corresponding to the rate of syllable production, which is reflected in both the sound envelope and the visual mouth movements. Artificial perturbation of the speech rhythm outside the natural range reduces speech intelligibility, demonstrating a perceptual tuning to this frequency band. One theory posits that the mouth movements at the core of this speech rhythm evolved through modification of ancestral primate facial expressions. Recent evidence shows that one such communicative gesture in macaque monkeys, lip-smacking, has motor parallels with speech in its rhythmicity, its developmental trajectory, and the coordination of vocal tract structures. Whether monkeys also exhibit a perceptual tuning to the natural rhythms of lip-smacking is unknown. To investigate this, we tested rhesus monkeys in a preferential-looking procedure, measuring the time spent looking at each of two side-by-side computer-generated monkey avatars lip-smacking at natural versus sped-up or slowed-down rhythms. Monkeys showed an overall preference for the natural rhythm compared with the perturbed rhythms. This lends behavioral support for the hypothesis that perceptual processes in monkeys are similarly tuned to the natural frequencies of communication signals as they are in humans. Our data provide perceptual evidence for the theory that speech may have evolved from ancestral primate rhythmic facial expressions.


Asunto(s)
Expresión Facial , Labio/fisiología , Macaca mulatta/fisiología , Vocalización Animal/fisiología , Animales , Evolución Biológica , Humanos , Movimiento/fisiología , Periodicidad , Habla/fisiología , Ritmo Teta/fisiología
3.
Am J Primatol ; 75(9): 904-16, 2013 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-23592313

RESUMEN

Noisy acoustic environments present several challenges for the evolution of acoustic communication systems. Among the most significant is the need to limit degradation of spectro-temporal signal structure in order to maintain communicative efficacy. This can be achieved by selecting for several potentially complementary processes. Selection can act on behavioral mechanisms permitting signalers to control the timing and occurrence of signal production to avoid acoustic interference. Likewise, the signal itself may be the target of selection, biasing the evolution of its structure to comprise acoustic features that avoid interference from ambient noise or degrade minimally in the habitat. Here, we address the latter topic for common marmoset (Callithrix jacchus) long-distance contact vocalizations, known as phee calls. Our aim was to test whether this vocalization is specifically adapted for transmission in a species-typical forest habitat, the Atlantic forests of northeastern Brazil. We combined seasonal analyses of ambient habitat acoustics with experiments in which pure tones, clicks, and vocalizations were broadcast and rerecorded at different distances to characterize signal degradation in the habitat. Ambient sound was analyzed from intervals throughout the day and over rainy and dry seasons, showing temporal regularities across varied timescales. Broadcast experiment results indicated that the tone and click stimuli showed the typically inverse relationship between frequency and signaling efficacy. Although marmoset phee calls degraded over distance with marked predictability compared with artificial sounds, they did not otherwise appear to be specially designed for increased transmission efficacy or minimal interference in this habitat. We discuss these data in the context of other similar studies and evidence of potential behavioral mechanisms for avoiding acoustic interference in order to maintain effective vocal communication in common marmosets.


Asunto(s)
Acústica , Callithrix/fisiología , Ecosistema , Vocalización Animal/fisiología , Animales , Brasil , Estaciones del Año
4.
Dev Sci ; 15(4): 557-68, 2012 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-22709404

RESUMEN

Across all languages studied to date, audiovisual speech exhibits a consistent rhythmic structure. This rhythm is critical to speech perception. Some have suggested that the speech rhythm evolved de novo in humans. An alternative account--the one we explored here--is that the rhythm of speech evolved through the modification of rhythmic facial expressions. We tested this idea by investigating the structure and development of macaque monkey lipsmacks and found that their developmental trajectory is strikingly similar to the one that leads from human infant babbling to adult speech. Specifically, we show that: (1) younger monkeys produce slower, more variable mouth movements and as they get older, these movements become faster and less variable; and (2) this developmental pattern does not occur for another cyclical mouth movement--chewing. These patterns parallel human developmental patterns for speech and chewing. They suggest that, in both species, the two types of rhythmic mouth movements use different underlying neural circuits that develop in different ways. Ultimately, both lipsmacking and speech converge on a ~5 Hz rhythm that represents the frequency that characterizes the speech rhythm of human adults. We conclude that monkey lipsmacking and human speech share a homologous developmental mechanism, lending strong empirical support to the idea that the human speech rhythm evolved from the rhythmic facial expressions of our primate ancestors.


Asunto(s)
Labio/fisiología , Macaca mulatta/fisiología , Periodicidad , Habla/fisiología , Adulto , Animales , Animales Recién Nacidos , Evolución Biológica , Femenino , Humanos , Lactante , Desarrollo del Lenguaje , Boca/fisiología , Movimiento/fisiología , Percepción del Habla/fisiología , Factores de Tiempo
5.
Curr Res Neurobiol ; 3: 100040, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36518337

RESUMEN

Recent studies have established significant anatomical and functional connections between visual areas and primary auditory cortex (A1), which may be important for cognitive processes such as communication and spatial perception. These studies have raised two important questions: First, which cell populations in A1 respond to visual input and/or are influenced by visual context? Second, which aspects of sound encoding are affected by visual context? To address these questions, we recorded single-unit activity across cortical layers in awake mice during exposure to auditory and visual stimuli. Neurons responsive to visual stimuli were most prevalent in the deep cortical layers and included both excitatory and inhibitory cells. The overwhelming majority of these neurons also responded to sound, indicating unimodal visual neurons are rare in A1. Other neurons for which sound-evoked responses were modulated by visual context were similarly excitatory or inhibitory but more evenly distributed across cortical layers. These modulatory influences almost exclusively affected sustained sound-evoked firing rate (FR) responses or spectrotemporal receptive fields (STRFs); transient FR changes at stimulus onset were rarely modified by visual context. Neuron populations with visually modulated STRFs and sustained FR responses were mostly non-overlapping, suggesting spectrotemporal feature selectivity and overall excitability may be differentially sensitive to visual context. The effects of visual modulation were heterogeneous, increasing and decreasing STRF gain in roughly equal proportions of neurons. Our results indicate visual influences are surprisingly common and diversely expressed throughout layers and cell types in A1, affecting nearly one in five neurons overall.

6.
Elife ; 112022 08 18.
Artículo en Inglés | MEDLINE | ID: mdl-35980027

RESUMEN

In everyday behavior, sensory systems are in constant competition for attentional resources, but the cellular and circuit-level mechanisms of modality-selective attention remain largely uninvestigated. We conducted translaminar recordings in mouse auditory cortex (AC) during an audiovisual (AV) attention shifting task. Attending to sound elements in an AV stream reduced both pre-stimulus and stimulus-evoked spiking activity, primarily in deep-layer neurons and neurons without spectrotemporal tuning. Despite reduced spiking, stimulus decoder accuracy was preserved, suggesting improved sound encoding efficiency. Similarly, task-irrelevant mapping stimuli during inter-trial intervals evoked fewer spikes without impairing stimulus encoding, indicating that attentional modulation generalized beyond training stimuli. Importantly, spiking reductions predicted trial-to-trial behavioral accuracy during auditory attention, but not visual attention. Together, these findings suggest auditory attention facilitates sound discrimination by filtering sound-irrelevant background activity in AC, and that the deepest cortical layers serve as a hub for integrating extramodal contextual information.


Asunto(s)
Corteza Auditiva , Estimulación Acústica , Animales , Corteza Auditiva/fisiología , Percepción Auditiva/fisiología , Ratones , Estimulación Luminosa , Sonido , Percepción Visual/fisiología
7.
Eur J Neurosci ; 31(10): 1807-17, 2010 May.
Artículo en Inglés | MEDLINE | ID: mdl-20584185

RESUMEN

Audiovisual speech has a stereotypical rhythm that is between 2 and 7 Hz, and deviations from this frequency range in either modality reduce intelligibility. Understanding how audiovisual speech evolved requires investigating the origins of this rhythmic structure. One hypothesis is that the rhythm of speech evolved through the modification of some pre-existing cyclical jaw movements in a primate ancestor. We tested this hypothesis by investigating the temporal structure of lipsmacks and teeth-grinds of macaque monkeys and the neural responses to these facial gestures in the superior temporal sulcus (STS), a region implicated in the processing of audiovisual communication signals in both humans and monkeys. We found that both lipsmacks and teeth-grinds have consistent but distinct peak frequencies and that both fall well within the 2-7 Hz range of mouth movements associated with audiovisual speech. Single neurons and local field potentials of the STS of monkeys readily responded to such facial rhythms, but also responded just as robustly to yawns, a nonrhythmic but dynamic facial expression. All expressions elicited enhanced power in the delta (0-3Hz), theta (3-8Hz), alpha (8-14Hz) and gamma (> 60 Hz) frequency ranges, and suppressed power in the beta (20-40Hz) range. Thus, STS is sensitive to, but not selective for, rhythmic facial gestures. Taken together, these data provide support for the idea that that audiovisual speech evolved (at least in part) from the rhythmic facial gestures of an ancestral primate and that the STS was sensitive to and thus 'prepared' for the advent of rhythmic audiovisual communication.


Asunto(s)
Evolución Biológica , Expresión Facial , Habla/fisiología , Lóbulo Temporal/fisiología , Estimulación Acústica , Animales , Interpretación Estadística de Datos , Electroencefalografía , Potenciales Evocados/fisiología , Gestos , Macaca mulatta , Imagen por Resonancia Magnética , Masculino , Boca/fisiología , Movimiento/fisiología , Neuronas/fisiología , Estimulación Luminosa
8.
eNeuro ; 6(5)2019.
Artículo en Inglés | MEDLINE | ID: mdl-31481397

RESUMEN

Information processing in sensory cortex is highly sensitive to nonsensory variables such as anesthetic state, arousal, and task engagement. Recent work in mouse visual cortex suggests that evoked firing rates, stimulus-response mutual information, and encoding efficiency increase when animals are engaged in movement. A disinhibitory circuit appears central to this change: inhibitory neurons expressing vasoactive intestinal peptide (VIP) are activated during movement and disinhibit pyramidal cells by suppressing other inhibitory interneurons. Paradoxically, although movement activates a similar disinhibitory circuit in auditory cortex (ACtx), most ACtx studies report reduced spiking during movement. It is unclear whether the resulting changes in spike rates result in corresponding changes in stimulus-response mutual information. We examined ACtx responses evoked by tone cloud stimuli, in awake mice of both sexes, during spontaneous movement and still conditions. VIP+ cells were optogenetically activated on half of trials, permitting independent analysis of the consequences of movement and VIP activation, as well as their intersection. Movement decreased stimulus-related spike rates as well as mutual information and encoding efficiency. VIP interneuron activation tended to increase stimulus-evoked spike rates but not stimulus-response mutual information, thus reducing encoding efficiency. The intersection of movement and VIP activation was largely consistent with a linear combination of these main effects: VIP activation recovered movement-induced reduction in spike rates, but not information transfer.


Asunto(s)
Estimulación Acústica/métodos , Corteza Auditiva/metabolismo , Interneuronas/metabolismo , Movimiento/fisiología , Péptido Intestinal Vasoactivo/metabolismo , Potenciales de Acción/fisiología , Animales , Corteza Auditiva/química , Femenino , Técnicas de Sustitución del Gen , Interneuronas/química , Masculino , Ratones , Ratones Endogámicos C57BL , Ratones Transgénicos , Optogenética/métodos , Péptido Intestinal Vasoactivo/análisis
9.
Sci Rep ; 6: 25431, 2016 05 05.
Artículo en Inglés | MEDLINE | ID: mdl-27145729

RESUMEN

A common pattern in dominance hierarchies is that some ranks result in higher levels of psychosocial stress than others. Such stress can lead to negative health outcomes, possibly through altered levels of stress hormones. The dominance rank-stress physiology relationship is known to vary between species; sometimes dominants show higher levels of glucocorticoid stress hormones, whereas in other cases subordinates show higher levels. It is less clear how this relationship varies between groups of different ages or cultures. In this study, we used long-term cortisol measurement methods to compare the effect of rank on cortisol levels in adult and adolescent male rhesus macaques. In the adult groups, subordinates had significantly higher cortisol levels. In the adolescents, no significant correlation between cortisol and status was found. Further analysis demonstrated that the adult hierarchy was stricter than that of the adolescents. Adult subordinates received extreme aggression more frequently than dominants, and this class of behavior was positively correlated with cortisol; by contrast, adolescents showed neither trend. Together, these findings provide evidence for a cortisol-rank relationship determined by social factors, namely, despotism of the group, and highlight the importance of group-specific social analysis when comparing or combining results obtained from different groups of animals.


Asunto(s)
Hidrocortisona/análisis , Macaca mulatta/metabolismo , Estrés Psicológico/metabolismo , Factores de Edad , Agresión , Animales , Conducta Animal , Masculino , Predominio Social
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA