Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 109
Filtrar
Más filtros

Banco de datos
Tipo del documento
Intervalo de año de publicación
1.
PLoS Comput Biol ; 20(1): e1011783, 2024 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-38206969

RESUMEN

Neurons throughout the brain modulate their firing rate lawfully in response to sensory input. Theories of neural computation posit that these modulations reflect the outcome of a constrained optimization in which neurons aim to robustly and efficiently represent sensory information. Our understanding of how this optimization varies across different areas in the brain, however, is still in its infancy. Here, we show that neural sensory responses transform along the dorsal stream of the visual system in a manner consistent with a transition from optimizing for information preservation towards optimizing for perceptual discrimination. Focusing on the representation of binocular disparities-the slight differences in the retinal images of the two eyes-we re-analyze measurements characterizing neuronal tuning curves in brain areas V1, V2, and MT (middle temporal) in the macaque monkey. We compare these to measurements of the statistics of binocular disparity typically encountered during natural behaviors using a Fisher Information framework. The differences in tuning curve characteristics across areas are consistent with a shift in optimization goals: V1 and V2 population-level responses are more consistent with maximizing the information encoded about naturally occurring binocular disparities, while MT responses shift towards maximizing the ability to support disparity discrimination. We find that a change towards tuning curves preferring larger disparities is a key driver of this shift. These results provide new insight into previously-identified differences between disparity-selective areas of cortex and suggest these differences play an important role in supporting visually-guided behavior. Our findings emphasize the need to consider not just information preservation and neural resources, but also relevance to behavior, when assessing the optimality of neural codes.


Asunto(s)
Corteza Visual , Animales , Corteza Visual/fisiología , Macaca , Disparidad Visual , Encéfalo , Neuronas/fisiología , Estimulación Luminosa/métodos
2.
J Neurosci ; 43(11): 1888-1904, 2023 03 15.
Artículo en Inglés | MEDLINE | ID: mdl-36725323

RESUMEN

Smooth eye movements are common during natural viewing; we frequently rotate our eyes to track moving objects or to maintain fixation on an object during self-movement. Reliable information about smooth eye movements is crucial to various neural computations, such as estimating heading from optic flow or judging depth from motion parallax. While it is well established that extraretinal signals (e.g., efference copies of motor commands) carry critical information about eye velocity, the rotational optic flow field produced by eye rotations also carries valuable information. Although previous work has shown that dynamic perspective cues in optic flow can be used in computations that require estimates of eye velocity, it has remained unclear where and how the brain processes these visual cues and how they are integrated with extraretinal signals regarding eye rotation. We examined how neurons in the dorsal region of the medial superior temporal area (MSTd) of two male rhesus monkeys represent the direction of smooth pursuit eye movements based on both visual cues (dynamic perspective) and extraretinal signals. We find that most MSTd neurons have matched preferences for the direction of eye rotation based on visual and extraretinal signals. Moreover, neural responses to combinations of these signals are well predicted by a weighted linear summation model. These findings demonstrate a neural substrate for representing the velocity of smooth eye movements based on rotational optic flow and establish area MSTd as a key node for integrating visual and extraretinal signals into a more generalized representation of smooth eye movements.SIGNIFICANCE STATEMENT We frequently rotate our eyes to smoothly track objects of interest during self-motion. Information about eye velocity is crucial for a variety of computations performed by the brain, including depth perception and heading perception. Traditionally, information about eye rotation has been thought to arise mainly from extraretinal signals, such as efference copies of motor commands. Previous work shows that eye velocity can also be inferred from rotational optic flow that accompanies smooth eye movements, but the neural origins of these visual signals about eye rotation have remained unknown. We demonstrate that macaque neurons signal the direction of smooth eye rotation based on visual signals, and that they integrate both visual and extraretinal signals regarding eye rotation in a congruent fashion.


Asunto(s)
Percepción de Movimiento , Flujo Optico , Animales , Masculino , Movimientos Oculares , Señales (Psicología) , Seguimiento Ocular Uniforme , Neuronas/fisiología , Macaca mulatta , Percepción de Movimiento/fisiología , Estimulación Luminosa
3.
J Neurosci ; 42(7): 1235-1253, 2022 02 16.
Artículo en Inglés | MEDLINE | ID: mdl-34911796

RESUMEN

There are two distinct sources of retinal image motion: objects moving in the world and observer movement. When the eyes move to track a target of interest, the retinal velocity of some object in the scene will depend on both eye velocity and that object's motion in the world. Thus, to compute the object's velocity relative to the head, a coordinate transformation must be performed by vectorially adding eye velocity and retinal velocity. In contrast, a very different interaction between retinal and eye velocity signals has been proposed to underlie estimation of depth from motion parallax, which involves computing the ratio of retinal and eye velocities. We examined how neurons in the middle temporal (MT) area of male macaques combine eye velocity and retinal velocity, to test whether this interaction is more consistent with a partial coordinate transformation (for computing head-centered object motion) or a multiplicative gain interaction (for computing depth from motion parallax). We find that some MT neurons show clear signatures of a partial coordinate transformation for computing head-centered velocity. Even a small shift toward head-centered velocity tuning can account for the observed depth-sign selectivity of MT neurons, including a strong dependence on speed preference that was previously unexplained. A formal model comparison reveals that the data from many MT neurons are equally well explained by a multiplicative gain interaction or a partial transformation toward head-centered tuning, although some responses can only be well fit by the coordinate transform model. Our findings shed new light on the neural computations performed in area MT, and raise the possibility that depth-sign selectivity emerges from a partial coordinate transformation toward representing head-centered velocity.SIGNIFICANCE STATEMENT Eye velocity signals modulate the responses of neurons in the middle temporal (MT) area to retinal image motion. Two different types of interactions between retinal and eye velocities have previously been considered: a vector addition computation for computing head-centered velocity, and a multiplicative gain interaction for computing depth from motion parallax. Whereas previous work favored a multiplicative gain interaction in MT, we demonstrate that some MT neurons show clear evidence of a partial shift toward coding head-centered velocity. Moreover, we demonstrate that even a small shift toward head coordinates is sufficient to account for the depth-sign selectivity observed previously in area MT, thus raising the possibility that a partial coordinate transformation may also provide a mechanism for computing depth from motion parallax.


Asunto(s)
Modelos Neurológicos , Percepción de Movimiento/fisiología , Neuronas/fisiología , Lóbulo Temporal/fisiología , Animales , Macaca mulatta , Masculino
4.
J Neurosci ; 41(49): 10108-10119, 2021 12 08.
Artículo en Inglés | MEDLINE | ID: mdl-34716232

RESUMEN

Multisensory plasticity enables our senses to dynamically adapt to each other and the external environment, a fundamental operation that our brain performs continuously. We searched for neural correlates of adult multisensory plasticity in the dorsal medial superior temporal area (MSTd) and the ventral intraparietal area (VIP) in 2 male rhesus macaques using a paradigm of supervised calibration. We report little plasticity in neural responses in the relatively low-level multisensory cortical area MSTd. In contrast, neural correlates of plasticity are found in higher-level multisensory VIP, an area with strong decision-related activity. Accordingly, we observed systematic shifts of VIP tuning curves, which were reflected in the choice-related component of the population response. This is the first demonstration of neuronal calibration, together with behavioral calibration, in single sessions. These results lay the foundation for understanding multisensory neural plasticity, applicable broadly to maintaining accuracy for sensorimotor tasks.SIGNIFICANCE STATEMENT Multisensory plasticity is a fundamental and continual function of the brain that enables our senses to adapt dynamically to each other and to the external environment. Yet, very little is known about the neuronal mechanisms of multisensory plasticity. In this study, we searched for neural correlates of adult multisensory plasticity in the dorsal medial superior temporal area (MSTd) and the ventral intraparietal area (VIP) using a paradigm of supervised calibration. We found little plasticity in neural responses in the relatively low-level multisensory cortical area MSTd. By contrast, neural correlates of plasticity were found in VIP, a higher-level multisensory area with strong decision-related activity. This is the first demonstration of neuronal calibration, together with behavioral calibration, in single sessions.


Asunto(s)
Plasticidad Neuronal/fisiología , Lóbulo Parietal/fisiología , Lóbulo Temporal/fisiología , Animales , Macaca mulatta , Masculino
5.
J Neurosci ; 41(14): 3254-3265, 2021 04 07.
Artículo en Inglés | MEDLINE | ID: mdl-33622780

RESUMEN

Perceptual decision-making is increasingly being understood to involve an interaction between bottom-up sensory-driven signals and top-down choice-driven signals, but how these signals interact to mediate perception is not well understood. The parieto-insular vestibular cortex (PIVC) is an area with prominent vestibular responsiveness, and previous work has shown that inactivating PIVC impairs vestibular heading judgments. To investigate the nature of PIVC's contribution to heading perception, we recorded extracellularly from PIVC neurons in two male rhesus macaques during a heading discrimination task, and compared findings with data from previous studies of dorsal medial superior temporal (MSTd) and ventral intraparietal (VIP) areas using identical stimuli. By computing partial correlations between neural responses, heading, and choice, we find that PIVC activity reflects a dynamically changing combination of sensory and choice signals. In addition, the sensory and choice signals are more balanced in PIVC, in contrast to the sensory dominance in MSTd and choice dominance in VIP. Interestingly, heading and choice signals in PIVC are negatively correlated during the middle portion of the stimulus epoch, reflecting a mismatch in the polarity of heading and choice signals. We anticipate that these results will help unravel the mechanisms of interaction between bottom-up sensory signals and top-down choice signals in perceptual decision-making, leading to more comprehensive models of self-motion perception.SIGNIFICANCE STATEMENT Vestibular information is important for our perception of self-motion, and various cortical regions in primates show vestibular heading selectivity. Inactivation of the macaque vestibular cortex substantially impairs the precision of vestibular heading discrimination, more so than inactivation of other multisensory areas. Here, we record for the first time from the vestibular cortex while monkeys perform a forced-choice heading discrimination task, and we compare results with data collected previously from other multisensory cortical areas. We find that vestibular cortex activity reflects a dynamically changing combination of sensory and choice signals, with both similarities and notable differences with other multisensory areas.


Asunto(s)
Conducta de Elección/fisiología , Movimientos de la Cabeza/fisiología , Percepción de Movimiento/fisiología , Lóbulo Parietal/fisiología , Corteza Somatosensorial/fisiología , Vestíbulo del Laberinto/fisiología , Animales , Corteza Cerebral/diagnóstico por imagen , Corteza Cerebral/fisiología , Aprendizaje Discriminativo/fisiología , Macaca mulatta , Imagen por Resonancia Magnética/métodos , Masculino , Lóbulo Parietal/diagnóstico por imagen , Estimulación Luminosa/métodos , Corteza Somatosensorial/diagnóstico por imagen , Vestíbulo del Laberinto/diagnóstico por imagen
6.
Proc Natl Acad Sci U S A ; 116(18): 9060-9065, 2019 04 30.
Artículo en Inglés | MEDLINE | ID: mdl-30996126

RESUMEN

The brain infers our spatial orientation and properties of the world from ambiguous and noisy sensory cues. Judging self-motion (heading) in the presence of independently moving objects poses a challenging inference problem because the image motion of an object could be attributed to movement of the object, self-motion, or some combination of the two. We test whether perception of heading and object motion follows predictions of a normative causal inference framework. In a dual-report task, subjects indicated whether an object appeared stationary or moving in the virtual world, while simultaneously judging their heading. Consistent with causal inference predictions, the proportion of object stationarity reports, as well as the accuracy and precision of heading judgments, depended on the speed of object motion. Critically, biases in perceived heading declined when the object was perceived to be moving in the world. Our findings suggest that the brain interprets object motion and self-motion using a causal inference framework.


Asunto(s)
Percepción de Movimiento/fisiología , Percepción Espacial/fisiología , Percepción Visual/fisiología , Adulto , Animales , Señales (Psicología) , Femenino , Voluntarios Sanos , Humanos , Juicio/fisiología , Macaca mulatta , Masculino , Movimiento (Física) , Movimiento/fisiología , Orientación/fisiología , Estimulación Luminosa/métodos
7.
J Neurosci ; 40(5): 1066-1083, 2020 01 29.
Artículo en Inglés | MEDLINE | ID: mdl-31754013

RESUMEN

Identifying the features of population responses that are relevant to the amount of information encoded by neuronal populations is a crucial step toward understanding population coding. Statistical features, such as tuning properties, individual and shared response variability, and global activity modulations, could all affect the amount of information encoded and modulate behavioral performance. We show that two features in particular affect information: the modulation of population responses across conditions (population signal) and the inverse population covariability along the modulation axis (projected precision). We demonstrate that fluctuations of these two quantities are correlated with fluctuations of behavioral performance in various tasks and brain regions consistently across 4 monkeys (1 female and 1 male Macaca mulatta; and 2 male Macaca fascicularis). In contrast, fluctuations in mean correlations among neurons and global activity have negligible or inconsistent effects on the amount of information encoded and behavioral performance. We also show that differential correlations reduce the amount of information encoded in finite populations by reducing projected precision. Our results are consistent with predictions of a model that optimally decodes population responses to produce behavior.SIGNIFICANCE STATEMENT The last two or three decades of research have seen hot debates about what features of population tuning and trial-by-trial variability influence the information carried by a population of neurons, with some camps arguing, for instance, that mean pairwise correlations or global fluctuations are important while other camps report opposite results. In this study, we identify the most important features of neural population responses that determine the amount of encoded information and behavioral performance by combining analytic calculations with a novel nonparametric method that allows us to isolate the effects of different statistical features. We tested our hypothesis on 4 macaques, three decision-making tasks, and two brain areas. The predictions of our theory were in agreement with the experimental data.


Asunto(s)
Redes Neurales de la Computación , Neuronas/fisiología , Corteza Prefrontal/fisiología , Desempeño Psicomotor/fisiología , Lóbulo Temporal/fisiología , Animales , Atención/fisiología , Conducta Animal , Análisis Discriminante , Femenino , Macaca fascicularis , Macaca mulatta , Masculino , Modelos Neurológicos , Percepción de Movimiento/fisiología , Percepción Visual/fisiología
8.
Cereb Cortex ; 30(4): 2658-2672, 2020 04 14.
Artículo en Inglés | MEDLINE | ID: mdl-31828299

RESUMEN

Visual motion processing is a well-established model system for studying neural population codes in primates. The common marmoset, a small new world primate, offers unparalleled opportunities to probe these population codes in key motion processing areas, such as cortical areas MT and MST, because these areas are accessible for imaging and recording at the cortical surface. However, little is currently known about the perceptual abilities of the marmoset. Here, we introduce a paradigm for studying motion perception in the marmoset and compare their psychophysical performance with human observers. We trained two marmosets to perform a motion estimation task in which they provided an analog report of their perceived direction of motion with an eye movement to a ring that surrounded the motion stimulus. Marmosets and humans exhibited similar trade-offs in speed versus accuracy: errors were larger and reaction times were longer as the strength of the motion signal was reduced. Reverse correlation on the temporal fluctuations in motion direction revealed that both species exhibited short integration windows; however, marmosets had substantially less nondecision time than humans. Our results provide the first quantification of motion perception in the marmoset and demonstrate several advantages to using analog estimation tasks.


Asunto(s)
Movimientos Oculares/fisiología , Percepción de Movimiento/fisiología , Estimulación Luminosa/métodos , Tiempo de Reacción/fisiología , Corteza Visual/fisiología , Adulto , Animales , Callithrix , Femenino , Humanos , Masculino , Persona de Mediana Edad , Especificidad de la Especie , Adulto Joven
9.
Proc Natl Acad Sci U S A ; 115(14): E3305-E3312, 2018 04 03.
Artículo en Inglés | MEDLINE | ID: mdl-29555744

RESUMEN

By systematically manipulating head position relative to the body and eye position relative to the head, previous studies have shown that vestibular tuning curves of neurons in the ventral intraparietal (VIP) area remain invariant when expressed in body-/world-centered coordinates. However, body orientation relative to the world was not manipulated; thus, an egocentric, body-centered representation could not be distinguished from an allocentric, world-centered reference frame. We manipulated the orientation of the body relative to the world such that we could distinguish whether vestibular heading signals in VIP are organized in body- or world-centered reference frames. We found a hybrid representation, depending on gaze direction. When gaze remained fixed relative to the body, the vestibular heading tuning of VIP neurons shifted systematically with body orientation, indicating an egocentric, body-centered reference frame. In contrast, when gaze remained fixed relative to the world, this representation changed to be intermediate between body- and world-centered. We conclude that the neural representation of heading in posterior parietal cortex is flexible, depending on gaze and possibly attentional demands.


Asunto(s)
Movimientos Oculares/fisiología , Movimientos de la Cabeza/fisiología , Percepción de Movimiento/fisiología , Neuronas/fisiología , Lóbulo Parietal/fisiología , Animales , Orientación
10.
Cereb Cortex ; 29(9): 3932-3947, 2019 08 14.
Artículo en Inglés | MEDLINE | ID: mdl-30365011

RESUMEN

We examined the responses of neurons in posterior parietal area 7a to passive rotational and translational self-motion stimuli, while systematically varying the speed of visually simulated (optic flow cues) or actual (vestibular cues) self-motion. Contrary to a general belief that responses in area 7a are predominantly visual, we found evidence for a vestibular dominance in self-motion processing. Only a small fraction of neurons showed multisensory convergence of visual/vestibular and linear/angular self-motion cues. These findings suggest possibly independent neuronal population codes for visual versus vestibular and linear versus angular self-motion. Neural responses scaled with self-motion magnitude (i.e., speed) but temporal dynamics were diverse across the population. Analyses of laminar recordings showed a strong distance-dependent decrease for correlations in stimulus-induced (signal correlation) and stimulus-independent (noise correlation) components of spike-count variability, supporting the notion that neurons are spatially clustered with respect to their sensory representation of motion. Single-unit and multiunit response patterns were also correlated, but no other systematic dependencies on cortical layers or columns were observed. These findings describe a likely independent multimodal neural code for linear and angular self-motion in a posterior parietal area of the macaque brain that is connected to the hippocampal formation.


Asunto(s)
Percepción de Movimiento/fisiología , Movimiento/fisiología , Neuronas/fisiología , Lóbulo Parietal/fisiología , Vestíbulo del Laberinto/fisiología , Animales , Macaca mulatta , Masculino , Movimiento (Física) , Flujo Optico/fisiología
11.
J Vis ; 20(10): 8, 2020 10 01.
Artículo en Inglés | MEDLINE | ID: mdl-33016983

RESUMEN

During self-motion, an independently moving object generates retinal motion that is the vector sum of its world-relative motion and the optic flow caused by the observer's self-motion. A hypothesized mechanism for the computation of an object's world-relative motion is flow parsing, in which the optic flow field due to self-motion is globally subtracted from the retinal flow field. This subtraction generates a bias in perceived object direction (in retinal coordinates) away from the optic flow vector at the object's location. Despite psychophysical evidence for flow parsing in humans, the neural mechanisms underlying the process are unknown. To build the framework for investigation of the neural basis of flow parsing, we trained macaque monkeys to discriminate the direction of a moving object in the presence of optic flow simulating self-motion. Like humans, monkeys showed biases in object direction perception consistent with subtraction of background optic flow attributable to self-motion. The size of perceptual biases generally depended on the magnitude of the expected optic flow vector at the location of the object, which was contingent on object position and self-motion velocity. There was a modest effect of an object's depth on flow-parsing biases, which reached significance in only one of two subjects. Adding vestibular self-motion signals to optic flow facilitated flow parsing, increasing biases in direction perception. Our findings indicate that monkeys exhibit perceptual hallmarks of flow parsing, setting the stage for the examination of the neural mechanisms underlying this phenomenon.


Asunto(s)
Percepción de Movimiento , Flujo Optico/fisiología , Animales , Haplorrinos , Humanos , Macaca mulatta , Masculino , Retina/fisiología
12.
J Neurophysiol ; 121(4): 1207-1221, 2019 04 01.
Artículo en Inglés | MEDLINE | ID: mdl-30699042

RESUMEN

Multiple areas of macaque cortex are involved in visual motion processing, but their relative functional roles remain unclear. The medial superior temporal (MST) area is typically divided into lateral (MSTl) and dorsal (MSTd) subdivisions that are thought to be involved in processing object motion and self-motion, respectively. Whereas MSTd has been studied extensively with regard to processing visual and nonvisual self-motion cues, little is known about self-motion signals in MSTl, especially nonvisual signals. Moreover, little is known about how self-motion and object motion signals interact in MSTl and how this differs from interactions in MSTd. We compared the visual and vestibular heading tuning of neurons in MSTl and MSTd using identical stimuli. Our findings reveal that both visual and vestibular heading signals are weaker in MSTl than in MSTd, suggesting that MSTl is less well suited to participate in self-motion perception than MSTd. We also tested neurons in both areas with a variety of combinations of object motion and self-motion. Our findings reveal that vestibular signals improve the separability of coding of heading and object direction in both areas, albeit more strongly in MSTd due to the greater strength of vestibular signals. Based on a marginalization technique, population decoding reveals that heading and object direction can be more effectively dissociated from MSTd responses than MSTl responses. Our findings help to clarify the respective contributions that MSTl and MSTd make to processing of object motion and self-motion, although our conclusions may be somewhat specific to the multipart moving objects that we employed. NEW & NOTEWORTHY Retinal image motion reflects contributions from both the observer's self-motion and the movement of objects in the environment. The neural mechanisms by which the brain dissociates self-motion and object motion remain unclear. This study provides the first systematic examination of how the lateral subdivision of area MST (MSTl) contributes to dissociating object motion and self-motion. We also examine, for the first time, how MSTl neurons represent translational self-motion based on both vestibular and visual cues.


Asunto(s)
Percepción de Movimiento , Movimiento , Lóbulo Temporal/fisiología , Animales , Señales (Psicología) , Macaca mulatta , Masculino , Neuronas/fisiología , Lóbulo Temporal/citología , Vestíbulo del Laberinto/fisiología , Percepción Visual
13.
PLoS Comput Biol ; 14(9): e1006371, 2018 09.
Artículo en Inglés | MEDLINE | ID: mdl-30248091

RESUMEN

Studies of neuron-behaviour correlation and causal manipulation have long been used separately to understand the neural basis of perception. Yet these approaches sometimes lead to drastically conflicting conclusions about the functional role of brain areas. Theories that focus only on choice-related neuronal activity cannot reconcile those findings without additional experiments involving large-scale recordings to measure interneuronal correlations. By expanding current theories of neural coding and incorporating results from inactivation experiments, we demonstrate here that it is possible to infer decoding weights of different brain areas at a coarse scale without precise knowledge of the correlation structure. We apply this technique to neural data collected from two different cortical areas in macaque monkeys trained to perform a heading discrimination task. We identify two opposing decoding schemes, each consistent with data depending on the nature of correlated noise. Our theory makes specific testable predictions to distinguish these scenarios experimentally without requiring measurement of the underlying noise correlations.


Asunto(s)
Encéfalo/fisiología , Percepción de Movimiento/fisiología , Neuronas/fisiología , Algoritmos , Animales , Conducta de Elección , Simulación por Computador , Macaca mulatta , Modelos Neurológicos , Movimiento (Física) , Distribución Normal
14.
Proc Natl Acad Sci U S A ; 113(18): 5077-82, 2016 May 03.
Artículo en Inglés | MEDLINE | ID: mdl-27095846

RESUMEN

Terrestrial navigation naturally involves translations within the horizontal plane and eye rotations about a vertical (yaw) axis to track and fixate targets of interest. Neurons in the macaque ventral intraparietal (VIP) area are known to represent heading (the direction of self-translation) from optic flow in a manner that is tolerant to rotational visual cues generated during pursuit eye movements. Previous studies have also reported that eye rotations modulate the response gain of heading tuning curves in VIP neurons. We tested the hypothesis that VIP neurons simultaneously represent both heading and horizontal (yaw) eye rotation velocity by measuring heading tuning curves for a range of rotational velocities of either real or simulated eye movements. Three findings support the hypothesis of a joint representation. First, we show that rotation velocity selectivity based on gain modulations of visual heading tuning is similar to that measured during pure rotations. Second, gain modulations of heading tuning are similar for self-generated eye rotations and visually simulated rotations, indicating that the representation of rotation velocity in VIP is multimodal, driven by both visual and extraretinal signals. Third, we show that roughly one-half of VIP neurons jointly represent heading and rotation velocity in a multiplicatively separable manner. These results provide the first evidence, to our knowledge, for a joint representation of translation direction and rotation velocity in parietal cortex and show that rotation velocity can be represented based on visual cues, even in the absence of efference copy signals.


Asunto(s)
Señales (Psicología) , Movimientos Oculares/fisiología , Percepción de Movimiento/fisiología , Flujo Optico/fisiología , Lóbulo Parietal/fisiología , Navegación Espacial/fisiología , Animales , Macaca mulatta , Masculino , Orientación/fisiología , Rotación
15.
J Neurosci ; 37(46): 11204-11219, 2017 11 15.
Artículo en Inglés | MEDLINE | ID: mdl-29030435

RESUMEN

We use visual image motion to judge the movement of objects, as well as our own movements through the environment. Generally, image motion components caused by object motion and self-motion are confounded in the retinal image. Thus, to estimate heading, the brain would ideally marginalize out the effects of object motion (or vice versa), but little is known about how this is accomplished neurally. Behavioral studies suggest that vestibular signals play a role in dissociating object motion and self-motion, and recent computational work suggests that a linear decoder can approximate marginalization by taking advantage of diverse multisensory representations. By measuring responses of MSTd neurons in two male rhesus monkeys and by applying a recently-developed method to approximate marginalization by linear population decoding, we tested the hypothesis that vestibular signals help to dissociate self-motion and object motion. We show that vestibular signals stabilize tuning for heading in neurons with congruent visual and vestibular heading preferences, whereas they stabilize tuning for object motion in neurons with discrepant preferences. Thus, vestibular signals enhance the separability of joint tuning for object motion and self-motion. We further show that a linear decoder, designed to approximate marginalization, allows the population to represent either self-motion or object motion with good accuracy. Decoder weights are broadly consistent with a readout strategy, suggested by recent computational work, in which responses are decoded according to the vestibular preferences of multisensory neurons. These results demonstrate, at both single neuron and population levels, that vestibular signals help to dissociate self-motion and object motion.SIGNIFICANCE STATEMENT The brain often needs to estimate one property of a changing environment while ignoring others. This can be difficult because multiple properties of the environment may be confounded in sensory signals. The brain can solve this problem by marginalizing over irrelevant properties to estimate the property-of-interest. We explore this problem in the context of self-motion and object motion, which are inherently confounded in the retinal image. We examine how diversity in a population of multisensory neurons may be exploited to decode self-motion and object motion from the population activity of neurons in macaque area MSTd.


Asunto(s)
Encéfalo/fisiología , Movimientos de la Cabeza/fisiología , Percepción de Movimiento/fisiología , Estimulación Luminosa/métodos , Animales , Macaca mulatta , Masculino
16.
J Neurosci ; 37(34): 8180-8197, 2017 08 23.
Artículo en Inglés | MEDLINE | ID: mdl-28739582

RESUMEN

Observer translation produces differential image motion between objects that are located at different distances from the observer's point of fixation [motion parallax (MP)]. However, MP can be ambiguous with respect to depth sign (near vs far), and this ambiguity can be resolved by combining retinal image motion with signals regarding eye movement relative to the scene. We have previously demonstrated that both extra-retinal and visual signals related to smooth eye movements can modulate the responses of neurons in area MT of macaque monkeys, and that these modulations generate neural selectivity for depth sign. However, the neural mechanisms that govern this selectivity have remained unclear. In this study, we analyze responses of MT neurons as a function of both retinal velocity and direction of eye movement, and we show that smooth eye movements modulate MT responses in a systematic, temporally precise, and directionally specific manner to generate depth-sign selectivity. We demonstrate that depth-sign selectivity is primarily generated by multiplicative modulations of the response gain of MT neurons. Through simulations, we further demonstrate that depth can be estimated reasonably well by a linear decoding of a population of MT neurons with response gains that depend on eye velocity. Together, our findings provide the first mechanistic description of how visual cortical neurons signal depth from MP.SIGNIFICANCE STATEMENT Motion parallax is a monocular cue to depth that commonly arises during observer translation. To compute from motion parallax whether an object appears nearer or farther than the point of fixation requires combining retinal image motion with signals related to eye rotation, but the neurobiological mechanisms have remained unclear. This study provides the first mechanistic account of how this interaction takes place in the responses of cortical neurons. Specifically, we show that smooth eye movements modulate the gain of responses of neurons in area MT in a directionally specific manner to generate selectivity for depth sign from motion parallax. We also show, through simulations, that depth could be estimated from a population of such gain-modulated neurons.


Asunto(s)
Percepción de Profundidad/fisiología , Movimientos Oculares/fisiología , Percepción de Movimiento/fisiología , Estimulación Luminosa/métodos , Corteza Visual/fisiología , Animales , Macaca mulatta , Masculino
17.
J Neurophysiol ; 119(3): 1113-1126, 2018 03 01.
Artículo en Inglés | MEDLINE | ID: mdl-29187554

RESUMEN

The ventral intraparietal area (VIP) of the macaque brain is a multimodal cortical region, with many cells tuned to both optic flow and vestibular stimuli. Responses of many VIP neurons also show robust correlations with perceptual judgments during a fine heading discrimination task. Previous studies have shown that heading tuning based on optic flow is represented in a clustered fashion in VIP. However, it is unknown whether vestibular self-motion selectivity is clustered in VIP. Moreover, it is not known whether stimulus- and choice-related signals in VIP show clustering in the context of a heading discrimination task. To address these issues, we compared the response characteristics of isolated single units (SUs) with those of the undifferentiated multiunit (MU) activity corresponding to several neighboring neurons recorded from the same microelectrode. We find that MU activity typically shows selectivity similar to that of simultaneously recorded SUs, for both the vestibular and visual stimulus conditions. In addition, the choice-related activity of MU signals, as quantified using choice probabilities, is correlated with the choice-related activity of SUs. Overall, these findings suggest that both sensory and choice-related signals regarding self-motion are clustered in VIP. NEW & NOTEWORTHY We demonstrate, for the first time, that the vestibular tuning of ventral intraparietal area (VIP) neurons in response to both translational and rotational motion is clustered. In addition, heading discriminability and choice-related activity are also weakly clustered in VIP.


Asunto(s)
Conducta de Elección , Percepción de Movimiento/fisiología , Neuronas/fisiología , Lóbulo Parietal/fisiología , Vestíbulo del Laberinto/fisiología , Animales , Macaca mulatta , Masculino , Estimulación Luminosa
18.
Nat Rev Neurosci ; 14(6): 429-42, 2013 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-23686172

RESUMEN

The richness of perceptual experience, as well as its usefulness for guiding behaviour, depends on the synthesis of information across multiple senses. Recent decades have witnessed a surge in our understanding of how the brain combines sensory cues. Much of this research has been guided by one of two distinct approaches: one is driven primarily by neurophysiological observations, and the other is guided by principles of mathematical psychology and psychophysics. Conflicting results and interpretations have contributed to a conceptual gap between psychophysical and physiological accounts of cue integration, but recent studies of visual-vestibular cue integration have narrowed this gap considerably.


Asunto(s)
Encéfalo/fisiología , Señales (Psicología) , Modelos Neurológicos , Percepción/fisiología , Células Receptoras Sensoriales/fisiología , Visión Ocular/fisiología , Animales , Encéfalo/citología , Humanos , Teoría Psicológica
19.
Proc Natl Acad Sci U S A ; 112(20): 6467-72, 2015 May 19.
Artículo en Inglés | MEDLINE | ID: mdl-25941358

RESUMEN

How activity of sensory neurons leads to perceptual decisions remains a challenge to understand. Correlations between choices and single neuron firing rates have been found early in vestibular processing, in the brainstem and cerebellum. To investigate the origins of choice-related activity, we have recorded from otolith afferent fibers while animals performed a fine heading discrimination task. We find that afferent fibers have similar discrimination thresholds as central cells, and the most sensitive fibers have thresholds that are only twofold or threefold greater than perceptual thresholds. Unlike brainstem and cerebellar nuclei neurons, spike counts from afferent fibers do not exhibit trial-by-trial correlations with perceptual decisions. This finding may reflect the fact that otolith afferent responses are poorly suited for driving heading perception because they fail to discriminate self-motion from changes in orientation relative to gravity. Alternatively, if choice probabilities reflect top-down inference signals, they are not relayed to the vestibular periphery.


Asunto(s)
Conducta de Elección/fisiología , Orientación/fisiología , Membrana Otolítica/inervación , Umbral Sensorial/fisiología , Percepción Espacial/fisiología , Vías Aferentes/fisiología , Animales , Macaca mulatta , Masculino , Percepción de Movimiento/fisiología , Curva ROC
20.
J Neurosci ; 36(13): 3789-98, 2016 Mar 30.
Artículo en Inglés | MEDLINE | ID: mdl-27030763

RESUMEN

Multisensory convergence of visual and vestibular signals has been observed within a network of cortical areas involved in representing heading. Vestibular-dominant heading tuning has been found in the macaque parietoinsular vestibular cortex (PIVC) and the adjacent visual posterior sylvian (VPS) area, whereas relatively balanced visual/vestibular tuning was encountered in the ventral intraparietal (VIP) area and visual-dominant tuning was found in the dorsal medial superior temporal (MSTd) area. Although the respective functional roles of these areas remain unclear, perceptual deficits in heading discrimination following reversible chemical inactivation of area MSTd area suggested that areas with vestibular-dominant heading tuning also contribute to behavior. To explore the roles of other areas in heading perception, muscimol injections were used to reversibly inactivate either the PIVC or the VIP area bilaterally in macaques. Inactivation of the anterior PIVC increased psychophysical thresholds when heading judgments were based on either optic flow or vestibular cues, although effects were stronger for vestibular stimuli. All behavioral deficits recovered within 36 h. Visual deficits were larger following inactivation of the posterior portion of the PIVC, likely because these injections encroached upon the VPS area, which contains neurons with optic flow tuning (unlike the PIVC). In contrast, VIP inactivation led to no behavioral deficits, despite the fact that VIP neurons show much stronger choice-related activity than MSTd neurons. These results suggest that the VIP area either provides a parallel and partially redundant pathway for this task, or does not participate in heading discrimination. In contrast, the PIVC/VPS area, along with the MSTd area, make causal contributions to heading perception based on either vestibular or visual signals. SIGNIFICANCE STATEMENT: Multisensory vestibular and visual signals are found in multiple cortical areas, but their causal contribution to self-motion perception has been previously tested only in the dorsal medial superior temporal (MSTd) area. In these experiments, we show that inactivation of the parietoinsular vestibular cortex (PIVC) also results in causal deficits during heading discrimination for both visual and vestibular cues. In contrast, ventral intraparietal (VIP) area inactivation led to no behavioral deficits, despite the fact that VIP neurons show much stronger choice-related activity than MSTd or PIVC neurons. These results demonstrate that choice-related activity does not always imply a causal role in sensory perception.


Asunto(s)
Movimientos de la Cabeza/fisiología , Percepción de Movimiento/fisiología , Flujo Optico/fisiología , Lóbulo Parietal/fisiología , Lóbulo Temporal/fisiología , Animales , Mapeo Encefálico , Señales (Psicología) , Discriminación en Psicología , Lateralidad Funcional/efectos de los fármacos , Agonistas de Receptores de GABA-A/farmacología , Movimientos de la Cabeza/efectos de los fármacos , Procesamiento de Imagen Asistido por Computador , Macaca mulatta , Imagen por Resonancia Magnética , Masculino , Percepción de Movimiento/efectos de los fármacos , Muscimol/farmacología , Red Nerviosa/efectos de los fármacos , Red Nerviosa/fisiología , Lóbulo Parietal/efectos de los fármacos , Estimulación Luminosa , Psicometría , Psicofísica , Lóbulo Temporal/efectos de los fármacos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA