Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Más filtros

Banco de datos
Tipo del documento
Asunto de la revista
País de afiliación
Intervalo de año de publicación
1.
PLoS Biol ; 22(6): e3002670, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-38917200

RESUMEN

Low and high beta frequency rhythms were observed in the motor cortex, but their respective sources and behavioral correlates remain unknown. We studied local field potentials (LFPs) during pre-cued reaching behavior in macaques. They contained a low beta band (<20 Hz) dominant in primary motor cortex and a high beta band (>20 Hz) dominant in dorsal premotor cortex (PMd). Low beta correlated positively with reaction time (RT) from visual cue onset and negatively with uninstructed hand postural micro-movements throughout the trial. High beta reflected temporal task prediction, with selective modulations before and during cues, which were enhanced in moments of increased focal attention when the gaze was on the work area. This double-dissociation in sources and behavioral correlates of motor cortical low and high beta, with respect to both task-instructed and spontaneous behavior, reconciles the largely disparate roles proposed for the beta rhythm, by suggesting band-specific roles in both movement control and spatiotemporal attention.


Asunto(s)
Atención , Ritmo beta , Macaca mulatta , Corteza Motora , Movimiento , Tiempo de Reacción , Animales , Corteza Motora/fisiología , Atención/fisiología , Ritmo beta/fisiología , Movimiento/fisiología , Tiempo de Reacción/fisiología , Macaca mulatta/fisiología , Masculino , Señales (Psicología) , Desempeño Psicomotor/fisiología
2.
J Neurophysiol ; 120(2): 539-552, 2018 08 01.
Artículo en Inglés | MEDLINE | ID: mdl-29718806

RESUMEN

Large-scale network dynamics in multiple visuomotor areas is of great interest in the study of eye-hand coordination in both human and monkey. To explore this, it is essential to develop a setup that allows for precise tracking of eye and hand movements. It is desirable that it is able to generate mechanical or visual perturbations of hand trajectories so that eye-hand coordination can be studied in a variety of conditions. There are simple solutions that satisfy these requirements for hand movements performed in the horizontal plane while visual stimuli and hand feedback are presented in the vertical plane. However, this spatial dissociation requires cognitive rules for eye-hand coordination different from eye-hand movements performed in the same space, as is the case in most natural conditions. Here we present an innovative solution for the precise tracking of eye and hand movements in a single reference frame. Importantly, our solution allows behavioral explorations under normal and perturbed conditions in both humans and monkeys. It is based on the integration of two noninvasive commercially available systems to achieve online control and synchronous recording of eye (EyeLink) and hand (KINARM) positions during interactive visuomotor tasks. We also present an eye calibration method compatible with different eye trackers that compensates for nonlinearities caused by the system's geometry. Our setup monitors the two effectors in real time with high spatial and temporal resolution and simultaneously outputs behavioral and neuronal data to an external data acquisition system using a common data format. NEW & NOTEWORTHY We developed a new setup for studying eye-hand coordination in humans and monkeys that monitors the two effectors in real time in a common reference frame. Our eye calibration method allows us to track gaze positions relative to visual stimuli presented in the horizontal workspace of the hand movements. This method compensates for nonlinearities caused by the system's geometry and transforms kinematics signals from the eye tracker into the same coordinate system as hand and targets.


Asunto(s)
Electroencefalografía/instrumentación , Medidas del Movimiento Ocular/instrumentación , Movimientos Oculares , Mano/fisiología , Desempeño Psicomotor , Animales , Fenómenos Biomecánicos , Calibración , Femenino , Humanos , Macaca mulatta , Programas Informáticos
3.
Cell Rep ; 43(7): 114371, 2024 Jun 25.
Artículo en Inglés | MEDLINE | ID: mdl-38923458

RESUMEN

High-dimensional brain activity is often organized into lower-dimensional neural manifolds. However, the neural manifolds of the visual cortex remain understudied. Here, we study large-scale multi-electrode electrophysiological recordings of macaque (Macaca mulatta) areas V1, V4, and DP with a high spatiotemporal resolution. We find that the population activity of V1 contains two separate neural manifolds, which correlate strongly with eye closure (eyes open/closed) and have distinct dimensionalities. Moreover, we find strong top-down signals from V4 to V1, particularly to the foveal region of V1, which are significantly stronger during the eyes-open periods. Finally, in silico simulations of a balanced spiking neuron network qualitatively reproduce the experimental findings. Taken together, our analyses and simulations suggest that top-down signals modulate the population activity of V1. We postulate that the top-down modulation during the eyes-open periods prepares V1 for fast and efficient visual responses, resulting in a type of visual stand-by state.

4.
eNeuro ; 2022 Jun 27.
Artículo en Inglés | MEDLINE | ID: mdl-35760525

RESUMEN

In human and non-human primates, reflexive tracking eye movements can be initiated at very short latency in response to a rapid shift of the image. Previous studies in humans have shown that only a part of the central visual field is optimal for driving ocular following responses. Herein, we have investigated spatial summation of motion information across a wide range of spatial frequencies and speeds of drifting gratings by recording short-latency ocular following responses in macaque monkeys. We show that optimal stimulus size for driving ocular responses cover a small (<20° diameter), central part of the visual field that shrinks with higher spatial frequency. This signature of linear motion integration remains invariant with speed and temporal frequency. For low and medium spatial frequencies, we found a strong suppressive influence from surround motion, evidenced by a decrease of response amplitude for stimulus sizes larger than optimal. Such suppression disappears with gratings at high frequencies. The contribution of peripheral motion was investigated by presenting grating annuli of increasing eccentricity. We observed an exponential decay of response amplitude with grating eccentricity, the decrease being faster for higher spatial frequencies. Weaker surround suppression can thus be explained by sparser eccentric inputs at high frequencies. A Difference-of-Gaussians model best renders the antagonistic contributions of peripheral and central motions. Its best-fit parameters coincide with several, well-known spatial properties of area MT neuronal populations. These results describe the mechanism by which central motion information is automatically integrated in a context-dependent manner to drive ocular responses.Significance statementOcular following is driven by visual motion at ultra-short latency in both humans and monkeys. Its dynamics reflect the properties of low-level motion integration. Here, we show that a strong center-surround suppression mechanism modulates initial eye velocity. Its spatial properties are dependent upon visual inputs' spatial frequency but are insensitive to either its temporal frequency or speed. These properties are best described with a Difference-of-Gaussian model of spatial integration. The model parameters reflect many spatial characteristics of motion sensitive neuronal populations in monkey area MT. Our results further outline the computational properties of the behavioral receptive field underpinning automatic, context-dependent motion integration.

5.
J Neurophysiol ; 103(3): 1275-82, 2010 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-20032230

RESUMEN

Several recent studies have shown that extracting pattern motion direction is a dynamical process where edge motion is first extracted and pattern-related information is encoded with a small time lag by MT neurons. A similar dynamics was found for human reflexive or voluntary tracking. Here, we bring an essential, but still missing, piece of information by documenting macaque ocular following responses to gratings, unikinetic plaids, and barber-poles. We found that ocular tracking was always initiated first in the grating motion direction with ultra-short latencies (approximately 55 ms). A second component was driven only 10-15 ms later, rotating tracking toward pattern motion direction. At the end the open-loop period, tracking direction was aligned with pattern motion direction (plaids) or the average of the line-ending motion directions (barber-poles). We characterized the dependency on contrast of each component. Both timing and direction of ocular following were quantitatively very consistent with the dynamics of neuronal responses reported by others. Overall, we found a remarkable consistency between neuronal dynamics and monkey behavior, advocating for a direct link between the neuronal solution of the aperture problem and primate perception and action.


Asunto(s)
Movimientos Oculares/fisiología , Desempeño Psicomotor/fisiología , Percepción Visual/fisiología , Algoritmos , Animales , Fijación Ocular , Macaca mulatta , Percepción de Movimiento/fisiología , Reconocimiento Visual de Modelos/fisiología , Estimulación Luminosa
6.
Vision Res ; 48(4): 501-22, 2008 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-18221979

RESUMEN

Integrating information is essential to measure the physical 2D motion of a surface from both ambiguous local 1D motion of its elongated edges and non-ambiguous 2D motion of its features such as corners or texture elements. The dynamics of this motion integration shows a complex time course as read from tracking eye movements: first, local 1D motion signals are extracted and pooled to initiate ocular responses, then 2D motion signals are integrated to adjust the tracking direction until it matches the surface motion direction. The nature of these 1D and 2D motion computations are still unclear. One hypothesis is that their different dynamics may be explained from different contrast sensitivities. To test this, we measured contrast-response functions of early, 1D-driven and late, 2D-driven components of ocular following responses to different motion stimuli: gratings, plaids and barberpoles. We found that contrast dynamics of 1D-driven responses are nearly identical across the different stimuli. On the contrary, late 2D-driven components with either plaids or barberpoles have similar latencies but different contrast dynamics. Temporal dynamics of both 1D- and 2D-driven responses demonstrates that the different contrast gains are set very early during the response time course. Running a Bayesian model of motion integration, we show that a large family of contrast-response functions can be predicted from the probability distributions of 1D and 2D motion signals for each stimulus and by the shape of the prior distribution. However, the pure delay (i.e. largely independent upon contrast) observed between 1D- and 2D-motion supports the fact that 1D and 2D probability distributions are computed independently. This two-pathway Bayesian model supports the idea that 1D and 2D mechanisms represent edges and features motion in parallel.


Asunto(s)
Movimientos Oculares/fisiología , Percepción de Movimiento/fisiología , Teorema de Bayes , Sensibilidad de Contraste , Humanos , Reconocimiento Visual de Modelos/fisiología , Estimulación Luminosa/métodos , Desempeño Psicomotor , Tiempo de Reacción/fisiología
7.
J Neurophysiol ; 95(6): 3712-26, 2006 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-16554515

RESUMEN

Visual neurons integrate information over a finite part of the visual field with high selectivity. This classical receptive field is modulated by peripheral inputs that play a role in both neuronal response normalization and contextual modulations. However, the consequences of these properties for visuomotor transformations are yet incompletely understood. To explore those, we recorded short-latency ocular following responses in humans to large center-only and center-surround stimuli. We found that eye movements are triggered by a mechanism that integrates motion over a restricted portion of the visual field, the size of which depends on stimulus contrast and increases as a function of time after response onset. We also found evidence for a strong nonisodirectional center-surround organization, responsible for normalizing the central, driving input so that motor responses are set to their most linear contrast dynamics. Such response normalization is delayed about 20 ms relative to tracking onset, gradually builds up over time, and is partly tuned for surround orientation/direction. These results outline the spatiotemporal organization of a behavioral receptive field, which might reflect a linear integration among subpopulations of cortical visual motion detectors.


Asunto(s)
Conducta/fisiología , Potenciales Evocados Visuales/fisiología , Percepción de Movimiento/fisiología , Movimientos Sacádicos/fisiología , Percepción Espacial/fisiología , Corteza Visual/fisiología , Campos Visuales/fisiología , Adulto , Simulación por Computador , Femenino , Humanos , Masculino , Modelos Neurológicos , Estimulación Luminosa/métodos , Tiempo de Reacción/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA