Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 30.254
Filter
Add more filters

Publication year range
1.
Cell ; 186(14): 3062-3078.e20, 2023 07 06.
Article in English | MEDLINE | ID: mdl-37343561

ABSTRACT

Seemingly simple behaviors such as swatting a mosquito or glancing at a signpost involve the precise coordination of multiple body parts. Neural control of coordinated movements is widely thought to entail transforming a desired overall displacement into displacements for each body part. Here we reveal a different logic implemented in the mouse gaze system. Stimulating superior colliculus (SC) elicits head movements with stereotyped displacements but eye movements with stereotyped endpoints. This is achieved by individual SC neurons whose branched axons innervate modules in medulla and pons that drive head movements with stereotyped displacements and eye movements with stereotyped endpoints, respectively. Thus, single neurons specify a mixture of endpoints and displacements for different body parts, not overall displacement, with displacements for different body parts computed at distinct anatomical stages. Our study establishes an approach for unraveling motor hierarchies and identifies a logic for coordinating movements and the resulting pose.


Subject(s)
Fixation, Ocular , Saccades , Animals , Mice , Eye Movements , Neurons/physiology , Superior Colliculi/physiology , Rhombencephalon , Head Movements/physiology
2.
Nature ; 623(7986): 381-386, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37880369

ABSTRACT

To maintain a stable and clear image of the world, our eyes reflexively follow the direction in which a visual scene is moving. Such gaze-stabilization mechanisms reduce image blur as we move in the environment. In non-primate mammals, this behaviour is initiated by retinal output neurons called ON-type direction-selective ganglion cells (ON-DSGCs), which detect the direction of image motion and transmit signals to brainstem nuclei that drive compensatory eye movements1. However, ON-DSGCs have not yet been identified in the retina of primates, raising the possibility that this reflex is mediated by cortical visual areas. Here we mined single-cell RNA transcriptomic data from primate retina to identify a candidate ON-DSGC. We then combined two-photon calcium imaging, molecular identification and morphological analysis to reveal a population of ON-DSGCs in the macaque retina. The morphology, molecular signature and GABA (γ-aminobutyric acid)-dependent mechanisms that underlie direction selectivity in primate ON-DSGCs are highly conserved with those in other mammals. We further identify a candidate ON-DSGC in human retina. The presence of ON-DSGCs in primates highlights the need to examine the contribution of subcortical retinal mechanisms to normal and aberrant gaze stabilization in the developing and mature visual system.


Subject(s)
Eye Movements , Macaca , Retina , Retinal Ganglion Cells , Animals , Humans , Eye Movements/physiology , Photic Stimulation , Retina/cytology , Retina/physiology , Retinal Ganglion Cells/cytology , Retinal Ganglion Cells/physiology , Motion , Single-Cell Gene Expression Analysis , gamma-Aminobutyric Acid/metabolism , Calcium Signaling , Fixation, Ocular/physiology
3.
Nature ; 618(7967): 1000-1005, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37258667

ABSTRACT

A hallmark of human intelligence is the ability to plan multiple steps into the future1,2. Despite decades of research3-5, it is still debated whether skilled decision-makers plan more steps ahead than novices6-8. Traditionally, the study of expertise in planning has used board games such as chess, but the complexity of these games poses a barrier to quantitative estimates of planning depth. Conversely, common planning tasks in cognitive science often have a lower complexity9,10 and impose a ceiling for the depth to which any player can plan. Here we investigate expertise in a complex board game that offers ample opportunity for skilled players to plan deeply. We use model fitting methods to show that human behaviour can be captured using a computational cognitive model based on heuristic search. To validate this model, we predict human choices, response times and eye movements. We also perform a Turing test and a reconstruction experiment. Using the model, we find robust evidence for increased planning depth with expertise in both laboratory and large-scale mobile data. Experts memorize and reconstruct board features more accurately. Using complex tasks combined with precise behavioural modelling might expand our understanding of human planning and help to bridge the gap with progress in artificial intelligence.


Subject(s)
Choice Behavior , Game Theory , Games, Experimental , Intelligence , Models, Psychological , Humans , Artificial Intelligence , Cognition , Eye Movements , Heuristics , Memory , Reaction Time , Reproducibility of Results
4.
Nature ; 612(7938): 116-122, 2022 12.
Article in English | MEDLINE | ID: mdl-36289333

ABSTRACT

Most animals have compound eyes, with tens to thousands of lenses attached rigidly to the exoskeleton. A natural assumption is that all of these species must resort to moving either their head or their body to actively change their visual input. However, classic anatomy has revealed that flies have muscles poised to move their retinas under the stable lenses of each compound eye1-3. Here we show that Drosophila use their retinal muscles to smoothly track visual motion, which helps to stabilize the retinal image, and also to perform small saccades when viewing a stationary scene. We show that when the retina moves, visual receptive fields shift accordingly, and that even the smallest retinal saccades activate visual neurons. Using a head-fixed behavioural paradigm, we find that Drosophila perform binocular, vergence movements of their retinas-which could enhance depth perception-when crossing gaps, and impairing the physiology of retinal motor neurons alters gap-crossing trajectories during free behaviour. That flies evolved an ability to actuate their retinas suggests that moving the eye independently of the head is broadly paramount for animals. The similarities of smooth and saccadic movements of the Drosophila retina and the vertebrate eye highlight a notable example of convergent evolution.


Subject(s)
Drosophila , Eye Movements , Muscles , Retina , Vision, Ocular , Animals , Drosophila/physiology , Eye Movements/physiology , Muscles/physiology , Retina/physiology , Saccades/physiology , Vision, Ocular/physiology , Vision, Binocular , Depth Perception , Motor Neurons , Head/physiology , Drosophila melanogaster/physiology , Biological Evolution
5.
Nature ; 604(7907): 708-713, 2022 04.
Article in English | MEDLINE | ID: mdl-35444285

ABSTRACT

Looking and reaching are controlled by different brain regions and are coordinated during natural behaviour1. Understanding how flexible, natural behaviours such as coordinated looking and reaching are controlled depends on understanding how neurons in different regions of the brain communicate2. Neural coherence in a gamma-frequency (40-90 Hz) band has been implicated in excitatory multiregional communication3. Inhibitory control mechanisms are also required to flexibly control behaviour4, but little is known about how neurons in one region transiently suppress individual neurons in another to support behaviour. How neuronal firing in a sender region transiently suppresses firing in a receiver region remains poorly understood. Here we study inhibitory communication during a flexible, natural behaviour, termed gaze anchoring, in which saccades are transiently inhibited by coordinated reaches. During gaze anchoring, we found that neurons in the reach region of the posterior parietal cortex can inhibit neuronal firing in the parietal saccade region to suppress eye movements and improve reach accuracy. Suppression is transient, only present around the coordinated reach, and greatest when reach neurons fire spikes with respect to beta-frequency (15-25 Hz) activity, not gamma-frequency activity. Our work provides evidence in the activity of single neurons for a novel mechanism of inhibitory communication in which beta-frequency neural coherence transiently inhibits multiregional communication to flexibly coordinate natural behaviour.


Subject(s)
Motor Skills , Parietal Lobe , Psychomotor Performance , Saccades , Animals , Eye Movements , Fixation, Ocular , Macaca mulatta , Neurons/physiology , Parietal Lobe/physiology , Psychomotor Performance/physiology
6.
PLoS Biol ; 22(1): e3002485, 2024 Jan.
Article in English | MEDLINE | ID: mdl-38271460

ABSTRACT

Planning a rapid eye movement (saccade) changes how we perceive our visual world. Even before we move the eyes visual discrimination sensitivity improves at the impending target of eye movements, a phenomenon termed "presaccadic attention." Yet, it is unknown if such presaccadic selection merely affects perceptual sensitivity, or also affects downstream decisional processes, such as choice bias. We report a surprising lack of presaccadic perceptual benefits in a common, everyday setting-detection of changes in the visual field. Despite the lack of sensitivity benefits, choice bias for reporting changes increased reliably for the saccade target. With independent follow-up experiments, we show that presaccadic change detection is rendered more challenging because percepts at the saccade target location are biased toward, and more precise for, only the most recent of two successive stimuli. With a Bayesian model, we show how such perceptual and choice biases are crucial to explain the effects of saccade plans on change detection performance. In sum, visual change detection sensitivity does not improve presaccadically, a result that is readily explained by teasing apart distinct components of presaccadic selection. The findings may have critical implications for real-world scenarios, like driving, that require rapid gaze shifts in dynamically changing environments.


Subject(s)
Visual Fields , Visual Perception , Bayes Theorem , Attention , Eye Movements , Saccades , Photic Stimulation
7.
PLoS Biol ; 22(5): e3002614, 2024 May.
Article in English | MEDLINE | ID: mdl-38743775

ABSTRACT

The processing of sensory information, even at early stages, is influenced by the internal state of the animal. Internal states, such as arousal, are often characterized by relating neural activity to a single "level" of arousal, defined by a behavioral indicator such as pupil size. In this study, we expand the understanding of arousal-related modulations in sensory systems by uncovering multiple timescales of pupil dynamics and their relationship to neural activity. Specifically, we observed a robust coupling between spiking activity in the mouse dorsolateral geniculate nucleus (dLGN) of the thalamus and pupil dynamics across timescales spanning a few seconds to several minutes. Throughout all these timescales, 2 distinct spiking modes-individual tonic spikes and tightly clustered bursts of spikes-preferred opposite phases of pupil dynamics. This multi-scale coupling reveals modulations distinct from those captured by pupil size per se, locomotion, and eye movements. Furthermore, coupling persisted even during viewing of a naturalistic movie, where it contributed to differences in the encoding of visual information. We conclude that dLGN spiking activity is under the simultaneous influence of multiple arousal-related processes associated with pupil dynamics occurring over a broad range of timescales.


Subject(s)
Action Potentials , Arousal , Geniculate Bodies , Pupil , Animals , Pupil/physiology , Geniculate Bodies/physiology , Mice , Action Potentials/physiology , Arousal/physiology , Male , Mice, Inbred C57BL , Photic Stimulation/methods , Neurons/physiology , Thalamus/physiology , Eye Movements/physiology , Time Factors , Visual Pathways/physiology
8.
Proc Natl Acad Sci U S A ; 121(17): e2403858121, 2024 Apr 23.
Article in English | MEDLINE | ID: mdl-38635638

ABSTRACT

Functional neuroimaging studies indicate that the human brain can represent concepts and their relational structure in memory using coding schemes typical of spatial navigation. However, whether we can read out the internal representational geometries of conceptual spaces solely from human behavior remains unclear. Here, we report that the relational structure between concepts in memory might be reflected in spontaneous eye movements during verbal fluency tasks: When we asked participants to randomly generate numbers, their eye movements correlated with distances along the left-to-right one-dimensional geometry of the number space (mental number line), while they scaled with distance along the ring-like two-dimensional geometry of the color space (color wheel) when they randomly generated color names. Moreover, when participants randomly produced animal names, eye movements correlated with low-dimensional similarity in word frequencies. These results suggest that the representational geometries used to internally organize conceptual spaces might be read out from gaze behavior.


Subject(s)
Eye Movements , Spatial Navigation , Humans , Brain , Movement , Functional Neuroimaging
9.
Proc Natl Acad Sci U S A ; 121(15): e2310291121, 2024 Apr 09.
Article in English | MEDLINE | ID: mdl-38564641

ABSTRACT

Humans blink their eyes frequently during normal viewing, more often than it seems necessary for keeping the cornea well lubricated. Since the closure of the eyelid disrupts the image on the retina, eye blinks are commonly assumed to be detrimental to visual processing. However, blinks also provide luminance transients rich in spatial information to neural pathways highly sensitive to temporal changes. Here, we report that the luminance modulations from blinks enhance visual sensitivity. By coupling high-resolution eye tracking in human observers with modeling of blink transients and spectral analysis of visual input signals, we show that blinking increases the power of retinal stimulation and that this effect significantly enhances visibility despite the time lost in exposure to the external scene. We further show that, as predicted from the spectral content of input signals, this enhancement is selective for stimuli at low spatial frequencies and occurs irrespective of whether the luminance transients are actively generated or passively experienced. These findings indicate that, like eye movements, blinking acts as a computational component of a visual processing strategy that uses motor behavior to reformat spatial information into the temporal domain.


Subject(s)
Blinking , Eye Movements , Humans , Photic Stimulation , Visual Perception/physiology , Vision, Ocular
10.
Proc Natl Acad Sci U S A ; 121(3): e2309906121, 2024 Jan 16.
Article in English | MEDLINE | ID: mdl-38198528

ABSTRACT

During free viewing, faces attract gaze and induce specific fixation patterns corresponding to the facial features. This suggests that neurons encoding the facial features are in the causal chain that steers the eyes. However, there is no physiological evidence to support a mechanistic link between face-encoding neurons in high-level visual areas and the oculomotor system. In this study, we targeted the middle face patches of the inferior temporal (IT) cortex in two macaque monkeys using an functional magnetic resonance imaging (fMRI) localizer. We then utilized muscimol microinjection to unilaterally suppress IT neural activity inside and outside the face patches and recorded eye movements while the animals free viewing natural scenes. Inactivation of the face-selective neurons altered the pattern of eye movements on faces: The monkeys found faces in the scene but neglected the eye contralateral to the inactivation hemisphere. These findings reveal the causal contribution of the high-level visual cortex in eye movements.


Subject(s)
Eye Movements , Neurons , Animals , Eye , Histological Techniques , Macaca
11.
Proc Natl Acad Sci U S A ; 121(36): e2405602121, 2024 Sep 03.
Article in English | MEDLINE | ID: mdl-39213176

ABSTRACT

Complex visual stimuli evoke diverse patterns of gaze, but previous research suggests that their neural representations are shared across brains. Here, we used hyperalignment to compare visual responses between observers viewing identical stimuli. We find that individual eye movements enhance cortical visual responses but also lead to representational divergence. Pairwise differences in the spatial distribution of gaze and in semantic salience predict pairwise representational divergence in V1 and inferior temporal cortex, respectively. This suggests that individual gaze sculpts individual visual worlds.


Subject(s)
Eye Movements , Humans , Male , Female , Adult , Eye Movements/physiology , Photic Stimulation , Fixation, Ocular/physiology , Visual Perception/physiology , Temporal Lobe/physiology , Visual Cortex/physiology , Young Adult
12.
PLoS Biol ; 21(1): e3001968, 2023 01.
Article in English | MEDLINE | ID: mdl-36649331

ABSTRACT

We saccade 3 to 5 times per second when reading. However, little is known about the neuronal mechanisms coordinating the oculomotor and visual systems during such rapid processing. Here, we ask if brain oscillations play a role in the temporal coordination of the visuomotor integration. We simultaneously acquired MEG and eye-tracking data while participants read sentences silently. Every sentence was embedded with a target word of either high or low lexical frequency. Our key finding demonstrated that saccade onsets were locked to the phase of alpha oscillations (8 to 13 Hz), and in particular, for saccades towards low frequency words. Source modelling demonstrated that the alpha oscillations to which the saccades were locked, were generated in the right-visual motor cortex (BA 7). Our findings suggest that the alpha oscillations serve to time the processing between the oculomotor and visual systems during natural reading, and that this coordination becomes more pronounced for demanding words.


Subject(s)
Reading , Saccades , Humans , Eye Movements , Language , Brain/physiology , Fixation, Ocular
13.
PLoS Biol ; 21(5): e3002120, 2023 05.
Article in English | MEDLINE | ID: mdl-37155704

ABSTRACT

In the search for the neural basis of conscious experience, perception and the cognitive processes associated with reporting perception are typically confounded as neural activity is recorded while participants explicitly report what they experience. Here, we present a novel way to disentangle perception from report using eye movement analysis techniques based on convolutional neural networks and neurodynamical analyses based on information theory. We use a bistable visual stimulus that instantiates two well-known properties of conscious perception: integration and differentiation. At any given moment, observers either perceive the stimulus as one integrated unitary object or as two differentiated objects that are clearly distinct from each other. Using electroencephalography, we show that measures of integration and differentiation based on information theory closely follow participants' perceptual experience of those contents when switches were reported. We observed increased information integration between anterior to posterior electrodes (front to back) prior to a switch to the integrated percept, and higher information differentiation of anterior signals leading up to reporting the differentiated percept. Crucially, information integration was closely linked to perception and even observed in a no-report condition when perceptual transitions were inferred from eye movements alone. In contrast, the link between neural differentiation and perception was observed solely in the active report condition. Our results, therefore, suggest that perception and the processes associated with report require distinct amounts of anterior-posterior network communication and anterior information differentiation. While front-to-back directed information is associated with changes in the content of perception when viewing bistable visual stimuli, regardless of report, frontal information differentiation was absent in the no-report condition and therefore is not directly linked to perception per se.


Subject(s)
Brain , Electroencephalography , Humans , Feedback , Eye Movements , Perception , Visual Perception , Photic Stimulation
14.
Proc Natl Acad Sci U S A ; 120(20): e2220552120, 2023 05 16.
Article in English | MEDLINE | ID: mdl-37155892

ABSTRACT

Reliable, noninvasive biomarkers that reveal the internal state of a subject are an invaluable tool for neurological diagnoses. Small fixational eye movements, called microsaccades, are a candidate biomarker thought to reflect a subject's focus of attention [Z. M. Hafed, J. J. Clark, VisionRes. 42, 2533-2545 (2002); R. Engbert, R. Kliegl, VisionRes. 43, 1035-1045 (2003)]. The linkage between the direction of microsaccades and attention has mainly been demonstrated using explicit and unambiguous attentional cues. However, the natural world is seldom predictable and rarely provides unambiguous information. Thus, a useful biomarker must be robust to such changes in environmental statistics. To determine how well microsaccades reveal visual-spatial attention across behavioral contexts, we analyzed these fixational eye movements in monkeys performing a conventional change detection task. The task included two stimulus locations and variable cue validities across blocks of trials. Subjects were adept at the task, showing precise and graded modulations of visual attention for subtle target changes and performing better and faster when the cue was more reliable [J. P. Mayo, J. H. R. Maunsell, J. Neurosci. 36, 5353 (2016)]. However, over tens of thousands of microsaccades, we found no difference in microsaccade direction between cued locations when cue variability was high nor between hit and miss trials. Instead, microsaccades were made toward the midpoint of the two target locations, not toward individual targets. Our results suggest that the direction of microsaccades should be interpreted with caution and may not be a reliable measure of covert spatial attention in more complex viewing conditions.


Subject(s)
Fixation, Ocular , Visual Perception , Saccades , Cues , Eye Movements
15.
Proc Natl Acad Sci U S A ; 120(38): e2305759120, 2023 09 19.
Article in English | MEDLINE | ID: mdl-37695898

ABSTRACT

Movement control is critical for successful interaction with our environment. However, movement does not occur in complete isolation of sensation, and this is particularly true of eye movements. Here, we show that the neuronal eye movement commands emitted by the superior colliculus (SC), a structure classically associated with oculomotor control, encompass a robust visual sensory representation of eye movement targets. Thus, similar saccades toward different images are associated with different saccade-related "motor" bursts. Such sensory tuning in SC saccade motor commands appeared for all image manipulations that we tested, from simple visual features to real-life object images, and it was also strongest in the most motor neurons in the deeper collicular layers. Visual-feature discrimination performance in the motor commands was also stronger than in visual responses. Comparing SC motor command feature discrimination performance to that in the primary visual cortex during steady-state gaze fixation revealed that collicular motor bursts possess a reliable perisaccadic sensory representation of the peripheral saccade target's visual appearance, exactly when retinal input is expected to be most uncertain. Our results demonstrate that SC neuronal movement commands likely serve a fundamentally sensory function.


Subject(s)
Eye Movements , Movement , Motor Neurons , Saccades , Discrimination, Psychological
16.
Proc Natl Acad Sci U S A ; 120(43): e2303763120, 2023 Oct 24.
Article in English | MEDLINE | ID: mdl-37844238

ABSTRACT

Perceptual learning is the ability to enhance perception through practice. The hallmark of perceptual learning is its specificity for the trained location and stimulus features, such as orientation. For example, training in discriminating a grating's orientation improves performance only at the trained location but not in other untrained locations. Perceptual learning has mostly been studied using stimuli presented briefly while observers maintained gaze at one location. However, in everyday life, stimuli are actively explored through eye movements, which results in successive projections of the same stimulus at different retinal locations. Here, we studied perceptual learning of orientation discrimination across saccades. Observers were trained to saccade to a peripheral grating and to discriminate its orientation change that occurred during the saccade. The results showed that training led to transsaccadic perceptual learning (TPL) and performance improvements which did not generalize to an untrained orientation. Remarkably, however, for the trained orientation, we found a complete transfer of TPL to the untrained location in the opposite hemifield suggesting high flexibility of reference frame encoding in TPL. Three control experiments in which participants were trained without saccades did not show such transfer, confirming that the location transfer was contingent upon eye movements. Moreover, performance at the trained location, but not at the untrained location, was also improved in an untrained fixation task. Our results suggest that TPL has both, a location-specific component that occurs before the eye movement and a saccade-related component that involves location generalization.


Subject(s)
Saccades , Visual Perception , Humans , Learning , Eye Movements , Retina , Discrimination Learning , Photic Stimulation
17.
Proc Natl Acad Sci U S A ; 120(48): e2303562120, 2023 Nov 28.
Article in English | MEDLINE | ID: mdl-37988462

ABSTRACT

Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements. We show that EMREOs contain parametric information about horizontal and vertical eye displacement as well as initial/final eye position with respect to the head. The parametric information in the horizontal and vertical directions can be modeled as combining linearly, allowing accurate prediction of the EMREOs associated with oblique (diagonal) eye movements. Target location can also be inferred from the EMREO signals recorded during eye movements to those targets. We hypothesize that the (currently unknown) mechanism underlying EMREOs could impose a two-dimensional eye-movement-related transfer function on any incoming sound, permitting subsequent processing stages to compute the positions of sounds in relation to the visual scene.


Subject(s)
Eye Movements , Saccades , Movement , Ocular Physiological Phenomena , Sound
18.
Proc Natl Acad Sci U S A ; 120(40): e2303523120, 2023 10 03.
Article in English | MEDLINE | ID: mdl-37748075

ABSTRACT

Sensorimotor transformation is the process of first sensing an object in the environment and then producing a movement in response to that stimulus. For visually guided saccades, neurons in the superior colliculus (SC) emit a burst of spikes to register the appearance of stimulus, and many of the same neurons discharge another burst to initiate the eye movement. We investigated whether the neural signatures of sensation and action in SC depend on context. Spiking activity along the dorsoventral axis was recorded with a laminar probe as Rhesus monkeys generated saccades to the same stimulus location in tasks that require either executive control to delay saccade onset until permission is granted or the production of an immediate response to a target whose onset is predictable. Using dimensionality reduction and discriminability methods, we show that the subspaces occupied during the visual and motor epochs were both distinct within each task and differentiable across tasks. Single-unit analyses, in contrast, show that the movement-related activity of SC neurons was not different between tasks. These results demonstrate that statistical features in neural activity of simultaneously recorded ensembles provide more insight than single neurons. They also indicate that cognitive processes associated with task requirements are multiplexed in SC population activity during both sensation and action and that downstream structures could use this activity to extract context. Additionally, the entire manifolds associated with sensory and motor responses, respectively, may be larger than the subspaces explored within a certain set of experiments.


Subject(s)
Body Fluids , Superior Colliculi , Animals , Eye Movements , Neurons , Macaca mulatta , Sensation
19.
Proc Natl Acad Sci U S A ; 120(18): e2213438120, 2023 05 02.
Article in English | MEDLINE | ID: mdl-37094161

ABSTRACT

Rapid eye movement sleep (REM) is believed to have a binary temporal structure with "phasic" and "tonic" microstates, characterized by motoric activity versus quiescence, respectively. However, we observed in mice that the frequency of theta activity (a marker of rodent REM) fluctuates in a nonbinary fashion, with the extremes of that fluctuation correlating with phasic-type and tonic-type facial motricity. Thus, phasic and tonic REM may instead represent ends of a continuum. These cycles of brain physiology and facial movement occurred at 0.01 to 0.06 Hz, or infraslow frequencies, and affected cross-frequency coupling and neuronal activity in the neocortex, suggesting network functional impact. We then analyzed human data and observed that humans also demonstrate nonbinary phasic/tonic microstates, with continuous 0.01 to 0.04-Hz respiratory rate cycles matching the incidence of eye movements. These fundamental properties of REM can yield insights into our understanding of sleep health.


Subject(s)
Neocortex , Sleep, REM , Humans , Animals , Mice , Sleep, REM/physiology , Sleep/physiology , Eye Movements , Neocortex/physiology
20.
Proc Natl Acad Sci U S A ; 120(9): e2210839120, 2023 02 28.
Article in English | MEDLINE | ID: mdl-36812207

ABSTRACT

During visual search, it is important to reduce the interference of distracting objects in the scene. The neuronal responses elicited by the search target stimulus are typically enhanced. However, it is equally important to suppress the representations of distracting stimuli, especially if they are salient and capture attention. We trained monkeys to make an eye movement to a unique "pop-out" shape stimulus among an array of distracting stimuli. One of these distractors had a salient color that varied across trials and differed from the color of the other stimuli, causing it to also pop-out. The monkeys were able to select the pop-out shape target with high accuracy and actively avoided the pop-out color distractor. This behavioral pattern was reflected in the activity of neurons in area V4. Responses to the shape targets were enhanced, while the activity evoked by the pop-out color distractor was only briefly enhanced, directly followed by a sustained period of pronounced suppression. These behavioral and neuronal results demonstrate a cortical selection mechanism that rapidly inverts a pop-out signal to "pop-in" for an entire feature dimension thereby facilitating goal-directed visual search in the presence of salient distractors.


Subject(s)
Color Perception , Visual Cortex , Animals , Color Perception/physiology , Haplorhini , Attention/physiology , Eye Movements , Visual Cortex/physiology , Reaction Time/physiology , Visual Perception/physiology
SELECTION OF CITATIONS
SEARCH DETAIL