Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters

Database
Language
Affiliation country
Publication year range
1.
J Neurosci ; 39(49): 9818-9830, 2019 12 04.
Article in English | MEDLINE | ID: mdl-31666357

ABSTRACT

A central function of the brain is to plan, predict, and imagine the effect of movement in a dynamically changing environment. Here we show that in mice head-fixed in a plus-maze, floating on air, and trained to pick lanes based on visual stimuli, the asymmetric movement, and position of whiskers on the two sides of the face signals whether the animal is moving, turning, expecting reward, or licking. We show that (1) whisking asymmetry is coordinated with behavioral state, and that behavioral state can be decoded and predicted based on asymmetry, (2) even in the absence of tactile input, whisker positioning and asymmetry nevertheless relate to behavioral state, and (3) movement of the nose correlates with asymmetry, indicating that facial expression of the mouse is itself correlated with behavioral state. These results indicate that the movement of whiskers, a behavior that is not instructed or necessary in the task, can inform an observer about what a mouse is doing in the maze. Thus, the position of these mobile tactile sensors reflects a behavioral and movement-preparation state of the mouse.SIGNIFICANCE STATEMENT Behavior is a sequence of movements, where each movement can be related to or can trigger a set of other actions. Here we show that, in mice, the movement of whiskers (tactile sensors used to extract information about texture and location of objects) is coordinated with and predicts the behavioral state of mice: that is, what mice are doing, where they are in space, and where they are in the sequence of behaviors.


Subject(s)
Maze Learning/physiology , Vibrissae/innervation , Vibrissae/physiology , Animals , Behavior, Animal , Exploratory Behavior/physiology , Facial Expression , Functional Laterality/physiology , Male , Mice , Mice, Inbred C57BL , Nose/innervation , Nose/physiology , Orientation/physiology , Photic Stimulation , Psychomotor Performance/physiology , Somatosensory Cortex/physiology , Touch/physiology
2.
J Neurophysiol ; 116(4): 1542-1553, 2016 10 01.
Article in English | MEDLINE | ID: mdl-27486102

ABSTRACT

Natural behavior occurs in multiple sensory and motor modalities and in particular is dependent on sensory feedback that constantly adjusts behavior. To investigate the underlying neuronal correlates of natural behavior, it is useful to have access to state-of-the-art recording equipment (e.g., 2-photon imaging, patch recordings, etc.) that frequently requires head fixation. This limitation has been addressed with various approaches such as virtual reality/air ball or treadmill systems. However, achieving multimodal realistic behavior in these systems can be challenging. These systems are often also complex and expensive to implement. Here we present "Air-Track," an easy-to-build head-fixed behavioral environment that requires only minimal computational processing. The Air-Track is a lightweight physical maze floating on an air table that has all the properties of the "real" world, including multiple sensory modalities tightly coupled to motor actions. To test this system, we trained mice in Go/No-Go and two-alternative forced choice tasks in a plus maze. Mice chose lanes and discriminated apertures or textures by moving the Air-Track back and forth and rotating it around themselves. Mice rapidly adapted to moving the track and used visual, auditory, and tactile cues to guide them in performing the tasks. A custom-controlled camera system monitored animal location and generated data that could be used to calculate reaction times in the visual and somatosensory discrimination tasks. We conclude that the Air-Track system is ideal for eliciting natural behavior in concert with virtually any system for monitoring or manipulating brain activity.


Subject(s)
Auditory Perception , Models, Animal , Psychological Tests , Touch Perception , Visual Perception , Animals , Automation, Laboratory/instrumentation , Choice Behavior , Cues , Discrimination, Psychological , Equipment Design , Head , Learning , Maze Learning , Mice, Inbred C57BL , Printing, Three-Dimensional , Reaction Time , Restraint, Physical , Reward , Video Recording
3.
PLoS One ; 17(11): e0276531, 2022.
Article in English | MEDLINE | ID: mdl-36355714

ABSTRACT

The use of head fixation has become routine in systems neuroscience. However, whether the behavior changes with head fixation, whether animals can learn aspects of a task while freely moving and transfer this knowledge to the head fixed condition, has not been examined in much detail. Here, we used a novel floating platform, the "Air-Track", which simulates free movement in a real-world environment to address the effect of head fixation and developed methods to accelerate training of behavioral tasks for head fixed mice. We trained mice in a Y maze two choice discrimination task. One group was trained while head fixed and compared to a separate group that was pre-trained while freely moving and then trained on the same task while head fixed. Pre-training significantly reduced the time needed to relearn the discrimination task while head fixed. Freely moving and head fixed mice displayed similar behavioral patterns, however, head fixation significantly slowed movement speed. The speed of movement in the head fixed mice depended on the weight of the platform. We conclude that home-cage pre-training improves learning performance of head fixed mice and that while head fixation obviously limits some aspects of movement, the patterns of behavior observed in head fixed and freely moving mice are similar.


Subject(s)
Head Movements , Learning , Mice , Animals , Behavior, Animal
4.
eNeuro ; 4(1)2017.
Article in English | MEDLINE | ID: mdl-28275712

ABSTRACT

Here, we describe an automated optical method for tracking animal behavior in both head-fixed and freely moving animals, in real time and offline. It takes advantage of an off-the-shelf camera system, the Pixy camera, designed as a fast vision sensor for robotics that uses a color-based filtering algorithm at 50 Hz to track objects. Using customized software, we demonstrate the versatility of our approach by first tracking the rostro-caudal motion of individual adjacent row (D1, D2) or arc whiskers (ß, γ), or a single whisker and points on the whisker pad, in head-fixed mice performing a tactile task. Next, we acquired high-speed video and Pixy data simultaneously and applied the pixy-based real-time tracking to high-speed video data. With this approach, we expand the temporal resolution of the Pixy camera and track motion (post hoc) at the limit of high-speed video frame rates. Finally, we show that this system is flexible: it can be used to track individual whisker or limb position without any sophisticated object tracking algorithm, it can be used in many lighting conditions including infrared (IR); it can be used to track head rotation and location of multiple animals simultaneously. Our system makes behavioral monitoring possible in virtually any biological setting.


Subject(s)
Electronic Data Processing , Head Movements/physiology , Touch/physiology , Vibrissae/innervation , Video Recording , Wakefulness/physiology , Algorithms , Animals , Female , Mice , Motor Activity , Software
SELECTION OF CITATIONS
SEARCH DETAIL