Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
Article in English | MEDLINE | ID: mdl-31401034

ABSTRACT

The strategy of integrating motor signals with sensory information during voluntary behavior is a general feature of sensory processing. It is required to distinguish externally applied (exafferent) from self-generated (reafferent) sensory inputs. This distinction, in turn, underlies our ability to achieve both perceptual stability and accurate motor control during everyday activities. In this review, we consider the results of recent experiments that have provided circuit-level insight into how motor-related inputs to sensory areas selectively cancel self-generated sensory inputs during active behaviors. These studies have revealed both common strategies and important differences across systems. Sensory reafference is suppressed at the earliest stages of central processing in the somatosensory, vestibular, and auditory systems, with the cerebellum and cerebellum-like structures playing key roles. Furthermore, motor-related inputs can also suppress reafferent responses at higher levels of processing such as the cortex-a strategy preferentially used in visual processing. These recent findings have important implications for understanding how the brain achieves the flexibility required to continuously calibrate relationships between motor signals and the resultant sensory feedback, a computation necessary for our subjective awareness that we control both our actions and their sensory consequences.


Subject(s)
Brain/physiology , Psychomotor Performance/physiology , Cerebellum/physiology , Feedback, Sensory/physiology , Humans , Models, Neurological , Perception/physiology
2.
J Neurosci ; 38(14): 3584-3602, 2018 04 04.
Article in English | MEDLINE | ID: mdl-29487123

ABSTRACT

Many daily behaviors rely critically on estimates of our body motion. Such estimates must be computed by combining neck proprioceptive signals with vestibular signals that have been transformed from a head- to a body-centered reference frame. Recent studies showed that deep cerebellar neurons in the rostral fastigial nucleus (rFN) reflect these computations, but whether they explicitly encode estimates of body motion remains unclear. A key limitation in addressing this question is that, to date, cell tuning properties have only been characterized for a restricted set of motions across head-re-body orientations in the horizontal plane. Here we examined, for the first time, how 3D spatiotemporal tuning for translational motion varies with head-re-body orientation in both horizontal and vertical planes in the rFN of male macaques. While vestibular coding was profoundly influenced by head-re-body position in both planes, neurons typically reflected at most a partial transformation. However, their tuning shifts were not random but followed the specific spatial trajectories predicted for a 3D transformation. We show that these properties facilitate the linear decoding of fully body-centered motion representations in 3D with a broad range of temporal characteristics from small groups of 5-7 cells. These results demonstrate that the vestibular reference frame transformation required to compute body motion is indeed encoded by cerebellar neurons. We propose that maintaining partially transformed rFN responses with different spatiotemporal properties facilitates the creation of downstream body motion representations with a range of dynamic characteristics, consistent with the functional requirements for tasks such as postural control and reaching.SIGNIFICANCE STATEMENT Estimates of body motion are essential for many daily activities. Vestibular signals are important contributors to such estimates but must be transformed from a head- to a body-centered reference frame. Here, we provide the first direct demonstration that the cerebellum computes this transformation fully in 3D. We show that the output of these computations is reflected in the tuning properties of deep cerebellar rostral fastigial nucleus neurons in a specific distributed fashion that facilitates the efficient creation of body-centered translation estimates with a broad range of temporal properties (i.e., from acceleration to position). These findings support an important role for the rostral fastigial nucleus as a source of body translation estimates functionally relevant for behaviors ranging from postural control to perception.


Subject(s)
Body Image , Cerebellar Nuclei/physiology , Head Movements , Orientation, Spatial , Animals , Cerebellar Nuclei/cytology , Macaca mulatta , Male , Neurons/physiology , Vestibule, Labyrinth/physiology
3.
Nat Neurosci ; 18(9): 1310-7, 2015 Sep.
Article in English | MEDLINE | ID: mdl-26237366

ABSTRACT

There is considerable evidence that the cerebellum has a vital role in motor learning by constructing an estimate of the sensory consequences of movement. Theory suggests that this estimate is compared with the actual feedback to compute the sensory prediction error. However, direct proof for the existence of this comparison is lacking. We carried out a trial-by-trial analysis of cerebellar neurons during the execution and adaptation of voluntary head movements and found that neuronal sensitivities dynamically tracked the comparison of predictive and feedback signals. When the relationship between the motor command and resultant movement was altered, neurons robustly responded to sensory input as if the movement was externally generated. Neuronal sensitivities then declined with the same time course as the concurrent behavioral learning. These findings demonstrate the output of an elegant computation in which rapid updating of an internal model enables the motor system to learn to expect unexpected sensory inputs.


Subject(s)
Cerebellum/physiology , Extinction, Psychological/physiology , Head Movements/physiology , Learning/physiology , Neurons/physiology , Action Potentials/physiology , Animals , Cerebellum/cytology , Macaca mulatta , Male , Photic Stimulation/methods , Time Factors
4.
J Neurosci ; 35(8): 3555-65, 2015 Feb 25.
Article in English | MEDLINE | ID: mdl-25716854

ABSTRACT

Traditionally, the neural encoding of vestibular information is studied by applying either passive rotations or translations in isolation. However, natural vestibular stimuli are typically more complex. During everyday life, our self-motion is generally not restricted to one dimension, but rather comprises both rotational and translational motion that will simultaneously stimulate receptors in the semicircular canals and otoliths. In addition, natural self-motion is the result of self-generated and externally generated movements. However, to date, it remains unknown how information about rotational and translational components of self-motion is integrated by vestibular pathways during active and/or passive motion. Accordingly, here, we compared the responses of neurons at the first central stage of vestibular processing to rotation, translation, and combined motion. Recordings were made in alert macaques from neurons in the vestibular nuclei involved in postural control and self-motion perception. In response to passive stimulation, neurons did not combine canal and otolith afferent information linearly. Instead, inputs were subadditively integrated with a weighting that was frequency dependent. Although canal inputs were more heavily weighted at low frequencies, the weighting of otolith input increased with frequency. In response to active stimulation, neuronal modulation was significantly attenuated (∼ 70%) relative to passive stimulation for rotations and translations and even more profoundly attenuated for combined motion due to subadditive input integration. Together, these findings provide insights into neural computations underlying the integration of semicircular canal and otolith inputs required for accurate posture and motor control, as well as perceptual stability, during everyday life.


Subject(s)
Head Movements , Otolithic Membrane/physiology , Semicircular Canals/physiology , Sensory Receptor Cells/physiology , Vestibular Nuclei/physiology , Action Potentials , Animals , Macaca mulatta , Male , Space Perception , Vestibular Nuclei/cytology
5.
Cerebellum ; 14(1): 31-4, 2015 Feb.
Article in English | MEDLINE | ID: mdl-25287644

ABSTRACT

During self-motion, the vestibular system makes essential contributions to postural stability and self-motion perception. To ensure accurate perception and motor control, it is critical to distinguish between vestibular sensory inputs that are the result of externally applied motion (exafference) and that are the result of our own actions (reafference). Indeed, although the vestibular sensors encode vestibular afference and reafference with equal fidelity, neurons at the first central stage of sensory processing selectively encode vestibular exafference. The mechanism underlying this reafferent suppression compares the brain's motor-based expectation of sensory feedback with the actual sensory consequences of voluntary self-motion, effectively computing the sensory prediction error (i.e., exafference). It is generally thought that sensory prediction errors are computed in the cerebellum, yet it has been challenging to explicitly demonstrate this. We have recently addressed this question and found that deep cerebellar nuclei neurons explicitly encode sensory prediction errors during self-motion. Importantly, in everyday life, sensory prediction errors occur in response to changes in the effector or world (muscle strength, load, etc.), as well as in response to externally applied sensory stimulation. Accordingly, we hypothesize that altering the relationship between motor commands and the actual movement parameters will result in the updating in the cerebellum-based computation of exafference. If our hypothesis is correct, under these conditions, neuronal responses should initially be increased--consistent with a sudden increase in the sensory prediction error. Then, over time, as the internal model is updated, response modulation should decrease in parallel with a reduction in sensory prediction error, until vestibular reafference is again suppressed. The finding that the internal model predicting the sensory consequences of motor commands adapts for new relationships would have important implications for understanding how responses to passive stimulation endure despite the cerebellum's ability to learn new relationships between motor commands and sensory feedback.


Subject(s)
Cerebellum/physiology , Perception/physiology , Psychomotor Performance/physiology , Animals , Feedback , Haplorhini
7.
J Neurophysiol ; 111(12): 2465-78, 2014 Jun 15.
Article in English | MEDLINE | ID: mdl-24671531

ABSTRACT

Most of our sensory experiences are gained by active exploration of the world. While the ability to distinguish sensory inputs resulting of our own actions (termed reafference) from those produced externally (termed exafference) is well established, the neural mechanisms underlying this distinction are not fully understood. We have previously proposed that vestibular signals arising from self-generated movements are inhibited by a mechanism that compares the internal prediction of the proprioceptive consequences of self-motion to the actual feedback. Here we directly tested this proposal by recording from single neurons in monkey during vestibular stimulation that was externally produced and/or self-generated. We show for the first time that vestibular reafference is equivalently canceled for self-generated sensory stimulation produced by activation of the neck musculature (head-on-body motion), or axial musculature (combined head and body motion), when there is no discrepancy between the predicted and actual proprioceptive consequences of self-motion. However, if a discrepancy does exist, central vestibular neurons no longer preferentially encode vestibular exafference. Specifically, when simultaneous active and passive motion resulted in activation of the same muscle proprioceptors, neurons robustly encoded the total vestibular input (i.e., responses to vestibular reafference and exafference were equally strong), rather than exafference alone. Taken together, our results show that the cancellation of vestibular reafference in early vestibular processing requires an explicit match between expected and actual proprioceptive feedback. We propose that this vital neuronal computation, necessary for both accurate sensory perception and motor control, has important implications for a variety of sensory systems that suppress self-generated signals.


Subject(s)
Feedback, Sensory/physiology , Movement/physiology , Neurons/physiology , Proprioception/physiology , Vestibular Nuclei/physiology , Action Potentials , Animals , Head Movements/physiology , Macaca mulatta , Microelectrodes , Neck Muscles/physiology , Physical Stimulation , Signal Processing, Computer-Assisted , Volition
8.
J Neurosci ; 33(50): 19555-66, 2013 Dec 11.
Article in English | MEDLINE | ID: mdl-24336720

ABSTRACT

The ability to keep track of where we are going as we navigate through our environment requires knowledge of our ongoing location and orientation. In response to passively applied motion, the otolith organs of the vestibular system encode changes in the velocity and direction of linear self-motion (i.e., heading). When self-motion is voluntarily generated, proprioceptive and motor efference copy information is also available to contribute to the brain's internal representation of current heading direction and speed. However to date, how the brain integrates these extra-vestibular cues with otolith signals during active linear self-motion remains unknown. Here, to address this question, we compared the responses of macaque vestibular neurons during active and passive translations. Single-unit recordings were made from a subgroup of neurons at the first central stage of sensory processing in the vestibular pathways involved in postural control and the computation of self-motion perception. Neurons responded far less robustly to otolith stimulation during self-generated than passive head translations. Yet, the mechanism underlying the marked cancellation of otolith signals did not affect other characteristics of neuronal responses (i.e., baseline firing rate, tuning ratio, orientation of maximal sensitivity vector). Transiently applied perturbations during active motion further established that an otolith cancellation signal was only gated in conditions where proprioceptive sensory feedback matched the motor-based expectation. Together our results have important implications for understanding the brain's ability to ensure accurate postural and motor control, as well as perceptual stability, during active self-motion.


Subject(s)
Models, Neurological , Motion Perception/physiology , Proprioception/physiology , Vestibular Nuclei/physiology , Vestibule, Labyrinth/physiology , Animals , Cues , Head Movements/physiology , Macaca mulatta , Male , Motion , Movement/physiology , Neurons/physiology
9.
Curr Biol ; 23(11): 947-55, 2013 Jun 03.
Article in English | MEDLINE | ID: mdl-23684973

ABSTRACT

BACKGROUND: The ability to distinguish sensory signals that register unexpected events (exafference) from those generated by voluntary actions (reafference) during self-motion is essential for accurate perception and behavior. The cerebellum is most commonly considered in relation to its contributions to the fine tuning of motor commands and sensorimotor calibration required for motor learning. During unexpected motion, however, the sensory prediction errors that drive motor learning potentially provide a neural basis for the computation underlying the distinction between reafference and exafference. RESULTS: Recording from monkeys during voluntary and applied self-motion, we demonstrate that individual cerebellar output neurons encode an explicit and selective representation of unexpected self-motion by means of an elegant computation that cancels the reafferent sensory effects of self-generated movements. During voluntary self-motion, the sensory responses of neurons that robustly encode unexpected movement are canceled. Neurons with vestibular and proprioceptive responses to applied head and body movements are unresponsive when the same motion is self-generated. When sensory reafference and exafference are experienced simultaneously, individual neurons provide a precise estimate of the detailed time course of exafference. CONCLUSIONS: These results provide an explicit solution to the longstanding problem of understanding mechanisms by which the brain anticipates the sensory consequences of our voluntary actions. Specifically, by revealing a striking computation of a sensory prediction error signal that effectively distinguishes between the sensory consequences of self-generated and externally produced actions, our findings overturn the conventional thinking that the sensory errors coded by the cerebellum principally contribute to the fine tuning of motor activity required for motor learning.


Subject(s)
Cerebellum/physiology , Macaca mulatta/physiology , Motion , Movement , Animals , Head Movements , Neurons/physiology , Proprioception
10.
Exp Brain Res ; 210(3-4): 377-88, 2011 May.
Article in English | MEDLINE | ID: mdl-21286693

ABSTRACT

In everyday life, vestibular sensors are activated by both self-generated and externally applied head movements. The ability to distinguish inputs that are a consequence of our own actions (i.e., active motion) from those that result from changes in the external world (i.e., passive or unexpected motion) is essential for perceptual stability and accurate motor control. Recent work has made progress toward understanding how the brain distinguishes between these two kinds of sensory inputs. We have performed a series of experiments in which single-unit recordings were made from vestibular afferents and central neurons in alert macaque monkeys during rotation and translation. Vestibular afferents showed no differences in firing variability or sensitivity during active movements when compared to passive movements. In contrast, the analyses of neuronal firing rates revealed that neurons at the first central stage of vestibular processing (i.e., in the vestibular nuclei) were effectively less sensitive to active motion. Notably, however, this ability to distinguish between active and passive motion was not a general feature of early central processing, but rather was a characteristic of a distinct group of neurons known to contribute to postural control and spatial orientation. Our most recent studies have addressed how vestibular and proprioceptive inputs are integrated in the vestibular cerebellum, a region likely to be involved in generating an internal model of self-motion. We propose that this multimodal integration within the vestibular cerebellum is required for eliminating self-generated vestibular information from the subsequent computation of orientation and posture control at the first central stage of processing.


Subject(s)
Afferent Pathways/physiology , Computer Simulation , Models, Neurological , Motion , Neurons/physiology , Vestibule, Labyrinth/physiology , Action Potentials/physiology , Animals , Cerebellum/cytology , Cerebellum/physiology , Head Movements/physiology , Humans , Vestibule, Labyrinth/cytology
11.
J Neurosci ; 29(34): 10499-511, 2009 Aug 26.
Article in English | MEDLINE | ID: mdl-19710303

ABSTRACT

The ability to accurately control posture and perceive self-motion and spatial orientation requires knowledge of the motion of both the head and body. However, whereas the vestibular sensors and nuclei directly encode head motion, no sensors directly encode body motion. Instead, the convergence of vestibular and neck proprioceptive inputs during self-motion is generally believed to underlie the ability to compute body motion. Here, we provide evidence that the brain explicitly computes an internal estimate of body motion at the level of single cerebellar neurons. Neuronal responses were recorded from the rostral fastigial nucleus, the most medial of the deep cerebellar nuclei, during whole-body, body-under-head, and head-on-body rotations. We found that approximately half of the neurons encoded the motion of the body in space, whereas the other half encoded the motion of the head in space in a manner similar to neurons in the vestibular nuclei. Notably, neurons encoding body motion responded to both vestibular and proprioceptive stimulation (accordingly termed bimodal neurons). In contrast, neurons encoding head motion were sensitive only to vestibular inputs (accordingly termed unimodal neurons). Comparison of the proprioceptive and vestibular responses of bimodal neurons further revealed similar tuning in response to changes in head-on-body position. We propose that the similarity in nonlinear processing of vestibular and proprioceptive signals underlies the accurate computation of body motion. Furthermore, the same neurons that encode body motion (i.e., bimodal neurons) most likely encode vestibular signals in a body-referenced coordinate frame, since the integration of proprioceptive and vestibular information is required for both computations.


Subject(s)
Motion , Movement/physiology , Proprioception/physiology , Vestibular Nuclei/physiology , Action Potentials/physiology , Animals , Head Movements/physiology , Macaca mulatta , Models, Neurological , Neck/innervation , Neurons/classification , Neurons/physiology , Orientation , Physical Stimulation/methods , Posture/physiology , Predictive Value of Tests , Vestibular Nuclei/cytology , Vestibule, Labyrinth/physiology
12.
Ann N Y Acad Sci ; 1164: 29-36, 2009 May.
Article in English | MEDLINE | ID: mdl-19645877

ABSTRACT

Our vestibular organs are simultaneously activated by our own actions as well as by stimulation from the external world. The ability to distinguish sensory inputs that are a consequence of our own actions (vestibular reafference) from those that result from changes in the external world (vestibular exafference) is essential for perceptual stability and accurate motor control. Recent work in our laboratory has focused on understanding how the brain distinguishes between vestibular reafference and exafference. Single-unit recordings were made in alert rhesus monkeys during passive and voluntary (i.e., active) head movements. We found that neurons in the first central stage of vestibular processing (vestibular nuclei), but not the primary vestibular afferents, can distinguish between active and passive movements. In order to better understand how neurons differentiate active from passive head motion, we systematically tested neuronal responses to different combinations of passive and active motion resulting from rotation of the head-on-body and/or head-and-body in space. We found that during active movements, a cancellation signal was generated when the activation of proprioceptors matched the motor-generated expectation.


Subject(s)
Vestibule, Labyrinth/physiology , Animals , Head Movements , Macaca mulatta , Models, Biological , Neck/physiology , Neurons, Afferent , Proprioception
SELECTION OF CITATIONS
SEARCH DETAIL
...