RESUMEN
Bowers et al. focus their criticisms on research that compares behavioral and brain data from the ventral stream with a class of deep neural networks for object recognition. While they are right to identify issues with current benchmarking research programs, they overlook a much more fundamental limitation of this literature: Disregarding the importance of action and interaction for perception.
Asunto(s)
Reconocimiento Visual de Modelos , Percepción Visual , Humanos , Encéfalo , Mapeo EncefálicoRESUMEN
Self-motion through an environment induces various sensory signals, i.e., visual, vestibular, auditory, or tactile. Numerous studies have investigated the role of visual and vestibular stimulation for the perception of self-motion direction (heading). Here, we investigated the rarely considered interaction of visual and tactile stimuli in heading perception. Participants were presented optic flow simulating forward self-motion across a horizontal ground plane (visual), airflow toward the participants' forehead (tactile), or both. In separate blocks of trials, participants indicated perceived heading from unimodal visual or tactile or bimodal sensory signals. In bimodal trials, presented headings were either spatially congruent or incongruent with a maximum offset between visual and tactile heading of 30°. To investigate the reference frame in which visuo-tactile heading is encoded, we varied head and eye orientation during presentation of the stimuli. Visual and tactile stimuli were designed to achieve comparable precision of heading reports between modalities. Nevertheless, in bimodal trials heading perception was dominated by the visual stimulus. A change of head orientation had no significant effect on perceived heading, whereas, surprisingly, a change in eye orientation affected tactile heading perception. Overall, we conclude that tactile flow is more important to heading perception than previously thought.NEW & NOTEWORTHY We investigated heading perception from visual-only (optic flow), tactile-only (tactile flow), or bimodal self-motion stimuli in different conditions varying in head and eye position. Overall, heading perception was body or world centered and non-Bayes optimal and revealed a centripetal bias. Although being visually dominated, tactile flow revealed a significant influence during bimodal heading perception.
Asunto(s)
Percepción de Movimiento , Flujo Optico , Percepción del Tacto , Vestíbulo del Laberinto , Humanos , Percepción de Movimiento/fisiología , Vestíbulo del Laberinto/fisiología , Tacto , Estimulación Luminosa , Percepción Visual/fisiologíaRESUMEN
Successful interaction with the environment requires the dissociation of self-induced from externally induced sensory stimulation. Temporal proximity of action and effect is hereby often used as an indicator of whether an observed event should be interpreted as a result of own actions or not. We tested how the delay between an action (press of a touch bar) and an effect (onset of simulated self-motion) influences the processing of visually simulated self-motion in the ventral intraparietal area (VIP) of macaque monkeys. We found that a delay between the action and the start of the self-motion stimulus led to a rise of activity above the baseline activity before motion onset in a subpopulation of 21% of the investigated neurons. In the responses to the stimulus, we found a significantly lower sustained activity when the press of a touch bar and the motion onset were contiguous compared to the condition when the motion onset was delayed. We speculate that this weak inhibitory effect might be part of a mechanism that sharpens the tuning of VIP neurons during self-induced motion and thus has the potential to increase the precision of heading information that is required to adjust the orientation of self-motion in everyday navigational tasks.NEW & NOTEWORTHY Neurons in macaque ventral intraparietal area (VIP) are responding to sensory stimulation related to self-motion, e.g. visual optic flow. Here, we found that self-motion induced activation depends on the sense of agency, i.e., it differed when optic flow was perceived as self- or externally induced. This demonstrates that area VIP is well suited for study of the interplay between active behavior and sensory processing during self-motion.
Asunto(s)
Cinestesia/fisiología , Percepción de Movimiento/fisiología , Actividad Motora/fisiología , Flujo Optico/fisiología , Lóbulo Parietal/fisiología , Animales , Electrocorticografía , Macaca mulatta , Masculino , Neuronas/fisiologíaRESUMEN
The accurate processing of temporal information is of critical importance in everyday life. Yet, psychophysical studies in humans have shown that the perception of time is distorted around saccadic eye movements. The neural correlates of this misperception are still poorly understood. Behavioral and neural evidence suggest that it is tightly linked to other known perisaccadic modulations of visual perception. To further our understanding of how temporal processing is affected by saccades, we studied the representations of brief visual time intervals during fixation and saccades in area V4 of two awake macaques. We presented random sequences of vertical bar stimuli and extracted neural responses to double-pulse stimulation at varying interstimulus intervals. Our results show that temporal information about very brief intervals of as brief as 20 ms is reliably represented in the multiunit activity in area V4. Response latencies were not systematically modulated by the saccade. However, a general increase in perisaccadic activity altered the ratio of response amplitudes within stimulus pairs compared with fixation. In line with previous studies showing that the perception of brief time intervals is partly based on response levels, this may be seen as a possible correlate of the perisaccadic misperception of time.NEW & NOTEWORTHY We investigated for the first time how temporal information on very brief timescales is represented in area V4 around the time of saccadic eye movements. Overall, the responses showed an unexpectedly precise representation of time intervals. Our finding of a perisaccadic modulation of relative response amplitudes introduces a new possible correlate of saccade-related perceptual distortions of time.
Asunto(s)
Estimulación Luminosa/métodos , Tiempo de Reacción/fisiología , Movimientos Sacádicos/fisiología , Percepción del Tiempo/fisiología , Corteza Visual/fisiología , Percepción Visual/fisiología , Animales , Macaca , Macaca mulatta , MasculinoRESUMEN
Postural instability marks one of the most disabling features of Parkinson's disease (PD), but it only reveals itself after affected brain areas have already been significantly damaged. Thus there is a need to detect deviations in balance and postural control before visible symptoms occur. In this study, we visually perturbed balance in the anterior-posterior direction using sinusoidal oscillations of a moving room in virtual reality at different frequencies. We tested three groups: individuals with PD under dopaminergic medication, an age-matched control group, and a group of young healthy adults. We tracked their center of pressure and their full-body motion, from which we also extracted the center of mass. We investigated sway amplitudes and applied newly introduced phase-locking analyses to investigate responses across participants' bodies. Patients exhibited significantly higher sway amplitudes as compared with the control subjects. However, their sway was phase locked to the visual motion like that of age-matched and young healthy adults. Furthermore, all groups successfully compensated for the visual perturbation by phase locking their sway to the stimulus. As frequency of the perturbation increased, distribution of phase locking (PL) across the body revealed a shift of the highest PL values from the upper body toward the hip region for young healthy adults, which could not be observed in patients and elderly healthy adults. Our findings suggest an impaired motor control, but intact visuomotor processing in early stages of PD, while less flexibility to adapt postural strategy to different perturbations revealed to be an effect of age rather than disease.NEW & NOTEWORTHY A better understanding of visuomotor control in Parkinson's disease (PD) potentially serves as a tool for earlier diagnosis, which is crucial for improving patient's quality of life. In our study, we assess body sway responses to visual perturbations of the balance control system in patients with early-to-mid stage PD, using motion tracking along with recently established phase-locking techniques. Our findings suggest patients at this stage have an impaired muscular stability but intact visuomotor control.
Asunto(s)
Enfermedad de Parkinson/fisiopatología , Equilibrio Postural/fisiología , Desempeño Psicomotor/fisiología , Percepción Visual/fisiología , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Adulto JovenRESUMEN
Vision plays a central role in maintaining balance. When humans perceive their body as moving, they trigger counter movements. This results in body sway, which has typically been investigated by measuring the body's center of pressure (COP). Here, we aimed to induce visually evoked postural responses (VEPR) by simulating self-motion in virtual reality (VR) using a sinusoidally oscillating "moving room" paradigm. Ten healthy subjects participated in the experiment. Stimulation consisted of a 3D-cloud of random dots, presented through a VR headset, which oscillated sinusoidally in the anterior-posterior direction at different frequencies. We used a force platform to measure subjects' COP over time and quantified the resulting trajectory by wavelet analyses including inter-trial phase coherence (ITPC). Subjects exhibited significant coupling of their COP to the respective stimulus. Even when spectral analysis of postural sway showed only small responses in the expected frequency bands (power), ITPC revealed an almost constant strength of coupling to the stimulus within but also across subjects and presented frequencies. Remarkably, ITPC even revealed a strong phase coupling to stimulation at 1.5 Hz, which exceeds the frequency range that has generally been attributed to the coupling of human postural sway to an oscillatory visual scenery. These findings suggest phase-locking to be an essential feature of visuomotor control.
Asunto(s)
Percepción de Movimiento/fisiología , Equilibrio Postural/fisiología , Desempeño Psicomotor/fisiología , Realidad Virtual , Adulto , HumanosRESUMEN
Keeping track of objects in our environment across body and eye movements is essential for perceptual stability and localization of external objects. As of yet, it is largely unknown how this perceptual stability is achieved. A common behavioral approach to investigate potential neuronal mechanisms underlying spatial vision has been the presentation of one brief visual stimulus across eye movements. Here, we adopted this approach and aimed to determine the reference frame of the perceptual localization of two successively presented flashes during fixation and smooth pursuit eye movements (SPEMs). To this end, eccentric flashes with a stimulus onset asynchrony of zero or ± 200 ms had to be localized with respect to each other during fixation and SPEMs. The results were used to evaluate different models predicting the reference frame in which the spatial information is represented. First, we were able to reproduce the well-known effect of relative mislocalization during fixation. Second, smooth pursuit led to a characteristic relative mislocalization, different from that during fixation. A model assuming that relative localization takes place in a nonretinocentric reference frame described our data best. This suggests that the relative localization judgment is performed at a stage of visual processing in which retinal and nonretinal information is available.
Asunto(s)
Fijación Ocular/fisiología , Estimulación Luminosa , Seguimiento Ocular Uniforme/fisiología , Retina/efectos de la radiación , Adulto , Femenino , Humanos , Juicio , Masculino , Percepción de Movimiento/fisiología , Percepción Visual/fisiología , Adulto JovenRESUMEN
Vision represents the most important sense of primates. To understand visual processing, various different methods are employed-for example, electrophysiology, psychophysics, or eye-tracking. For the latter method, researchers have recently begun to step outside the artificial environments of laboratory setups toward the more natural conditions we usually face in the real world. To get a better understanding of the advantages and limitations of modern mobile eye-trackers, we quantitatively compared one of the most advanced mobile eye-trackers available, the EyeSeeCam, with a commonly used laboratory eye-tracker, the EyeLink II, serving as a gold standard. We aimed to investigate whether or not fully mobile eye-trackers are capable of providing data that would be adequate for direct comparisons with data recorded by stationary eye-trackers. Therefore, we recorded three different, commonly used eye movements-fixations, saccades, and smooth-pursuit eye movements-with both eye-trackers, in successive standardized paradigms in a laboratory setting with eight human subjects. Despite major technical differences between the devices, most eye movement parameters were not statistically different between the two systems. Differences could only be found in overall gaze accuracy and for time-critical parameters such as saccade duration, for which a higher sample frequency is especially useful. Although the stationary EyeLink II system proved to be superior, especially on a single-subject or even a single-trial basis, the ESC showed similar performance for the averaged parameters across both trials and subjects. We concluded that modern mobile eye-trackers are well-suited to providing reliable oculomotor data at the required spatial and temporal resolutions.
Asunto(s)
Movimientos Oculares , Humanos , Seguimiento Ocular Uniforme , Visión OcularRESUMEN
Natural orienting of gaze often results in a retinal image that is rotated relative to space due to ocular torsion. However, we perceive neither this rotation nor a moving world despite visual rotational motion on the retina. This perceptual stability is often attributed to the phenomenon known as predictive remapping, but the current remapping literature ignores this torsional component. In addition, studies often simply measure remapping across either space or features (e.g., orientation) but in natural circumstances, both components are bound together for stable perception. One natural circumstance in which the perceptual system must account for the current and future eye orientation to correctly interpret the orientation of external stimuli occurs during movements to or from oblique eye orientations (i.e., eye orientations with both a horizontal and vertical angular component relative to the primary position). Here we took advantage of oblique eye orientation-induced ocular torsion to examine perisaccadic orientation perception. First, we found that orientation perception was largely predicted by the rotated retinal image. Second, we observed a presaccadic remapping of orientation perception consistent with maintaining a stable (but spatially inaccurate) retinocentric perception throughout the saccade. These findings strongly suggest that our seamless perceptual stability relies on retinocentric signals that are predictively remapped in all three ocular dimensions with each saccade.
Asunto(s)
Orientación Espacial/fisiología , Movimientos Sacádicos/fisiología , Percepción Visual/fisiología , Adulto , Femenino , Fijación Ocular/fisiología , Humanos , Masculino , Retina/fisiología , Rotación , Visión Ocular/fisiología , Adulto JovenRESUMEN
During self-motion through an environment, our sensory systems are confronted with a constant flow of information from different modalities. To successfully navigate, self-induced sensory signals have to be dissociated from externally induced sensory signals. Previous studies have suggested that the processing of self-induced sensory information is modulated by means of predictive coding mechanisms. However, the neural correlates of processing self-induced sensory information from different modalities during self-motion are largely unknown. Here, we asked if and how the processing of visually simulated self-motion and/or associated auditory stimuli is modulated by self-controlled action. Participants were asked to actively reproduce a previously observed simulated self-displacement (path integration). Blood oxygen level-dependent (BOLD) activation during this path integration was compared with BOLD activation during a condition in which we passively replayed the exact sensory stimulus that had been produced by the participants in previous trials. We found supramodal BOLD suppression in parietal and frontal regions. Remarkably, BOLD contrast in sensory areas was enhanced in a modality-specific manner. We conclude that the effect of action on sensory processing is strictly dependent on the respective behavioral task and its relevance.
Asunto(s)
Percepción Auditiva/fisiología , Lóbulo Frontal/fisiología , Procesos Mentales/fisiología , Percepción de Movimiento/fisiología , Sensación/fisiología , Estimulación Acústica , Adulto , Cognición/fisiología , Femenino , Fijación Ocular/fisiología , Humanos , Imagen por Resonancia Magnética , Masculino , Movimiento/fisiología , Estimulación Luminosa , Adulto JovenRESUMEN
In the natural world, self-motion always stimulates several different sensory modalities. Here we investigated the interplay between a visual optic flow stimulus simulating self-motion and a tactile stimulus (air flow resulting from self-motion) while human observers were engaged in a distance reproduction task. We found that adding congruent tactile information (i.e., speed of the air flow and speed of visual motion are directly proportional) to the visual information significantly improves the precision of the actively reproduced distances. This improvement, however, was smaller than predicted for an optimal integration of visual and tactile information. In contrast, incongruent tactile information (i.e., speed of the air flow and speed of visual motion are inversely proportional) did not improve subjects' precision indicating that incongruent tactile information and visual information were not integrated. One possible interpretation of the results is a link to properties of neurons in the ventral intraparietal area that have been shown to have spatially and action-congruent receptive fields for visual and tactile stimuli.NEW & NOTEWORTHY This study shows that tactile and visual information can be integrated to improve the estimates of the parameters of self-motion. This, however, happens only if the two sources of information are congruent-as they are in a natural environment. In contrast, an incongruent tactile stimulus is still used as a source of information about self-motion but it is not integrated with visual information.
Asunto(s)
Percepción de Movimiento , Percepción del Tacto , Adulto , Femenino , Humanos , Masculino , Movimiento , Lóbulo Parietal/fisiologíaRESUMEN
The dependence of neuronal discharge on the position of the eyes in the orbit is a functional characteristic of many visual cortical areas of the macaque. It has been suggested that these eye-position signals provide relevant information for a coordinate transformation of visual signals into a non-eye-centered frame of reference. This transformation could be an integral part for achieving visual perceptual stability across eye movements. Previous studies demonstrated close to veridical eye-position decoding during stable fixation as well as characteristic erroneous decoding across saccadic eye-movements. Here we aimed to decode eye position during smooth pursuit. We recorded neural activity in macaque area VIP during steady fixation, saccades and smooth-pursuit and investigated the temporal and spatial accuracy of eye position as decoded from the neuronal discharges. Confirming previous results, the activity of the majority of neurons depended linearly on horizontal and vertical eye position. The application of a previously introduced computational approach (isofrequency decoding) allowed eye position decoding with considerable accuracy during steady fixation. We applied the same decoder on the activity of the same neurons during smooth-pursuit. On average, the decoded signal was leading the current eye position. A model combining this constant lead of the decoded eye position with a previously described attentional bias ahead of the pursuit target describes the asymmetric mislocalization pattern for briefly flashed stimuli during smooth pursuit eye movements as found in human behavioral studies.
Asunto(s)
Encéfalo/fisiología , Seguimiento Ocular Uniforme , Percepción Espacial , Animales , Atención , Encéfalo/citología , Potenciales Evocados Visuales , Macaca , Neuronas/fisiología , Movimientos SacádicosRESUMEN
Alterations of eye movements in schizophrenia patients have been widely described for laboratory settings. For example, gain during smooth tracking is reduced, and fixation patterns differ between patients and healthy controls. The question remains, whether such results are related to the specifics of the experimental environment, or whether they transfer to natural settings. Twenty ICD-10 diagnosed schizophrenia patients and 20 healthy age-matched controls participated in the study, each performing four different oculomotor tasks corresponding to natural everyday behavior in an indoor environment: (I) fixating stationary targets, (II) sitting in a hallway with free gaze, (III) walking down the hallway, and (IV) visually tracking a target on the floor while walking straight-ahead. In all conditions, eye movements were continuously recorded binocularly by a mobile lightweight eye tracker (EyeSeeCam). When patients looked at predefined targets, they showed more fixations with reduced durations than controls. The opposite was true when participants were sitting in a hallway with free gaze. During visual tracking, patients showed a significantly greater root-mean-square error (representing the mean deviation from optimal) of retinal target velocity. Different from previous results on smooth-pursuit eye movements obtained in laboratory settings, no such difference was found for velocity gain. Taken together, we have identified significant differences in fundamental oculomotor parameters between schizophrenia patients and healthy controls during natural behavior in a real environment. Moreover, our data provide evidence that in natural settings, patients overcome some impairments, which might be present only in laboratory studies, by as of now unknown compensatory mechanisms or strategies.
Asunto(s)
Ambiente , Trastornos de la Motilidad Ocular/etiología , Esquizofrenia/complicaciones , Adulto , Estudios de Casos y Controles , Movimientos Oculares , Femenino , Humanos , Masculino , Trastornos de la Motilidad Ocular/diagnóstico , Estimulación Luminosa , Desempeño Psicomotor , Percepción Visual , Adulto JovenRESUMEN
Eye-position signals (EPS) are found throughout the primate visual system and are thought to provide a mechanism for representing spatial locations in a manner that is robust to changes in eye position. It remains unknown, however, whether cortical EPS (also known as "gain fields") have the necessary spatial and temporal characteristics to fulfill their purported computational roles. To quantify these EPS, we combined single-unit recordings in four dorsal visual areas of behaving rhesus macaques (lateral intraparietal area, ventral intraparietal area, middle temporal area, and the medial superior temporal area) with likelihood-based population-decoding techniques. The decoders used knowledge of spiking statistics to estimate eye position during fixation from a set of observed spike counts across neurons. Importantly, these samples were short in duration (100 ms) and from individual trials to mimic the real-time estimation problem faced by the brain. The results suggest that cortical EPS provide an accurate and precise representation of eye position, albeit with unequal signal fidelity across brain areas and a modest underestimation of eye eccentricity. The underestimation of eye eccentricity predicted a pattern of mislocalization that matches the errors made by human observers. In addition, we found that eccentric eye positions were associated with enhanced precision relative to the primary eye position. This predicts that positions in visual space should be represented more reliably during eccentric gaze than while looking straight ahead. Together, these results suggest that cortical eye-position signals provide a useable head-centered representation of visual space on timescales that are compatible with the duration of a typical ocular fixation.
Asunto(s)
Potenciales Evocados Visuales/fisiología , Fijación Ocular/fisiología , Modelos Neurológicos , Desempeño Psicomotor/fisiología , Percepción Espacial/fisiología , Vías Visuales/fisiología , Potenciales de Acción/fisiología , Animales , Teorema de Bayes , Humanos , Funciones de Verosimilitud , Macaca mulatta , Masculino , Tiempo de Reacción/fisiologíaRESUMEN
The patterns of optic flow seen during self-motion can be used to determine the direction of one's own heading. Tracking eye movements which typically occur during everyday life alter this task since they add further retinal image motion and (predictably) distort the retinal flow pattern. Humans employ both visual and nonvisual (extraretinal) information to solve a heading task in such case. Likewise, it has been shown that neurons in the monkey medial superior temporal area (area MST) use both signals during the processing of self-motion information. In this article we report that neurons in the macaque ventral intraparietal area (area VIP) use visual information derived from the distorted flow patterns to encode heading during (simulated) eye movements. We recorded responses of VIP neurons to simple radial flow fields and to distorted flow fields that simulated self-motion plus eye movements. In 59% of the cases, cell responses compensated for the distortion and kept the same heading selectivity irrespective of different simulated eye movements. In addition, response modulations during real compared with simulated eye movements were smaller, being consistent with reafferent signaling involved in the processing of the visual consequences of eye movements in area VIP. We conclude that the motion selectivities found in area VIP, like those in area MST, provide a way to successfully analyze and use flow fields during self-motion and simultaneous tracking movements.
Asunto(s)
Percepción de Movimiento/fisiología , Neuronas/fisiología , Orientación/fisiología , Lóbulo Parietal/fisiología , Animales , Movimientos Oculares/fisiología , Macaca mulatta , Flujo Optico , Estimulación Luminosa/métodosRESUMEN
PURPOSE: To assess the reproducibility of brain-activation and eye-movement patterns in a saccade paradigm when comparing subjects, tasks, and magnetic resonance (MR) systems. MATERIALS AND METHODS: Forty-five healthy adults at two different sites (n = 45) performed saccade tasks with varying levels of target predictability: predictable (PRED), position predictable (pPRED), time predictable (tPRED), and prosaccade (SAC). Eye-movement pattern was tested with a repeated-measures analysis of variance. Activation maps reproducibility were estimated with the cluster overlap Jaccard index and signal variance coefficient of determination for within-subjects test-retest data, and for between-subjects data from the same and different sites. RESULTS: In all groups latencies increased with decreasing target predictability: PRED < pPRED < tPRED < SAC (P < 0,001). Activation overlap was good to fair (>0.40) in all tasks in the within-subjects test-retest comparisons and poor (<0.40) in the tPRED for different subjects. The overlap of the different tasks for within-groups data was higher (0.40-0.68) than for the between-groups data (0.30-0.50). Activation consistency was 60-85% in the same subjects, 50-79% in different subjects, and 50-80% in different sites. In SAC, the activation found in the same and in different subjects was more consistent than in other tasks (50-80%). CONCLUSION: The predictive saccade tasks produced evidence for brain-activation and eye-movement reproducibility.
Asunto(s)
Mapeo Encefálico/métodos , Encéfalo/fisiología , Movimientos Oculares/fisiología , Fijación Ocular/fisiología , Imagen por Resonancia Magnética/métodos , Red Nerviosa/fisiología , Movimientos Sacádicos/fisiología , Adulto , Femenino , Humanos , Masculino , Valores de Referencia , Reproducibilidad de los Resultados , Sensibilidad y EspecificidadRESUMEN
Introduction: Numerous previous studies have shown that eye movements induce errors in the localization of briefly flashed stimuli. Remarkably, the error pattern is indicative of the underlying eye movement and the exact experimental condition. For smooth pursuit eye movements (SPEM) and the slow phase of the optokinetic nystagmus (OKN), perceived stimulus locations are shifted in the direction of the ongoing eye movement, with a hemifield asymmetry observed only during SPEM. During the slow phases of the optokinetic afternystagmus (OKAN), however, the error pattern can be described as a perceptual expansion of space. Different from SPEM and OKN, the OKAN is an open-loop eye movement. Methods: Visually guided smooth pursuit can be transformed into an open-loop eye movement by briefly blanking the pursuit target (gap). Here, we examined flash localization during open-loop pursuit and asked, whether localization is also prone to errors and whether these are similar to those found during SPEM or during OKAN. Human subjects tracked a pursuit target. In half of the trials, the target was extinguished for 300 ms (gap) during the steady-state, inducing open-loop pursuit. Flashes were presented during this gap or during steady-state (closed-loop) pursuit. Results: In both conditions, perceived flash locations were shifted in the direction of the eye movement. The overall error pattern was very similar with error size being slightly smaller in the gap condition. The differences between errors in the open- and closed-loop conditions were largest in the central visual field and smallest in the periphery. Discussion: We discuss the findings in light of the neural substrates driving the different forms of eye movements.
RESUMEN
Self-motion induces sensory signals that allow to determine travel distance (path integration). For veridical path integration, one must distinguish self-generated from externally induced sensory signals. Predictive coding has been suggested to attenuate self-induced sensory responses, while task relevance can reverse the attenuating effect of prediction. But how is self-motion processing affected by prediction and task demands, and do effects generalize across senses? In this fMRI study, we investigated visual and tactile self-motion processing and its modulation by task demands. Visual stimuli simulated forward self-motion across a ground plane. Tactile self-motion stimuli were delivered by airflow across the subjects' forehead. In one task, subjects replicated a previously observed distance (Reproduction/Active; high behavioral demand) of passive self-displacement (Reproduction/Passive). In a second task, subjects travelled a self-chosen distance (Self/Active; low behavioral demand) which was recorded and played back to them (Self/Passive). For both tasks and sensory modalities, Active as compared to Passive trials showed enhancement in early visual areas and suppression in higher order areas of the inferior parietal lobule (IPL). Contrasting high and low demanding active trials yielded supramodal enhancement in the anterior insula. Suppression in the IPL suggests this area to be a comparator of sensory self-motion signals and predictions thereof.
Asunto(s)
Percepción de Movimiento , Humanos , Percepción de Movimiento/fisiología , Tacto/fisiología , Lóbulo Parietal/fisiología , Estimulación LuminosaRESUMEN
While deep brain stimulation (DBS) in the subthalamic nucleus (STN) improves motor functions in Parkinson's disease (PD), it may also increase impulsivity by interfering with the inhibition of reflexive responses. The aim of this study was to investigate if varying the pulse frequency of STN-DBS has a modulating effect on response inhibition and its neural correlates. For this purpose, 14 persons with PD repeated an antisaccade task in three stimulation settings (DBS off, high-frequency DBS (130 Hz), mid-frequency DBS (60 Hz)) in a randomized order, while eye movements and brain activity via high-density EEG were recorded. On a behavioral level, 130 Hz DBS stimulation had no effect on response inhibition measured as antisaccade error rate, while 60 Hz DBS induced a slight but significant reduction of directional errors compared with the DBS-off state and 130 Hz DBS. Further, stimulation with both frequencies decreased the onset latency of correct antisaccades, while increasing the latency of directional errors. Time-frequency domain analysis of the EEG data revealed that 60 Hz DBS was associated with an increase in preparatory theta power over a midfrontal region of interest compared with the off-DBS state which is generally regarded as a marker of increased cognitive control. While no significant differences in brain activity over mid- and lateral prefrontal regions of interest emerged between the 60 Hz and 130 Hz conditions, both stimulation frequencies were associated with a stronger midfrontal beta desynchronization during the mental preparation for correct antisaccades compared with DBS off-state which is discussed in the context of potentially enhanced proactive recruitment of the oculomotor network. Our preliminary findings suggest that mid-frequency STN-DBS may provide beneficial effects on response inhibition, while both 130 Hz- and 60 Hz STN-DBS may promote voluntary actions at the expense of slower reflexive responses.
Asunto(s)
Estimulación Encefálica Profunda , Enfermedad de Parkinson , Núcleo Subtalámico , Humanos , Electroencefalografía , Tecnología de Seguimiento Ocular , Enfermedad de Parkinson/terapia , Núcleo Subtalámico/fisiologíaRESUMEN
Visual landmarks influence spatial cognition and behavior, but their influence on visual codes for action is poorly understood. Here, we test landmark influence on the visual response to saccade targets recorded from 312 frontal and 256 supplementary eye field neurons in rhesus macaques. Visual response fields are characterized by recording neural responses to various target-landmark combinations, and then we test against several candidate spatial models. Overall, frontal/supplementary eye fields response fields preferentially code either saccade targets (40%/40%) or landmarks (30%/4.5%) in gaze fixation-centered coordinates, but most cells show multiplexed target-landmark coding within intermediate reference frames (between fixation-centered and landmark-centered). Further, these coding schemes interact: neurons with near-equal target and landmark coding show the biggest shift from fixation-centered toward landmark-centered target coding. These data show that landmark information is preserved and influences target coding in prefrontal visual responses, likely to stabilize movement goals in the presence of noisy egocentric signals.