Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 64
Filtrar
Más filtros












Base de datos
Intervalo de año de publicación
1.
Cell Rep ; 43(3): 113884, 2024 Mar 26.
Artículo en Inglés | MEDLINE | ID: mdl-38458194

RESUMEN

Primate hands house an array of mechanoreceptors and proprioceptors, which are essential for tactile and kinematic information crucial for daily motor action. While the regulation of these somatosensory signals is essential for hand movements, the specific central nervous system (CNS) location and mechanism remain unclear. Our study demonstrates the attenuation of somatosensory signals in the cuneate nucleus during voluntary movement, suggesting significant modulation at this initial relay station in the CNS. The attenuation is comparable to the cerebral cortex but more pronounced than in the spinal cord, indicating the cuneate nuclei's role in somatosensory perception modulation during movement. Moreover, our findings suggest that the descending motor tract may regulate somatosensory transmission in the cuneate nucleus, enhancing relevant signals and suppressing unnecessary ones for the regulation of movement. This process of recurrent somatosensory modulation between cortical and subcortical areas could be a basic mechanism for modulating somatosensory signals to achieve active perception.


Asunto(s)
Mano , Bulbo Raquídeo , Animales , Bulbo Raquídeo/fisiología , Médula Espinal/fisiología , Tacto , Primates , Corteza Somatosensorial/fisiología , Movimiento/fisiología
2.
Front Hum Neurosci ; 18: 1336629, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38419960

RESUMEN

Various functional modulations of the stretch reflex help to stabilize actions, but the computational mechanism behind its context-dependent tuning remains unclear. While many studies have demonstrated that motor contexts associated with the task goal cause functional modulation of the stretch reflex of upper limbs, it is not well understood how visual contexts independent of the task requirements affect the stretch reflex. To explore this issue, we conducted two experiments testing 20 healthy human participants (age range 20-45, average 31.3 ± 9.0), in which visual contexts were manipulated in a visually guided reaching task. During wrist flexion movements toward a visual target, a mechanical load was applied to the wrist joint to evoke stretch reflex of wrist flexor muscle (flexor carpi radialis). The first experiment (n = 10) examined the effect of altering the visuomotor transformation on the stretch reflex that was evaluated with surface electromyogram. We found that the amplitude of the stretch reflex decreased (p = 0.024) when a rotational transformation of 90° was introduced between the hand movement and the visual cursor, whereas the amplitude did not significantly change (p = 0.26) when the rotational transformation was accompanied by a head rotation so that the configuration of visual feedback was maintained in visual coordinates. The results suggest that the stretch reflex was regulated depending on whether the visuomotor mapping had already been acquired or not. In the second experiment (n = 10), we examined how uncertainty in the visual target or hand cursor affects the stretch reflex by removing these visual stimuli. We found that the reflex amplitude was reduced by the disappearance of the hand cursor (p = 0.039), but was not affected by removal of the visual target (p = 0.27), suggesting that the visual state of the body and target contribute differently to the reflex tuning. These findings support the idea that visual updating of the body state is crucial for regulation of quick motor control driven by proprioceptive signals.

3.
Proc Biol Sci ; 291(2015): 20231753, 2024 Jan 31.
Artículo en Inglés | MEDLINE | ID: mdl-38228504

RESUMEN

Bodily self-awareness relies on a constant integration of visual, tactile, proprioceptive, and motor signals. In the 'rubber hand illusion' (RHI), conflicting visuo-tactile stimuli lead to changes in self-awareness. It remains unclear whether other, somatic signals could compensate for the alterations in self-awareness caused by visual information about the body. Here, we used the RHI in combination with robot-mediated self-touch to systematically investigate the role of tactile, proprioceptive and motor signals in maintaining and restoring bodily self-awareness. Participants moved the handle of a leader robot with their right hand and simultaneously received corresponding tactile feedback on their left hand from a follower robot. This self-touch stimulation was performed either before or after the induction of a classical RHI. Across three experiments, active self-touch delivered after-but not before-the RHI, significantly reduced the proprioceptive drift caused by RHI, supporting a restorative role of active self-touch on bodily self-awareness. The effect was not present during involuntary self-touch. Unimodal control conditions confirmed that both tactile and motor components of self-touch were necessary to restore bodily self-awareness. We hypothesize that active self-touch transiently boosts the precision of proprioceptive representation of the touched body part, thus counteracting the visual capture effects that underlie the RHI.


Asunto(s)
Ilusiones , Percepción del Tacto , Humanos , Tacto/fisiología , Ilusiones/fisiología , Percepción Visual/fisiología , Percepción del Tacto/fisiología , Mano/fisiología , Propiocepción/fisiología , Imagen Corporal
4.
Neural Netw ; 162: 516-530, 2023 May.
Artículo en Inglés | MEDLINE | ID: mdl-36990001

RESUMEN

Visual motion analysis is crucial for humans to detect external moving objects and self-motion which are informative for planning and executing actions for various interactions with environments. Here we show that the image motion analysis trained to decode the self-motion during human natural movements by a convolutional neural network exhibits similar specificities with the reflexive ocular and manual responses induced by a large-field visual motion, in terms of stimulus spatiotemporal frequency tuning. The spatiotemporal frequency tuning of the decoder peaked at high-temporal and low-spatial frequencies, as observed in the reflexive ocular and manual responses, but differed significantly from the frequency power of the visual image itself and the density distribution of self-motion. Further, artificial manipulations of the learning data sets predicted great changes in the specificity of the spatiotemporal tuning. Interestingly, despite similar spatiotemporal frequency tunings in the vertical-axis rotational direction and in the transversal direction to full-field visual stimuli, the tunings for center-masked stimuli were different between those directions, and the specificity difference is qualitatively similar to the discrepancy between ocular and manual responses, respectively. In addition, the representational analysis demonstrated that head-axis rotation was decoded by relatively simple spatial accumulation over the visual field, while the transversal motion was decoded by more complex spatial interaction of visual information. These synthetic model examinations support the idea that visual motion analyses eliciting the reflexive motor responses, which are critical in interacting with the external world, are acquired for decoding self-motion.


Asunto(s)
Percepción de Movimiento , Humanos , Percepción de Movimiento/fisiología , Movimiento/fisiología , Rotación , Estimulación Luminosa/métodos
5.
iScience ; 26(1): 105751, 2023 Jan 20.
Artículo en Inglés | MEDLINE | ID: mdl-36590158

RESUMEN

Hierarchical brain-information-processing schemes have frequently assumed that the flexible but slow voluntary action modulates a direct sensorimotor process that can quickly generate a reaction in dynamical interaction. Here we show that the quick visuomotor process for manual movement is modulated by postural and visual instability contexts that are related but remote and prior states to manual movements. A preceding unstable postural context significantly enhanced the reflexive manual response induced by a large-field visual motion during hand reaching while the response was evidently weakened by imposing a preceding random-visual-motion context. These modulations are successfully explained by the Bayesian optimal formulation in which the manual response elicited by visual motion is ascribed to the compensatory response to the estimated self-motion affected by the preceding contextual situations. Our findings suggest an implicit and functional mechanism that links the variability and uncertainty of remote states to the quick sensorimotor transformation.

6.
Proc Biol Sci ; 289(1988): 20221977, 2022 12 14.
Artículo en Inglés | MEDLINE | ID: mdl-36475437

RESUMEN

During the haptic exploration of a planar surface, slight resistances against the hand's movement are illusorily perceived as asperities (bumps) in the surface. If the surface being touched is one's own skin, an actual bump would also produce increased tactile pressure from the moving finger onto the skin. We investigated how kinaesthetic and tactile signals combine to produce haptic perceptions during self-touch. Participants performed two successive movements with the right hand. A haptic force-control robot applied resistances to both movements, and participants judged which movement was felt to contain the larger bump. An additional robot delivered simultaneous but task-irrelevant tactile stroking to the left forearm. These strokes contained either increased or decreased tactile pressure synchronized with the resistance-induced illusory bump encountered by the right hand. We found that the size of bumps perceived by the right hand was enhanced by an increase in left tactile pressure, but also by a decrease. Tactile event detection was thus transferred interhemispherically, but the sign of the tactile information was not respected. Randomizing (rather than blocking) the presentation order of left tactile stimuli abolished these interhemispheric enhancement effects. Thus, interhemispheric transfer during bimanual self-touch requires a stable model of temporally synchronized events, but does not require geometric consistency between hemispheric information, nor between tactile and kinaesthetic representations of a single common object.


Asunto(s)
Comunicación , Autoimagen , Humanos
7.
Sci Rep ; 12(1): 16708, 2022 10 06.
Artículo en Inglés | MEDLINE | ID: mdl-36202958

RESUMEN

Sensory prediction-error is vital to discriminating whether sensory inputs are caused externally or are the consequence of self-action, thereby contributing to a stable perception of the external world and building sense of agency. However, it remains unexplored whether prediction error of self-action is also used to estimate the internal body condition. To address this point, we examined whether prediction error affects the perceived intensity of muscle fatigue. Participants evaluated fatigue while maintaining repetitive finger movements. To provide prediction error, we inserted a temporal delay into online visual feedback of self-movements. The results show that the subjective rating of muscle fatigue significantly increased under the delayed visual feedback, suggesting that prediction error enhances the perception of muscle fatigue. Furthermore, we introduced visual feedback that preceded actual finger movements to test whether the temporal direction of the mismatch is crucial in estimating muscle fatigue. We found that perceived fatigue was significantly weaker with preceding visual feedback compared to normal feedback, showing that the perception of muscle fatigue is affected by the signed prediction-error. Our findings support the idea that the brain flexibly attributes prediction errors to a self-origin with keeping sense of agency, or external origin by considering contexts and error characteristics.


Asunto(s)
Fatiga Muscular , Desempeño Psicomotor , Retroalimentación Sensorial/fisiología , Humanos , Movimiento , Desempeño Psicomotor/fisiología , Percepción Social , Percepción Visual/fisiología
8.
iScience ; 25(9): 105018, 2022 Sep 16.
Artículo en Inglés | MEDLINE | ID: mdl-36105590

RESUMEN

Directional tactile pulling sensations are integral to everyday life, but their neural mechanisms remain unknown. Prior accounts hold that primary somatosensory (SI) activity is sufficient to generate pulling sensations, with alternative proposals suggesting that amodal frontal or parietal regions may be critical. We combined high-density EEG with asymmetric vibration, which creates an illusory pulling sensation, thereby unconfounding pulling sensations from unrelated sensorimotor processes. Oddballs that created opposite direction pulls to common stimuli were compared to the same oddballs after neutral common stimuli (symmetric vibration) and to neutral oddballs. We found evidence against the sensory-frontal N140 and in favor of the midline P200 tracking the emergence of pulling sensations, specifically contralateral parietal lobe activity 264-320ms, centered on the intraparietal sulcus. This suggests that SI is not sufficient to generate pulling sensations, which instead depend on the parietal association cortex, and may reflect the extraction of orientation information and related spatial processing.

9.
J Neurophysiol ; 128(2): 418-433, 2022 08 01.
Artículo en Inglés | MEDLINE | ID: mdl-35822710

RESUMEN

Interactions with objects involve simultaneous contact with multiple, not necessarily adjacent, skin regions. Although advances have been made in understanding the capacity to selectively attend to a single tactile element among distracting stimulations, here, we examine how multiple stimulus elements are explicitly integrated into an overall tactile percept. Across four experiments, participants averaged the direction of two simultaneous tactile motion trajectories of varying discrepancy delivered to different fingerpads. Averaging performance differed between within- and between-hands conditions in terms of sensitivity and precision but was unaffected by somatotopic proximity between stimulated fingers. First, precision was greater in between-hand compared with within-hand conditions, demonstrating a bimanual perceptual advantage in multi-touch integration. Second, sensitivity to the average direction was influenced by the discrepancy between individual motion signals, but only for within-hand conditions. Overall, our experiments identify key factors that influence perception of simultaneous tactile events. In particular, we show that multi-touch integration is constrained by hand-specific rather than digit-specific mechanisms.NEW & NOTEWORTHY Object manipulation involves encoding spatially and temporally extended tactile signals, yet most studies emphasize minimal units of tactile perception (e.g., selectivity). Instead, we asked participants to average two tactile motion trajectories delivered simultaneously to two different fingerpads. Our results show strong integration between multiple tactile inputs, but subject to limitations for inputs delivered within a hand. As such, the present study establishes a paradigm for studying unified experience of touch despite distinct stimulus elements.


Asunto(s)
Percepción de Movimiento , Percepción del Tacto , Dedos , Mano , Humanos , Movimiento (Física) , Tacto
10.
Curr Biol ; 32(12): 2747-2753.e6, 2022 06 20.
Artículo en Inglés | MEDLINE | ID: mdl-35580606

RESUMEN

Numerous studies have proposed that our adaptive motor behaviors depend on learning a map between sensory information and limb movement,1-3 called an "internal model." From this perspective, how the brain represents internal models is a critical issue in motor learning, especially regarding their association with spatial frames processed in motor planning.4,5 Extensive experimental evidence suggests that during planning stages for visually guided hand reaching, the brain transforms visual target representations in gaze-centered coordinates to motor commands in limb coordinates, via hand-target vectors in workspace coordinates.6-9 While numerous studies have intensively investigated whether the learning for reaching occurs in workspace or limb coordinates,10-20 the association of the learning with gaze coordinates still remains untested.21 Given the critical role of gaze-related spatial coding in reaching planning,22-26 the potential role of gaze states for learning is worth examining. Here, we show that motor memories for reaching are separately learned according to target location in gaze coordinates. Specifically, two opposing visuomotor rotations, which normally interfere with each other, can be simultaneously learned when each is associated with reaching to a foveal target and peripheral one. We also show that this gaze-dependent learning occurs in force-field adaptation. Furthermore, generalization of gaze-coupled reach adaptation is limited across central, right, and left visual fields. These results suggest that gaze states are available in the formation and recall of multiple internal models for reaching. Our findings provide novel evidence that a gaze-dependent spatial representation can provide a spatial coordinate framework for context-dependent motor learning.


Asunto(s)
Mano , Desempeño Psicomotor , Generalización Psicológica , Aprendizaje , Movimiento
11.
Curr Biol ; 32(6): 1301-1309.e3, 2022 03 28.
Artículo en Inglés | MEDLINE | ID: mdl-35167805

RESUMEN

During active movement, there is normally a tight relation between motor command and sensory representation about the resulting spatial displacement of the body. Indeed, some theories of space perception emphasize the topographic layout of sensory receptor surfaces, while others emphasize implicit spatial information provided by the intensity of motor command signals. To identify which has the primary role in spatial perception, we developed experiments based on everyday self-touch, in which the right hand strokes the left arm. We used a robot-mediated form of self-touch to decouple the spatial extent of active or passive right hand movements from their tactile consequences. Participants made active movements of the right hand between unpredictable, haptically defined start and stop positions, or the hand was passively moved between the same positions. These movements caused a stroking tactile motion by a brush along the left forearm, with minimal delay, but with an unpredictable spatial gain factor. Participants judged the spatial extent of either the right hand's movement, or of the resulting tactile stimulation to their left forearm. Across five experiments, we found that movement extent strongly interfered with tactile extent perception, and vice versa. Crucially, interference in both directions was stronger during active than passive movements. Thus, voluntary motor commands produced stronger integration of multiple sensorimotor signals underpinning the perception of personal space. Our results prompt a reappraisal of classical theories that reduce space perception to motor command information.


Asunto(s)
Percepción del Tacto , Tacto , Mano/fisiología , Humanos , Movimiento/fisiología , Autoimagen , Percepción Espacial , Tacto/fisiología , Percepción del Tacto/fisiología
12.
Cogn Neurosci ; 13(1): 47-59, 2022 01.
Artículo en Inglés | MEDLINE | ID: mdl-33307992

RESUMEN

Many perceptual studies focus on the brain's capacity to discriminate between stimuli. However, our normal experience of the world also involves integrating multiple stimuli into a single perceptual event. Neural mechanisms such as lateral inhibition are believed to enhance local differences between sensory inputs from nearby regions of the receptor surface. However, this mechanism would seem dysfunctional when sensory inputs need to be combined rather than contrasted. Here, we investigated whether the brain can strategically regulate the strength of suppressive interactions that underlie lateral inhibition between finger representations in human somatosensory processing. To do this, we compared sensory processing between conditions that required either comparing or combining information. We delivered two simultaneous tactile motion trajectories to index and middle fingertips of the right hand. Participants had to either compare the directions of the two stimuli, or to combine them to form their average direction. To reveal preparatory tuning of somatosensory cortex, we used an established event-related potential design to measure the interaction between cortical representations evoked by digital nerve shocks immediately before each tactile stimulus. Consistent with previous studies, we found a clear suppression between cortical activations when participants were instructed to compare the tactile motion directions. Importantly, this suppression was significantly reduced when participants had to combine the same stimuli. These findings suggest that the brain can strategically switch between a comparative and a combinative mode of somatosensory processing, according to the perceptual goal, by preparatorily adjusting the strength of a process akin to lateral inhibition.


Asunto(s)
Potenciales Evocados Somatosensoriales , Tacto , Potenciales Evocados Somatosensoriales/fisiología , Dedos/fisiología , Humanos , Corteza Somatosensorial/fisiología , Tacto/fisiología
13.
J Neurophysiol ; 127(1): 16-26, 2022 01 01.
Artículo en Inglés | MEDLINE | ID: mdl-34879215

RESUMEN

Humans continuously adapt their movement to a novel environment by recalibrating their sensorimotor system. Recent evidence, however, shows that explicit planning to compensate for external changes, i.e., a cognitive strategy, can also aid performance. If such a strategy is planned in external space, it should improve performance in an effector-independent manner. We tested this hypothesis by examining whether promoting a cognitive strategy during a visual-force adaptation task performed in one hand can facilitate learning for the opposite hand. Participants rapidly adjusted the height of visual bar on screen to a target level by isometrically exerting force on a handle using their right hand. Visuomotor gain increased during the task and participants learned the increased gain. Visual feedback was continuously provided for one group, whereas for another group only the endpoint of the force trajectory was presented. The latter has been reported to promote cognitive strategy use. We found that endpoint feedback produced stronger intermanual transfer of learning and slower response times than continuous feedback. In a separate experiment, we found evidence that aftereffects are reduced when only endpoint feedback is provided, a finding that has been consistently observed when cognitive strategies are used. The results suggest that intermanual transfer can be facilitated by a cognitive strategy. This indicates that the behavioral observation of intermanual transfer can be achieved either by forming an effector-independent motor representation or by sharing an effector-independent cognitive strategy between the hands.NEW & NOTEWORTHY The causes and consequences of cognitive strategy use are poorly understood. We tested whether a visuomotor task learned in a manner that may promote cognitive strategy use causes greater generalization across effectors. Visual feedback was manipulated to promote cognitive strategy use. Learning consistent with cognitive strategy use for one hand transferred to the unlearned hand. Our result suggests that intermanual transfer can result from a common cognitive strategy used to control both hands.


Asunto(s)
Adaptación Fisiológica/fisiología , Retroalimentación Sensorial/fisiología , Mano/fisiología , Desempeño Psicomotor/fisiología , Pensamiento/fisiología , Transferencia de Experiencia en Psicología/fisiología , Adulto , Femenino , Humanos , Masculino , Adulto Joven
14.
iScience ; 24(12): 103390, 2021 Dec 17.
Artículo en Inglés | MEDLINE | ID: mdl-34841229

RESUMEN

Can we recover self-motion from vision? This basic issue remains unsolved since, while the human visual system is known to estimate the direction of self-motion from optic flow, it remains unclear whether it also estimates the speed. Importantly, the latter requires disentangling self-motion speed and depths of objects in the scene as retinal velocity depends on both. Here we show that our automatic regulator of walking speed based on vision, which estimates and maintains the speed to its preferred range by adjusting stride length, is robust to changes in the depths. The robustness was not explained by temporal-frequency-based speed coding previously suggested to underlie depth-invariant object-motion perception. Meanwhile, it broke down, not only when the interocular distance was virtually manipulated but also when monocular depth cues were deceptive. These observations suggest that our visuomotor system embeds a speedometer that calculates self-motion speed from vision by integrating monocular/binocular depth and motion cues.

15.
Neural Netw ; 144: 573-590, 2021 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-34634605

RESUMEN

Understanding information processing in the brain-and creating general-purpose artificial intelligence-are long-standing aspirations of scientists and engineers worldwide. The distinctive features of human intelligence are high-level cognition and control in various interactions with the world including the self, which are not defined in advance and are vary over time. The challenge of building human-like intelligent machines, as well as progress in brain science and behavioural analyses, robotics, and their associated theoretical formalisations, speaks to the importance of the world-model learning and inference. In this article, after briefly surveying the history and challenges of internal model learning and probabilistic learning, we introduce the free energy principle, which provides a useful framework within which to consider neuronal computation and probabilistic world models. Next, we showcase examples of human behaviour and cognition explained under that principle. We then describe symbol emergence in the context of probabilistic modelling, as a topic at the frontiers of cognitive robotics. Lastly, we review recent progress in creating human-like intelligence by using novel probabilistic programming languages. The striking consensus that emerges from these studies is that probabilistic descriptions of learning and inference are powerful and effective ways to create human-like artificial intelligent machines and to understand intelligence in the context of how humans interact with their world.


Asunto(s)
Inteligencia Artificial , Modelos Estadísticos , Encéfalo , Cognición , Humanos , Inteligencia
16.
Neural Netw ; 144: 507-521, 2021 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-34601363

RESUMEN

Our brain can be recognized as a network of largely hierarchically organized neural circuits that operate to control specific functions, but when acting in parallel, enable the performance of complex and simultaneous behaviors. Indeed, many of our daily actions require concurrent information processing in sensorimotor, associative, and limbic circuits that are dynamically and hierarchically modulated by sensory information and previous learning. This organization of information processing in biological organisms has served as a major inspiration for artificial intelligence and has helped to create in silico systems capable of matching or even outperforming humans in several specific tasks, including visual recognition and strategy-based games. However, the development of human-like robots that are able to move as quickly as humans and respond flexibly in various situations remains a major challenge and indicates an area where further use of parallel and hierarchical architectures may hold promise. In this article we review several important neural and behavioral mechanisms organizing hierarchical and predictive processing for the acquisition and realization of flexible behavioral control. Then, inspired by the organizational features of brain circuits, we introduce a multi-timescale parallel and hierarchical learning framework for the realization of versatile and agile movement in humanoid robots.


Asunto(s)
Inteligencia Artificial , Robótica , Control de la Conducta , Simulación por Computador , Humanos , Aprendizaje
17.
J Neurophysiol ; 126(3): 816-826, 2021 09 01.
Artículo en Inglés | MEDLINE | ID: mdl-34320845

RESUMEN

When reaching for an object with the hand, the gaze is usually directed at the target. In a laboratory setting, fixation is strongly maintained at the reach target until the reaching is completed, a phenomenon known as "gaze anchoring." While conventional accounts of such tight eye-hand coordination have often emphasized the internal synergetic linkage between both motor systems, more recent optimal control theories regard motor coordination as the adaptive solution to task requirements. We here investigated to what degree gaze control during reaching is modulated by task demands. We adopted a gaze-anchoring paradigm in which participants had to reach for a target location. During the reach, they additionally had to make a saccadic eye movement to a salient visual cue presented at locations other than the target. We manipulated the task demands by independently changing reward contingencies for saccade reaction time (RT) and reaching accuracy. On average, both saccade RTs and reach error varied systematically with reward condition, with reach accuracy improving when the saccade was delayed. The distribution of the saccade RTs showed two types of eye movements: fast saccades with short RTs, and voluntary saccade with longer RTs. Increased reward for high reach accuracy reduced the probability of fast saccades but left their latency unchanged. The results suggest that gaze anchoring acts through a suppression of fast saccades, a mechanism that can be adaptively adjusted to the current task demands.NEW & NOTEWORTHY During visually guided reaching, our eyes usually fixate the target and saccades elsewhere are delayed ("gaze anchoring"). We here show that the degree of gaze anchoring is flexibly modulated by the reward contingencies of saccade latency and reach accuracy. Reach error became larger when saccades occurred earlier. These results suggest that early saccades are costly for reaching and the brain modulates inhibitory online coordination from the hand to the eye system depending on task requirements.


Asunto(s)
Mano/fisiología , Movimiento , Desempeño Psicomotor , Movimientos Sacádicos , Adulto , Femenino , Humanos , Masculino , Recompensa
18.
Exp Brain Res ; 239(4): 1047-1059, 2021 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-33528597

RESUMEN

Previous studies (Haswell et al. in Nat Neurosci 12:970-972, 2009; Marko et al. in Brain J Neurol 138:784-797, 2015) reported that people with autism rely less on vision for learning to reach in a force field. This suggested a possibility that they have difficulties in extracting force information from visual motion signals, a process called inverse dynamics computation. Our recent study (Takamuku et al. in J Int Soc Autism Res 11:1062-1075, 2018) examined the ability of inverse computation with two perceptual tasks and found similar performances in typical and autistic adults. However, this tested the computation only in the context of sensory perception while it was possible that the suspected disability is specific to the motor domain. Here, in order to address the concern, we tested the use of inverse dynamics computation in the context of motor control by measuring changes in grip timing caused by seeing/not seeing a controlled object. The motion of the object was informative of its inertial force and typical participants improved their grip timing based on the visual feedback. Our interest was on whether the autism participants show the same improvement. While some autism participants showed atypical hand slowing when seeing the controlled object, we found no evidence of abnormalities in the inverse computation in our grip timing task or in a replication of the perceptual task. This suggests that the ability of inverse dynamics computation is preserved not only for sensory perception but also for motor control in adults with autism.


Asunto(s)
Trastorno Autístico , Adulto , Retroalimentación Sensorial , Mano , Fuerza de la Mano , Humanos , Movimiento (Física) , Desempeño Psicomotor
19.
Neuropsychologia ; 151: 107729, 2021 01 22.
Artículo en Inglés | MEDLINE | ID: mdl-33346045

RESUMEN

Perception of space has puzzled scientists since antiquity, and is among the foundational questions of scientific psychology. Classical "local sign" theories assert that perception of spatial extent ultimately derives from efferent signals specifying the intensity of motor commands. Everyday cases of self-touch, such as stroking the left forearm with the right index fingertip, provide an important platform for studying spatial perception, because of the tight correlation between motor and tactile extents. Nevertheless, if the motor and sensory information in self-touch were artificially decoupled, these classical theories would clearly predict that motor signals - especially if self-generated rather than passive - should influence spatial perceptual judgements, but not vice versa. We tested this hypothesis by quantifying the contribution of tactile, kinaesthetic, and motor information to judgements of spatial extent. In a self-touch paradigm involving two coupled robots in master-slave configuration, voluntary movements of the right-hand produced simultaneous tactile stroking on the left forearm. Crucially, the coupling between robots was manipulated so that tactile stimulation could be shorter, equal, or longer in extent than the movement that caused it. Participants judged either the extent of the movement, or the extent of the tactile stroke. By controlling sensorimotor gains in this way, we quantified how motor signals influence tactile spatial perception, and vice versa. Perception of tactile extent was strongly biased by the amplitude of the movement performed. Importantly, touch also affected the perceived extent of movement. Finally, the effect of movement on touch was significantly stronger when movements were actively-generated compared to when the participant's right hand was passively moved by the experimenter. Overall, these results suggest that motor signals indeed dominate the construction of spatial percepts, at least when the normal tight correlation between motor and sensory signals is broken. Importantly, however, this dominance is not total, as classical theory might suggest.


Asunto(s)
Percepción del Tacto , Tacto , Mano , Humanos , Movimiento , Percepción Espacial
20.
J Neurosci ; 40(31): 6035-6048, 2020 07 29.
Artículo en Inglés | MEDLINE | ID: mdl-32611708

RESUMEN

Control of the body requires inhibiting complex actions, involving contracting and relaxing muscles. However, little is known of how voluntary commands to relax a muscle are cancelled. Action inhibition causes both suppression of muscle activity and the transient excitation of antagonist muscles, the latter being termed active breaking. We hypothesized that active breaking is present when stopping muscle relaxations. Stop signal experiments were used to compare the mechanisms of active breaking for muscle relaxations and contractions in male and female human participants. In experiments 1 and 2, go signals were presented that required participants to contract or relax their biceps or triceps muscle. Infrequent Stop signals occurred after fixed delays (0-500 ms), requiring that participants cancelled go commands. In experiment 3, participants increased (contract) or decreased (relax) an existing isometric finger abduction depending on the go signal, and cancelled these force changes whenever Stop signals occurred (dynamically adjusted delay). We found that muscle relaxations were stopped rapidly, met predictions of existing race models, and had Stop signal reaction times that correlated with those observed during the stopping of muscle contractions, suggesting shared control mechanisms. However, stopped relaxations were preceded by transient increases in electromyography (EMG), while stopped contractions were preceded by decreases in EMG, suggesting a later divergence of control. Muscle state-specific active breaking occurred simultaneously across muscles, consistent with a central origin. Our results indicate that the later stages of action inhibition involve separate excitatory and inhibitory pathways, which act automatically to cancel complex body movements.SIGNIFICANCE STATEMENT The mechanisms of how muscle relaxations are cancelled are poorly understood. We showed in three experiments involving multiple effectors that stopping muscle relaxations involves transient bursts of EMG activity, which resemble cocontraction and have onsets that correlate with Stop signal reaction time. Comparison with the stopping of matched muscle contractions showed that active breaking was muscle state specific, being positive for relaxations and negative for contractions. The two processes were also observed to co-occur in agonist-antagonist pairs, suggesting separate pathways. The rapid, automatic activation of both pathways may explain how complex actions can be stopped at any stage of their execution.


Asunto(s)
Contracción Muscular/fisiología , Relajación Muscular/fisiología , Músculo Esquelético/fisiología , Adulto , Electromiografía , Femenino , Dedos/fisiología , Músculos Isquiosurales/fisiología , Humanos , Contracción Isométrica , Masculino , Movimiento/fisiología , Tiempo de Reacción
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...