Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
J Neurophysiol ; 2024 Jun 12.
Artigo em Inglês | MEDLINE | ID: mdl-38863427

RESUMO

Everyday actions like moving the head, walking around and grasping objects are typically self-controlled. This presents a problem when studying the signals encoding such actions because active self-movement is difficult to control experimentally. Available techniques demand repeatable trials, but each action is unique, making it difficult to measure fundamental properties like psychophysical thresholds. We present a novel paradigm that recovers both precision and bias of self-movement signals with minimal constraint on the participant. The paradigm relies on linking image motion to previous self-movement, and two experimental phases to extract the signal encoding the latter. The paradigm takes care of a hidden source of external noise not previously accounted for in techniques that link display motion to self-movement in real time (e.g. virtual reality). We use head rotations as an example of self-movement, and show that the precision of the signals encoding head movement depends on whether they are being used to judge visual motion or auditory motion. We find that perceived motion is slowed during head movement in both cases. The 'non-image' signals encoding active head rotation (motor commands, proprioception and vestibular cues) are therefore biased towards lower speeds and/or displacements. In a second experiment, we trained participants to rotate their heads at different rates and found that the imprecision of the head rotation signal rises proportionally with head speed (Weber's Law). We discuss the findings in terms of the different motion cues used by vision and hearing, and the implications they have for Bayesian models of motion perception.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA