Your browser doesn't support javascript.
loading
Embodied Cross-Modal Interactions Based on an Altercentric Reference Frame.
Guo, Guanchen; Wang, Nanbo; Sun, Chu; Geng, Haiyan.
Afiliação
  • Guo G; School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China.
  • Wang N; Department of Psychology, School of Health, Fujian Medical University, Fuzhou 350122, China.
  • Sun C; School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China.
  • Geng H; School of Psychological and Cognitive Sciences, Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing 100871, China.
Brain Sci ; 14(4)2024 Mar 27.
Article em En | MEDLINE | ID: mdl-38671966
ABSTRACT
Accurate comprehension of others' thoughts and intentions is crucial for smooth social interactions, wherein understanding their perceptual experiences serves as a fundamental basis for this high-level social cognition. However, previous research has predominantly focused on the visual modality when investigating perceptual processing from others' perspectives, leaving the exploration of multisensory inputs during this process largely unexplored. By incorporating auditory stimuli into visual perspective-taking (VPT) tasks, we have designed a novel experimental paradigm in which the spatial correspondence between visual and auditory stimuli was limited to the altercentric rather than the egocentric reference frame. Overall, we found that when individuals engaged in explicit or implicit VPT to process visual stimuli from an avatar's viewpoint, the concomitantly presented auditory stimuli were also processed within this avatar-centered reference frame, revealing altercentric cross-modal interactions.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Brain Sci Ano de publicação: 2024 Tipo de documento: Article País de afiliação: China

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Brain Sci Ano de publicação: 2024 Tipo de documento: Article País de afiliação: China