Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros












Base de datos
Intervalo de año de publicación
1.
Sci Rep ; 14(1): 10011, 2024 05 01.
Artículo en Inglés | MEDLINE | ID: mdl-38693174

RESUMEN

Interacting with the environment often requires the integration of visual and haptic information. Notably, perceiving external objects depends on how our brain binds sensory inputs into a unitary experience. The feedback provided by objects when we interact (through our movements) with them might then influence our perception. In VR, the interaction with an object can be dissociated by the size of the object itself by means of 'colliders' (interactive spaces surrounding the objects). The present study investigates possible after-effects in size discrimination for virtual objects after exposure to a prolonged interaction characterized by visual and haptic incongruencies. A total of 96 participants participated in this virtual reality study. Participants were distributed into four groups, in which they were required to perform a size discrimination task between two cubes before and after 15 min of a visuomotor task involving the interaction with the same virtual cubes. Each group interacted with a different cube where the visual (normal vs. small collider) and the virtual cube's haptic (vibration vs. no vibration) features were manipulated. The quality of interaction (number of touches and trials performed) was used as a dependent variable to investigate the performance in the visuomotor task. To measure bias in size perception, we compared changes in point of subjective equality (PSE) before and after the task in the four groups. The results showed that a small visual collider decreased manipulation performance, regardless of the presence or not of the haptic signal. However, change in PSE was found only in the group exposed to the small visual collider with haptic feedback, leading to increased perception of the cube size. This after-effect was absent in the only visual incongruency condition, suggesting that haptic information and multisensory integration played a crucial role in inducing perceptual changes. The results are discussed considering the recent findings in visual-haptic integration during multisensory information processing in real and virtual environments.


Asunto(s)
Realidad Virtual , Percepción Visual , Humanos , Masculino , Femenino , Adulto , Percepción Visual/fisiología , Adulto Joven , Desempeño Psicomotor/fisiología , Percepción del Tacto/fisiología , Percepción del Tamaño/fisiología
2.
Front Hum Neurosci ; 18: 1354633, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38445099

RESUMEN

Introduction: Our brain continuously maps our body in space. It has been suggested that at least two main frames of reference are used to process somatosensory stimuli presented on our own body: the anatomical frame of reference (based on the somatotopic representation of our body in the somatosensory cortex) and the spatial frame of reference (where body parts are mapped in external space). Interestingly, a mismatch between somatotopic and spatial information significantly affects the processing of bodily information, as demonstrated by the "crossing hand" effect. However, it is not clear if this impairment occurs not only when the conflict between these frames of reference is determined by a static change in the body position (e.g., by crossing the hands) but also when new associations between motor and sensory responses are artificially created (e.g., by presenting feedback stimuli on a side of the body that is not involved in the movement). Methods: In the present study, 16 participants performed a temporal order judgment task before and after a congruent or incongruent visual-tactile-motor- task in virtual reality. During the VR task, participants had to move a cube using a virtual stick. In the congruent condition, the haptic feedback during the interaction with the cube was provided on the right hand (the one used to control the stick). In the incongruent condition, the haptic feedback was provided to the contralateral hand, simulating a sort of 'active' crossed feedback during the interaction. Using a psychophysical approach, the point of subjective equality (or PSE, i.e., the probability of responding left or right to the first stimulus in the sequence in 50% of the cases) and the JND (accuracy) were calculated for both conditions, before and after the VR-task. Results: After the VR task, compared to the baseline condition, the PSE shifted toward the hand that received the haptic feedback during the interaction (toward the right hand for the congruent condition and toward the left hand for the incongruent condition). Dicussion: This study demonstrated the possibility of inducing spatial biases in the processing of bodily information by modulating the sensory-motor interaction between stimuli in virtual environments (while keeping constant the actual position of the body in space).

3.
Front Psychol ; 14: 1301981, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38274671

RESUMEN

Finding one's way in unfamiliar environments is an essential ability. When navigating, people are overwhelmed with an enormous amount of information. However, some information might be more relevant than others. Despite the mounting knowledge about the mechanisms underlying orientational skills, and the notable effects of facial emotions on human behavior, little is known about emotions' effects on spatial navigation. Hereby, this study aimed to explore how exposure to others' negative emotional facial expressions affects wayfinding performances. Moreover, gender differences that characterize both processes were considered. Fifty-five participants (31 females) entered twice in three realistic virtual reality environments: the first time, to encode a route to find an object and then to recall the learned path to reach the same object again. In between the two explorations of the virtual environment, participants were asked to undergo a gender categorization task during which they were exposed to sixty faces showing either neutral, fearful, or angry expressions. Results showed a significant interaction between emotions, time, and gender. In particular, the exposition to fearful faces, but not angry and neutral ones, decreased males' wayfinding performances (i.e., travel times and distance travelled), while females' performances were unaffected. Possible explanations for such gender and emotional dissimilarities are discussed.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...