Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
MethodsX ; 12: 102662, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-38577409

RESUMEN

This article provides a step-by-step guideline for measuring and analyzing visual attention in 3D virtual reality (VR) environments based on eye-tracking data. We propose a solution to the challenges of obtaining relevant eye-tracking information in a dynamic 3D virtual environment and calculating interpretable indicators of learning and social behavior. With a method called "gaze-ray casting," we simulated 3D-gaze movements to obtain information about the gazed objects. This information was used to create graphical models of visual attention, establishing attention networks. These networks represented participants' gaze transitions between different entities in the VR environment over time. Measures of centrality, distribution, and interconnectedness of the networks were calculated to describe the network structure. The measures, derived from graph theory, allowed for statistical inference testing and the interpretation of participants' visual attention in 3D VR environments. Our method provides useful insights when analyzing students' learning in a VR classroom, as reported in a corresponding evaluation article with N = 274 participants. •Guidelines on implementing gaze-ray casting in VR using the Unreal Engine and the HTC VIVE Pro Eye.•Creating gaze-based attention networks and analyzing their network structure.•Implementation tutorials and the Open Source software code are provided via OSF: https://osf.io/pxjrc/?view_only=1b6da45eb93e4f9eb7a138697b941198.

2.
Sci Rep ; 13(1): 14672, 2023 09 06.
Artículo en Inglés | MEDLINE | ID: mdl-37673939

RESUMEN

Higher-achieving peers have repeatedly been found to negatively impact students' evaluations of their own academic abilities (i.e., Big-Fish-Little-Pond Effect). Building on social comparison theory, this pattern is assumed to result from students comparing themselves to their classmates; however, based on existing research designs, it remains unclear how exactly students make use of social comparison information in the classroom. To determine the extent to which students (N = 353 sixth graders) actively attend and respond to social comparison information in the form of peers' achievement-related behaviour, we used eye-tracking data from an immersive virtual reality (IVR) classroom. IVR classrooms offer unprecedented opportunities for psychological classroom research as they allow to integrate authentic classroom scenarios with maximum experimental control. In the present study, we experimentally varied virtual classmates' achievement-related behaviour (i.e., their hand-raising in response to the teacher's questions) during instruction, and students' eye and gaze data showed that they actively processed this social comparison information. Students who attended more to social comparison information (as indicated by more frequent and longer gaze durations at peer learners) had less favourable self-evaluations. We discuss implications for the future use of IVR environments to study behaviours in the classroom and beyond.


Asunto(s)
Comparación Social , Realidad Virtual , Animales , Humanos , Conducta Social , Relaciones Interpersonales , Estudiantes
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA