Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 79
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
Intervalo de año de publicación
1.
J Vis ; 24(7): 6, 2024 Jul 02.
Artículo en Inglés | MEDLINE | ID: mdl-38984899

RESUMEN

It is reasonable to assume that where people look in the world is largely determined by what they are doing. The reasoning is that the activity determines where it is useful to look at each moment in time. Assuming that it is vital to accurately judge the positions of the steps when navigating a staircase, it is surprising that people differ a lot in the extent to which they look at the steps. Apparently, some people consider the accuracy of peripheral vision, predictability of the step size, and feeling the edges of the steps with their feet to be good enough. If so, occluding part of the view of the staircase and making it more important to place one's feet gently might make it more beneficial to look directly at the steps before stepping onto them, so that people will more consistently look at many steps. We tested this idea by asking people to walk on staircases, either with or without a tray with two cups of water on it. When carrying the tray, people walked more slowly, but they shifted their gaze across steps in much the same way as they did when walking without the tray. They did not look at more steps. There was a clear positive correlation between the fraction of steps that people looked at when walking with and without the tray. Thus, the variability in the extent to which people look at the steps persists when one makes walking on the staircase more challenging.


Asunto(s)
Fijación Ocular , Caminata , Humanos , Caminata/fisiología , Fijación Ocular/fisiología , Masculino , Adulto , Femenino , Adulto Joven , Movimientos Oculares/fisiología , Percepción Visual/fisiología
2.
Behav Brain Sci ; 47: e48, 2024 Feb 05.
Artículo en Inglés | MEDLINE | ID: mdl-38311450

RESUMEN

We disagree with Almaatouq et al. that no realistic alternative exists to the "one-at-a-time" paradigm. Seventy years ago, Egon Brunswik introduced representative design, which offers a clear path to commensurability and generality. Almaatouq et al.'s integrative design cannot guarantee the external validity and generalizability of results which is sorely needed, while representative design tackles the problem head on.

3.
Behav Res Methods ; 2024 Jan 10.
Artículo en Inglés | MEDLINE | ID: mdl-38200239

RESUMEN

We built a novel setup to record large gaze shifts (up to 140[Formula: see text]). The setup consists of a wearable eye tracker and a high-speed camera with fiducial marker technology to track the head. We tested our setup by replicating findings from the classic eye-head gaze shift literature. We conclude that our new inexpensive setup is good enough to investigate the dynamics of large eye-head gaze shifts. This novel setup could be used for future research on large eye-head gaze shifts, but also for research on gaze during e.g., human interaction. We further discuss reference frames and terminology in head-free eye tracking. Despite a transition from head-fixed eye tracking to head-free gaze tracking, researchers still use head-fixed eye movement terminology when discussing world-fixed gaze phenomena. We propose to use more specific terminology for world-fixed phenomena, including gaze fixation, gaze pursuit, and gaze saccade.

4.
Behav Res Methods ; 56(4): 3280-3299, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38424292

RESUMEN

Blinks, the closing and opening of the eyelids, are used in a wide array of fields where human function and behavior are studied. In data from video-based eye trackers, blink rate and duration are often estimated from the pupil-size signal. However, blinks and their parameters can be estimated only indirectly from this signal, since it does not explicitly contain information about the eyelid position. We ask whether blinks detected from an eye openness signal that estimates the distance between the eyelids (EO blinks) are comparable to blinks detected with a traditional algorithm using the pupil-size signal (PS blinks) and how robust blink detection is when data quality is low. In terms of rate, there was an almost-perfect overlap between EO and PS blink (F1 score: 0.98) when the head was in the center of the eye tracker's tracking range where data quality was high and a high overlap (F1 score 0.94) when the head was at the edge of the tracking range where data quality was worse. When there was a difference in blink rate between EO and PS blinks, it was mainly due to data loss in the pupil-size signal. Blink durations were about 60 ms longer in EO blinks compared to PS blinks. Moreover, the dynamics of EO blinks was similar to results from previous literature. We conclude that the eye openness signal together with our proposed blink detection algorithm provides an advantageous method to detect and describe blinks in greater detail.


Asunto(s)
Algoritmos , Parpadeo , Tecnología de Seguimiento Ocular , Humanos , Parpadeo/fisiología , Pupila/fisiología , Párpados/fisiología , Masculino , Adulto , Femenino , Movimientos Oculares/fisiología
5.
Behav Res Methods ; 56(3): 1476-1484, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-37326770

RESUMEN

According to the proposal for a minimum reporting guideline for an eye tracking study by Holmqvist et al. (2022), the accuracy (in degrees) of eye tracking data should be reported. Currently, there is no easy way to determine accuracy for wearable eye tracking recordings. To enable determining the accuracy quickly and easily, we have produced a simple validation procedure using a printable poster and accompanying Python software. We tested the poster and procedure with 61 participants using one wearable eye tracker. In addition, the software was tested with six different wearable eye trackers. We found that the validation procedure can be administered within a minute per participant and provides measures of accuracy and precision. Calculating the eye-tracking data quality measures can be done offline on a simple computer and requires no advanced computer skills.


Asunto(s)
Exactitud de los Datos , Tecnología de Seguimiento Ocular , Humanos , Programas Informáticos
6.
Behav Res Methods ; 56(3): 1900-1915, 2024 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-37101100

RESUMEN

Computer-vision-based gaze estimation refers to techniques that estimate gaze direction directly from video recordings of the eyes or face without the need for an eye tracker. Although many such methods exist, their validation is often found in the technical literature (e.g., computer science conference papers). We aimed to (1) identify which computer-vision-based gaze estimation methods are usable by the average researcher in fields such as psychology or education, and (2) evaluate these methods. We searched for methods that do not require calibration and have clear documentation. Two toolkits, OpenFace and OpenGaze, were found to fulfill these criteria. First, we present an experiment where adult participants fixated on nine stimulus points on a computer screen. We filmed their face with a camera and processed the recorded videos with OpenFace and OpenGaze. We conclude that OpenGaze is accurate and precise enough to be used in screen-based experiments with stimuli separated by at least 11 degrees of gaze angle. OpenFace was not sufficiently accurate for such situations but can potentially be used in sparser environments. We then examined whether OpenFace could be used with horizontally separated stimuli in a sparse environment with infant participants. We compared dwell measures based on OpenFace estimates to the same measures based on manual coding. We conclude that OpenFace gaze estimates may potentially be used with measures such as relative total dwell time to sparse, horizontally separated areas of interest, but should not be used to draw conclusions about measures such as dwell duration.


Asunto(s)
Movimientos Oculares , Visión Ocular , Adulto , Humanos , Ojo , Calibración , Grabación en Video
7.
Behav Res Methods ; 55(2): 657-669, 2023 02.
Artículo en Inglés | MEDLINE | ID: mdl-35419703

RESUMEN

Estimating the gaze direction with a digital video-based pupil and corneal reflection (P-CR) eye tracker is challenging partly since a video camera is limited in terms of spatial and temporal resolution, and because the captured eye images contain noise. Through computer simulation, we evaluated the localization accuracy of pupil-, and CR centers in the eye image for small eye rotations (≪ 1 deg). Results highlight how inaccuracies in center localization are related to 1) how many pixels the pupil and CR span in the eye camera image, 2) the method to compute the center of the pupil and CRs, and 3) the level of image noise. Our results provide a possible explanation to why the amplitude of small saccades may not be accurately estimated by many currently used video-based eye trackers. We conclude that eye movements with arbitrarily small amplitudes can be accurately estimated using the P-CR eye-tracking principle given that the level of image noise is low and the pupil and CR span enough pixels in the eye camera, or if localization of the CR is based on the intensity values in the eye image instead of a binary representation.


Asunto(s)
Movimientos Oculares , Movimientos Sacádicos , Humanos , Simulación por Computador , Pupila
8.
Behav Res Methods ; 55(8): 4128-4142, 2023 12.
Artículo en Inglés | MEDLINE | ID: mdl-36326998

RESUMEN

How well can modern wearable eye trackers cope with head and body movement? To investigate this question, we asked four participants to stand still, walk, skip, and jump while fixating a static physical target in space. We did this for six different eye trackers. All the eye trackers were capable of recording gaze during the most dynamic episodes (skipping and jumping). The accuracy became worse as movement got wilder. During skipping and jumping, the biggest error was 5.8∘. However, most errors were smaller than 3∘. We discuss the implications of decreased accuracy in the context of different research scenarios.


Asunto(s)
Movimientos Oculares , Dispositivos Electrónicos Vestibles , Humanos , Movimiento , Medidas del Movimiento Ocular , Movimientos de la Cabeza
9.
Behav Res Methods ; 2023 Jul 28.
Artículo en Inglés | MEDLINE | ID: mdl-37507649

RESUMEN

A guideline is proposed that comprises the minimum items to be reported in research studies involving an eye tracker and human or non-human primate participant(s). This guideline was developed over a 3-year period using a consensus-based process via an open invitation to the international eye tracking community. This guideline will be reviewed at maximum intervals of 4 years.

10.
Behav Res Methods ; 55(1): 364-416, 2023 01.
Artículo en Inglés | MEDLINE | ID: mdl-35384605

RESUMEN

In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "An empirically based minimal reporting guideline").


Asunto(s)
Movimientos Oculares , Tecnología de Seguimiento Ocular , Humanos , Investigación Empírica
11.
Infancy ; 27(1): 25-45, 2022 01.
Artículo en Inglés | MEDLINE | ID: mdl-34687142

RESUMEN

The Tobii Pro TX300 is a popular eye tracker in developmental eye-tracking research, yet it is no longer manufactured. If a TX300 breaks down, it may have to be replaced. The data quality of the replacement eye tracker may differ from that of the TX300, which may affect the experimental outcome measures. This is problematic for longitudinal and multi-site studies, and for researchers replacing eye trackers between studies. We, therefore, ask how the TX300 and its successor, the Tobii Pro Spectrum, compare in terms of eye-tracking data quality. Data quality-operationalized through precision, accuracy, and data loss-was compared between eye trackers for three age groups (around 5-months, 10-months, and 3-years). Precision was better for all gaze position signals obtained with the Spectrum in comparison to the TX300. Accuracy of the Spectrum was higher for the 5-month-old and 10-month-old children. For the three-year-old children, accuracy was similar across both eye trackers. Gaze position signals from the Spectrum exhibited lower proportions of data loss, and the duration of the data loss periods tended to be shorter. In conclusion, the Spectrum produces gaze position signals with higher data quality, especially for the younger infants. Implications for data analysis are discussed.


Asunto(s)
Exactitud de los Datos , Tecnología de Seguimiento Ocular , Niño , Preescolar , Recolección de Datos , Movimientos Oculares , Humanos , Lactante
12.
Behav Res Methods ; 54(6): 2765-2776, 2022 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-35023066

RESUMEN

Eye trackers are applied in many research fields (e.g., cognitive science, medicine, marketing research). To give meaning to the eye-tracking data, researchers have a broad choice of classification methods to extract various behaviors (e.g., saccade, blink, fixation) from the gaze signal. There is extensive literature about the different classification algorithms. Surprisingly, not much is known about the effect of fixation and saccade selection rules that are usually (implicitly) applied. We want to answer the following question: What is the impact of the selection-rule parameters (minimal saccade amplitude and minimal fixation duration) on the distribution of fixation durations? To answer this question, we used eye-tracking data with high and low quality and seven different classification algorithms. We conclude that selection rules play an important role in merging and selecting fixation candidates. For eye-tracking data with good-to-moderate precision (RMSD < 0.5∘), the classification algorithm of choice does not matter too much as long as it is sensitive enough and is followed by a rule that selects saccades with amplitudes larger than 1.0∘ and a rule that selects fixations with duration longer than 60 ms. Because of the importance of selection, researchers should always report whether they performed selection and the values of their parameters.

13.
Behav Res Methods ; 53(5): 1986-2006, 2021 10.
Artículo en Inglés | MEDLINE | ID: mdl-33709298

RESUMEN

The pupil size artefact (PSA) is the gaze deviation reported by an eye tracker during pupil size changes if the eye does not rotate. In the present study, we ask three questions: 1) how stable is the PSA over time, 2) does the PSA depend on properties of the eye tracker set up, and 3) does the PSA depend on the participants' viewing direction? We found that the PSA is very stable over time for periods as long as 1 year, but may differ between participants. When comparing the magnitude of the PSA between eye trackers, we found the magnitude of the obtained PSA to be related to the direction of the eye-tracker-camera axis, suggesting that the angle between the participants' viewing direction and the camera axis affects the PSA. We then investigated the PSA as a function of the participants' viewing direction. The PSA was non-zero for viewing direction 0∘ and depended on the viewing direction. These findings corroborate the suggestion by Choe et al. (Vision Research 118(6755):48-59, 2016), that the PSA can be described by an idiosyncratic and a viewing direction-dependent component. Based on a simulation, we cannot claim that the viewing direction-dependent component of the PSA is caused by the optics of the cornea.


Asunto(s)
Artefactos , Pupila , Medidas del Movimiento Ocular , Movimientos Oculares , Humanos
14.
Behav Res Methods ; 53(4): 1592-1608, 2021 08.
Artículo en Inglés | MEDLINE | ID: mdl-33409984

RESUMEN

There is a long history of interest in looking behavior during human interaction. With the advance of (wearable) video-based eye trackers, it has become possible to measure gaze during many different interactions. We outline the different types of eye-tracking setups that currently exist to investigate gaze during interaction. The setups differ mainly with regard to the nature of the eye-tracking signal (head- or world-centered) and the freedom of movement allowed for the participants. These features place constraints on the research questions that can be answered about human interaction. We end with a decision tree to help researchers judge the appropriateness of specific setups.


Asunto(s)
Movimientos Oculares , Tecnología de Seguimiento Ocular , Cabeza , Humanos , Movimiento
15.
J Vis ; 20(10): 5, 2020 10 01.
Artículo en Inglés | MEDLINE | ID: mdl-33007079

RESUMEN

As humans move through parts of their environment, they meet others that may or may not try to interact with them. Where do people look when they meet others? We had participants wearing an eye tracker walk through a university building. On the way, they encountered nine "walkers." Walkers were instructed to e.g. ignore the participant, greet him or her, or attempt to hand out a flyer. The participant's gaze was mostly directed to the currently relevant body parts of the walker. Thus, the participants gaze depended on the walker's action. Individual differences in participant's looking behavior were consistent across walkers. Participants who did not respond to the walker seemed to look less at that walker, although this difference was not statistically significant. We suggest that models of gaze allocation should take social motivation into account.


Asunto(s)
Fijación Ocular/fisiología , Caminata , Adulto , Movimientos Oculares/fisiología , Femenino , Humanos , Masculino
16.
Behav Res Methods ; 52(3): 1140-1160, 2020 06.
Artículo en Inglés | MEDLINE | ID: mdl-31898290

RESUMEN

Mobile head-worn eye trackers allow researchers to record eye-movement data as participants freely move around and interact with their surroundings. However, participant behavior may cause the eye tracker to slip on the participant's head, potentially strongly affecting data quality. To investigate how this eye-tracker slippage affects data quality, we designed experiments in which participants mimic behaviors that can cause a mobile eye tracker to move. Specifically, we investigated data quality when participants speak, make facial expressions, and move the eye tracker. Four head-worn eye-tracking setups were used: (i) Tobii Pro Glasses 2 in 50 Hz mode, (ii) SMI Eye Tracking Glasses 2.0 60 Hz, (iii) Pupil-Labs' Pupil in 3D mode, and (iv) Pupil-Labs' Pupil with the Grip gaze estimation algorithm as implemented in the EyeRecToo software. Our results show that whereas gaze estimates of the Tobii and Grip remained stable when the eye tracker moved, the other systems exhibited significant errors (0.8-3.1∘ increase in gaze deviation over baseline) even for the small amounts of glasses movement that occurred during the speech and facial expressions tasks. We conclude that some of the tested eye-tracking setups may not be suitable for investigating gaze behavior when high accuracy is required, such as during face-to-face interaction scenarios. We recommend that users of mobile head-worn eye trackers perform similar tests with their setups to become aware of its characteristics. This will enable researchers to design experiments that are robust to the limitations of their particular eye-tracking setup.


Asunto(s)
Exactitud de los Datos , Movimientos Oculares , Adulto , Medidas del Movimiento Ocular , Femenino , Movimientos de la Cabeza , Humanos , Masculino , Pupila
17.
Behav Res Methods ; 51(6): 2712-2721, 2019 12.
Artículo en Inglés | MEDLINE | ID: mdl-30350022

RESUMEN

Most modern video eye trackers deliver binocular data. Many researchers take the average of the left and right eye signals (the version signal) to decrease the variable error (precision) up to a factor of [Formula: see text]. What happens to the systematic error (accuracy) if the left and right eye signals are averaged? To determine the systematic error, we conducted a calibration validation in two experiments (n= 79 and n = 64). The systematic error was computed for the left eye, right eye, and version signals separately. In respectively 29.5 and 25.8% of the participants, the systematic error of a single eye signal was lower than that of the version signal at the cost of a higher variable error. If a small variable error is desirable, and the difference between the left and the right eye is not the topic of study, one should average position data from the left and the right eye (in other words, use the version signal). If a small systematic error is desirable, one should use the signal (from left eye, right eye or version) that delivers the best accuracy. In the latter case, this may cause worse precision than that of the version signal.


Asunto(s)
Calibración , Exactitud de los Datos , Movimientos Oculares/fisiología , Error Científico Experimental/estadística & datos numéricos , Adulto , Anciano , Femenino , Humanos , Masculino , Persona de Mediana Edad , Adulto Joven
20.
Perception ; 47(2): 125-142, 2018 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-29183222

RESUMEN

Several models of selection in search predict that saccades are biased toward conspicuous objects (also referred to as salient objects). Indeed, it has been demonstrated that initial saccades are biased toward the most conspicuous candidate. However, in a recent study, no such bias was found for the second saccade, and it was concluded that the attraction of conspicuous elements is limited to only short-latency initial saccades. This conclusion is based on only a single feature manipulation (orientation contrast) and conflicts with the prediction of influential salience models. Here, we investigate whether this result can be generalized beyond the domain of orientation. In displays containing three luminance annuli (Experiment 1), we find a considerable bias toward the most conspicuous candidate for the second saccade. In Experiment 1, the target could not be discriminated peripherally. When we made the target peripherally discriminable, the second saccade was no longer biased toward the more conspicuous candidate (Experiment 2). Thus, conspicuity plays a role in saccadic selection beyond the initial saccade. Whether second saccades are biased toward conspicuous objects appears to depend on the type of feature contrast underlying the conspicuity and the peripheral discriminability of target properties.


Asunto(s)
Atención/fisiología , Fijación Ocular/fisiología , Movimientos Sacádicos/fisiología , Percepción Visual/fisiología , Adulto , Medidas del Movimiento Ocular , Humanos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA