Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 8 de 8
1.
Article En | MEDLINE | ID: mdl-38648151

Areas of interest (AOIs) are well-established means of providing semantic information for visualizing, analyzing, and classifying gaze data. However, the usual manual annotation of AOIs is time-consuming and further impaired by ambiguities in label assignments. To address these issues, we present an interactive labeling approach that combines visualization, machine learning, and user-centered explainable annotation. Our system provides uncertainty-aware visualization to build trust in classification with an increasing number of annotated examples. It combines specifically designed EyeFlower glyphs, dimensionality reduction, and selection and exploration techniques in an integrated workflow. The approach is versatile and hardware-agnostic, supporting video stimuli from stationary and unconstrained mobile eye tracking alike. We conducted an expert review to assess labeling strategies and trust building.

2.
IEEE Trans Vis Comput Graph ; 29(1): 896-906, 2023 Jan.
Article En | MEDLINE | ID: mdl-36191101

This work investigates and compares the performance of node-link diagrams, adjacency matrices, and bipartite layouts for visualizing networks. In a crowd-sourced user study ( n=150), we measure the task accuracy and completion time of the three representations for different network classes and properties. In contrast to the literature, which covers mostly topology-based tasks (e.g., path finding) in small datasets, we mainly focus on overview tasks for large and directed networks. We consider three overview tasks on networks with 500 nodes: (T1) network class identification, (T2) cluster detection, and (T3) network density estimation, and two detailed tasks: (T4) node in-degree vs. out-degree and (T5) representation mapping, on networks with 50 and 20 nodes, respectively. Our results show that bipartite layouts are beneficial for revealing the overall network structure, while adjacency matrices are most reliable across the different tasks.

3.
IEEE Comput Graph Appl ; 42(2): 33-44, 2022.
Article En | MEDLINE | ID: mdl-35263250

Modern machines continuously log status reports over long periods of time, which are valuable data to optimize working routines. Data visualization is a commonly used tool to gain insights into these data, mostly in retrospective (e.g., to determine causal dependencies between the faults of different machines). We present an approach to bring such visual analyses to the shop floor to support reacting to faults in real time. This approach combines spatio-temporal analyses of time series using a handheld touch device with augmented reality for live monitoring. Important information augments machines directly in their real-world context, and detailed logs of current and historical events are displayed on the handheld device. In collaboration with an industry partner, we designed and tested our approach on a live production line to obtain feedback from operators. We compare our approach for monitoring and analysis with existing solutions that are currently deployed.


Augmented Reality , Commerce , Feedback , Industry , Retrospective Studies
4.
IEEE Trans Vis Comput Graph ; 23(1): 301-310, 2017 01.
Article En | MEDLINE | ID: mdl-27875146

The analysis of eye tracking data often requires the annotation of areas of interest (AOIs) to derive semantic interpretations of human viewing behavior during experiments. This annotation is typically the most time-consuming step of the analysis process. Especially for data from wearable eye tracking glasses, every independently recorded video has to be annotated individually and corresponding AOIs between videos have to be identified. We provide a novel visual analytics approach to ease this annotation process by image-based, automatic clustering of eye tracking data integrated in an interactive labeling and analysis system. The annotation and analysis are tightly coupled by multiple linked views that allow for a direct interpretation of the labeled data in the context of the recorded video stimuli. The components of our analytics environment were developed with a user-centered design approach in close cooperation with an eye tracking expert. We demonstrate our approach with eye tracking data from a real experiment and compare it to an analysis of the data by manual annotation of dynamic AOIs. Furthermore, we conducted an expert user study with 6 external eye tracking researchers to collect feedback and identify analysis strategies they used while working with our application.


Computer Graphics , Eye Movements/physiology , Image Processing, Computer-Assisted/methods , Video Recording , Algorithms , Humans
5.
IEEE Trans Vis Comput Graph ; 22(1): 61-70, 2016 Jan.
Article En | MEDLINE | ID: mdl-26529687

Evaluation has become a fundamental part of visualization research and researchers have employed many approaches from the field of human-computer interaction like measures of task performance, thinking aloud protocols, and analysis of interaction logs. Recently, eye tracking has also become popular to analyze visual strategies of users in this context. This has added another modality and more data, which requires special visualization techniques to analyze this data. However, only few approaches exist that aim at an integrated analysis of multiple concurrent evaluation procedures. The variety, complexity, and sheer amount of such coupled multi-source data streams require a visual analytics approach. Our approach provides a highly interactive visualization environment to display and analyze thinking aloud, interaction, and eye movement data in close relation. Automatic pattern finding algorithms allow an efficient exploratory search and support the reasoning process to derive common eye-interaction-thinking patterns between participants. In addition, our tool equips researchers with mechanisms for searching and verifying expected usage patterns. We apply our approach to a user study involving a visual analytics application and we discuss insights gained from this joint analysis. We anticipate our approach to be applicable to other combinations of evaluation techniques and a broad class of visualization applications.


Computer Graphics , Eye Movements/physiology , User-Computer Interface , Adult , Female , Humans , Male , Task Performance and Analysis , Young Adult
6.
IEEE Trans Vis Comput Graph ; 22(1): 1005-14, 2016 Jan.
Article En | MEDLINE | ID: mdl-26529744

We present a new visualization approach for displaying eye tracking data from multiple participants. We aim to show the spatio-temporal data of the gaze points in the context of the underlying image or video stimulus without occlusion. Our technique, denoted as gaze stripes, does not require the explicit definition of areas of interest but directly uses the image data around the gaze points, similar to thumbnails for images. A gaze stripe consists of a sequence of such gaze point images, oriented along a horizontal timeline. By displaying multiple aligned gaze stripes, it is possible to analyze and compare the viewing behavior of the participants over time. Since the analysis is carried out directly on the image data, expensive post-processing or manual annotation are not required. Therefore, not only patterns and outliers in the participants' scanpaths can be detected, but the context of the stimulus is available as well. Furthermore, our approach is especially well suited for dynamic stimuli due to the non-aggregated temporal mapping. Complementary views, i.e., markers, notes, screenshots, histograms, and results from automatic clustering, can be added to the visualization to display analysis results. We illustrate the usefulness of our technique on static and dynamic stimuli. Furthermore, we discuss the limitations and scalability of our approach in comparison to established visualization techniques.


Computer Graphics , Eye Movement Measurements , Image Processing, Computer-Assisted/methods , Eye Movements/physiology , Humans , Spatio-Temporal Analysis
7.
IEEE Comput Graph Appl ; 35(4): 64-72, 2015.
Article En | MEDLINE | ID: mdl-25974928

In many research fields, eye tracking has become an established method to analyze the distribution of visual attention in various scenarios. With the trend toward increasingly affordable and easy-to-use consumer hardware, we expect mobile eye tracking to become ubiquitous, recording massive amounts of gaze data on a regular basis in everyday personal situations. To make use of this data, new approaches for personal visual analytics will be necessary to make the data accessible for non-expert users for self-reflection and re-experiencing interesting events. We discuss how eye tracking fits in the context of personal visual analytics, the challenges that arise with its application to everyday situations, and the research perspectives of personal eye tracking. Therefore, the extraction and representation of areas of interest (AOIs) in the recorded data is a crucial part of data processing. We present a new technique to represent these AOIs from multiple videos: the AOI cloud. In our example, we apply this technique to examine the personal encounters of a user with other persons. The technique provides an accessible user interface that is also applicable to touch devices and therefore suitable for an integration into the everyday life of a user.

8.
IEEE Trans Vis Comput Graph ; 19(12): 2129-38, 2013 Dec.
Article En | MEDLINE | ID: mdl-24051779

We introduce a visual analytics method to analyze eye movement data recorded for dynamic stimuli such as video or animated graphics. The focus lies on the analysis of data of several viewers to identify trends in the general viewing behavior, including time sequences of attentional synchrony and objects with strong attentional focus. By using a space-time cube visualization in combination with clustering, the dynamic stimuli and associated eye gazes can be analyzed in a static 3D representation. Shotbased, spatiotemporal clustering of the data generates potential areas of interest that can be filtered interactively. We also facilitate data drill-down: the gaze points are shown with density-based color mapping and individual scan paths as lines in the space-time cube. The analytical process is supported by multiple coordinated views that allow the user to focus on different aspects of spatial and temporal information in eye gaze data. Common eye-tracking visualization techniques are extended to incorporate the spatiotemporal characteristics of the data. For example, heat maps are extended to motion-compensated heat maps and trajectories of scan paths are included in the space-time visualization. Our visual analytics approach is assessed in a qualitative users study with expert users, which showed the usefulness of the approach and uncovered that the experts applied different analysis strategies supported by the system.


Attention/physiology , Computer Graphics , Databases, Factual , Eye Movements/physiology , Fixation, Ocular/physiology , Information Storage and Retrieval/methods , User-Computer Interface , Algorithms , Humans , Spatio-Temporal Analysis
...