Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
IEEE Trans Vis Comput Graph ; 27(2): 1193-1203, 2021 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-33074810

RESUMO

Collaborative visual analytics leverages social interaction to support data exploration and sensemaking. These processes are typically imagined as formalised, extended activities, between groups of dedicated experts, requiring expertise with sophisticated data analysis tools. However, there are many professional domains that benefit from support for short 'bursts' of data exploration between a subset of stakeholders with a diverse breadth of knowledge. Such 'casual collaborative' scenarios will require engaging features to draw users' attention, with intuitive, 'walk-up and use' interfaces. This paper presents Uplift, a novel prototype system to support 'casual collaborative visual analytics' for a campus microgrid, co-designed with local stakeholders. An elicitation workshop with key members of the building management team revealed relevant knowledge is distributed among multiple experts in their team, each using bespoke analysis tools. Uplift combines an engaging 3D model on a central tabletop display with intuitive tangible interaction, as well as augmented-reality, mid-air data visualisation, in order to support casual collaborative visual analytics for this complex domain. Evaluations with expert stakeholders from the building management and energy domains were conducted during and following our prototype development and indicate that Uplift is successful as an engaging backdrop for casual collaboration. Experts see high potential in such a system to bring together diverse knowledge holders and reveal complex interactions between structural, operational, and financial aspects of their domain. Such systems have further potential in other domains that require collaborative discussion or demonstration of models, forecasts, or cost-benefit analyses to high-level stakeholders.

2.
Front Robot AI ; 6: 5, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-33501022

RESUMO

Augmented and Virtual Reality provide unique capabilities for Mixed Reality collaboration. This paper explores how different combinations of virtual awareness cues can provide users with valuable information about their collaborator's attention and actions. In a user study (n = 32, 16 pairs), we compared different combinations of three cues: Field-of-View (FoV) frustum, Eye-gaze ray, and Head-gaze ray against a baseline condition showing only virtual representations of each collaborator's head and hands. Through a collaborative object finding and placing task, the results showed that awareness cues significantly improved user performance, usability, and subjective preferences, with the combination of the FoV frustum and the Head-gaze ray being best. This work establishes the feasibility of room-scale MR collaboration and the utility of providing virtual awareness cues.

3.
IEEE Trans Vis Comput Graph ; 24(11): 2974-2982, 2018 11.
Artigo em Inglês | MEDLINE | ID: mdl-30387715

RESUMO

The advancements in Mixed Reality (MR), Unmanned Aerial Vehicle, and multi-scale collaborative virtual environments have led to new interface opportunities for remote collaboration. This paper explores a novel concept of flying telepresence for multi-scale mixed reality remote collaboration. This work could enable remote collaboration at a larger scale such as building construction. We conducted a user study with three experiments. The first experiment compared two interfaces, static and dynamic IPD, on simulator sickness and body size perception. The second experiment tested the user perception of a virtual object size under three levels of IPD and movement gain manipulation with a fixed eye height in a virtual environment having reduced or rich visual cues. Our last experiment investigated the participant's body size perception for two levels of manipulation of the IPDs and heights using stereo video footage to simulate a flying telepresence experience. The studies found that manipulating IPDs and eye height influenced the user's size perception. We present our findings and share the recommendations for designing a multi-scale MR flying telepresence interface.


Assuntos
Gráficos por Computador , Imageamento Tridimensional/métodos , Percepção Espacial/fisiologia , Navegação Espacial/fisiologia , Realidade Virtual , Adulto , Algoritmos , Feminino , Humanos , Masculino , Interface Usuário-Computador , Adulto Jovem
4.
IEEE Comput Graph Appl ; 37(2): 66-79, 2017.
Artigo em Inglês | MEDLINE | ID: mdl-28113834

RESUMO

As wearable devices gain acceptance, we need to ask, What will user interfaces look like in a post-smartphone world? Will these future interfaces support sophisticated interactions in a mobile context? The authors draw from visual analytics concepts to address the growing need for individuals to manage information on personal devices. Spatial analytic interfaces (SAIs) can leverage the benefits of spatial interaction to enable everyday visual analytics tasks to be performed in-situ, at the most beneficial place and time. They explore the possibilities for such interfaces using head-worn display technology and discuss current developments and future research goals for the successful development of SAIs.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...