Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
Stud Health Technol Inform ; 167: 170-5, 2011.
Artículo en Inglés | MEDLINE | ID: mdl-21685662

RESUMEN

This paper introduces the technical foundations of a system designed to embed a lightweight, faithful and spatially manipulable representation of the user's hand into an otherwise virtual world - Augmented Virtuality (AV). A highly intuitive control during pointing-like near space interaction can be provided to the user, as well as a very flexible means to experimenters, in a variety of non-medical and medical contexts. Our approach essentially relies on stereoscopic video see-through Augmented Reality (AR) technology and a generic, extendible framework for managing 3-D visual hand displacements. Research from human-computer interaction, perception and motor control has contributed to the elaboration of our proposal which combines a) acting in co-location, b) avoiding occlusion violations by assuring a correct scene depth ordering and c) providing a convincing visual feedback of the user's hand. We further present two cases in which this system has already successfully been used and then outline some other applications that we think are promising, for instance, in the fields of neuromotor rehabilitation and experimental neuroscience.


Asunto(s)
Simulación por Computador , Mano , Imagenología Tridimensional/métodos , Interfaz Usuario-Computador , Retroalimentación , Humanos , Imagenología Tridimensional/instrumentación , Propiocepción , Percepción Visual
2.
IEEE Trans Vis Comput Graph ; 13(3): 458-69, 2007.
Artículo en Inglés | MEDLINE | ID: mdl-17356213

RESUMEN

This paper describes a generalization of the god-object method for haptic interaction between rigid bodies. Our approach separates the computation of the motion of the six degree-of-freedom god-object from the computation of the force applied to the user. The motion of the god-object is computed using continuous collision detection and constraint-based quasi-statics, which enables high-quality haptic interaction between contacting rigid bodies. The force applied to the user is computed using a novel constraint-based quasi-static approach, which allows us to suppress force artifacts typically found in previous methods. The constraint-based force applied to the user, which handles any number of simultaneous contact points, is computed within a few microseconds, while the update of the configuration of the rigid god-object is performed within a few milliseconds for rigid bodies containing up to tens of thousands of triangles. Our approach has been successfully tested on complex benchmarks. Our results show that the separation into asynchronous processes allows us to satisfy the different update rates required by the haptic and visual displays. Force shading and textures can be added and enlarge the range of haptic perception of a virtual environment. This paper is an extension of [1].

3.
Artículo en Inglés | MEDLINE | ID: mdl-21095949

RESUMEN

Most conventional computer-aided navigation systems assist the surgeon visually by tracking the position of an ancillary and by superposing this position into the 3D preoperative imaging exam. This paper aims at adding to such navigation systems a device that will guide the surgeon towards the target, following a complex preplanned ancillary trajectory. We propose to use tactile stimuli for such guidance, with the design of a vibrating belt. An experiment using a virtual surgery simulator in the case of skull base surgery is conducted with 9 naïve subjects, assessing the vibrotactile guidance effectiveness for complex trajectories. Comparisons between a visual guidance and a visual+tactile guidance are encouraging, supporting the relevance of such tactile guidance paradigm.


Asunto(s)
Cirugía Asistida por Computador/instrumentación , Simulación por Computador , Sistemas de Computación , Diseño de Equipo , Humanos , Procesamiento de Imagen Asistido por Computador/instrumentación , Imagenología Tridimensional/instrumentación , Imagen por Resonancia Magnética/métodos , Modelos Estadísticos , Base del Cráneo/patología , Cirugía Asistida por Computador/métodos , Tomografía Computarizada por Rayos X/métodos , Tacto , Interfaz Usuario-Computador
5.
Cyberpsychol Behav ; 12(2): 175-81, 2009 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-19361298

RESUMEN

Unilateral spatial neglect is a disabling condition frequently occurring after stroke. People with neglect suffer from various spatial deficits in several modalities, which in many cases impair everyday functioning. A successful treatment is yet to be found. Several techniques have been proposed in the last decades, but only a few showed long-lasting effects and none could completely rehabilitate the condition. Diagnostic methods of neglect could be improved as well. The disorder is normally diagnosed with pen-and-paper methods, which generally do not assess patients in everyday tasks and do not address some forms of the disorder. Recently, promising new methods based on virtual reality have emerged. Virtual reality technologies hold great opportunities for the development of effective assessment and treatment techniques for neglect because they provide rich, multimodal, and highly controllable environments. In order to stimulate advancements in this domain, we present a review and an analysis of the current work. We describe past and ongoing research of virtual reality applications for unilateral neglect and discuss the existing problems and new directions for development.


Asunto(s)
Trastornos de la Percepción/diagnóstico , Trastornos de la Percepción/rehabilitación , Interfaz Usuario-Computador , Recursos Audiovisuales , Gráficos por Computador , Humanos , Pruebas Neuropsicológicas , Orientación , Trastornos Psicomotores/diagnóstico , Trastornos Psicomotores/rehabilitación , Percepción Espacial , Resultado del Tratamiento
6.
PLoS One ; 3(3): e1775, 2008 Mar 12.
Artículo en Inglés | MEDLINE | ID: mdl-18335049

RESUMEN

BACKGROUND: Learning to perform new movements is usually achieved by following visual demonstrations. Haptic guidance by a force feedback device is a recent and original technology which provides additional proprioceptive cues during visuo-motor learning tasks. The effects of two types of haptic guidances-control in position (HGP) or in force (HGF)-on visuo-manual tracking ("following") of trajectories are still under debate. METHODOLOGY/PRINCIPALS FINDINGS: Three training techniques of haptic guidance (HGP, HGF or control condition, NHG, without haptic guidance) were evaluated in two experiments. Movements produced by adults were assessed in terms of shapes (dynamic time warping) and kinematics criteria (number of velocity peaks and mean velocity) before and after the training sessions. Trajectories consisted of two Arabic and two Japanese-inspired letters in Experiment 1 and ellipses in Experiment 2. We observed that the use of HGF globally improves the fluency of the visuo-manual tracking of trajectories while no significant improvement was found for HGP or NHG. CONCLUSION/SIGNIFICANCE: These results show that the addition of haptic information, probably encoded in force coordinates, play a crucial role on the visuo-manual tracking of new trajectories.


Asunto(s)
Movimiento , Visión Ocular/fisiología , Adolescente , Adulto , Humanos
8.
IEEE Comput Graph Appl ; 28(6): 20-36, 2008.
Artículo en Inglés | MEDLINE | ID: mdl-19004682

RESUMEN

Three-dimensional user interfaces (3D UIs) let users interact with virtual objects, environments, or information using direct 3D input in the physical and/or virtual space. In this article, the founders and organizers of the IEEE Symposium on 3D User Interfaces reflect on the state of the art in several key aspects of 3D UIs and speculate on future research.


Asunto(s)
Gráficos por Computador/tendencias , Imagenología Tridimensional/tendencias , Difusión de la Información/métodos , Almacenamiento y Recuperación de la Información/tendencias , Internet/tendencias , Programas Informáticos/tendencias , Interfaz Usuario-Computador , Predicción
SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda