A Multimodal Direct Gaze Interface for Wheelchairs and Teleoperated Robots.
Annu Int Conf IEEE Eng Med Biol Soc
; 2021: 4796-4800, 2021 11.
Article
em En
| MEDLINE
| ID: mdl-34892283
Gaze-based interfaces are especially useful for people with disabilities involving the upper limbs or hands. Typically, users select from a number of options (e.g. letters or commands) displayed on a screen by gazing at the desired option. However, in some applications, e.g. gaze-based driving, it may be dangerous to direct gaze away from the environment towards a separate display. In addition, a purely gaze based interface can present a high cognitive load to users, as gaze is not normally used for selection and/or control, but rather for other purposes, such as information gathering. To address these issues, this paper presents a cost-effective multi-modal system for gaze based driving which combines appearance-based gaze estimates derived from webcam images with push button inputs that trigger command execution. This system uses an intuitive "direct interface", where users determine the direction of motion by gazing in the corresponding direction in the environment. We have implemented the system for both wheelchair control and robotic teleoperation. The use of our system should provide substantial benefits for patients with severe motor disabilities, such as ALS, by providing them with a more natural and affordable method of wheelchair control. We compare the performance of our system to the more conventional and common "indirect" system where gaze is used to select commands from a separate display, showing that our system enables faster and more efficient navigation.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Cadeiras de Rodas
/
Robótica
/
Pessoas com Deficiência
Limite:
Humans
Idioma:
En
Revista:
Annu Int Conf IEEE Eng Med Biol Soc
Ano de publicação:
2021
Tipo de documento:
Article
País de publicação:
Estados Unidos