Your browser doesn't support javascript.
loading
A novel gaze-supported multimodal human-computer interaction for ultrasound machines.
Zhu, Hongzhi; Salcudean, Septimiu E; Rohling, Robert N.
Afiliação
  • Zhu H; Department of Biomedical Engineering, University of British Columbia, Vancouver, Canada. hzhu@ece.ubc.ca.
  • Salcudean SE; Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, Canada.
  • Rohling RN; Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, Canada.
Int J Comput Assist Radiol Surg ; 14(7): 1107-1115, 2019 Jul.
Article em En | MEDLINE | ID: mdl-30977092
ABSTRACT

PURPOSE:

Conventional ultrasound (US) machines employ a physical control panel (PCP) as the primary user interface for machine control. This panel is adjacent to the main machine display that requires the operator's constant attention. The switch of attention to the control panel can lead to interruptions in the flow of the medical examination. Some ultraportable machines also lack many physical controls. Furthermore, the need to both control the US machine and observe the US image may lead the practitioners to adopt unergonomic postures and repetitive motions that can lead to work-related injuries. Therefore, there is a need for a more efficient human-computer interaction method on US machines.

METHODS:

To tackle some of the limitations with the PCP, we propose to merge the PCP into the main screen of the US machines. We propose to use gaze tracking and a handheld controller so that machine control can be achieved via a multimodal human-computer interaction (HCI) method that does not require one to touch the screen or look away from the US image. As a first step, a pop-up menu and measurement tool were designed on top of the US image based on gaze position for efficient machine control.

RESULTS:

A comparative study was performed on the BK Medical SonixTOUCH US machine. Participants were asked to complete the task of measuring the area of an ellipse-shaped tumor in a phantom using our gaze-supported HCI method as well as the traditional method. The user study indicates that the task completion time can be reduced by [Formula see text] when using our gaze-supported HCI, while no extra workload is imposed on the operators.

CONCLUSIONS:

Our preliminary study suggests that, when combined with a simple handheld controller, eye gaze tracking can be integrated into the US machine HCI for more efficient machine control.
Assuntos
Palavras-chave

Texto completo: 1 Bases de dados: MEDLINE Assunto principal: Interface Usuário-Computador / Ultrassonografia / Movimentos Oculares Tipo de estudo: Diagnostic_studies Limite: Humans Idioma: En Revista: Int J Comput Assist Radiol Surg Assunto da revista: RADIOLOGIA Ano de publicação: 2019 Tipo de documento: Article País de afiliação: Canadá

Texto completo: 1 Bases de dados: MEDLINE Assunto principal: Interface Usuário-Computador / Ultrassonografia / Movimentos Oculares Tipo de estudo: Diagnostic_studies Limite: Humans Idioma: En Revista: Int J Comput Assist Radiol Surg Assunto da revista: RADIOLOGIA Ano de publicação: 2019 Tipo de documento: Article País de afiliação: Canadá