Your browser doesn't support javascript.
loading
UltrARsound: in situ visualization of live ultrasound images using HoloLens 2.
von Haxthausen, Felix; Moreta-Martinez, Rafael; Pose Díez de la Lastra, Alicia; Pascau, Javier; Ernst, Floris.
Afiliação
  • von Haxthausen F; Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain. vonhaxthausen@rob.uni-luebeck.de.
  • Moreta-Martinez R; Institute for Robotics and Cognitive Systems, University of Lübeck, Ratzeburger Allee 160, 23562, Lübeck, Schleswig-Holstein, Germany. vonhaxthausen@rob.uni-luebeck.de.
  • Pose Díez de la Lastra A; Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain.
  • Pascau J; Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain.
  • Ernst F; Departamento de Bioingeniería e Ingeniería Aeroespacial, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, Spain.
Int J Comput Assist Radiol Surg ; 17(11): 2081-2091, 2022 Nov.
Article em En | MEDLINE | ID: mdl-35776399
ABSTRACT

PURPOSE:

Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy.

METHODS:

The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses-thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images.

RESULTS:

Tracking is performed with a median accuracy of 1.98 mm/1.81[Formula see text] for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70[Formula see text]. The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms.

CONCLUSIONS:

In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Estudos Transversais Idioma: En Ano de publicação: 2022 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Estudos Transversais Idioma: En Ano de publicação: 2022 Tipo de documento: Article