Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros

Banco de datos
Tipo del documento
Asunto de la revista
País de afiliación
Intervalo de año de publicación
1.
Int J Comput Assist Radiol Surg ; 15(3): 545-553, 2020 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-31520326

RESUMEN

PURPOSE: MRI-guided interventions allow minimally invasive, radiation-free treatment but rely on real-time image data and free slice positioning. Interventional interaction with the data and the MRI scanner is cumbersome due to the diagnostic focus of current systems, confined space and sterile conditions. METHODS: We present a touchless, hand-gesture-based interaction concept to control functions of the MRI scanner typically used during MRI-guided interventions. The system consists of a hand gesture sensor customised for MRI compatibility and a specialised UI that was developed based on clinical needs. A user study with 10 radiologists was performed to compare the gesture interaction concept and its components to task delegation-the prevalent method in clinical practice. RESULTS: Both methods performed comparably in terms of task duration and subjective workload. Subjective performance with gesture input was perceived as worse compared to task delegation, but was rated acceptable in terms of usability while task delegation was not. CONCLUSION: This work contributes by (1) providing access to relevant functions on an MRI scanner during percutaneous interventions in a (2) suitable way for sterile human-computer interaction. The introduced concept removes indirect interaction with the scanner via an assistant, which leads to comparable subjective workload and task completion times while showing higher perceived usability.


Asunto(s)
Gestos , Imagen por Resonancia Magnética/métodos , Interfaz Usuario-Computador , Humanos
2.
Int J Med Robot ; 15(1): e1950, 2019 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-30168639

RESUMEN

BACKGROUND: Navigation support in interventional magnetic resonance imaging (MRI) is separated from the operating field, which makes it difficult to interpret positions and orientations and to coordinate the necessary hand movements. METHODS: We developed a projector-based augmented reality system to enable visual navigation of tracked instruments on pre-planned paths and visualization of risk structures directly on the patient inside the MRI bore. To assess the accuracy of the system, a user study was carried out with clinicians in a needle navigation test scenario. RESULTS: The targets were reached with an error of 1.7 ± 0.5 mm and the entry points with an error of 1.7 ± 0.8 mm. CONCLUSION: The accuracy results are similar to those reached by live image-guided interventions and related work and confirm that this projective augmented reality prototype for the interventional MRI can serve as a platform for current and future research in augmented reality visualization and dynamic registration.


Asunto(s)
Imagen por Resonancia Magnética Intervencional/instrumentación , Imagen por Resonancia Magnética Intervencional/métodos , Agujas , Fantasmas de Imagen , Abdomen , Calibración , Diseño de Equipo , Mano/fisiología , Humanos , Aumento de la Imagen , Procesamiento de Imagen Asistido por Computador/métodos , Movimiento , Reproducibilidad de los Resultados , Riesgo , Interfaz Usuario-Computador
3.
Healthc Technol Lett ; 5(5): 172-176, 2018 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-30464849

RESUMEN

During MRI-guided interventions, navigation support is often separated from the operating field on displays, which impedes the interpretation of positions and orientations of instruments inside the patient's body as well as hand-eye coordination. To overcome these issues projector-based augmented reality can be used to support needle guidance inside the MRI bore directly in the operating field. The authors present two visualisation concepts for needle navigation aids which were compared in an accuracy and usability study with eight participants, four of whom were experienced radiologists. The results show that both concepts are equally accurate ( 2.0 ± 0.6 and 1.7 ± 0.5 mm ), useful and easy to use, with clear visual feedback about the state and success of the needle puncture. For easier clinical applicability, a dynamic projection on moving surfaces and organ movement tracking are needed. For now, tests with patients with respiratory arrest are feasible.

4.
Int J Comput Assist Radiol Surg ; 12(2): 291-305, 2017 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-27647327

RESUMEN

PURPOSE: In this article, we systematically examine the current state of research of systems that focus on touchless human-computer interaction in operating rooms and interventional radiology suites. We further discuss the drawbacks of current solutions and underline promising technologies for future development. METHODS: A systematic literature search of scientific papers that deal with touchless control of medical software in the immediate environment of the operation room and interventional radiology suite was performed. This includes methods for touchless gesture interaction, voice control and eye tracking. RESULTS: Fifty-five research papers were identified and analyzed in detail including 33 journal publications. Most of the identified literature (62 %) deals with the control of medical image viewers. The others present interaction techniques for laparoscopic assistance (13 %), telerobotic assistance and operating room control (9 % each) as well as for robotic operating room assistance and intraoperative registration (3.5 % each). Only 8 systems (14.5 %) were tested in a real clinical environment, and 7 (12.7 %) were not evaluated at all. CONCLUSION: In the last 10 years, many advancements have led to robust touchless interaction approaches. However, only a few have been systematically evaluated in real operating room settings. Further research is required to cope with current limitations of touchless software interfaces in clinical environments. The main challenges for future research are the improvement and evaluation of usability and intuitiveness of touchless human-computer interaction and the full integration into productive systems as well as the reduction of necessary interaction steps and further development of hands-free interaction.


Asunto(s)
Gestos , Programas Informáticos , Procedimientos Quirúrgicos Operativos/métodos , Interfaz Usuario-Computador , Humanos , Laparoscopía , Quirófanos , Radiología Intervencionista/métodos , Procedimientos Quirúrgicos Robotizados , Robótica , Voz
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA