Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros

Banco de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
J Digit Imaging ; 34(4): 1014-1025, 2021 08.
Artículo en Inglés | MEDLINE | ID: mdl-34027587

RESUMEN

The recent introduction of wireless head-mounted displays (HMD) promises to enhance 3D image visualization by immersing the user into 3D morphology. This work introduces a prototype holographic augmented reality (HAR) interface for the 3D visualization of magnetic resonance imaging (MRI) data for the purpose of planning neurosurgical procedures. The computational platform generates a HAR scene that fuses pre-operative MRI sets, segmented anatomical structures, and a tubular tool for planning an access path to the targeted pathology. The operator can manipulate the presented images and segmented structures and perform path-planning using voice and gestures. On-the-fly, the software uses defined forbidden-regions to prevent the operator from harming vital structures. In silico studies using the platform with a HoloLens HMD assessed its functionality and the computational load and memory for different tasks. A preliminary qualitative evaluation revealed that holographic visualization of high-resolution 3D MRI data offers an intuitive and interactive perspective of the complex brain vasculature and anatomical structures. This initial work suggests that immersive experiences may be an unparalleled tool for planning neurosurgical procedures.


Asunto(s)
Realidad Aumentada , Holografía , Cirugía Asistida por Computador , Imagenología Tridimensional , Imagen por Resonancia Magnética , Procedimientos Neuroquirúrgicos , Programas Informáticos , Interfaz Usuario-Computador
2.
J Digit Imaging ; 32(3): 420-432, 2019 06.
Artículo en Inglés | MEDLINE | ID: mdl-30483988

RESUMEN

This work presents a platform that integrates a customized MRI data acquisition scheme with reconstruction and three-dimensional (3D) visualization modules along with a module for controlling an MRI-compatible robotic device to facilitate the performance of robot-assisted, MRI-guided interventional procedures. Using dynamically-acquired MRI data, the computational framework of the platform generates and updates a 3D model representing the area of the procedure (AoP). To image structures of interest in the AoP that do not reside inside the same or parallel slices, the MRI acquisition scheme was modified to collect a multi-slice set of intraoblique to each other slices; which are termed composing slices. Moreover, this approach interleaves the collection of the composing slices so the same k-space segments of all slices are collected during similar time instances. This time matching of the k-space segments results in spatial matching of the imaged objects in the individual composing slices. The composing slices were used to generate and update the 3D model of the AoP. The MRI acquisition scheme was evaluated with computer simulations and experimental studies. Computer simulations demonstrated that k-space segmentation and time-matched interleaved acquisition of these segments provide spatial matching of the structures imaged with composing slices. Experimental studies used the platform to image the maneuvering of an MRI-compatible manipulator that carried tubing filled with MRI contrast agent. In vivo experimental studies to image the abdomen and contrast enhanced heart on free-breathing subjects without cardiac triggering demonstrated spatial matching of imaged anatomies in the composing planes. The described interventional MRI framework could assist in performing real-time MRI-guided interventions.


Asunto(s)
Procesamiento de Imagen Asistido por Computador/métodos , Imagenología Tridimensional , Imagen por Resonancia Magnética Intervencional , Robótica/instrumentación , Abdomen/diagnóstico por imagen , Simulación por Computador , Medios de Contraste , Humanos
3.
Comput Methods Programs Biomed ; 198: 105779, 2021 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-33045556

RESUMEN

BACKGROUND AND OBJECTIVE: Modern imaging scanners produce an ever-growing body of 3D/4D multimodal data requiring image analytics and visualization of fused images, segmentations, and information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has shown potential. This work describes a framework (FI3D) for interactive immersion with data, integration of image processing and analytics, and rendering and fusion with an AR interface. METHODS: The FI3D was designed and endowed with modules to communicate with peripherals, including imaging scanners and HMDs, and to provide computational power for data acquisition and processing. The core of FI3D is deployed to a dedicated computational unit that performs the computationally demanding processes in real-time, and the HMD is used as a display output peripheral and an input peripheral through gestures and voice commands. FI3D offers user-made processing and analysis dedicated modules. Users can customize and optimize these for a particular workflow while incorporating current or future libraries. RESULTS: The FI3D framework was used to develop a workflow for processing, rendering, and visualization of CINE MRI cardiac sets. In this version, the data were loaded from a remote database, and the endocardium and epicardium of the left ventricle (LV) were segmented using a machine learning model and transmitted to a HoloLens HMD to be visualized in 4D. Performance results show that the system is capable of maintaining an image stream of one image per second with a resolution of 512 × 512. Also, it can modify visual properties of the holograms at 1 update per 16 milliseconds (62.5 Hz) while providing enough resources for the segmentation and surface reconstruction tasks without hindering the HMD. CONCLUSIONS: We provide a system design and framework to be used as a foundation for medical applications that benefit from AR visualization, removing several technical challenges from the developmental pipeline.


Asunto(s)
Realidad Aumentada , Procesamiento de Imagen Asistido por Computador , Imagenología Tridimensional , Inmersión , Interfaz Usuario-Computador
4.
Int J Med Robot ; 17(1): 1-12, 2021 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-33047863

RESUMEN

BACKGROUND: This study presents user evaluation studies to assess the effect of information rendered by an interventional planning software on the operator's ability to plan transrectal magnetic resonance (MR)-guided prostate biopsies using actuated robotic manipulators. METHODS: An intervention planning software was developed based on the clinical workflow followed for MR-guided transrectal prostate biopsies. The software was designed to interface with a generic virtual manipulator and simulate an intervention environment using 2D and 3D scenes. User studies were conducted with urologists using the developed software to plan virtual biopsies. RESULTS: User studies demonstrated that urologists with prior experience in using 3D software completed the planning less time. 3D scenes were required to control all degrees-of-freedom of the manipulator, while 2D scenes were sufficient for planar motion of the manipulator. CONCLUSIONS: The study provides insights on using 2D versus 3D environment from a urologist's perspective for different operational modes of MR-guided prostate biopsy systems.


Asunto(s)
Neoplasias de la Próstata , Biopsia , Humanos , Biopsia Guiada por Imagen , Imagen por Resonancia Magnética , Espectroscopía de Resonancia Magnética , Masculino , Neoplasias de la Próstata/diagnóstico por imagen , Programas Informáticos
5.
Int J Med Robot ; 17(5): e2290, 2021 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-34060214

RESUMEN

BACKGROUND: User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot-assisted MR-guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment. METHOD: End-user studies were conducted by simulating an MRgPBx system with end- and side-firing modes. The information from the system to the operator was rendered on HoloLens as an output interface. Joystick, mouse/keyboard, and holographic menus were used as input interfaces to the system. RESULTS: The studies indicated that using a joystick improved the interactive capacity and enabled operator to plan MRgPBx in less time. It efficiently captures the operator's commands to manipulate the augmented environment representing the state of MRgPBx system. CONCLUSIONS: The study demonstrates an alternative to conventional input interfaces to interact and manipulate an AR environment within the context of MRgPBx planning.


Asunto(s)
Realidad Aumentada , Cirugía Asistida por Computador , Biopsia , Humanos , Imagen por Resonancia Magnética , Masculino , Próstata/cirugía
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA