Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Int J Med Robot ; 17(5): e2290, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34060214

ABSTRACT

BACKGROUND: User interfaces play a vital role in the planning and execution of an interventional procedure. The objective of this study is to investigate the effect of using different user interfaces for planning transrectal robot-assisted MR-guided prostate biopsy (MRgPBx) in an augmented reality (AR) environment. METHOD: End-user studies were conducted by simulating an MRgPBx system with end- and side-firing modes. The information from the system to the operator was rendered on HoloLens as an output interface. Joystick, mouse/keyboard, and holographic menus were used as input interfaces to the system. RESULTS: The studies indicated that using a joystick improved the interactive capacity and enabled operator to plan MRgPBx in less time. It efficiently captures the operator's commands to manipulate the augmented environment representing the state of MRgPBx system. CONCLUSIONS: The study demonstrates an alternative to conventional input interfaces to interact and manipulate an AR environment within the context of MRgPBx planning.


Subject(s)
Augmented Reality , Surgery, Computer-Assisted , Biopsy , Humans , Magnetic Resonance Imaging , Male , Prostate/surgery
2.
J Digit Imaging ; 34(4): 1014-1025, 2021 08.
Article in English | MEDLINE | ID: mdl-34027587

ABSTRACT

The recent introduction of wireless head-mounted displays (HMD) promises to enhance 3D image visualization by immersing the user into 3D morphology. This work introduces a prototype holographic augmented reality (HAR) interface for the 3D visualization of magnetic resonance imaging (MRI) data for the purpose of planning neurosurgical procedures. The computational platform generates a HAR scene that fuses pre-operative MRI sets, segmented anatomical structures, and a tubular tool for planning an access path to the targeted pathology. The operator can manipulate the presented images and segmented structures and perform path-planning using voice and gestures. On-the-fly, the software uses defined forbidden-regions to prevent the operator from harming vital structures. In silico studies using the platform with a HoloLens HMD assessed its functionality and the computational load and memory for different tasks. A preliminary qualitative evaluation revealed that holographic visualization of high-resolution 3D MRI data offers an intuitive and interactive perspective of the complex brain vasculature and anatomical structures. This initial work suggests that immersive experiences may be an unparalleled tool for planning neurosurgical procedures.


Subject(s)
Augmented Reality , Holography , Surgery, Computer-Assisted , Imaging, Three-Dimensional , Magnetic Resonance Imaging , Neurosurgical Procedures , Software , User-Computer Interface
3.
Int J Med Robot ; 17(1): 1-12, 2021 Feb.
Article in English | MEDLINE | ID: mdl-33047863

ABSTRACT

BACKGROUND: This study presents user evaluation studies to assess the effect of information rendered by an interventional planning software on the operator's ability to plan transrectal magnetic resonance (MR)-guided prostate biopsies using actuated robotic manipulators. METHODS: An intervention planning software was developed based on the clinical workflow followed for MR-guided transrectal prostate biopsies. The software was designed to interface with a generic virtual manipulator and simulate an intervention environment using 2D and 3D scenes. User studies were conducted with urologists using the developed software to plan virtual biopsies. RESULTS: User studies demonstrated that urologists with prior experience in using 3D software completed the planning less time. 3D scenes were required to control all degrees-of-freedom of the manipulator, while 2D scenes were sufficient for planar motion of the manipulator. CONCLUSIONS: The study provides insights on using 2D versus 3D environment from a urologist's perspective for different operational modes of MR-guided prostate biopsy systems.


Subject(s)
Prostatic Neoplasms , Biopsy , Humans , Image-Guided Biopsy , Magnetic Resonance Imaging , Magnetic Resonance Spectroscopy , Male , Prostatic Neoplasms/diagnostic imaging , Software
4.
Comput Methods Programs Biomed ; 198: 105779, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33045556

ABSTRACT

BACKGROUND AND OBJECTIVE: Modern imaging scanners produce an ever-growing body of 3D/4D multimodal data requiring image analytics and visualization of fused images, segmentations, and information. For the latter, augmented reality (AR) with head-mounted displays (HMDs) has shown potential. This work describes a framework (FI3D) for interactive immersion with data, integration of image processing and analytics, and rendering and fusion with an AR interface. METHODS: The FI3D was designed and endowed with modules to communicate with peripherals, including imaging scanners and HMDs, and to provide computational power for data acquisition and processing. The core of FI3D is deployed to a dedicated computational unit that performs the computationally demanding processes in real-time, and the HMD is used as a display output peripheral and an input peripheral through gestures and voice commands. FI3D offers user-made processing and analysis dedicated modules. Users can customize and optimize these for a particular workflow while incorporating current or future libraries. RESULTS: The FI3D framework was used to develop a workflow for processing, rendering, and visualization of CINE MRI cardiac sets. In this version, the data were loaded from a remote database, and the endocardium and epicardium of the left ventricle (LV) were segmented using a machine learning model and transmitted to a HoloLens HMD to be visualized in 4D. Performance results show that the system is capable of maintaining an image stream of one image per second with a resolution of 512 × 512. Also, it can modify visual properties of the holograms at 1 update per 16 milliseconds (62.5 Hz) while providing enough resources for the segmentation and surface reconstruction tasks without hindering the HMD. CONCLUSIONS: We provide a system design and framework to be used as a foundation for medical applications that benefit from AR visualization, removing several technical challenges from the developmental pipeline.


Subject(s)
Augmented Reality , Image Processing, Computer-Assisted , Imaging, Three-Dimensional , Immersion , User-Computer Interface
5.
J Digit Imaging ; 32(3): 420-432, 2019 06.
Article in English | MEDLINE | ID: mdl-30483988

ABSTRACT

This work presents a platform that integrates a customized MRI data acquisition scheme with reconstruction and three-dimensional (3D) visualization modules along with a module for controlling an MRI-compatible robotic device to facilitate the performance of robot-assisted, MRI-guided interventional procedures. Using dynamically-acquired MRI data, the computational framework of the platform generates and updates a 3D model representing the area of the procedure (AoP). To image structures of interest in the AoP that do not reside inside the same or parallel slices, the MRI acquisition scheme was modified to collect a multi-slice set of intraoblique to each other slices; which are termed composing slices. Moreover, this approach interleaves the collection of the composing slices so the same k-space segments of all slices are collected during similar time instances. This time matching of the k-space segments results in spatial matching of the imaged objects in the individual composing slices. The composing slices were used to generate and update the 3D model of the AoP. The MRI acquisition scheme was evaluated with computer simulations and experimental studies. Computer simulations demonstrated that k-space segmentation and time-matched interleaved acquisition of these segments provide spatial matching of the structures imaged with composing slices. Experimental studies used the platform to image the maneuvering of an MRI-compatible manipulator that carried tubing filled with MRI contrast agent. In vivo experimental studies to image the abdomen and contrast enhanced heart on free-breathing subjects without cardiac triggering demonstrated spatial matching of imaged anatomies in the composing planes. The described interventional MRI framework could assist in performing real-time MRI-guided interventions.


Subject(s)
Image Processing, Computer-Assisted/methods , Imaging, Three-Dimensional , Magnetic Resonance Imaging, Interventional , Robotics/instrumentation , Abdomen/diagnostic imaging , Computer Simulation , Contrast Media , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...