Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Mais filtros








Base de dados
Intervalo de ano de publicação
1.
Artigo em Inglês | MEDLINE | ID: mdl-38708142

RESUMO

Biopsies play a crucial role in diagnosis of various diseases including cancers. In this study, we developed an augmented reality (AR) system to improve biopsy procedures and increase targeting accuracy. Our AR-guided biopsy system uses a high-speed motion tracking technology and an AR headset to display a holographic representation of the organ, lesions, and other structures of interest superimposed on real physical objects. The first application of our AR system is prostate biopsy. By incorporating preoperative scans, such as computed tomography (CT) or magnetic resonance imaging (MRI), into real-time ultrasound-guided procedures, this innovative AR-guided system enables clinicians to see the lesion as well as the organs in real time. With the enhanced visualization of the prostate, lesion, and surrounding organs, surgeons can perform prostate biopsies with an increased accuracy. Our AR-guided biopsy system yielded an average targeting accuracy of 2.94 ± 1.04 mm and can be applied for real-time guidance of prostate biopsy as well as other biopsy procedures.

2.
Artigo em Inglês | MEDLINE | ID: mdl-38708143

RESUMO

While minimally invasive laparoscopic surgery can help reduce blood loss, reduce hospital time, and shorten recovery time compared to open surgery, it has the disadvantages of limited field of view and difficulty in locating subsurface targets. Our proposed solution applies an augmented reality (AR) system to overlay pre-operative images, such as those from magnetic resonance imaging (MRI), onto the target organ in the user's real-world environment. Our system can provide critical information regarding the location of subsurface lesions to guide surgical procedures in real time. An infrared motion tracking camera system was employed to obtain real-time position data of the patient and surgical instruments. To perform hologram registration, fiducial markers were used to track and map virtual coordinates to the real world. In this study, phantom models of each organ were constructed to test the reliability and accuracy of the AR-guided laparoscopic system. Root mean square error (RMSE) was used to evaluate the targeting accuracy of the laparoscopic interventional procedure. Our results demonstrated a registration error of 2.42 ± 0.79 mm and a procedural targeting error of 4.17 ± 1.63 mm using our AR-guided laparoscopic system that will be further refined for potential clinical procedures.

3.
Artigo em Inglês | MEDLINE | ID: mdl-38708144

RESUMO

Neuroblastoma is the most common type of extracranial solid tumor in children and can often result in death if not treated. High-intensity focused ultrasound (HIFU) is a non-invasive technique for treating tissue that is deep within the body. It avoids the use of ionizing radiation, avoiding long-term side-effects of these treatments. The goal of this project was to develop the rendering component of an augmented reality (AR) system with potential applications for image-guided HIFU treatment of neuroblastoma. Our project focuses on taking 3D models of neuroblastoma lesions obtained from PET/CT and displaying them in our AR system in near real-time for use by physicians. We used volume ray casting with raster graphics as our preferred rendering method, as it allows for the real-time editing of our 3D radiologic data. Some unique features of our AR system include intuitive hand gestures and virtual user interfaces that allow the user to interact with the rendered data and process PET/CT images for optimal visualization. We implemented the feature to set a custom transfer function, set custom intensity cutoff points, and region-of-interest extraction via cutting planes. In the future, we hope to incorporate this work as part of a complete system for focused ultrasound treatment by adding ultrasound simulation, visualization, and deformable registration.

4.
Artigo em Inglês | MEDLINE | ID: mdl-38560640

RESUMO

Augmented Reality (AR) is becoming a more common addition to physicians' repertoire for aiding in resident training and patient interactions. However, the use of augmented reality in clinical settings is still beset with many complications, including the lack of physician control over the systems, set modes of interactions within the system, and physician's lack of familiarity with such AR systems. In this paper, we plan to expand on our previous prostate biopsy AR system by adding in improved user interface systems within the virtual world in order to allow the user to more accurately visualize only parts of the system which they consider to be useful at that time. To accomplish this, we have incorporated three-dimensional virtual sliders built from the ground up, using Unity to afford control over each model's RGB values, as well as their transparency. This means that the user would be able to fully edit the color, and transparency of each individual model in real time as they see fit quickly and easily while still being immersed in the augmented space. This would allow users to view internal holograms while not sacrificing the capability to view the external structure. Such leeway could be invaluable when visualizing a tumor within a prostate and would provide the physician with the capability to view as much or as little of the surrounded virtual models as desired, while providing the option to reinstate the surrounding models at will. The AR system can provide a new approach for potential uses in image-guided interventions including targeted biopsy of the prostate.

5.
Artigo em Inglês | MEDLINE | ID: mdl-36793657

RESUMO

Ultrasound-guided biopsy is widely used for disease detection and diagnosis. We plan to register preoperative imaging, such as positron emission tomography / computed tomography (PET/CT) and/or magnetic resonance imaging (MRI), with real-time intraoperative ultrasound imaging for improved localization of suspicious lesions that may not be seen on ultrasound but visible on other imaging modalities. Once the image registration is completed, we will combine the images from two or more imaging modalities and use Microsoft HoloLens 2 augmented reality (AR) headset to display three-dimensional (3D) segmented lesions and organs from previously acquired images and real-time ultrasound images. In this work, we are developing a multi-modal, 3D augmented reality system for the potential use in ultrasound-guided prostate biopsy. Preliminary results demonstrate the feasibility of combining images from multiple modalities into an AR-guided system.

6.
Artigo em Inglês | MEDLINE | ID: mdl-35177877

RESUMO

Cardiac catheterization is a delicate strategy often used during various heart procedures. However, the procedure carries a myriad of risks associated with it, including damage to the vessel or heart itself, blood clots, and arrhythmias. Many of these risks increase in probability as the length of the operation increases, creating a demand for a more accurate procedure while reducing the overall time required. To this end, we developed an adaptable virtual reality simulation and visualization method to provide essential information to the physician ahead of time with the goal of reducing potential risks, decreasing operation time, and improving the accuracy of cardiac catheterization procedures. We additionally conducted a phantom study to evaluate the impact of using our virtual reality system prior to a procedure.

7.
Artigo em Inglês | MEDLINE | ID: mdl-32528216

RESUMO

Guided biopsy of soft tissue lesions can be challenging in the presence of sensitive organs or when the lesion itself is small. Computed tomography (CT) is the most frequently used modality to target soft tissue lesions. In order to aid physicians, small field of view (FOV) low dose non-contrast CT volumes are acquired prior to intervention while the patient is on the procedure table to localize the lesion and plan the best approach. However, patient motion between the end of the scan and the start of the biopsy procedure can make it difficult for a physician to translate the lesion location from the CT onto the patient body, especially for a deep-seated lesion. In addition, the needle should be managed well in three-dimensional trajectories in order to reach the lesion and avoid vital structures. This is especially challenging for less experienced interventionists. These usually result in multiple additional image acquisitions during the course of procedure to ensure accurate needle placement, especially when multiple core biopsies are required. In this work, we present an augmented reality (AR)-guided biopsy system and procedure for soft tissue and lung lesions and quantify the results using a phantom study. We found an average error of 0.75 cm from the center of the lesion when AR guidance was used, compared to an error of 1.52 cm from the center of the lesion during unguided biopsy for soft tissue lesions while upon testing the system on lung lesions, an average error of 0.62 cm from the center of the tumor while using AR guidance versus a 1.12 cm error while relying on unguided biopsies. The AR-guided system is able to improve the accuracy and could be useful in the clinical application.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA