Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros

Base de datos
Tipo del documento
Asunto de la revista
Intervalo de año de publicación
1.
Artículo en Inglés | MEDLINE | ID: mdl-39162975

RESUMEN

PURPOSE: The operating microscope plays a central role in middle and inner ear procedures that involve working within tightly confined spaces under limited exposure. Augmented reality (AR) may improve surgical guidance by combining preoperative computed tomography (CT) imaging that can provide precise anatomical information, with intraoperative microscope video feed. With current technology, the operator must manually interact with the AR interface using a computer. The latter poses a disruption in the surgical flow and is suboptimal for maintaining the sterility of the operating environment. The purpose of this study was to implement and evaluate free-hand interaction concepts leveraging hand tracking and gesture recognition as an attempt to reduce the disruption during surgery and improve human-computer interaction. METHODS: An electromagnetically tracked surgical microscope was calibrated using a custom 3D printed calibration board. This allowed the augmentation of the microscope feed with segmented preoperative CT-derived virtual models. Ultraleap's Leap Motion Controller 2 was coupled to the microscope and used to implement hand-tracking capabilities. End-user feedback was gathered from a surgeon during development. Finally, users were asked to complete tasks that involved interacting with the virtual models, aligning them to physical targets, and adjusting the AR visualization. RESULTS: Following observations and user feedback, we upgraded the functionalities of the hand interaction system. User feedback showed the users' preference for the new interaction concepts that provided minimal disruption of the surgical workflow and more intuitive interaction with the virtual content. CONCLUSION: We integrated hand interaction concepts, typically used with head-mounted displays (HMDs), into a surgical stereo microscope system intended for AR in otologic microsurgery. The concepts presented in this study demonstrated a more favorable approach to human-computer interaction in a surgical context. They hold potential for a more efficient execution of surgical tasks under microscopic AR guidance.

2.
Plast Reconstr Surg Glob Open ; 12(7): e5940, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-38957720

RESUMEN

We introduce a novel technique using augmented reality (AR) on smartphones and tablets, making it possible for surgeons to review perforator anatomy in three dimensions on the go. Autologous breast reconstruction with abdominal flaps remains challenging due to the highly variable anatomy of the deep inferior epigastric artery. Computed tomography angiography has mitigated some but not all challenges. Previously, volume rendering and different headsets were used to enable better three-dimensional (3D) review for surgeons. However, surgeons have been dependent on others to provide 3D imaging data. Leveraging the ubiquity of Apple devices, our approach permits surgeons to review 3D models of deep inferior epigastric artery anatomy segmented from abdominal computed tomography angiography directly on their iPhone/iPad. Segmentation can be performed in common radiology software. The models are converted to the universal scene description zipped format, which allows immediate use on Apple devices without third-party software. They can be easily shared using secure, Health Insurance Portability and Accountability Act-compliant sharing services already provided by most hospitals. Surgeons can simply open the file on their mobile device to explore the images in 3D using "object mode" natively without additional applications or can switch to AR mode to pin the model in their real-world surroundings for intuitive exploration. We believe patient-specific 3D anatomy models are a powerful tool for intuitive understanding and communication of complex perforator anatomy and would be a valuable addition in routine clinical practice and education. Using this one-click solution on existing devices that is simple to implement, we hope to streamline the adoption of AR models by plastic surgeons.

3.
Surg Innov ; : 15533506241262946, 2024 Jun 21.
Artículo en Inglés | MEDLINE | ID: mdl-38905568

RESUMEN

Plastic surgeons routinely use 3D-models in their clinical practice, from 3D-photography and surface imaging to 3D-segmentations from radiological scans. However, these models continue to be viewed on flattened 2D screens that do not enable an intuitive understanding of 3D-relationships and cause challenges regarding collaboration with colleagues. The Metaverse has been proposed as a new age of applications building on modern Mixed Reality headset technology that allows remote collaboration on virtual 3D-models in a shared physical-virtual space in real-time. We demonstrate the first use of the Metaverse in the context of reconstructive surgery, focusing on preoperative planning discussions and trainee education. Using a HoloLens headset with the Microsoft Mesh application, we performed planning sessions for 4 DIEP-flaps in our reconstructive metaverse on virtual patient-models segmented from routine CT angiography. In these sessions, surgeons discuss perforator anatomy and perforator selection strategies whilst comprehensively assessing the respective models. We demonstrate the workflow for a one-on-one interaction between an attending surgeon and a trainee in a video featuring both viewpoints as seen through the headset. We believe the Metaverse will provide novel opportunities to use the 3D-models that are already created in everyday plastic surgery practice in a more collaborative, immersive, accessible, and educational manner.

4.
Plast Reconstr Surg ; 2024 Feb 14.
Artículo en Inglés | MEDLINE | ID: mdl-38351515

RESUMEN

Preoperative CT angiography (CTA) is increasingly performed prior to perforator flap-based reconstruction. However, radiological 2D thin-slices do not allow for intuitive interpretation and translation to intraoperative findings. 3D volume rendering has been used to alleviate the need for mental 2D-to-3D abstraction. Even though volume rendering allows for a much easier understanding of anatomy, it currently has limited utility as the skin obstructs the view of critical structures. Using free, open-source software, we introduce a new skin-masking technique that allows surgeons to easily create a segmentation mask of the skin that can later be used to toggle the skin on and off. Additionally, the mask can be used in other rendering applications. We use Cinematic Anatomy for photorealistic volume rendering and interactive exploration of the CTA with and without skin. We present results from using this technique to investigate perforator anatomy in deep inferior epigastric perforator flaps and demonstrate that the skin-masking workflow is performed in less than 5 minutes. In Cinematic Anatomy, the view onto the abdominal wall and especially onto perforators becomes significantly sharper and more detailed when no longer obstructed by the skin. We perform a virtual, partial muscle dissection to show the intramuscular and submuscular course of the perforators. The skin-masking workflow allows surgeons to improve arterial and perforator detail in volume renderings easily and quickly by removing skin and could alternatively also be performed solely using open-source and free software. The workflow can be easily expanded to other perforator flaps without the need for modification.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA