Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
1.
Surg Innov ; 31(5): 563-566, 2024 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-38905568

RESUMO

Plastic surgeons routinely use 3D-models in their clinical practice, from 3D-photography and surface imaging to 3D-segmentations from radiological scans. However, these models continue to be viewed on flattened 2D screens that do not enable an intuitive understanding of 3D-relationships and cause challenges regarding collaboration with colleagues. The Metaverse has been proposed as a new age of applications building on modern Mixed Reality headset technology that allows remote collaboration on virtual 3D-models in a shared physical-virtual space in real-time. We demonstrate the first use of the Metaverse in the context of reconstructive surgery, focusing on preoperative planning discussions and trainee education. Using a HoloLens headset with the Microsoft Mesh application, we performed planning sessions for 4 DIEP-flaps in our reconstructive metaverse on virtual patient-models segmented from routine CT angiography. In these sessions, surgeons discuss perforator anatomy and perforator selection strategies whilst comprehensively assessing the respective models. We demonstrate the workflow for a one-on-one interaction between an attending surgeon and a trainee in a video featuring both viewpoints as seen through the headset. We believe the Metaverse will provide novel opportunities to use the 3D-models that are already created in everyday plastic surgery practice in a more collaborative, immersive, accessible, and educational manner.


Assuntos
Imageamento Tridimensional , Microcirurgia , Procedimentos de Cirurgia Plástica , Humanos , Procedimentos de Cirurgia Plástica/educação , Procedimentos de Cirurgia Plástica/métodos , Microcirurgia/educação , Microcirurgia/métodos , Realidade Virtual , Modelos Anatômicos , Realidade Aumentada
2.
Artigo em Inglês | MEDLINE | ID: mdl-39162975

RESUMO

PURPOSE: The operating microscope plays a central role in middle and inner ear procedures that involve working within tightly confined spaces under limited exposure. Augmented reality (AR) may improve surgical guidance by combining preoperative computed tomography (CT) imaging that can provide precise anatomical information, with intraoperative microscope video feed. With current technology, the operator must manually interact with the AR interface using a computer. The latter poses a disruption in the surgical flow and is suboptimal for maintaining the sterility of the operating environment. The purpose of this study was to implement and evaluate free-hand interaction concepts leveraging hand tracking and gesture recognition as an attempt to reduce the disruption during surgery and improve human-computer interaction. METHODS: An electromagnetically tracked surgical microscope was calibrated using a custom 3D printed calibration board. This allowed the augmentation of the microscope feed with segmented preoperative CT-derived virtual models. Ultraleap's Leap Motion Controller 2 was coupled to the microscope and used to implement hand-tracking capabilities. End-user feedback was gathered from a surgeon during development. Finally, users were asked to complete tasks that involved interacting with the virtual models, aligning them to physical targets, and adjusting the AR visualization. RESULTS: Following observations and user feedback, we upgraded the functionalities of the hand interaction system. User feedback showed the users' preference for the new interaction concepts that provided minimal disruption of the surgical workflow and more intuitive interaction with the virtual content. CONCLUSION: We integrated hand interaction concepts, typically used with head-mounted displays (HMDs), into a surgical stereo microscope system intended for AR in otologic microsurgery. The concepts presented in this study demonstrated a more favorable approach to human-computer interaction in a surgical context. They hold potential for a more efficient execution of surgical tasks under microscopic AR guidance.

3.
Plast Reconstr Surg ; 154(4S): 63S-67S, 2024 Oct 01.
Artigo em Inglês | MEDLINE | ID: mdl-38351515

RESUMO

SUMMARY: Preoperative computed tomographic angiography is increasingly performed before perforator flap-based reconstruction. However, radiologic two-dimensional thin slices do not allow for intuitive interpretation and translation to intraoperative findings. Three-dimensional volume rendering has been used to alleviate the need for mental two-dimensional to three-dimensional abstraction. Even though volume rendering allows for a much easier understanding of anatomy, it currently has limited utility, as the skin obstructs the view of critical structures. Using free, open-source software, the authors introduce a new skin-masking technique that allows surgeons to easily create a segmentation mask of the skin that can later be used to toggle the skin on and off. In addition, the mask can be used in other rendering applications. The authors use Cinematic Anatomy for photorealistic volume rendering and interactive exploration of computed tomographic angiography with and without skin. The authors present results from using this technique to investigate perforator anatomy in deep inferior epigastric perforator flaps and demonstrate that the skin-masking workflow is performed in less than 5 minutes. In Cinematic Anatomy, the view onto the abdominal wall and especially onto perforators becomes significantly sharper and more detailed when no longer obstructed by the skin. The authors perform a virtual, partial muscle dissection to show the intramuscular and submuscular course of the perforators. The skin-masking workflow allows surgeons to improve arterial and perforator detail in volume renderings easily and quickly by removing skin and could alternatively be performed solely using open-source and free software. The workflow can be easily expanded to other perforator flaps without the need for modification.


Assuntos
Angiografia por Tomografia Computadorizada , Imageamento Tridimensional , Retalho Perfurante , Humanos , Retalho Perfurante/irrigação sanguínea , Artérias Epigástricas/anatomia & histologia , Artérias Epigástricas/diagnóstico por imagem , Mamoplastia/métodos , Parede Abdominal/irrigação sanguínea , Parede Abdominal/cirurgia , Parede Abdominal/diagnóstico por imagem , Parede Abdominal/anatomia & histologia , Software
4.
Plast Reconstr Surg Glob Open ; 12(7): e5940, 2024 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-38957720

RESUMO

We introduce a novel technique using augmented reality (AR) on smartphones and tablets, making it possible for surgeons to review perforator anatomy in three dimensions on the go. Autologous breast reconstruction with abdominal flaps remains challenging due to the highly variable anatomy of the deep inferior epigastric artery. Computed tomography angiography has mitigated some but not all challenges. Previously, volume rendering and different headsets were used to enable better three-dimensional (3D) review for surgeons. However, surgeons have been dependent on others to provide 3D imaging data. Leveraging the ubiquity of Apple devices, our approach permits surgeons to review 3D models of deep inferior epigastric artery anatomy segmented from abdominal computed tomography angiography directly on their iPhone/iPad. Segmentation can be performed in common radiology software. The models are converted to the universal scene description zipped format, which allows immediate use on Apple devices without third-party software. They can be easily shared using secure, Health Insurance Portability and Accountability Act-compliant sharing services already provided by most hospitals. Surgeons can simply open the file on their mobile device to explore the images in 3D using "object mode" natively without additional applications or can switch to AR mode to pin the model in their real-world surroundings for intuitive exploration. We believe patient-specific 3D anatomy models are a powerful tool for intuitive understanding and communication of complex perforator anatomy and would be a valuable addition in routine clinical practice and education. Using this one-click solution on existing devices that is simple to implement, we hope to streamline the adoption of AR models by plastic surgeons.

5.
Int J Comput Assist Radiol Surg ; 18(11): 2033-2041, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-37450175

RESUMO

PURPOSE: Middle and inner ear procedures target hearing loss, infections, and tumors of the temporal bone and lateral skull base. Despite the advances in surgical techniques, these procedures remain challenging due to limited haptic and visual feedback. Augmented reality (AR) may improve operative safety by allowing the 3D visualization of anatomical structures from preoperative computed tomography (CT) scans on real intraoperative microscope video feed. The purpose of this work was to develop a real-time CT-augmented stereo microscope system using camera calibration and electromagnetic (EM) tracking. METHODS: A 3D printed and electromagnetically tracked calibration board was used to compute the intrinsic and extrinsic parameters of the surgical stereo microscope. These parameters were used to establish a transformation between the EM tracker coordinate system and the stereo microscope image space such that any tracked 3D point can be projected onto the left and right images of the microscope video stream. This allowed the augmentation of the microscope feed of a 3D printed temporal bone with its corresponding CT-derived virtual model. Finally, the calibration board was also used for evaluating the accuracy of the calibration. RESULTS: We evaluated the accuracy of the system by calculating the registration error (RE) in 2D and 3D in a microsurgical laboratory setting. Our calibration workflow achieved a RE of 0.11 ± 0.06 mm in 2D and 0.98 ± 0.13 mm in 3D. In addition, we overlaid a 3D CT model on the microscope feed of a 3D resin printed model of a segmented temporal bone. The system exhibited small latency and good registration accuracy. CONCLUSION: We present the calibration of an electromagnetically tracked surgical stereo microscope for augmented reality visualization. The calibration method achieved accuracy within a range suitable for otologic procedures. The AR process introduces enhanced visualization of the surgical field while allowing depth perception.

6.
Otol Neurotol ; 44(8): e602-e609, 2023 09 01.
Artigo em Inglês | MEDLINE | ID: mdl-37464458

RESUMO

OBJECTIVE: To objectively evaluate vestibular schwannomas (VSs) and their spatial relationships with the ipsilateral inner ear (IE) in magnetic resonance imaging (MRI) using deep learning. STUDY DESIGN: Cross-sectional study. PATIENTS: A total of 490 adults with VS, high-resolution MRI scans, and no previous neurotologic surgery. INTERVENTIONS: MRI studies of VS patients were split into training (390 patients) and test (100 patients) sets. A three-dimensional convolutional neural network model was trained to segment VS and IE structures using contrast-enhanced T1-weighted and T2-weighted sequences, respectively. Manual segmentations were used as ground truths. Model performance was evaluated on the test set and on an external set of 100 VS patients from a public data set (Vestibular-Schwannoma-SEG). MAIN OUTCOME MEASURES: Dice score, relative volume error, average symmetric surface distance, 95th-percentile Hausdorff distance, and centroid locations. RESULTS: Dice scores for VS and IE volume segmentations were 0.91 and 0.90, respectively. On the public data set, the model segmented VS tumors with a Dice score of 0.89 ± 0.06 (mean ± standard deviation), relative volume error of 9.8 ± 9.6%, average symmetric surface distance of 0.31 ± 0.22 mm, and 95th-percentile Hausdorff distance of 1.26 ± 0.76 mm. Predicted VS segmentations overlapped with ground truth segmentations in all test subjects. Mean errors of predicted VS volume, VS centroid location, and IE centroid location were 0.05 cm 3 , 0.52 mm, and 0.85 mm, respectively. CONCLUSIONS: A deep learning system can segment VS and IE structures in high-resolution MRI scans with excellent accuracy. This technology offers promise to improve the clinical workflow for assessing VS radiomics and enhance the management of VS patients.


Assuntos
Orelha Interna , Neuroma Acústico , Adulto , Humanos , Inteligência Artificial , Neuroma Acústico/diagnóstico por imagem , Estudos Transversais , Imageamento por Ressonância Magnética/métodos
7.
Int J Comput Assist Radiol Surg ; 18(1): 85-93, 2023 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-35933491

RESUMO

PURPOSE: Virtual reality (VR) simulation has the potential to advance surgical education, procedural planning, and intraoperative guidance. "SurgiSim" is a VR platform developed for the rehearsal of complex procedures using patient-specific anatomy, high-fidelity stereoscopic graphics, and haptic feedback. SurgiSim is the first VR simulator to include a virtual operating room microscope. We describe the process of designing and refining the VR microscope user experience (UX) and user interaction (UI) to optimize surgical rehearsal and education. METHODS: Human-centered VR design principles were applied in the design of the SurgiSim microscope to optimize the user's sense of presence. Throughout the UX's development, the team of developers met regularly with surgeons to gather end-user feedback. Supplemental testing was performed on four participants. RESULTS: Through observation and participant feedback, we made iterative design upgrades to the SurgiSim platform. We identified the following key characteristics of the VR microscope UI: overall appearance, hand controller interface, and microscope movement. CONCLUSION: Our design process identified challenges arising from the disparity between VR and physical environments that pertain to microscope education and deployment. These roadblocks were addressed using creative solutions. Future studies will investigate the efficacy of VR surgical microscope training on real-world microscope skills as assessed by validated performance metrics.


Assuntos
Treinamento por Simulação , Cirurgiões , Realidade Virtual , Humanos , Simulação por Computador , Cirurgiões/educação , Salas Cirúrgicas , Treinamento por Simulação/métodos , Competência Clínica , Interface Usuário-Computador
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa