Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
Sensors (Basel) ; 22(13)2022 Jun 29.
Artigo em Inglês | MEDLINE | ID: mdl-35808407

RESUMO

This work analyzed the use of Microsoft HoloLens 2 in orthopedic oncological surgeries and compares it to its predecessor (Microsoft HoloLens 1). Specifically, we developed two equivalent applications, one for each device, and evaluated the augmented reality (AR) projection accuracy in an experimental scenario using phantoms based on two patients. We achieved automatic registration between virtual and real worlds using patient-specific surgical guides on each phantom. They contained a small adaptor for a 3D-printed AR marker, the characteristic patterns of which were easily recognized using both Microsoft HoloLens devices. The newest model improved the AR projection accuracy by almost 25%, and both of them yielded an RMSE below 3 mm. After ascertaining the enhancement of the second model in this aspect, we went a step further with Microsoft HoloLens 2 and tested it during the surgical intervention of one of the patients. During this experience, we collected the surgeons' feedback in terms of comfortability, usability, and ergonomics. Our goal was to estimate whether the improved technical features of the newest model facilitate its implementation in actual surgical scenarios. All of the results point to Microsoft HoloLens 2 being better in all the aspects affecting surgical interventions and support its use in future experiences.


Assuntos
Realidade Aumentada , Procedimentos Ortopédicos , Cirurgia Assistida por Computador , Ergonomia , Humanos , Imagens de Fantasmas , Software , Cirurgia Assistida por Computador/métodos
2.
Sensors (Basel) ; 21(4)2021 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-33672053

RESUMO

During the last decade, orthopedic oncology has experienced the benefits of computerized medical imaging to reduce human dependency, improving accuracy and clinical outcomes. However, traditional surgical navigation systems do not always adapt properly to this kind of interventions. Augmented reality (AR) and three-dimensional (3D) printing are technologies lately introduced in the surgical environment with promising results. Here we present an innovative solution combining 3D printing and AR in orthopedic oncological surgery. A new surgical workflow is proposed, including 3D printed models and a novel AR-based smartphone application (app). This app can display the patient's anatomy and the tumor's location. A 3D-printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow. Experiments on six realistic phantoms provided a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients' experience.


Assuntos
Realidade Aumentada , Imageamento Tridimensional , Smartphone , Cirurgia Assistida por Computador , Humanos , Impressão Tridimensional , Fluxo de Trabalho
3.
Sensors (Basel) ; 21(23)2021 Nov 24.
Artigo em Inglês | MEDLINE | ID: mdl-34883825

RESUMO

Patient-specific instruments (PSIs) have become a valuable tool for osteotomy guidance in complex surgical scenarios such as pelvic tumor resection. They provide similar accuracy to surgical navigation systems but are generally more convenient and faster. However, their correct placement can become challenging in some anatomical regions, and it cannot be verified objectively during the intervention. Incorrect installations can result in high deviations from the planned osteotomy, increasing the risk of positive resection margins. In this work, we propose to use augmented reality (AR) to guide and verify PSIs placement. We designed an experiment to assess the accuracy provided by the system using a smartphone and the HoloLens 2 and compared the results with the conventional freehand method. The results showed significant differences, where AR guidance prevented high osteotomy deviations, reducing maximal deviation of 54.03 mm for freehand placements to less than 5 mm with AR guidance. The experiment was performed in two versions of a plastic three-dimensional (3D) printed phantom, one including a silicone layer to simulate tissue, providing more realism. We also studied how differences in shape and location of PSIs affect their accuracy, concluding that those with smaller sizes and a homogeneous target surface are more prone to errors. Our study presents promising results that prove AR's potential to overcome the present limitations of PSIs conveniently and effectively.


Assuntos
Realidade Aumentada , Neoplasias Pélvicas , Cirurgia Assistida por Computador , Humanos , Imageamento Tridimensional , Pelve/cirurgia , Imagens de Fantasmas
4.
Entropy (Basel) ; 23(7)2021 Jun 26.
Artigo em Inglês | MEDLINE | ID: mdl-34206962

RESUMO

Deep learning is a recent technology that has shown excellent capabilities for recognition and identification tasks. This study applies these techniques in open cranial vault remodeling surgeries performed to correct craniosynostosis. The objective was to automatically recognize surgical tools in real-time and estimate the surgical phase based on those predictions. For this purpose, we implemented, trained, and tested three algorithms based on previously proposed Convolutional Neural Network architectures (VGG16, MobileNetV2, and InceptionV3) and one new architecture with fewer parameters (CranioNet). A novel 3D Slicer module was specifically developed to implement these networks and recognize surgical tools in real time via video streaming. The training and test data were acquired during a surgical simulation using a 3D printed patient-based realistic phantom of an infant's head. The results showed that CranioNet presents the lowest accuracy for tool recognition (93.4%), while the highest accuracy is achieved by the MobileNetV2 model (99.6%), followed by VGG16 and InceptionV3 (98.8% and 97.2%, respectively). Regarding phase detection, InceptionV3 and VGG16 obtained the best results (94.5% and 94.4%), whereas MobileNetV2 and CranioNet presented worse values (91.1% and 89.8%). Our results prove the feasibility of applying deep learning architectures for real-time tool detection and phase estimation in craniosynostosis surgeries.

5.
3D Print Med ; 10(1): 17, 2024 May 31.
Artigo em Inglês | MEDLINE | ID: mdl-38819536

RESUMO

BACKGROUND: Microtia is a congenital malformation of the auricle that affects approximately 4 of every 10,000 live newborns. Radiographic film paper is traditionally employed to bidimensionally trace the structures of the contralateral healthy ear in a quasi-artistic manner. Anatomical points provide linear and angular measurements. However, this technique proves time-consuming, subjectivity-rich, and greatly dependent on surgeon expertise. Hence, it's susceptible to shape errors and misplacement. METHODS: We present an innovative clinical workflow that combines 3D printing and augmented reality (AR) to increase objectivity and reproducibility of these procedures. Specifically, we introduce patient-specific 3D cutting templates and remodeling molds to carve and construct the cartilaginous framework that will conform the new ear. Moreover, we developed an in-house AR application compatible with any commercial Android tablet. It precisely guides the positioning of the new ear during surgery, ensuring symmetrical alignment with the healthy one and avoiding time-consuming intraoperative linear or angular measurements. Our solution was evaluated in one case, first with controlled experiments in a simulation scenario and finally during surgery. RESULTS: Overall, the ears placed in the simulation scenario had a mean absolute deviation of 2.2 ± 1.7 mm with respect to the reference plan. During the surgical intervention, the reconstructed ear was 3.1 mm longer and 1.3 mm wider with respect to the ideal plan and had a positioning error of 2.7 ± 2.4 mm relative to the contralateral side. Note that in this case, additional morphometric variations were induced from inflammation and other issues intended to be addressed in a subsequent stage of surgery, which are independent of our proposed solution. CONCLUSIONS: In this work we propose an innovative workflow that combines 3D printing and AR to improve ear reconstruction and positioning in microtia correction procedures. Our implementation in the surgical workflow showed good accuracy, empowering surgeons to attain consistent and objective outcomes.

6.
Int J Comput Assist Radiol Surg ; 18(11): 2023-2032, 2023 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-37310561

RESUMO

PURPOSE: Up to date, there has been a lack of software infrastructure to connect 3D Slicer to any augmented reality (AR) device. This work describes a novel connection approach using Microsoft HoloLens 2 and OpenIGTLink, with a demonstration in pedicle screw placement planning. METHODS: We developed an AR application in Unity that is wirelessly rendered onto Microsoft HoloLens 2 using Holographic Remoting. Simultaneously, Unity connects to 3D Slicer using the OpenIGTLink communication protocol. Geometrical transform and image messages are transferred between both platforms in real time. Through the AR glasses, a user visualizes a patient's computed tomography overlaid onto virtual 3D models showing anatomical structures. We technically evaluated the system by measuring message transference latency between the platforms. Its functionality was assessed in pedicle screw placement planning. Six volunteers planned pedicle screws' position and orientation with the AR system and on a 2D desktop planner. We compared the placement accuracy of each screw with both methods. Finally, we administered a questionnaire to all participants to assess their experience with the AR system. RESULTS: The latency in message exchange is sufficiently low to enable real-time communication between the platforms. The AR method was non-inferior to the 2D desktop planner, with a mean error of 2.1 ± 1.4 mm. Moreover, 98% of the screw placements performed with the AR system were successful, according to the Gertzbein-Robbins scale. The average questionnaire outcomes were 4.5/5. CONCLUSIONS: Real-time communication between Microsoft HoloLens 2 and 3D Slicer is feasible and supports accurate planning for pedicle screw placement.

7.
Int J Comput Assist Radiol Surg ; 17(11): 2081-2091, 2022 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-35776399

RESUMO

PURPOSE: Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. METHODS: The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses-thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. RESULTS: Tracking is performed with a median accuracy of 1.98 mm/1.81[Formula: see text] for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70[Formula: see text]. The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms. CONCLUSIONS: In this work, we showed that spatial tracking of retroreflective spheres with the depth camera of HoloLens 2 is feasible, achieving a promising accuracy for in situ visualization of live US images. For tracking, no additional hardware nor modifications to HoloLens 2 are required making it a cheap and easy-to-use approach. Moreover, a minimal latency of displayed images enables a real-time perception for the sonographer.


Assuntos
Estudos Transversais , Humanos , Ultrassonografia
8.
Artigo em Inglês | MEDLINE | ID: mdl-33259295

RESUMO

A methodology for the assessment of cell concentration, in the range 5-100 cells/ [Formula: see text], suitable for in vivo analysis of serous body fluids is presented in this work. This methodology is based on the quantitative analysis of ultrasound images obtained from cell suspensions and considers applicability criteria, such as short analysis times, moderate frequency, and absolute concentration estimation, all necessary to deal with the variability of tissues among different patients. Numerical simulations provided the framework to analyze the impact of echo overlapping and the polydispersion of scatterer sizes on the cell concentration estimation. The cell concentration range that can be analyzed as a function of the transducer and emitted waveform used was also discussed. Experiments were conducted to evaluate the performance of the method using 7- [Formula: see text] and 12- [Formula: see text] polystyrene particles in water suspensions in the 5-100 particles/ [Formula: see text] range. A single scanning focused transducer working at a central frequency of 20 MHz was used to obtain ultrasound images. The method proposed to estimate the concentration proved to be robust for different particle sizes and variations of gain acquisition settings. The effect of tissues placed in the ultrasound path between the probe and the sample was also investigated using 3-mm-thick tissue mimics. Under this situation, the algorithm was robust for the concentration analysis of 12 [Formula: see text] particle suspensions, yet significant deviations were obtained for the smallest particles.


Assuntos
Algoritmos , Líquidos Corporais , Humanos , Ultrassonografia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA