Your browser doesn't support javascript.
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 51
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
J Sports Sci ; 38(5): 486-493, 2020 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-31865835

RESUMO

Biomechanical analysis has typically been confined to a laboratory setting. While attempts have been made to take laboratory testing into the field, this study was designed to assess whether augmented reality (AR) could be used to bring the field into the laboratory. This study aimed to measure knee load in volleyball players through a jump task incorporating AR while maintaining the perception-action couplings by replicating the visual features of a volleyball court. Twelve male volleyball athletes completed four tasks: drop landing, hop jump, spike jump, and spike jump while wearing AR smart glasses. Biomechanical variables included patellar tendon force, knee moment and kinematics of the ankle, knee, hip, pelvis and thorax. The drop landing showed differences in patellar tendon force and knee moment when compared to the other conditions. The hop jump did not present differences in kinetics when compared to the spike conditions, instead of displaying the greatest kinematic differences. As a measure of patellar tendon loading the AR condition showed a close approximation to the spike jump, with no differences present when comparing landing forces and mechanics. Thus, AR may be used in a clinical assessment to better replicate information from the competitive environment.


Assuntos
Desempenho Atlético/fisiologia , Realidade Aumentada , Voleibol/fisiologia , Adolescente , Articulação do Tornozelo/fisiologia , Fenômenos Biomecânicos/fisiologia , Articulação do Quadril/fisiologia , Humanos , Articulação do Joelho/fisiologia , Masculino , Ligamento Patelar/fisiologia , Reprodutibilidade dos Testes , Adulto Jovem
2.
Bone Joint J ; 101-B(12): 1479-1488, 2019 12.
Artigo em Inglês | MEDLINE | ID: mdl-31786992

RESUMO

AIMS: Computer-based applications are increasingly being used by orthopaedic surgeons in their clinical practice. With the integration of technology in surgery, augmented reality (AR) may become an important tool for surgeons in the future. By superimposing a digital image on a user's view of the physical world, this technology shows great promise in orthopaedics. The aim of this review is to investigate the current and potential uses of AR in orthopaedics. MATERIALS AND METHODS: A systematic review of the PubMed, MEDLINE, and Embase databases up to January 2019 using the keywords 'orthopaedic' OR 'orthopedic AND augmented reality' was performed by two independent reviewers. RESULTS: A total of 41 publications were included after screening. Applications were divided by subspecialty: spine (n = 15), trauma (n = 16), arthroplasty (n = 3), oncology (n = 3), and sports (n = 4). Out of these, 12 were clinical in nature. AR-based technologies have a wide variety of applications, including direct visualization of radiological images by overlaying them on the patient and intraoperative guidance using preoperative plans projected onto real anatomy, enabling hands-free real-time access to operating room resources, and promoting telemedicine and education. CONCLUSION: There is an increasing interest in AR among orthopaedic surgeons. Although studies show similar or better outcomes with AR compared with traditional techniques, many challenges need to be addressed before this technology is ready for widespread use. Cite this article: Bone Joint J 2019;101-B:1479-1488.


Assuntos
Realidade Aumentada , Procedimentos Ortopédicos/métodos , Cirurgia Assistida por Computador/métodos , Atitude do Pessoal de Saúde , Humanos , Procedimentos Ortopédicos/tendências , Ortopedia/métodos , Ortopedia/tendências , Cirurgiões , Cirurgia Assistida por Computador/tendências
3.
Adv Exp Med Biol ; 1171: 15-23, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31823236

RESUMO

Emerging technologies have the potential to transform our approach to medical education. A goal in this chapter is to inspire researchers, educators and scholars in the bio-medical visualisation field who can benefit from integrating wearable Augmented Reality (AR) technologies, like the HoloLens into their existing teaching and learning environments. We draw from case studies, existing research and the educational technology literature, to propose the design of purposeful learner-centered experiences that might benefit from wearable AR technologies in the classroom.


Assuntos
Realidade Aumentada , Educação Médica , Dispositivos Eletrônicos Vestíveis , Educação Médica/métodos , Educação Médica/tendências , Aprendizagem
4.
Adv Exp Med Biol ; 1171: 105-126, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31823243

RESUMO

The use of augmented reality (AR) has a rich history and is used in a number of fields. Its application in healthcare and anatomy education is developing considerable interest. However, although its popularity is on the rise, its use as an educational and practical tool has not been sufficiently evaluated, especially with children. Therefore, this study presents the design, development and evaluation of an educational tablet-based application with AR functionality for children. A distal radius fracture was chosen, as it is one of the more common fractures in the younger age group. Following a standardized software engineering methodology, we identified functional and non-functional requirements, creating a child-friendly tablet based AR application. This used industry standard software and incorporated three-dimensional models of a buckle fracture, object and image target marker recognition, interactivity and educational elements. In addition, we surveyed children at the Glasgow Science Centre on its usability, design and educational effectiveness. Seventy-one children completed a questionnaire (25 also underwent a short structured interview). Overall, the feedback was positive relating to entertainment value, graphic design, usability and educational scope of the application. Notably, it was shown to increase user understanding of radiology across all age groups following a trial of the application. This study shows the great potential of using digital technologies, and more particularly augmented information, in engaging future generations in science from a young age. Creation of educational materials using digital technologies, and evaluating its effectiveness, highlights the great scope novel technology could have in anatomical education and training.


Assuntos
Realidade Aumentada , Educação de Pacientes como Assunto , Interface Usuário-Computador , Criança , Humanos , Educação de Pacientes como Assunto/normas , Assistência Centrada no Paciente , Escócia , Software/normas , Inquéritos e Questionários
5.
BMC Oral Health ; 19(1): 238, 2019 11 08.
Artigo em Inglês | MEDLINE | ID: mdl-31703708

RESUMO

BACKGROUND: Virtual reality is the science of creating a virtual environment for the assessment of various anatomical regions of the body for the diagnosis, planning and surgical training. Augmented reality is the superimposition of a 3D real environment specific to individual patient onto the surgical filed using semi-transparent glasses to augment the virtual scene.. The aim of this study is to provide an over view of the literature on the application of virtual and augmented reality in oral & maxillofacial surgery. METHODS: We reviewed the literature and the existing database using Ovid MEDLINE search, Cochran Library and PubMed. All the studies in the English literature in the last 10 years, from 2009 to 2019 were included. RESULTS: We identified 101 articles related the broad application of virtual reality in oral & maxillofacial surgery. These included the following: Eight systematic reviews, 4 expert reviews, 9 case reports, 5 retrospective surveys, 2 historical perspectives, 13 manuscripts on virtual education and training, 5 on haptic technology, 4 on augmented reality, 10 on image fusion, 41 articles on the prediction planning for orthognathic surgery and maxillofacial reconstruction. Dental implantology and orthognathic surgery are the most frequent applications of virtual reality and augmented reality. Virtual planning improved the accuracy of inserting dental implants using either a statistic guidance or dynamic navigation. In orthognathic surgery, prediction planning and intraoperative navigation are the main applications of virtual reality. Virtual reality has been utilised to improve the delivery of education and the quality of training in oral & maxillofacial surgery by creating a virtual environment of the surgical procedure. Haptic feedback provided an additional immersive reality to improve manual dexterity and improve clinical training. CONCLUSION: Virtual and augmented reality have contributed to the planning of maxillofacial procedures and surgery training. Few articles highlighted the importance of this technology in improving the quality of patients' care. There are limited prospective randomized studies comparing the impact of virtual reality with the standard methods in delivering oral surgery education.


Assuntos
Realidade Aumentada , Cirurgia Bucal , Realidade Virtual , Humanos , Estudos Prospectivos , Estudos Retrospectivos
6.
Nature ; 575(7783): 453-454, 2019 11.
Artigo em Inglês | MEDLINE | ID: mdl-31748719
7.
Curr Urol Rep ; 20(12): 81, 2019 Nov 28.
Artigo em Inglês | MEDLINE | ID: mdl-31782033

RESUMO

PURPOSE OF REVIEW: Postgraduate medical training has evolved considerably from an emphasis on hands-on, autonomous learning to a paradigm where simulation technologies are used to introduce and augment certain skill sets. This review is intended to provide an update on surgical simulators and tools for urological trainee education. RECENT FINDINGS: We provide an overview of simulation platforms for robotics, endoscopy, and laparoscopic practice and training. In general, these simulators provide face, content, and construct validity. Various educational and evaluation tools have been adopted. Simulation platforms have been developed for technical and non-technical surgical skills, educational bootcamps, and tools for evaluation and feedback. While trainees find the opportunity to practice their skills beneficial, there may be difficulty with access due to cost and availability. Additionally, there is a need for more objective metrics demonstrating improvement in skill or patient outcome.


Assuntos
Simulação por Computador , Treinamento por Simulação , Procedimentos Cirúrgicos Urológicos/educação , Urologia/educação , Realidade Aumentada , Cadáver , Competência Clínica , Avaliação Educacional , Endoscopia/educação , Humanos , Imagem Tridimensional , Internato e Residência , Laparoscopia/educação , Aplicativos Móveis , Impressão Tridimensional , Procedimentos Cirúrgicos Robóticos/educação , Smartphone , Cirurgia Assistida por Computador/educação , Visitas com Preceptor , Procedimentos Cirúrgicos Urológicos/métodos
8.
J Laparoendosc Adv Surg Tech A ; 29(11): 1419-1426, 2019 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-31613679

RESUMO

Background: The eoSim® laparoscopic augmented reality (AR) simulator has instrument tracking capabilities that may be suitable for implementation in laparoscopic training. The objective is to assess face, content, and construct validity of this simulator for basic laparoscopic skills training. Methods: Participants were divided into three groups: novices (no training), intermediates (<50 laparoscopic procedures), and experts (>50 laparoscopic procedures). Three basic tasks were completed on the simulator: thread transfer (1), cyst dissection (2), and tube ligation (3). A questionnaire was completed on realism, didactic value, and usability of the simulator. Measured outcome parameters were as follows: time, distance, time off screen, average speed, acceleration, and smoothness. Results: Mean ± standard deviation scores on realism were positive (Task 1 or T1; 3.9 ± 0.7, P = .13, T2; 3.7 ± 0.7, P = .07, T3; 3.7 ± 0.07), as well as didactic value (T1; 3.9 ± 0.8, P = .71, T2; 3.9 ± 0.8, P = .31, T3; 4.0 ± 0.8, P = .40). Usability was valued the highest, with mean scores between 3.9 and 4.3 (T1; P = .71, T2; P = .80, T3; P = .85). Scores did not differ significantly between groups. Experts were significantly faster (Task 1; P < .001, Task 2; P = .042, Task 3: P < .001) with higher handling speed for tasks 2 and 3 (Task 1; P = .20, task 2; P = .034, task 3; P = .049). Results for other outcome parameters also indicated experts had better instrument control and efficiency than novices, although these differences did not reach statistical significance. Conclusions: The eoSim laparoscopic AR simulator is regarded as a realistic, accessible, and useful tool for the training of basic laparoscopic skills, with good face validity. Construct validity of the eoSim AR simulator was demonstrated on several core variables, but not all.


Assuntos
Realidade Aumentada , Competência Clínica , Laparoscopia/educação , Treinamento por Simulação/métodos , Adulto , Cistos/cirurgia , Dissecação , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Reprodutibilidade dos Testes , Software , Esterilização Tubária , Inquéritos e Questionários , Adulto Jovem
9.
Cogn Behav Neurol ; 32(3): 172-178, 2019 09.
Artigo em Inglês | MEDLINE | ID: mdl-31517700

RESUMO

BACKGROUND: Mixed reality (MR) technology, which combines the best features of augmented reality and virtual reality, has recently emerged as a promising tool in cognitive rehabilitation therapy. OBJECTIVE: To investigate the effectiveness of an MR-based cognitive training system for individuals with mild cognitive impairment (MCI). METHODS: Twenty-one individuals aged 65 years and older who had been diagnosed with MCI were recruited for this study and were divided into two groups. Participants in the MR group (n=10, aged 70.5±4.2 years) received 30 minutes of training 3 times a week for 6 weeks using a newly developed MR-based cognitive training system. Participants in the control group (n=11, aged 72.6±5.3 years) received the same amount of training using a conventional computer-assisted cognitive training system. Both groups took the Korean version of the Consortium to Establish a Registry for Alzheimer's Disease (CERAD-K) both before and after intervention. To determine the effect of the intervention on cognitive function, we compared the difference in each group's CERAD-K scores. RESULTS: There was a statistically significant interaction between intervention (MR group vs control group) and time (before vs after intervention) as assessed by the Constructional Recall Test. The individuals with MCI who participated in the MR training showed significantly improved performance in visuospatial working memory compared with the individuals with MCI who participated in the conventional training. CONCLUSION: An MR-based cognitive training system can be used as a cognitive training tool to improve visuospatial working memory in individuals with MCI.


Assuntos
Transtornos Cognitivos/diagnóstico , Testes Neuropsicológicos/normas , Idoso , Idoso de 80 Anos ou mais , Realidade Aumentada , Disfunção Cognitiva/diagnóstico , Feminino , Humanos , Masculino , Projetos Piloto , Realidade Virtual
10.
Clin Hemorheol Microcirc ; 73(1): 125-133, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31561348

RESUMO

BACKGROUND: A physiological and minimal invasive form of surgery with minimal risk to treat lymphedemas is the so-called supermicrosurgical lymphovenous anastomosis (LVA) where a lymph vessel is connected with a venule. METHODS: 30 patients (between 2018 and 2019) with secondary upper extremity lymphedema refractory to conservative therapy (manual lymph drainage and compression therapy were operated using the "simplified lymphovenous anastomosis" method). For the assessment of lymphatic supermicrosurgery, an operating microscope in which a near-infrared illumination system is integrated (Leica M530 OHX with glow technology ULT530, Leica Microsystems) and the IC-FlowTM Imaging System(Diagnostic Green)/Visionsense System (Medtronic) together with a ZEISS S8 microscope was used. Augmented reality intraoperative indocyanin green (ICG) lymphography-navigated modified "simplified lymphovenous anastomosis" were performed on the Leica microscope. All patients were informed about Off-label-use of ICG lymphography. RESULTS: 57 LVAs were performed with modified "simplified lymphovenous anastomosis" lymphography-guidance on 30 upper extremities. All patients showed good patency after lymphovenous anastomosis. CONCLUSIONS: Supermicrosurgery in the case of LVA is minimally invasive, highly effective, and shows a very low complication rate. The surgeon/equipment-related factors restrict the pratice of LVA, and its effectiveness limited by technical constraints.


Assuntos
Anastomose Cirúrgica/métodos , Fluorescência , Verde de Indocianina/química , Vasos Linfáticos/cirurgia , Linfedema/cirurgia , Procedimentos Cirúrgicos Vasculares/métodos , Adulto , Idoso , Realidade Aumentada , Feminino , Humanos , Masculino , Pessoa de Meia-Idade
11.
J Orthop Surg Res ; 14(1): 255, 2019 Aug 08.
Artigo em Inglês | MEDLINE | ID: mdl-31395071

RESUMO

BACKGROUND: The purpose of this study was to assess the clinical outcome of percutaneous kyphoplasty (PKP) assisted with mixed reality (MR) technology in treatment of osteoporotic vertebral compression fracture (OVCF) with intravertebral vacuum cleft (IVC). METHOD: Forty cases of OVCF with IVC undergoing PKP were randomized into a MR technology-assisted group (group A) and a traditional C-arm fluoroscopy group (group B). Both groups were performed PKP and evaluated by VAS scores, ODI scores, radiological evidence of vertebral body height, and kyphotic angle (KA) at pre-operation and post-operation. The volume of injected cement, fluoroscopy times, and operation time were recorded. And cases of non-PMMA-endplates-contact(NPEC) in radiological evidence was also recorded postoperatively. The clinical outcomes and complications were evaluated afterwards. All patients received 10 to 14 months follow-up, with an average of 12 months. RESULT: This MR-assisted group (group A) acquired more about the amount of the polymethyl methacrylate (PMMA) injection and postoperative vertebral height and less about postoperative KA, fluoroscopy times, and operation time compared with the control group (group B) (P < 0.05). The VAS scores and ODI scores in both groups have improved, but more significantly in group A (P < 0.05). Also, more cases achieve both-endplates-touching of cement in group A (P < 0.05). And there are less of the loss of vertebral height, KA, and occurrence of re-collapse of the vertebra in group A during the follow-up (P < 0.05). CONCLUSION: PKP assisted with MR technology can accurately orientate the position of IVC area, which can be augmented by the balloon leading to more satisfied vertebral height improvement, cement diffusion, and pain relief. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT03959059 . Registered 25 September 2016.


Assuntos
Realidade Aumentada , Fraturas por Compressão/cirurgia , Cifoplastia/métodos , Fraturas por Osteoporose/cirurgia , Fraturas da Coluna Vertebral/cirurgia , Idoso , Idoso de 80 Anos ou mais , Feminino , Fraturas por Compressão/diagnóstico por imagem , Humanos , Masculino , Fraturas por Osteoporose/diagnóstico por imagem , Estudos Prospectivos , Fraturas da Coluna Vertebral/diagnóstico por imagem , Vácuo
12.
IEEE Int Conf Rehabil Robot ; 2019: 181-186, 2019 06.
Artigo em Inglês | MEDLINE | ID: mdl-31374627

RESUMO

Occupational rehabilitation is an integral part of the recovery process for workers who have sustained injuries at the workplace. It often requires the injured worker to engage in functional tasks that simulate the workplace environment to help regain their functional capabilities and allow for a return to employment. We present a system comprised of a robotic arm for recreating the physical dynamics of functional tasks and a 3D Augmented Reality (AR) display for immersive visualization of the tasks. While this system can be used to simulate a multitude of occupational tasks, we focus on one specific functional task. Participants perform a virtual version of the task using the robot-AR system, and a physical version of the same task without the system. This study shows the results for two able-bodied users to determine if the robot-AR system produces upper-limb movements similar to the real-life equivalent task. The similarity between relative joint positions, i.e., hand-to-elbow (H2E) and elbow-to-shoulder (E2S) displacements, is evaluated within clusters based on the spatial position of the user's hand. The H2E displacements for approximately 50% of hand position clusters were consistent between the robot-AR and real-world conditions and approximately 30% for E2S displacements. The similar clusters are distributed across the entire task space however, indicating the robot-AR system has the potential to properly simulate real-world equivalent tasks.


Assuntos
Realidade Aumentada , Traumatismos Ocupacionais/fisiopatologia , Traumatismos Ocupacionais/reabilitação , Robótica , Cotovelo/fisiopatologia , Mãos/fisiopatologia , Humanos , Articulações/fisiopatologia , Masculino , Ombro/fisiopatologia , Análise e Desempenho de Tarefas , Adulto Jovem
14.
Appl Ergon ; 80: 17-27, 2019 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-31280802

RESUMO

Wearable Augmented Reality Displays (WARDs) present situated, real-time information visually, providing immediate access to information to support decision making. The impacts of WARD use on operator performance, Situation Awareness (SA), and communication in one safety-critical system, marine transportation, were examined in a real-time physical simulator. WARD use improved operator trackkeeping performance, the practice of good seamanship, and SA, although operator responsiveness decreased. WARD users who used more closed-loop communication and information sharing showed improved threat avoidance, suggesting that operators can avoid accidents and failure through WARD use that promotes sharing and confirming information. WARD use also promoted information source diversity, a means of developing requisite variety. These operational impacts are important in safety-critical settings where failures can be catastrophic.


Assuntos
Realidade Aumentada , Conscientização , Comunicação , Análise e Desempenho de Tarefas , Dispositivos Eletrônicos Vestíveis/psicologia , Adulto , Tomada de Decisões , Feminino , Humanos , Masculino , Militares/psicologia , Navios , Interface Usuário-Computador
15.
Chin Med Sci J ; 34(2): 103-109, 2019 Jun 30.
Artigo em Inglês | MEDLINE | ID: mdl-31315751

RESUMO

With the continuous progress of virtual simulation technology, medical surgery visualization system has been developed from two-dimensional to three-dimensional, from digital to network and intelligentization. The visualization system with mixed reality technology will also be used in all stage of medical surgery, such as case discussion, surgical planning, intraoperative guidance, post-operative evaluation, rehabilitation, so as to further promote high intelligence, high precision of medical surgery, and consequently improve effectiveness of treatment and quality of medical service. This paper discusses the composition and technical characteristics of medical operation visualization system based on mixed reality technology, and introduces some typical applications of mixed reality technology in medical operation visualization, which provides a new perspective for the application of mixed technology in medical surgery.


Assuntos
Tecnologia Biomédica/métodos , Simulação por Computador , Cirurgia Geral/métodos , Realidade Aumentada , Humanos , Imagem Tridimensional
16.
J Craniomaxillofac Surg ; 47(8): 1280-1284, 2019 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-31337569

RESUMO

PURPOSE: Augmented reality (AR) is considered to be a valuable tool in craniofacial surgery for preoperative design, intraoperative navigation, and postoperative assessment. Corrective surgery is necessary synostotic plagiocephaly for functional and aesthetic outcomes. Open calvarial reconstruction is a difficult classic surgical procedure with a high accuracy requirement. The purpose of this study was to introduce an AR system application in synostotic plagiocephaly surgery. MATERIALS AND METHODS: Seven plagiocephaly patients (ages 6 months-24 months, average 16.7 months) were enrolled. Preoperative design was accomplished based on three-dimensional computed tomography (CT) data for patients with synostotic plagiocephaly. We completed the registration with the predefined markers through an image registration process preoperatively. Then, we overlaid the registration results into the surgical field to assist surgeons intraoperatively. CT scans were performed postoperatively. Intracranial volume was measured to judge the surgical outcomes. We performed a quantitative craniometric analysis between the planning of the reconstruction and post-operative results, and the main evaluation indicator was the intracranial volume asymmetry. RESULTS: We successfully applied the AR system in patients undergoing synostotic plagiocephaly, providing real-time navigational images of position and orientation information during open calvarial reconstruction surgery in 7 plagiocephaly patients within a span of 5 years. Good appearances were observed after the surgery. Cranial volume asymmetry was decreased from 27.87% to 16.57%, achieving precise intra-operative goals. No significant differences were found between planning and post-operative results. CONCLUSIONS: The AR system can be applied to plagiocephaly procedures guiding to obtain reliable and accurate results via a precise osteotomy.


Assuntos
Realidade Aumentada , Craniossinostoses , Pré-Escolar , Estética Dentária , Humanos , Imagem Tridimensional , Lactente , Crânio , Tomografia Computadorizada por Raios X
17.
Int J Comput Assist Radiol Surg ; 14(9): 1553-1563, 2019 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-31350704

RESUMO

PURPOSE: Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy. METHODS: In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping. RESULTS: We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and [Formula: see text], respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm. CONCLUSIONS: The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.


Assuntos
Realidade Aumentada , Interpretação de Imagem Radiográfica Assistida por Computador/métodos , Radiografia/métodos , Raios X , Algoritmos , Calibragem , Desenho de Equipamento , Fluoroscopia , Humanos , Imagem Tridimensional , Modelos Estatísticos , Destreza Motora , Estudos Prospectivos , Cirurgia Assistida por Computador/métodos , Tomografia Computadorizada por Raios X
18.
Comput Methods Programs Biomed ; 177: 253-268, 2019 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-31319954

RESUMO

BACKGROUND AND AIM: Surgical telepresence has been implemented using Mixed reality (MR) but, MR is theory based and only used for investigating research. The Aim of this paper is to propose and implement a new solution by merging augmented video (generating in local site) and virtual expertise surgeon hand (remote site). This system is to improve the visualization of surgical area, overlay accuracy in the merged video without having any discoloured patterns on hand, smudging artefacts on surgeon hand boundary and occluded areas of surgical area. METHODOLOGY: The Proposed system consists of an Enhanced Multi-Layer Mean Value Cloning (EMLMV) algorithm that improves the overlay accuracy, visualization accuracy and the processing time. This proposed algorithm includes trimap and alpha matting as a pre-processing stage of merging process, which helps to remove the smudging and discoloured artefacts surrounded by remote surgeon hand. RESULTS: Results showing that the proposed system improved the accuracy by reducing the overlay error of merging image from 1.3 mm (Millimeter) to 0.9 mm. Furthermore, it improves the visibility of surgeon hand in the final merged image from 98.4% (visibility of pixels) to 99.1% (visibility of pixels). Similarly, the processing time in our proposed solution is reduced, which is computed as 10 s to produce 50 frames, whilst, the state of art solution computes 11 s for the same number of frames. CONCLUSION: The proposed system focuses on the merging of augmented reality video (local site), and the virtual reality video (remote site) with the accurate visualization. we consider discoloured areas, smudging artefacts and occlusion as the main aspects to improve the accuracy of merged video in terms of overlay error and visualization error. So, the proposed system would produce the merged video with the removal of artefacts around the expert surgeon hand.


Assuntos
Realidade Aumentada , Neoplasias da Mama/diagnóstico por imagem , Mama/cirurgia , Mandíbula/cirurgia , Reconstrução Mandibular/métodos , Telemedicina/métodos , Cirurgia Vídeoassistida/métodos , Algoritmos , Simulação por Computador , Sistemas Especialistas , Feminino , Humanos , Processamento de Imagem Assistida por Computador , Imagem Tridimensional/métodos , Reprodutibilidade dos Testes , Software , Cirurgia Assistida por Computador/métodos , Interface Usuário-Computador , Gravação em Vídeo
19.
World Neurosurg ; 129: e767-e775, 2019 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-31203062

RESUMO

OBJECTIVE: To explore the effect of preoperative planning using mixed reality (MR) on training of percutaneous transforaminal endoscopic discectomy (PTED). METHODS: Before the training, we invited an experienced chief physician to plan the puncture path of PTED on the X-ray films of the lumbar spine model and the 3D Slicer platform, respectively, and used this as the standard to guide trainees. In the aggregate, 60 young residents were randomly divided into Group A (N = 30) and Group B (N = 30). Group A learned the 2-dimensional standard planning route, whereas Group B learned the standard route planning based on MR through the 3D Slicer platform. Then, trainees were asked to conduct PTED puncture on a lumbar spine model. Questionnaires were distributed to trainees before and after the training. During the training, puncture times, operating time (minutes), and fluoroscopy times were recorded. RESULTS: After the training, it was obvious that more trainees showed their recognition of MR, believing that MR could help preoperative planning and training of PTED. Their high satisfaction with the training indicated the success of our training. Moreover, puncture times, operating time (minutes), and fluoroscopy times of Group B were significantly lower than those of Group A. CONCLUSIONS: MR technology contributes to preoperative planning of PTED and is beneficial in the training of PTED. It significantly reduces puncture times and fluoroscopy times, providing a standardized method for the training of PTED.


Assuntos
Realidade Aumentada , Discotomia Percutânea/educação , Deslocamento do Disco Intervertebral/cirurgia , Treinamento por Simulação , Discotomia Percutânea/métodos , Estudos de Viabilidade , Humanos
20.
Med Phys ; 46(8): 3709-3718, 2019 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-31169914

RESUMO

PURPOSE: An accurate position of the needle is vitally important in low-dose-rate seed implantation brachytherapy. Our paper aims to implement a mixed reality navigation system to assist with the placement of the I125 seed implantation thoracoabdominal tumor brachytherapy needle and to validate the accuracy and quality of this type of method. METHODS: With the surgical navigation system, based on mixed reality through a novel modified multi-information fusion method, the fusion of virtual organs and a preoperative plan for a real patient and the tracking of surgical tools in real time were achieved. Personalized image recognition and pose estimation were used to track needle punctures in real time and to perform registration processes. After a one-time registration with a hexagonal prism tracker that used an iterative closest point algorithm, all information, including medical images and volume renderings of organs, needles, and seeds, was precisely merged with the patient. Doctors were able to observe the tumor target and to visualize the preoperative plan. This system was validated in both phantom and animal experiments. The accuracy of this system was validated by calculating the positional and rotational error of each needle insertion. The accuracy of implantation of each seed was determined in an animal experiment to test the accuracy in low-dose-rate brachytherapy. The efficiency of this system was also validated through time consumption assessments. RESULTS: In the phantom experiment, the average error of the needle locations was 0.664 mm and the angle error was 4.74°, average time consumption was 16.1 min with six needles inserted. Based on the results of the animal experiment, the accuracy of the needle insertion was 1.617 mm, while the angle error was 5.574° and the average error of the seed positions was 1.925 mm. CONCLUSIONS: This paper describes the design and experimental validation of a novel surgical navigation system based on mixed reality for I125 seed brachytherapy for thoracoabdominal tumors. This system was validated through a series of experiments, including phantom experiments and animal experiments. Compared with the traditional image-guided system, the procedure presented here is convenient, displays clinically acceptable accuracy and reduces the number of CT scans, allowing doctors to perform surgery based on a visualized plan. All the experimental results indicated that the procedure is ready to be applied in further clinical studies.


Assuntos
Realidade Aumentada , Braquiterapia/instrumentação , Animais , Desenho de Equipamento , Agulhas , Imagens de Fantasmas , Dosagem Radioterapêutica , Suínos , Tomografia Computadorizada por Raios X , Fluxo de Trabalho
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA