Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 229
Filtrar
1.
Sensors (Basel) ; 21(6)2021 Mar 15.
Artigo em Inglês | MEDLINE | ID: mdl-33804253

RESUMO

Metaverses embedded in our lives create virtual experiences inside of the physical world. Moving towards metaverses in aircraft maintenance, mixed reality (MR) creates enormous opportunities for the interaction with virtual airplanes (digital twin) that deliver a near-real experience, keeping physical distancing during pandemics. 3D twins of modern machines exported to MR can be easily manipulated, shared, and updated, which creates colossal benefits for aviation colleges who still exploit retired models for practicing. Therefore, we propose mixed reality education and training of aircraft maintenance for Boeing 737 in smart glasses, enhanced with a deep learning speech interaction module for trainee engineers to control virtual assets and workflow using speech commands, enabling them to operate with both hands. With the use of the convolutional neural network (CNN) architecture for audio features and learning and classification parts for commands and language identification, the speech module handles intermixed requests in English and Korean languages, giving corresponding feedback. Evaluation with test data showed high accuracy of prediction, having on average 95.7% and 99.6% on the F1-Score metric for command and language prediction, respectively. The proposed speech interaction module in the aircraft maintenance metaverse further improved education and training, giving intuitive and efficient control over the operation, enhancing interaction with virtual objects in mixed reality.


Assuntos
Realidade Aumentada , Aeronaves , Manutenção , Redes Neurais de Computação , Fala
2.
Sensors (Basel) ; 21(5)2021 Mar 07.
Artigo em Inglês | MEDLINE | ID: mdl-33800070

RESUMO

Augmented Reality (AR) provides an alternative to the traditional forms of interaction between humans and machines, and facilitates the access to certain technologies to groups of people with special needs like children. For instance, in pediatric healthcare, it is important to help children to feel comfortable during medical procedures and tests that may be performed on them. To tackle such an issue with the help of AR-based solutions, this article presents the design, implementation and evaluation of a novel open-source collaborative framework that enables to develop teaching, training, and monitoring pediatric healthcare applications. Specifically, such a framework allows for building collaborative applications and shared experiences for AR devices, providing functionalities for connecting with other AR devices and enabling real-time visualization and simultaneous interaction with virtual objects. Since all the communications involved in AR interactions are handled by AR devices, the proposed collaborative framework is able to operate autonomously through a Local Area Network (LAN), thus requiring no cloud or external servers. In order to demonstrate the potential of the proposed framework, a practical use case application is presented. Such an application has been designed to motivate pediatric patients and to encourage them to increase their physical activity through AR games. The presented games do not require any previous configuration, as they use ARCore automatic surface detection technology. Moreover, the AR mobile gaming framework allows multiple players to engage in the same AR experience, so children can interact and collaborate among them sharing the same AR content. In addition, the proposed AR system provides a remote web application that is able to collect and to visualize data on patient use, aiming to provide healthcare professionals with qualified data about the mobility and mood of their patients through an intuitive and user-friendly web tool. Finally, to determine the performance of the proposed AR system, this article presents its evaluation in terms of latency and processing time. The results show that both times are low enough to provide a good user experience.


Assuntos
Realidade Aumentada , Assistência à Saúde , Pediatria , Criança , Pessoal de Saúde , Humanos
3.
Sensors (Basel) ; 21(8)2021 Apr 11.
Artigo em Inglês | MEDLINE | ID: mdl-33920452

RESUMO

Serious games are a promising approach to improve gait rehabilitation for people with gait disorders. Combined with wearable augmented reality headset, serious games for gait rehabilitation in a clinical setting can be envisaged, allowing to evolve in a real environment and provide fun and feedback to enhance patient's motivation. This requires a method to obtain accurate information on the spatiotemporal gait parameters of the playing patient. To this end, we propose a new algorithm called HoloStep that computes spatiotemporal gait parameters using only the head pose provided by an augmented reality headset (Hololens). It is based on the detection of peaks associated to initial contact event, and uses a combination of locking distance, locking time, peak amplitude detection with custom thresholds for children with CP. The performance of HoloStep was compared during a walking session at comfortable speed to Zeni's reference algorithm, which is based on kinematics and a full 3D motion capture system. Our study included 62 children with cerebral palsy (CP), classified according to Gross Motor Function Classification System (GMFCS) between levels I and III, and 13 healthy participants (HP). Metrics such as sensitivity, specificity, accuracy and precision for step detection with HoloStep were above 96%. The Intra-Class Coefficient between steps length calculated with HoloStep and the reference was 0.92 (GMFCS I), 0.86 (GMFCS II/III) and 0.78 (HP). HoloStep demonstrated good performance when applied to a wide range of gait patterns, including children with CP using walking aids. Findings provide important insights for future gait intervention using augmented reality games for children with CP.


Assuntos
Realidade Aumentada , Paralisia Cerebral , Óculos Inteligentes , Adulto , Fenômenos Biomecânicos , Paralisia Cerebral/diagnóstico , Criança , Marcha , Humanos , Caminhada
4.
Sensors (Basel) ; 21(9)2021 Apr 22.
Artigo em Inglês | MEDLINE | ID: mdl-33922079

RESUMO

In the medical field, guidance to follow the surgical plan is crucial. Image overlay projection is a solution to link the surgical plan with the patient. It realizes augmented reality (AR) by projecting computer-generated image on the surface of the target through a projector, which can visualize additional information to the scene. By overlaying anatomical information or surgical plans on the surgery area, projection helps to enhance the surgeon's understanding of the anatomical structure, and intuitively visualizes the surgical target and key structures of the operation, and avoid the surgeon's sight diversion between monitor and patient. However, it still remains a challenge to project the surgical navigation information on the target precisely and efficiently. In this study, we propose a projector-based surgical navigation system. Through the gray code-based calibration method, the projector can be calibrated with a camera and then be integrated with an optical spatial locator, so that the navigation information of the operation can be accurately projected onto the target area. We validated the projection accuracy of the system through back projection, with average projection error of 3.37 pixels in x direction and 1.51 pixels in y direction, and model projection with an average position error of 1.03 ± 0.43 mm, and carried out puncture experiments using the system with correct rate of 99%, and qualitatively analyzed the system's performance through the questionnaire. The results demonstrate the efficacy of our proposed AR system.


Assuntos
Realidade Aumentada , Cirurgia Assistida por Computador , Calibragem , Humanos , Imageamento Tridimensional , Imagens de Fantasmas , Interface Usuário-Computador
5.
Sensors (Basel) ; 21(9)2021 Apr 28.
Artigo em Inglês | MEDLINE | ID: mdl-33924773

RESUMO

In recent years, we have assisted with an impressive advance in augmented reality systems and computer vision algorithms, based on image processing and artificial intelligence. Thanks to these technologies, mainstream smartphones are able to estimate their own motion in 3D space with high accuracy. In this paper, we exploit such technologies to support the autonomous mobility of people with visual disabilities, identifying pre-defined virtual paths and providing context information, reducing the distance between the digital and real worlds. In particular, we present ARIANNA+, an extension of ARIANNA, a system explicitly designed for visually impaired people for indoor and outdoor localization and navigation. While ARIANNA is based on the assumption that landmarks, such as QR codes, and physical paths (composed of colored tapes, painted lines, or tactile pavings) are deployed in the environment and recognized by the camera of a common smartphone, ARIANNA+ eliminates the need for any physical support thanks to the ARKit library, which we exploit to build a completely virtual path. Moreover, ARIANNA+ adds the possibility for the users to have enhanced interactions with the surrounding environment, through convolutional neural networks (CNNs) trained to recognize objects or buildings and enabling the possibility of accessing contents associated with them. By using a common smartphone as a mediation instrument with the environment, ARIANNA+ leverages augmented reality and machine learning for enhancing physical accessibility. The proposed system allows visually impaired people to easily navigate in indoor and outdoor scenarios simply by loading a previously recorded virtual path and providing automatic guidance along the route, through haptic, speech, and sound feedback.


Assuntos
Realidade Aumentada , Pessoas com Deficiência Visual , Algoritmos , Inteligência Artificial , Humanos , Redes Neurais de Computação
6.
Sensors (Basel) ; 21(6)2021 Mar 23.
Artigo em Inglês | MEDLINE | ID: mdl-33806863

RESUMO

Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers.


Assuntos
Realidade Aumentada , Óculos Inteligentes , Realidade Virtual
7.
Sensors (Basel) ; 21(4)2021 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-33672053

RESUMO

During the last decade, orthopedic oncology has experienced the benefits of computerized medical imaging to reduce human dependency, improving accuracy and clinical outcomes. However, traditional surgical navigation systems do not always adapt properly to this kind of interventions. Augmented reality (AR) and three-dimensional (3D) printing are technologies lately introduced in the surgical environment with promising results. Here we present an innovative solution combining 3D printing and AR in orthopedic oncological surgery. A new surgical workflow is proposed, including 3D printed models and a novel AR-based smartphone application (app). This app can display the patient's anatomy and the tumor's location. A 3D-printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow. Experiments on six realistic phantoms provided a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients' experience.


Assuntos
Realidade Aumentada , Imageamento Tridimensional , Smartphone , Cirurgia Assistida por Computador , Humanos , Impressão Tridimensional , Fluxo de Trabalho
8.
BMC Geriatr ; 21(1): 144, 2021 02 26.
Artigo em Inglês | MEDLINE | ID: mdl-33637043

RESUMO

BACKGROUND: Impaired balance leading to falls is common in the older adults, and there is strong evidence that balance training reduces falls and increases independence. Reduced resources in health care will result in fewer people getting help with rehabilitation training. In this regard, the new technology augmented reality (AR) could be helpful. With AR, the older adults can receive help with instructions and get feedback on their progression in balance training. The purpose of this pilot study was to examine the feasibility of using AR-based visual-interactive tools in balance training of the older adults. METHODS: Seven older adults (66-88 years old) with impaired balance trained under supervision of a physiotherapist twice a week for six weeks using AR-based visual-interactive guidance, which was facilitated through a Microsoft HoloLens holographic display. Afterwards, participants and physiotherapists were interviewed about the new technology and their experience of the training. Also, fear of falling and balance ability were measured before and after training. RESULTS: Five participants experienced the new technology as positive in terms of increased motivation and feedback. Experiences were mixed regarding the physical and technical aspects of the HoloLens and the design of the HoloLens application. Participants also described issues that needed to be further improved, for example, the training program was difficult and monotonous. Further, the HoloLens hardware was felt to be heavy, the application's menu was difficult to control with different hand manoeuvres, and the calibration took a long time. Suggestions for improvements were described. Results of the balance tests and self-assessment instruments indicated no improvements in balance performance after AR training. CONCLUSIONS: The study showed that training with the new technology is, to some extent, feasible for the older adults, but needs further development. Also, the technology seemed to stimulate increased motivation, which is a prerequisite for adherence to training. However, the new technology and training requires further development and testing in a larger context.


Assuntos
Acidentes por Quedas , Realidade Aumentada , Acidentes por Quedas/prevenção & controle , Idoso , Idoso de 80 Anos ou mais , Medo , Estudos de Viabilidade , Humanos , Projetos Piloto , Equilíbrio Postural , Tecnologia
9.
Zentralbl Chir ; 146(1): 37-43, 2021 Feb.
Artigo em Alemão | MEDLINE | ID: mdl-33588501

RESUMO

BACKGROUND: The digital transformation of healthcare is changing the medical profession. Augmented/Virtual Reality (AR/VR) and robotics are being increasingly used in different clinical contexts and require supporting education and training, which must begin within the medical school. There is currently a large discrepancy between the high demand and the number of scientifically proven concepts. The aim of this thesis was the conceptual design and structured evaluation of a newly developed learning/teaching concept for the digital transformation of medicine, with a special focus on the influence of surgical teaching. METHODS: Thirty-five students participated in three courses of the blended learning curriculum "Medicine in the digital age". The 4th module of this course deals with virtual reality, augmented reality and robotics in surgery. It is divided into the following course parts: (1) immersive surgery simulation of a laparoscopic cholecystectomy, (2) liver surgery planning using AR/VR, (3) basic skills on the VR simulator for robotic surgery, (4) collaborative surgery planning in virtual space and (5) expert discussion. After completing the overall curriculum, a qualitative and quantitative evaluation of the course concept was carried out by means of semi-structured interviews and standardised pre-/post-evaluation questionnaires. RESULTS: In the qualitative evaluation procedure of the interviews, 79 text statements were assigned to four main categories. The largest share (35%) was taken up by statements on the "expert discussion", which the students consider to be an elementary part of the course concept. In addition, the students perceived the course as a horizon-widening "learning experience" (29% of the statements) with high "practical relevance" (27%). The quantitative student evaluation shows a positive development in the three sub-competences knowledge, skills and attitude. CONCLUSION: Surgical teaching can be profitably used to develop digital skills. The speed of the change process of digital transformation in the surgical specialty must be considered. Curricular adaptation should be anchored in the course concept.


Assuntos
Realidade Virtual , Realidade Aumentada , Competência Clínica , Currículo , Humanos , Faculdades de Medicina
10.
PLoS One ; 16(1): e0242581, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-33481778

RESUMO

Artists can represent a 3D object by using only contours in a 2D drawing. Prior studies have shown that people can use such drawings to perceive 3D shapes reliably, but it is not clear how useful this kind of contour information actually is in a real dynamical scene in which people interact with objects. To address this issue, we developed an Augmented Reality (AR) device that can show a participant a contour-drawing or a grayscale-image of a real dynamical scene in an immersive manner. We compared the performance of people in a variety of run-of-the-mill tasks with both contour-drawings and grayscale-images under natural viewing conditions in three behavioral experiments. The results of these experiments showed that the people could perform almost equally well with both types of images. This contour information may be sufficient to provide the basis for our visual system to obtain much of the 3D information needed for successful visuomotor interactions in our everyday life.


Assuntos
Arte , Imageamento Tridimensional , Realidade Aumentada , Humanos , Jogos e Brinquedos , Tempo de Reação , Análise e Desempenho de Tarefas
11.
BMJ Case Rep ; 14(1)2021 Jan 11.
Artigo em Inglês | MEDLINE | ID: mdl-33431465

RESUMO

Body integrity identity disorder (BIID) is a rare condition characterised by a discrepancy between specific areas of an individual's perceived body image and body schema which causes the individual to disassociate those physical areas of their body from their internal representation. There are currently no efficacious, ethically unambiguous means for achieving long-lasting symptom reductions. In the case we present, two patients with BIID underwent an augmented reality (AR)-based simulation that virtually amputated their alienated limbs, allowing them to experience their ideal selves. During the exposure, both patients reported reductions in BIID-related complaints. These preliminary results suggest the existence of a possible therapeutic and diagnostic potential that AR possesses, which warrants further consideration within clinical healthcare settings.


Assuntos
Realidade Aumentada , Transtorno de Identidade da Integridade Corporal/terapia , Terapias Mente-Corpo , Adulto , Transtorno de Identidade da Integridade Corporal/diagnóstico , Transtorno de Identidade da Integridade Corporal/psicologia , Humanos , Masculino
12.
Expert Rev Med Devices ; 18(1): 1-8, 2021 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-33322948

RESUMO

Introduction: The field of augmented reality mediated spine surgery is growing rapidly and holds great promise for improving surgical capabilities and patient outcomes. Augmented reality can assist with complex or atypical cases involving challenging anatomy. As neuronavigation evolves, fundamental technical limitations remain in line-of-sight interruption and operator attention shift, which this novel augmented reality technology helps to address.Areas covered: XVision is a recently FDA-approved head mounted display for intraoperative neuronavigation, compatible with all current conventional pedicle screw technology. The device is a wireless, customizable headset with an integrated surgical tracking system and transparent retinal display. This review discusses the available literature on the safety and efficacy of XVision, as well as the current state of augmented reality technology in spine surgery.Expert opinion: Augmented-reality spine surgery is an emerging technology that may increase precision, efficiency, and safety as well as decrease radiation exposure of manual and robotic computer-navigated pedicle screw insertion techniques. The initial clinical experience with XVision has shown good outcomes and it has received positive operator feedback. Now that initial clinical safety and efficacy has been demonstrated, ongoing experience must be studied to empirically validate this technology and generate further innovation in this rapidly evolving field.


Assuntos
Realidade Aumentada , Equipamentos e Provisões , /efeitos adversos , Ensaios Clínicos como Assunto , Humanos , Vigilância de Produtos Comercializados , Coluna Vertebral/cirurgia , Resultado do Tratamento
13.
Plast Reconstr Surg ; 147(1): 25e-29e, 2021 01 01.
Artigo em Inglês | MEDLINE | ID: mdl-33370048

RESUMO

BACKGROUND: During a deep inferior epigastric perforator (DIEP) flap harvest, the identification and localization of the epigastric arteries and its perforators are crucial. Holographic augmented reality is an innovative technique that can be used to visualize this patient-specific anatomy extracted from a computed tomographic scan directly on the patient. This study describes an innovative workflow to achieve this. METHODS: A software application for the Microsoft HoloLens was developed to visualize the anatomy as a hologram. By using abdominal nevi as natural landmarks, the anatomy hologram is registered to the patient. To ensure that the anatomy hologram remains correctly positioned when the patient or the user moves, real-time patient tracking is obtained with a quick response marker attached to the patient. RESULTS: Holographic augmented reality can be used to visualize the epigastric arteries and its perforators in preparation for a deep inferior epigastric perforator flap harvest. CONCLUSIONS: Potentially, this workflow can be used visualize the vessels intraoperatively. Furthermore, this workflow is intuitive to use and could be applied for other flaps or other types of surgery.


Assuntos
Realidade Aumentada , Mamoplastia/métodos , Retalho Perfurante/transplante , Cirurgia Assistida por Computador/métodos , Coleta de Tecidos e Órgãos/métodos , Artérias Epigástricas/diagnóstico por imagem , Artérias Epigástricas/cirurgia , Estudos de Viabilidade , Feminino , Holografia , Humanos , Imageamento Tridimensional/métodos , Cuidados Intraoperatórios/métodos , Retalho Perfurante/irrigação sanguínea , Tomografia Computadorizada por Raios X
14.
Methods Mol Biol ; 2199: 347-356, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-33125660

RESUMO

Augmented reality (AR) allows a computer-generated 3D model to be superimposed onto a real-world environment in real time. The model can then be manipulated or probed interactively as if it is part of the real world. The application of AR in visualizing macromolecular structures is growing, primarily in showing preset collections of scenes for education purpose. Here, our emphasis is, however, on exploiting AR as a tool to facilitate scientific communication on the go. We have searched for freely available mobile software and custom-built tools which allow the display of user-specified protein structures. We provide step-by-step guides on a standalone app Ollomol (iOS and Android), as well as an in-browser web app, WebAR-PDB. Both of them allow users to specify entries from the Protein Data Bank (PDB) for an elementary AR experience. The application of AR enhances interactivity and imaginativity in macromolecular visualization.


Assuntos
Realidade Aumentada , Gráficos por Computador , Visualização de Dados , Aplicativos Móveis , Interface Usuário-Computador
15.
Orthop Clin North Am ; 52(1): 15-26, 2021 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-33222981

RESUMO

Augmented reality (AR) technology enhances a user's perception through the superimposition of digital information on physical images while still allowing for interaction with the physical world. The tracking, data processing, and display technology of traditional computer-assisted surgery (CAS) navigation have the potential to be consolidated to an AR headset equipped with high-fidelity cameras, microcomputers, and optical see-through lenses that create digital holographic images. This article evaluates AR applications specific to total knee arthroplasty, total hip arthroplasty, and the opportunities for AR to enhance arthroplasty education and professional development.


Assuntos
Artroplastia de Quadril/educação , Artroplastia do Joelho/educação , Realidade Aumentada , Instrução por Computador , Ortopedia/educação , Competência Clínica , Humanos , Cirurgia Assistida por Computador/instrumentação , Cirurgia Assistida por Computador/métodos
16.
BMC Med Educ ; 20(1): 510, 2020 Dec 16.
Artigo em Inglês | MEDLINE | ID: mdl-33327963

RESUMO

BACKGROUND: Cost-effective methods to facilitate practical medical education are in high demand and the "mixed-reality" (MR) technology seems suitable to provide students with instructions when learning a new practical task. To evaluate a step-by-step mixed reality (MR) guidance system for instructing a practical medical procedure, we conducted a randomized, single-blinded prospective trial on medical students learning bladder catheter placement. METHODS: We enrolled 164 medical students. Students were randomized into 2 groups and received instructions on how to perform bladder catheter placement on a male catheterization training model. One group (107 students) were given their instructions by an instructor, while the other group (57 students) were instructed via an MR guidance system using a Microsoft HoloLens. Both groups did hands on training. A standardized questionnaire covering previous knowledge, interest in modern technologies and a self-evaluation was filled out. In addition, students were asked to evaluate the system's usability. We assessed both groups's learning outcome via a standardized OSCE (objective structured clinical examination). RESULTS: Our evaluation of the learning outcome revealed an average point value of 19.96 ± 2,42 for the control group and 21.49 ± 2.27 for the MR group - the MR group's result was significantly better (p = 0.00). The self-evaluations revealed no difference between groups, however, the control group gave higher ratings when evaluating the quality of instructions. The MR system's assessment showed less usability, with a cumulative SUS (system usability scale) score of 56.6 (lower half) as well as a cumulative score of 24.2 ± 7.3 (n = 52) out of 100 in the NASA task load index. CONCLUSIONS: MR is a promising tool for instructing practical skills, and has the potential to enable superior learning outcomes. Advances in MR technology are necessary to improve the usability of current systems. TRIAL REGISTRATION: German Clinical Trial Register ID: DRKS00013186.


Assuntos
Realidade Aumentada , Instrução por Computador/métodos , Educação de Pós-Graduação em Medicina/métodos , Cateterismo Urinário , Realidade Virtual , Adulto , Competência Clínica , Autoavaliação Diagnóstica , Avaliação Educacional , Feminino , Humanos , Masculino , Estudos Prospectivos , Método Simples-Cego , Adulto Jovem
17.
Beijing Da Xue Xue Bao Yi Xue Ban ; 52(6): 1124-1129, 2020 Dec 18.
Artigo em Chinês | MEDLINE | ID: mdl-33331325

RESUMO

OBJECTIVE: To explore the application of mixed reality technique for the surgery of oral and maxillofacial tumors. METHODS: In this study, patients with a diagnosis of an oral and maxillofacial tumor who were referred to Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology from December 2018 to January 2020 were selected. The preoperative contrast-enhanced computed tomography data of the patients were imported into StarAtlas Holographic Medical Imaging System (Visual 3D Corp., Beijing, China). Three-dimensional (3D) model of tumor and key structures, such as skeleton and vessels were reconstructed to three-dimensionally present the spatial relationship between them, followed with the key structures delineation and preoperative virtual surgical planning. By using mixed reality technique, the real-time 3D model was displayed stereotactically in the surgical site. While keeping sterile during operation, the surgeon could use simple gestures to adjust the 3D model, and observed the location, range, and size of tumor and the key structures adjacent to the tumor. Mixed reality technique was used to assist the operation: 3D model registration was performed for guidance before tumor excision; intraoperative real-time verification was performed during tumor exposure and after excision of the tumor. The Likert scale was used to evaluate the application of mixed reality technique after the operation. RESULTS: Eight patients underwent mixed reality assisted tumor resection, and all of them successfully completed the operation. The average time of the 3D model registration was 12.0 minutes. In all the cases, the surgeon could intuitively and three-dimensionally observe the 3D model of the tumor and the surrounding anatomical structures, and could adjust the model during the operation. The results of the Likert scale showed that mixed reality technique got high scores in terms of perceptual accuracy, helping to locate the anatomical parts, the role of model guidance during surgery, and the potential for improving surgical safety (4.22, 4.19, 4.16, and 4.28 points respectively). Eight patients healed well without perioperative complications. CONCLUSION: By providing real-time stereotactic visualization of anatomy of surgical site and guiding the operation process through 3D model, mixed reality technique could improve the accuracy and safety of the excision of oral and maxillofacial tumors.


Assuntos
Neoplasias , Cirurgia Assistida por Computador , Realidade Aumentada , China , Humanos , Imageamento Tridimensional , Estudos Retrospectivos
18.
JMIR Mhealth Uhealth ; 8(12): e21643, 2020 12 31.
Artigo em Inglês | MEDLINE | ID: mdl-33382377

RESUMO

BACKGROUND: The recent widespread availability of augmented reality via smartphone offers an opportunity to translate cue exposure therapy for smoking cessation from the laboratory to the real world. Despite significant reductions in the smoking rates in the last decade, approximately 13.7% of the adults in the United States continue to smoke. Smoking-related cue exposure has demonstrated promise as an adjuvant therapy in the laboratory, but practical limitations have prevented its success in the real world. Augmented reality technology presents an innovative approach to overcome these limitations. OBJECTIVE: The aim of this study was to develop a smartphone app that presents smoking-related augmented reality images for cue exposure. Smokers provided feedback on the images and reported on the perceived urge to smoke, qualities of reality/coexistence, and general feedback about quality and functioning. The feedback was used to refine the augmented reality images within the app. METHODS: In collaboration with an augmented reality design company, we developed 6 smoking-related images (cigarette, lighter, ashtray, lit cigarette in ashtray, etc) and 6 neutral images similar in size or complexity for comparison (pen, eraser, notebook, soda bottle with droplets, etc). Ten smokers completed a survey of demographic characteristics, smoking history and behavior, dependence on nicotine, motivation to quit smoking, and familiarity with augmented reality technology. Then, participants viewed each augmented reality image and provided ratings on 10-point Likert scales for urge to smoke and reality/coexistence of the image into the scene. Participants were also queried with open-ended questions regarding the features of the images. RESULTS: Of the 10 participants, 5 (50%) had experienced augmented reality prior to the laboratory visit, but only 4 of those 5 participants used augmented reality at least weekly. Although the sample was small (N=10), smokers reported significantly higher urge to smoke after viewing the smoking-related augmented reality images (median 4.58, SD 3.49) versus the neutral images (median 1.42, SD 3.01) (Z=-2.14, P=.03; d=0.70). The average reality and coexistence ratings of the images did not differ between smoking-related and neutral images (all P>.29). Augmented reality images were found on average to be realistic (mean [SD] score 6.49 [3.11]) and have good environmental coexistence (mean [SD] score 6.93 [3.04]) and user coexistence (mean [SD] score 6.38 [3.27]) on the 10-point scale. Participant interviews revealed some areas of excellence (eg, details of the lit cigarette) and areas for improvement (eg, stability of images, lighting). CONCLUSIONS: All images were generally perceived as being realistic and well-integrated into the environment. However, the smoking augmented reality images produced higher urge to smoke than the neutral augmented reality images. In total, our findings support the potential utility of augmented reality for cue exposure therapy. Future directions and next steps are discussed.


Assuntos
Realidade Aumentada , Motivação , Abandono do Hábito de Fumar , Adulto , Humanos , Aplicativos Móveis , Fumantes , Fumar/efeitos adversos , Estados Unidos
19.
Urologiia ; (5): 37-40, 2020 Nov.
Artigo em Russo | MEDLINE | ID: mdl-33185344

RESUMO

AIM: to evaluate the efficiency and usefulness of augmented reality (AR) technology using HoloLens glasses for laparoscopic partial nephrectomy (LPN). MATERIALS AND METHODS: From July to December 2019, a total of 5 patients with localized kidney cancer (cT1aN0M0) underwent AR-assisted LRP. The mean RENAL score was 6 points (5-8). Preoperatively, all patients underwent contrast-enhanced multispiral computed tomography (MSCT). The three-dimensional reconstructions of the kidney, tumor, part of the abdominal aorta with the renal artery and its branches, part of the inferior vena cava with the renal vein were segmented with color coding and connected into a single virtual 3D model, which was loaded into the program in order to display image in AR glasses. The duration of surgery and thermal ischemia, type and frequency of intraoperative complications, as well as the time spent on preparing the 3D model and the Microsoft HoloLens device were evaluated. To assess the feasibility of using AR technology intraoperatively, a Likert scale was filled out by the surgeon. RESULTS: It took 10 (9-11) hours to prepare the model, including time to optimize the model and to set up its display and interactions. The setup of HoloLens required an average of 7.8 (5- 12) min. The total duration of the operation and the period of warm ischemia was 108 (90-120) and 20 (15-25) min, respectively, while intraoperative blood loss was 160 (110-250) ml. In all cases, a negative surgical margin was found. The surgeon who performed all the operations assessed the use of AR technology with the HoloLens device as highly beneficial in all clinical cases. CONCLUSION: The use of AR technology with a HoloLens holographic device during LPN can lead to improved treatment outcomes.


Assuntos
Neoplasias Renais , Laparoscopia , Urologia , Realidade Aumentada , Humanos , Neoplasias Renais/diagnóstico por imagem , Neoplasias Renais/cirurgia , Nefrectomia , Estudos Retrospectivos
20.
Annu Int Conf IEEE Eng Med Biol Soc ; 2020: 4612-4615, 2020 07.
Artigo em Inglês | MEDLINE | ID: mdl-33019021

RESUMO

Marker tracking for postural and range of motion (ROM) measurements transcends multiple disciplines (e.g., healthcare, ergonomics, engineering). A viable real-time mobile application is currently lacking for measuring limb angles and body posture. To address this need, a novel Android smartphone augmented-reality-based application was developed using the AprilTag2 fiducial marker system. To evaluate the app, two markers were printed on paper and attached to a wall. A Samsung S6 mobile phone was fixed on a tripod, parallel to the wall. The smartphone app tracked and recorded marker orientation and 2D position data in the camera frame, from front and rear cameras, for different smartphone placements. The average error between mobile phone and measured angles was less than 1 degree for all test settings (back camera=0.29°, front camera=0.33°, yaw rotation=0.75°, tilt rotation=0.22°). The average error between mobile phone and measured distance was less than 4 mm for all test settings (back camera=1.8 mm, front camera=2.5 mm, yaw rotation=3 mm, tilt rotation=3.8 mm). Overall, the app obtained valid and reliable angle and distance measurements with smartphone positions and cameras that would be expected in practice. Thus, this app is viable for clinical ROM and posture assessments.


Assuntos
Aplicativos Móveis , Smartphone , Realidade Aumentada , Postura , Amplitude de Movimento Articular
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...