Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
Add more filters










Publication year range
1.
Cureus ; 16(4): e59296, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38813320

ABSTRACT

Background Suturing requires repeated practice with guidance to prevent skill deterioration; however, guidance is often limited by expert availability. There is evidence that augmented reality (AR) may assist procedural skill acquisition among learners. This study examines the use of an AR suture guidance application to assist the independent practice of suturing. Methodology A novel suture guidance application was designed for the Microsoft HoloLens. The guidance system included a calibration system and holograms that projected over a suture pad in a stepwise manner. To assess the application, 30 medical students were recruited and randomly assigned to two groups. The control group (n = 16) was given 30 minutes of independent suture practice, while the experimental group (n = 14) utilized the suture guidance application. Both groups completed a pre- and post-test wound closure assessment. After the post-test, the control group trialed the suture guidance application. All participants completed a feedback survey on the application. Statistical analysis was completed using Stata (StataCorp., College Station, TX, USA) with paired Student's t-tests and Welch's t-tests with a significance of 95%. Results Both groups demonstrated a significant improvement in total time and time per stitch during the post-test. Additionally, comparing pre- and post-test assessments in the experimental group revealed a significant improvement in the total number of stitches (p = 0.007), the ratio of bisecting stitches (p = 0.02), and the symmetry of stitch bite (p = 0.03). The feedback survey supported the application for guiding suture placement and spacing. Participants identified limitations in the hologram stability and neck positioning. Conclusions This study suggests the potential to use AR to facilitate the independent practice of wound closure within simulation environments.

2.
Cureus ; 16(3): e55901, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38463412

ABSTRACT

Background Operating room (OR) nurses' training for surgical fields such as neurosurgery is often inconsistent and overly lengthy due to the lack of consistently scheduled procedures and the nature of procedures being for the most part emergencies. Virtual reality (VR) simulation has been explored for nurses training in various contexts with positive results. Objectives To develop a VR simulation that could replicate a pediatric neurosurgery craniotomy procedure reflecting a real OR scenario and the surgical procedural sequence of a craniotomy; and to assess OR nurses' confidence in assisting craniotomy procedures as scrub nurses before and after the VR simulation. Methods A pediatric craniotomy procedure was replicated using VR technology by a collaborative partnership between education, content, and technology experts within the Hospital for Sick Children, Toronto. Self-confidence among OR nurses to assist in craniotomy procedures was explored pre- and post-VR training sessions with a questionnaire ideated by the authors evaluating knowledge relevant to assisting craniotomy procedures with seven items. Results In total, 7 OR nurses participated in the study. The post-VR sessions questionnaires showed an increase of positive answers "extremely comfortable with the procedure" and "moderately comfortable with the procedure" compared to pre-VR sessions in all items except for "identify the hemostatic agents required during a bleed," for which no difference was noted. There were no issues with the equipment. Conclusion VR simulation session is an acceptable model to train OR nurses for the scrub nurse role in craniotomy procedures. VR simulation is a practical learning strategy for clinical situations that may occur inconsistently in real-time practice.

3.
Cureus ; 15(11): e48450, 2023 Nov.
Article in English | MEDLINE | ID: mdl-38073980

ABSTRACT

Hepatocellular carcinoma causes intrahepatic metastasis via the trans-portal vein. Thus, appropriate mapping of portal segments is necessary for laparoscopic anatomical liver resection. However, because of the difficulty in identifying tactile sensations and the limited surgical view of laparoscopy, augmented reality (AR) has recently been utilized in laparoscopic liver surgery to identify the tumor, vessels, and portal segments. Moreover, artificial intelligence (AI) has been employed to identify landmarks in two-dimensional (2D) images because of concerns regarding the accuracy of superimposing a three-dimensional (3D) model onto a 2D laparoscopic image. In this study, we report an AR-based projection mapping method of portal segments superimposing preoperative 3D models assisted by AI in laparoscopic surgery. The liver silhouette in laparoscopic images should be detected to superimpose 3D models. Labeled liver silhouettes were obtained from 380 images in surgical videos as learning images to implement AI-based silhouette detection. To implement this technique, we used Detectron2, a PyTorch-based object detection library by Facebook AI Research (Now, Meta AI, Menlo Park, California, United States). In the videos, the liver edges were displayed as green outlines according to AI. Additionally, 3D liver models with segmental mapping were generated using the open-source software 3D Slicer from computed tomography images. For AR display, we utilized the model target function of Vuforia SDK (PTC, Inc., Boston, Massachusetts, United States), an industrial AR library with silhouette-based AR display. Lastly, we merged the AI output video with a 3D model in Unity (Unity Software Inc., San Francisco, California, United States) to establish the projection mapping of the portal segment on 2D surgical images. The accuracy was assessed by measuring the maximum error between the liver edges of laparoscopic images and 3D liver silhouettes in five surgical videos. The maximum error between liver edges and 3D model silhouettes ranged from 4 mm to 22 mm in the AI-based approach and 12 mm to 55 mm in the non-AI-based approach. Meanwhile, the mean error was 14.5 and 31.2 mm in the AI-based and non-AI-based approaches, respectively. Despite camera movement, 3D AR displays were maintained. Thus, our AI-assisted projection mapping of the portal segment could offer a new approach for laparoscopic anatomical liver resection.

4.
Cureus ; 15(9): e45943, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37885489

ABSTRACT

Background Virtual reality (VR) simulation is a potential solution to the barriers surgical trainees are facing. There needs to be validation for its implementation within current training. We aimed to compare VR simulation to traditional methods in acquiring surgical skills for a TFN-ADVANCED™ Proximal Femoral Nailing System (TFNA; DePuy Synthes, Auckland, New Zealand) femoral nailing system. Methods Thirty-one surgical trainees were randomised to two groups: traditional-training group (control group) and a VR-training group (intervention group) for insertion of a short cephalomedullary TFNA nail. Both groups then inserted the same TFNA system into saw-bone femurs. Surveys evaluated validity of the relevant activities, perception of simulation, confidence, stress and anxiety. The primary outcomes were tip-apex distance (TAD) and user anxiety/confidence levels. Secondary outcomes included number of screw- and nail-guidewire insertion attempts, the time taken to complete and user validity of the VR system. Results There was no statistical difference in TAD between the intervention and control groups (9mm vs 15mm, p=0.0734). The only TAD at risk of cut-out was in the control group (25mm). There was no statistical difference in time taken (2547.5ss vs 2395ss, p=0.668), nail guide-wire attempts (two for both groups, p=0.355) and screw guide-wire attempts (one for both groups, p=0.702). The control group versus intervention had higher anxiety levels (50% vs 33%) and had lower confidence (61% vs 84%). Interpretation There was no objective difference in performance on a saw-bone model between groups. However, this VR simulator resulted in more confidence and lower anxiety levels whilst performing a simulated TFNA. Whilst further studies with larger sample sizes and exploration of transfer validity to the operating theatre are required, this study does indicate potential benefits of VR within surgical training.

5.
Stud Health Technol Inform ; 302: 433-437, 2023 May 18.
Article in English | MEDLINE | ID: mdl-37203711

ABSTRACT

ENTICE aimed to use co-creative methodologies in order to build a solid creation pipeline for medical experiential content. The project has developed and evaluated immersive learning resources and tools aiming to support well-defined learning objectives using tangible and intangible resources (AR/VR/MR, 3D printing) that are highly sought in the fields of anatomy and surgery. In this paper the preliminary results from the evaluation of the learning resources and tools in 3 countries as well as the lessons learnt are presented towards to the improvement of the medical education process.


Subject(s)
Education, Medical , Virtual Reality , Learning , Behavior Therapy , Printing, Three-Dimensional
6.
Front Robot AI ; 9: 927660, 2022.
Article in English | MEDLINE | ID: mdl-36246493

ABSTRACT

A novel haptic grasper that renders touch sensations to the user in 3-DoF (degrees of freedom), namely linear, rotary, and grasping motions, is presented. The touch sensations of the grasper include the combination of kinesthetic and tactile modalities such as stiffness, texture, and shape. The device is equipped with two swappable modular segments that provide stiffness and shape sensations. To increase the haptic fidelity, the textural surfaces that surround the outer surface of the segments are equipped with vibro-actuators underneath them. These vibro-actuators contribute to increasing the number of perceivable textures by varying amplitude, frequency, duration, and envelope of vibrations. The proposed device is characterized in terms of stiffness, shape and texture rendering capabilities. The experimental results validate the effectiveness of the developed haptic grasper in virtual/remote interactions. Also, the user studies and statistical analysis demonstrate that the users could perceive the high-fidelity haptic feedback with the unified sensations of kinesthetic and tactile cues.

7.
Soft Robot ; 9(1): 173-186, 2022 02.
Article in English | MEDLINE | ID: mdl-33571060

ABSTRACT

Variable stiffness actuation has applications in a wide range of fields, including wearable haptics, soft robots, and minimally invasive surgical devices. There have been numerous design approaches to control and tune stiffness and rigidity; however, most have relatively low specific load-carrying capacities (especially for flexural loads) in the most rigid state that restricts their use in small or slender devices. In this article, we present an approach to the design of slender, high flexural stiffness modules based on the principle of fiber jamming. The proposed fiber jamming modules (FJMs) consist of axially packed fibers in an airtight envelope that transition from a flexible to a rigid beam when a vacuum is created inside the envelope. This FJM can provide the flexural stiffness of up to eight times that of a particle jamming module in the rigid state. Unlike layer jamming modules, the design of FJMs further allows them to control stiffness while bending in space. We present an analytical model to guide the parameter choices for the design of fiber jamming devices. Finally, we demonstrate applications of FJMs, including as a versatile tool, as part of a kinesthetic force feedback haptic glove and as a programmable structure.


Subject(s)
Robotics , Wearable Electronic Devices , Equipment Design , Haptic Technology
8.
Int J Hosp Manag ; 94: 102869, 2021 Apr.
Article in English | MEDLINE | ID: mdl-34785847

ABSTRACT

The sudden outbreak of COVID-19 has severely affected the global hospitality industry. The hygiene and cleanliness of hotels has become the focal point in the recovery plan during COVID-19. This study investigates the effects of past disasters on the global hospitality industry, and how the industry responded to them. Since past pandemics and epidemics identified hygiene and cleanliness as an important factor, this study further explores the role of technology in ensuring hygiene and cleanliness. Hence, this study further examines the scalability of Industry 5.0 design principles into the hospitality context, leading to Hospitality 5.0 to improve operational efficiency. The study further delineates how Hospitality 5.0 technologies can ensure hygiene and cleanliness in various touchpoints in customer's journey. This study serves as a foundation to understand how synergy between humans and machines can be achieved through Hospitality 5.0. The theoretical and practical implications are discussed.

9.
Adv Sci (Weinh) ; 8(14): e2100230, 2021 07.
Article in English | MEDLINE | ID: mdl-34037331

ABSTRACT

Rapid advancements of artificial intelligence of things (AIoT) technology pave the way for developing a digital-twin-based remote interactive system for advanced robotic-enabled industrial automation and virtual shopping. The embedded multifunctional perception system is urged for better interaction and user experience. To realize such a system, a smart soft robotic manipulator is presented that consists of a triboelectric nanogenerator tactile (T-TENG) and length (L-TENG) sensor, as well as a poly(vinylidene fluoride) (PVDF) pyroelectric temperature sensor. With the aid of machine learning (ML) for data processing, the fusion of the T-TENG and L-TENG sensors can realize the automatic recognition of the grasped objects with the accuracy of 97.143% for 28 different shapes of objects, while the temperature distribution can also be obtained through the pyroelectric sensor. By leveraging the IoT and artificial intelligence (AI) analytics, a digital-twin-based virtual shop is successfully implemented to provide the users with real-time feedback about the details of the product. In general, by offering a more immersive experience in human-machine interactions, the proposed remote interactive system shows the great potential of being the advanced human-machine interface for the applications of the unmanned working space.

10.
Curr Robot Rep ; 2(1): 55-71, 2021.
Article in English | MEDLINE | ID: mdl-34977593

ABSTRACT

PURPOSE OF REVIEW: This review provides an overview of the most recent robotic ultrasound systems that have contemporary emerged over the past five years, highlighting their status and future directions. The systems are categorized based on their level of robot autonomy (LORA). RECENT FINDINGS: Teleoperating systems show the highest level of technical maturity. Collaborative assisting and autonomous systems are still in the research phase, with a focus on ultrasound image processing and force adaptation strategies. However, missing key factors are clinical studies and appropriate safety strategies. Future research will likely focus on artificial intelligence and virtual/augmented reality to improve image understanding and ergonomics. SUMMARY: A review on robotic ultrasound systems is presented in which first technical specifications are outlined. Hereafter, the literature of the past five years is subdivided into teleoperation, collaborative assistance, or autonomous systems based on LORA. Finally, future trends for robotic ultrasound systems are reviewed with a focus on artificial intelligence and virtual/augmented reality.

11.
Unfallchirurg ; 123(11): 836-842, 2020 Nov.
Article in German | MEDLINE | ID: mdl-33037457

ABSTRACT

INTRODUCTION: In its digital agenda the German Federal Government pursues the ambitious objective to facilitate digital competence and perform research into digital learning and teaching processes. Considerable investments are to be concentrated into the future viability of education, academic research and digitalization. As far as academic teaching and further education are concerned, not only in the field of orthopedics and trauma surgery, three aspects can be identified: digital organization, digital competence and digital tools. DIGITAL APPLICATIONS: New formats, such as the elective subject digital health of the Charité in Berlin, enable digital competences to be mediated in a multimodal and interdisciplinary way. With the help of a newly developed app the University of Essen provides teachers and students with mobile and flexible access to information on lectures in terms of content and organization. Especially because of transparency, high legal compliance and predictability, the digital logbook for the resident training program promises a real innovation for trainees in the further training reformation. Augmented and virtual reality play a crucial role in the imparting of practical skills and interconnect high-tech with classical craftsmanship. Digital training course formats have significantly gained in importance and are meanwhile well-established tools for efficient advanced medical training. OUTLOOK: If orthopedic and trauma surgeons take an active role in the process of digitalization of teaching, they can take part in decisions, adequately prepare the colleagues of tomorrow, optimize patient care, encourage innovations and altogether improve the discipline even more.


Subject(s)
Education, Medical, Undergraduate , Education, Medical , Orthopedics , Clinical Competence , Curriculum , Humans , Orthopedics/education , Students
12.
Urologe A ; 55(3): 350-5, 2016 Mar.
Article in German | MEDLINE | ID: mdl-26893136

ABSTRACT

More than other medical discipline, radiology is marked by technical innovation and continuous development, as well as the optimization of the underlying physical principles. In this respect, several trends that will crucially change and develop radiology over the next decade can be observed. Through the use of ever faster computer tomography, which also shows an ever-decreasing radiation exposure, the "workhorse" of radiology will have an even greater place and displace conventional X­ray techniques further. In addition, hybrid imaging, which is based on a combination of nuclear medicine and radiological techniques (keywords: PET/CT, PET/MRI) will become much more established and, in particular, will improve oncological imaging further, allowing increasingly individualized imaging for specific tracers and techniques of functional magnetic resonance imaging for a particular tumour. Future radiology will be strongly characterized by innovations in the software and Internet industry, which will enable new image viewing and processing methods and open up new possibilities in the context of the organization of radiological work.


Subject(s)
Diagnostic Imaging/trends , Forecasting , Radiology/trends , Urologic Neoplasms/diagnostic imaging , Urologic Neoplasms/pathology , Germany , Humans , Neoplasm Staging
SELECTION OF CITATIONS
SEARCH DETAIL