Your browser doesn't support javascript.
loading
Integrating Stereoscopic Video with Modular 3D Anatomic Models for Lateral Skull Base Training.
Barber, Samuel R; Jain, Saurabh; Son, Young-Jun; Almefty, Kaith; Lawton, Michael T; Stevens, Shawn M.
Afiliação
  • Barber SR; Department of Otolaryngology-Head and Neck Surgery, University of Arizona College of Medicine, Tucson, Arizona, United States.
  • Jain S; Department of Systems and Industrial Engineering, University of Arizona, Tucson, Arizona, United States.
  • Son YJ; Department of Systems and Industrial Engineering, University of Arizona, Tucson, Arizona, United States.
  • Almefty K; Division of Neurotology and Lateral Skull Base Surgery, Barrow Brain and Spine, Barrow Neurological Institute, Phoenix, Arizona, United States.
  • Lawton MT; Division of Neurotology and Lateral Skull Base Surgery, Barrow Brain and Spine, Barrow Neurological Institute, Phoenix, Arizona, United States.
  • Stevens SM; Division of Neurotology and Lateral Skull Base Surgery, Barrow Brain and Spine, Barrow Neurological Institute, Phoenix, Arizona, United States.
J Neurol Surg B Skull Base ; 82(Suppl 3): e268-e270, 2021 Jul.
Article em En | MEDLINE | ID: mdl-34306948
Introduction Current virtual reality (VR) technology allows the creation of instructional video formats that incorporate three-dimensional (3D) stereoscopic footage.Combined with 3D anatomic models, any surgical procedure or pathology could be represented virtually to supplement learning or surgical preoperative planning. We propose a standalone VR app that allows trainees to interact with modular 3D anatomic models corresponding to stereoscopic surgical videos. Methods Stereoscopic video was recorded using an OPMI Pentero 900 microscope (Zeiss, Oberkochen, Germany). Digital Imaging and Communications in Medicine (DICOM) images segmented axial temporal bone computed tomography and each anatomic structure was exported separately. 3D models included semicircular canals, facial nerve, sigmoid sinus and jugular bulb, carotid artery, tegmen, canals within the temporal bone, cochlear and vestibular aqueducts, endolymphatic sac, and all branches for cranial nerves VII and VIII. Finished files were imported into the Unreal Engine. The resultant application was viewed using an Oculus Go. Results A VR environment facilitated viewing of stereoscopic video and interactive model manipulation using the VR controller. Interactive models allowed users to toggle transparency, enable highlighted segmentation, and activate labels for each anatomic structure. Based on 20 variable components, a value of 1.1 × 10 12 combinations of structures per DICOM series was possible for representing patient-specific anatomy in 3D. Conclusion This investigation provides proof of concept that a hybrid of stereoscopic video and VR simulation is possible, and that this tool may significantly aid lateral skull base trainees as they learn to navigate a complex 3D surgical environment. Future studies will validate methodology.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Tipo de estudo: Prognostic_studies Idioma: En Ano de publicação: 2021 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Tipo de estudo: Prognostic_studies Idioma: En Ano de publicação: 2021 Tipo de documento: Article