Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
1.
J Vasc Interv Radiol ; 22(11): 1613-1618.e1, 2011 Nov.
Article in English | MEDLINE | ID: mdl-21959057

ABSTRACT

PURPOSE: To develop a consistent and reproducible method in an animal model for studies of radiofrequency (RF) ablation of primary hepatocellular carcinoma (HCC). MATERIALS AND METHODS: Fifteen woodchucks were inoculated with woodchuck hepatitis virus (WHV) to establish chronic infections. When serum γ-glutamyl transpeptidase levels became elevated, the animals were evaluated with ultrasound, and, in most cases, preoperative magnetic resonance (MR) imaging to confirm tumor development. Ultimately, RF ablation of tumors was performed by using a 1-cm probe with the animal submerged in a water bath for grounding. Ablation effectiveness was evaluated with contrast-enhanced MR imaging and gross and histopathologic analysis. RESULTS: RF ablation was performed in 15 woodchucks. Modifications were made to the initial study design to adapt methodology for the woodchuck. The last 10 of these animals were treated with a standardized protocol using a 1-cm probe that produced a consistent area of tumor necrosis (mean size of ablation, 10.2 mm × 13.1 mm) and led to no complications. CONCLUSIONS: A safe, reliable and consistent method was developed to study RF ablation of spontaneous primary HCC using chronically WHV-infected woodchucks, an animal model of hepatitis B virus-induced HCC.


Subject(s)
Carcinoma, Hepatocellular/surgery , Catheter Ablation , Hepatitis B Virus, Woodchuck/pathogenicity , Hepatitis B/virology , Liver Neoplasms, Experimental/surgery , Animals , Biopsy , Carcinoma, Hepatocellular/pathology , Carcinoma, Hepatocellular/virology , Catheter Ablation/instrumentation , Contrast Media , Equipment Design , Hepatitis B/complications , Liver Neoplasms, Experimental/pathology , Liver Neoplasms, Experimental/virology , Magnetic Resonance Imaging , Marmota , Necrosis , Reproducibility of Results , Time Factors
2.
IEEE Trans Vis Comput Graph ; 25(5): 1970-1980, 2019 05.
Article in English | MEDLINE | ID: mdl-30843843

ABSTRACT

This paper presents the implementation and evaluation of a 50,000-pose-sample-per-second, 6-degree-of-freedom optical head tracking instrument with motion-to-pose latency of 28µs and dynamic precision of 1-2 arcminutes. The instrument uses high-intensity infrared emitters and two duo-lateral photodiode-based optical sensors to triangulate pose. This instrument serves two purposes: it is the first step towards the requisite head tracking component in sub- 100µs motion-to-photon latency optical see-through augmented reality (OST AR) head-mounted display (HMD) systems; and it enables new avenues of research into human visual perception - including measuring the thresholds for perceptible real-virtual displacement during head rotation and other human research requiring high-sample-rate motion tracking. The instrument's tracking volume is limited to about 120×120×250 but allows for the full range of natural head rotation and is sufficient for research involving seated users. We discuss how the instrument's tracking volume is scalable in multiple ways and some of the trade-offs involved therein. Finally, we introduce a novel laser-pointer-based measurement technique for assessing the instrument's tracking latency and repeatability. We show that the instrument's motion-to-pose latency is 28µs and that it is repeatable within 1-2 arcminutes at mean rotational velocities (yaw) in excess of 500°/sec.


Subject(s)
Imaging, Three-Dimensional/instrumentation , Smart Glasses , User-Computer Interface , Virtual Reality , Computer Graphics , Equipment Design , Head Movements/physiology , Humans , Imaging, Three-Dimensional/methods , Time Factors
3.
IEEE Trans Vis Comput Graph ; 25(11): 3114-3124, 2019 11.
Article in English | MEDLINE | ID: mdl-31403422

ABSTRACT

In this paper, we present our novel design for switchable AR/VR near-eye displays which can help solve the vergence-accommodation-conflict issue. The principal idea is to time-multiplex virtual imagery and real-world imagery and use a tunable lens to adjust focus for the virtual display and the see-through scene separately. With this novel design, prescription eyeglasses for near- and far-sighted users become unnecessary. This is achieved by integrating the wearer's corrective optical prescription into the tunable lens for both virtual display and see-through environment. We built a prototype based on the design, comprised of micro-display, optical systems, a tunable lens, and active shutters. The experimental results confirm that the proposed near-eye display design can switch between AR and VR and can provide correct accommodation for both.


Subject(s)
Augmented Reality , Computer Graphics , Image Processing, Computer-Assisted/methods , Virtual Reality , Equipment Design , Eyeglasses , Holography , Humans
4.
Stud Health Technol Inform ; 132: 126-31, 2008.
Article in English | MEDLINE | ID: mdl-18391272

ABSTRACT

Radio frequency ablation is a minimally invasive intervention that introduces -- under 2D ultrasound guidance and via a needle-like probe -- high-frequency electrical current into non-resectable hepatic tumors. These recur mostly on the periphery, indicating errors in probe placement. Hypothesizing that a contextually correct 3D display will aid targeting and decrease recurrence, we have developed a prototype guidance system based on a head-tracked 3D display and motion-tracked instruments. We describe our reasoning and our experience in selecting components for, designing and constructing the 3D display. Initial candidates were an augmented reality see-through head-mounted display and a virtual reality "fish tank" system. We describe the system requirements and explain how we arrived at the final decision. We show the operational guidance system in use on phantoms and animals.


Subject(s)
Catheter Ablation , Computer Terminals , Head , Liver Neoplasms/surgery , User-Computer Interface , Equipment Design , Humans , United States
5.
IEEE Trans Vis Comput Graph ; 24(11): 2993-3004, 2018 11.
Article in English | MEDLINE | ID: mdl-30207957

ABSTRACT

We propose a new approach for 3D reconstruction of dynamic indoor and outdoor scenes in everyday environments, leveraging only cameras worn by a user. This approach allows 3D reconstruction of experiences at any location and virtual tours from anywhere. The key innovation of the proposed ego-centric reconstruction system is to capture the wearer's body pose and facial expression from near-body views, e.g. cameras on the user's glasses, and to capture the surrounding environment using outward-facing views. The main challenge of the ego-centric reconstruction, however, is the poor coverage of the near-body views - that is, the user's body and face are observed from vantage points that are convenient for wear but inconvenient for capture. To overcome these challenges, we propose a parametric-model-based approach to user motion estimation. This approach utilizes convolutional neural networks (CNNs) for near-view body pose estimation, and we introduce a CNN-based approach for facial expression estimation that combines audio and video. For each time-point during capture, the intermediate model-based reconstructions from these systems are used to re-target a high-fidelity pre-scanned model of the user. We demonstrate that the proposed self-sufficient, head-worn capture system is capable of reconstructing the wearer's movements and their surrounding environment in both indoor and outdoor situations without any additional views. As a proof of concept, we show how the resulting 3D-plus-time reconstruction can be immersively experienced within a virtual reality system (e.g., the HTC Vive). We expect that the size of the proposed egocentric capture-and-reconstruction system will eventually be reduced to fit within future AR glasses, and will be widely useful for immersive 3D telepresence, virtual tours, and general use-anywhere 3D content creation.


Subject(s)
Facial Expression , Imaging, Three-Dimensional/methods , Posture/physiology , User-Computer Interface , Video Recording/methods , Humans , Internet , Neural Networks, Computer
6.
IEEE Trans Vis Comput Graph ; 22(4): 1367-76, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26780797

ABSTRACT

We describe an augmented reality, optical see-through display based on a DMD chip with an extremely fast (16 kHz) binary update rate. We combine the techniques of post-rendering 2-D offsets and just-in-time tracking updates with a novel modulation technique for turning binary pixels into perceived gray scale. These processing elements, implemented in an FPGA, are physically mounted along with the optical display elements in a head tracked rig through which users view synthetic imagery superimposed on their real environment. The combination of mechanical tracking at near-zero latency with reconfigurable display processing has given us a measured average of 80 µs of end-to-end latency (from head motion to change in photons from the display) and also a versatile test platform for extremely-low-latency display systems. We have used it to examine the trade-offs between image quality and cost (i.e. power and logical complexity) and have found that quality can be maintained with a fairly simple display modulation scheme.

7.
Stud Health Technol Inform ; 220: 55-62, 2016.
Article in English | MEDLINE | ID: mdl-27046554

ABSTRACT

This paper introduces a computer-based system that is designed to record a surgical procedure with multiple depth cameras and reconstruct in three dimensions the dynamic geometry of the actions and events that occur during the procedure. The resulting 3D-plus-time data takes the form of dynamic, textured geometry and can be immersively examined at a later time; equipped with a Virtual Reality headset such as Oculus Rift DK2, a user can walk around the reconstruction of the procedure room while controlling playback of the recorded surgical procedure with simple VCR-like controls (play, pause, rewind, fast forward). The reconstruction can be annotated in space and time to provide more information of the scene to users. We expect such a system to be useful in applications such as training of medical students and nurses.


Subject(s)
Educational Measurement/methods , General Surgery/education , Imaging, Three-Dimensional/methods , Operating Rooms/methods , Photography/methods , Surgery, Computer-Assisted/methods , Computer-Assisted Instruction , Humans , Imaging, Three-Dimensional/instrumentation , Pattern Recognition, Automated/methods , Photography/instrumentation , Reproducibility of Results , Sensitivity and Specificity , Software , Surgery, Computer-Assisted/instrumentation , Systems Integration , Video Games , Whole Body Imaging/instrumentation , Whole Body Imaging/methods
9.
Med Image Anal ; 6(3): 313-20, 2002 Sep.
Article in English | MEDLINE | ID: mdl-12270235

ABSTRACT

We report the results of a randomized, controlled trial to compare the accuracy of standard ultrasound-guided needle biopsy to biopsies performed using a 3D Augmented Reality (AR) guidance system. A board-certified radiologist conducted 50 core biopsies of breast phantoms, with biopsies randomly assigned to one of the methods in blocks of five biopsies each. The raw ultrasound data from each biopsy was recorded. Another board-certified radiologist, blinded to the actual biopsy guidance mechanism, evaluated the ultrasound recordings and determined the distance of the biopsy from the ideal position. A repeated measures analysis of variance indicated that the head-mounted display method led to a statistically significantly smaller mean deviation from the desired target than did the standard display method (2.48 mm for control versus 1.62 mm for augmented reality, p<0.02). This result suggests that AR systems can offer improved accuracy over traditional biopsy guidance methods.


Subject(s)
Biopsy, Needle/instrumentation , Breast Neoplasms/pathology , Computer Graphics , Imaging, Three-Dimensional/methods , Ultrasonography, Mammary/instrumentation , User-Computer Interface , Biopsy, Needle/methods , Breast Neoplasms/diagnostic imaging , Equipment Design , Humans , Imaging, Three-Dimensional/instrumentation , Models, Anatomic , Phantoms, Imaging , Reproducibility of Results , Sensitivity and Specificity , Ultrasonography, Mammary/methods
10.
Stud Health Technol Inform ; 94: 325-8, 2003.
Article in English | MEDLINE | ID: mdl-15455917

ABSTRACT

This paper shows a number of stereoscopic images depicting the UNC augmented reality guidance system for medical visualization in operation.


Subject(s)
Biopsy, Needle/methods , Breast Neoplasms/diagnosis , Computer Simulation , Diagnostic Imaging , Breast Neoplasms/pathology , North Carolina , User-Computer Interface
11.
J Biomed Discov Collab ; 4: 4, 2009 Apr 19.
Article in English | MEDLINE | ID: mdl-19521951

ABSTRACT

Two-dimensional (2D) videoconferencing has been explored widely in the past 15-20 years to support collaboration in healthcare. Two issues that arise in most evaluations of 2D videoconferencing in telemedicine are the difficulty obtaining optimal camera views and poor depth perception. To address these problems, we are exploring the use of a small array of cameras to reconstruct dynamic three-dimensional (3D) views of a remote environment and of events taking place within. The 3D views could be sent across wired or wireless networks to remote healthcare professionals equipped with fixed displays or with mobile devices such as personal digital assistants (PDAs). The remote professionals' viewpoints could be specified manually or automatically (continuously) via user head or PDA tracking, giving the remote viewers head-slaved or hand-slaved virtual cameras for monoscopic or stereoscopic viewing of the dynamic reconstructions. We call this idea remote 3D medical collaboration. In this article we motivate and explain the vision for 3D medical collaboration technology; we describe the relevant computer vision, computer graphics, display, and networking research; we present a proof-of-concept prototype system; and we present evaluation results supporting the general hypothesis that 3D remote medical collaboration technology could offer benefits over conventional 2D videoconferencing in emergency healthcare.

SELECTION OF CITATIONS
SEARCH DETAIL