Your browser doesn't support javascript.
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Mais filtros

Base de dados
Intervalo de ano de publicação
Artigo em Inglês | MEDLINE | ID: mdl-31808070


PURPOSE: Knee arthroscopy suffers from a lack of depth information and easy occlusion of the visual field. To solve these limitations, we propose an arthroscopic navigation system based on self-positioning technology, with the guidance of virtual-vision views. This system can work without any external tracking devices or added markers, thus increasing the working range and improving the robustness of the rotating operation. METHODS: The fly-through view and global positioning view for surgical guidance are rendered through virtual-vision rendering in real time. The fly-through view provides surgeons with navigating the arthroscope in the internal anatomical structures using a virtual camera perspective. The global positioning view shows the posture of the arthroscope relative to the preoperative model in a transparent manner. The posture of the arthroscope is estimated from the fusion of visual and inertial data based on the visual-inertial stereo slam. A flexible calibration method that transforms the posture of the arthroscope in the physical world into the virtual-vision rendering framework is proposed for the arthroscopic navigation system with self-positioning information. RESULTS: Quantitative experiments for evaluating self-positioning accuracy were performed. For translation, the acquired mean error was 0.41 ± 0.28 mm; for rotation, it was 0.11° ± 0.07°. The tracking range of the proposed system was approximately 1.4 times that of the traditional external optical tracking system for the rotating operation. Simulated surgical operations were performed on the phantom. The fly-through and global positing views were paired with original arthroscopic images for intuitive surgical guidance. CONCLUSION: The proposed system provides surgeons with both fly-through and global positioning views without a dependence on the traditional external tracking systems for surgical guidance. The feasibility and robustness of the system are evaluated, and it shows promise for medical applications.

IEEE Trans Vis Comput Graph ; 24(9): 2600-2609, 2018 09.
Artigo em Inglês | MEDLINE | ID: mdl-28961116


We propose a computer generated integral photography (CGIP) method that employs a lens based rendering (LBR) algorithm for super-multiview displays to achieve higher frame rates and better image quality without pixel resampling or view interpolation. The algorithm can utilize both fixed and programmable graphics pipelines to accelerate CGIP rendering and inter-perspective antialiasing. Two hardware prototypes were fabricated with two high-resolution liquid crystal displays and micro-lens arrays (MLA). Qualitative and quantitative experiments were performed to evaluate the feasibility of the proposed algorithm. To the best of our knowledge, the proposed LBR method outperforms state-of-the-art CGIP algorithms relative to rendering speed and image quality with our super-multiview hardware configurations. A demonstration experiment was also conducted to reveal the interactivity of a super-multiview display utilizing the proposed algorithm.