RESUMEN
Recent years have seen a significant increase in the use of Interventional Radiology (IR) as an alternative to open surgery. A large number of IR procedures commences with needle puncture of a vessel to insert guidewires and catheters: these clinical skills are acquired by all radiologists during training on patients, associated with some discomfort and occasionally, complications. While some visual skills can be acquired using models such as the ones used in surgery, these have limitations for IR which relies heavily on a sense of touch. Both patients and trainees would benefit from a virtual environment (VE) conveying touch sensation to realistically mimic procedures. The authors are developing a high fidelity VE providing a validated alternative to the traditional apprenticeship model used for teaching the core skills. The current version of the CRaIVE simulator combines home made software, haptic devices and commercial equipments.
Asunto(s)
Competencia Clínica , Física , Radiología Intervencionista/educación , Interfaz Usuario-Computador , Humanos , Fenómenos Físicos , Radiología Intervencionista/normas , Tacto , Reino UnidoRESUMEN
Virtual Reality offers great potential for surgical training--yet is typically limited by the dedicated and expensive equipment required. Web-based VR has the potential to offer a much cheaper alternative, in which simulations of fundamental techniques are downloaded from a server to run within a web browser. The equipment requirement is modest--an Internet-connected PC or small workstation--and the simulation can be accessed worldwide. In a collaboration between computer scientists and neurosurgeons, we have studied the use of web-based VR to train neurosurgeons in Percutaneous Rhizotomy--a treatment for the intractable facial pain which occurs in trigeminal neuralgia. This involves the insertion of a needle so as to puncture the foramen ovale, and lesion the nerve. Our simulation uses VRML to provide a 3D visualization environment, but the work immediately exposes a key limitation of VRML for surgical simulation. VRML does not support collision detection between objects--only between viewpoint and object. Thus collision between needle and skull cannot be detected and fed back to the trainee. We have developed a novel solution in which the training simulation has linked views: a normal view, plus a view as seen from the tip of the needle. Collision detection is captured in the needle view, and fed back to the viewer. A happy consequence of this approach has been the chance to aid the trainee with this additional view from needle tip, which helps locate the foramen ovale. The technology to achieve this is Java software communicating with the VRML worlds through the External Authoring Interface (EAI). The training simulator is available on the Web, with accompanying tutorial on its use. A major advantage of web-based VR is that the techniques generalize to a whole range of surgical simulations. Thus we have been able to use exactly the same approach as described above for neurosurgery, to develop a shoulder arthroscopy simulator--where again collision detection, and the view from the scope, are fundamental.
Asunto(s)
Instrucción por Computador , Endoscopía , Internet , Rizotomía , Interfaz Usuario-Computador , Humanos , Procesamiento de Imagen Asistido por Computador , MicrocomputadoresRESUMEN
Training of medical staff on minimally invasive surgery (MIS) is an area where new effective training methods are needed. We have studied stent grafting, a type of MIS, which is used to treat abdominal aortic aneurysm. Our analysis revealed that this procedure requires a range of motor, perceptual and cognitive skills. In this paper, we present a training environment that could be used to acquire these skills. Our proposed solution differs from the usual VR solutions by operating within the World Wide Web as an environment for our system. This paper discusses how our solution covers the training skills and presents the results of an appraisal process, which we conducted to evaluate our solution.