ABSTRACT
PURPOSE: C-arm fluoroscopy reconstruction, such as that used in prostate brachytherapy, requires that the relative poses of the individual C-arm fluoroscopy images must be known prior to reconstruction. Radiographic fiducials can provide excellent C-arm pose tracking, but they need to be segmented in the image. The authors report an automated and unsupervised method that does not require prior segmentation of the fiducial. METHODS: The authors compute the individual C-arm poses relative to a stationary radiographic fiducial of known geometry. The authors register a filtered 2D fluoroscopy image of the fiducial to its 3D model by using image intensity alone without prior segmentation. To enhance the C-arm images, the authors investigated a three-step cascade filter and a line enhancement filter. The authors tested the method on a composite fiducial containing beads, straight lines, and ellipses. Ground-truth C-arm pose was provided by a clinically proven method. RESULTS: Using 111 clinical C-arm images and +/- 10 degrees and +/- 10 mm random perturbation around the ground-truth pose, a total of 2775 cases were evaluated. The average rotation and translation errors were 0.62 degrees (STD = 0.31 degrees) and 0.72 mm (STD = 0.55 mm) for the three-step filter and 0.67 degrees (STD = 0.40 degrees) and 0.87 mm (STD = 0.27 mm) using the line enhancement filter. CONCLUSIONS: The C-arm pose tracking method was sufficiently accurate and robust on human patient data for subsequent 3D implant reconstruction.
Subject(s)
Fiducial Markers , Fluoroscopy/standards , Image Processing, Computer-Assisted/methods , Humans , Male , Prostatic Neoplasms/diagnostic imaging , Prostatic Neoplasms/radiotherapyABSTRACT
Central Line Tutor is a system that facilitates real-time feedback during training for central venous catheterization. One limitation of Central Line Tutor is its reliance on expensive, cumbersome electromagnetic tracking to facilitate various training aids, including ultrasound task identification and segmentation of neck vasculature. The purpose of this study is to validate deep learning methods for vessel segmentation and ultrasound pose classification in order to mitigate the system's reliance on electromagnetic tracking. A large dataset of segmented and classified ultrasound images was generated from participant data captured using Central Line Tutor. A U-Net architecture was used to perform vessel segmentation, while a shallow Convolutional Neural Network (CNN) architecture was designed to classify the pose of the ultrasound probe. A second classifier architecture was also tested that used the U-Net output as the CNN input. The mean testing set Intersect over Union score for U-Net cross-validation was 0.746 ± 0.052. The mean test set classification accuracy for the CNN was 92.0% ± 3.0, while the U-Net + CNN achieved 92.7% ± 2.1%. This study highlights the potential for deep learning on ultrasound images to replace the current electromagnetic tracking-based methods for vessel segmentation and ultrasound pose classification, and represents an important step towards removing the electromagnetic tracker altogether. Removing the need for an external tracking system would significantly reduce the cost of Central Line Tutor and make it far more accessible to the medical trainees that would benefit from it most.
Subject(s)
Catheterization, Central Venous , Humans , Image Processing, Computer-Assisted , Neural Networks, Computer , UltrasonographyABSTRACT
PURPOSE: In prostate brachytherapy, transrectal ultrasound (TRUS) is used to visualize the anatomy, while implanted seeds can be visualized by fluoroscopy. Intraoperative dosimetry optimization is possible using a combination of TRUS and fluoroscopy, but requires localization of the fluoroscopy-derived seed cloud, relative to the anatomy as seen on TRUS. The authors propose to develop a method of registration of TRUS images and the implants reconstructed from fluoroscopy. METHODS: A phantom was implanted with 48 seeds then imaged with TRUS and CT. Seeds were reconstructed from CT yielding a cloud of seeds. Fiducial-based ground-truth registration was established between the TRUS and CT. TRUS images are filtered, compounded, and registered to the reconstructed implants by using an intensity-based metric. The authors evaluated a volume-to-volume and point-to-volume registration scheme. In total, seven TRUS filtering techniques and three image similarity metrics were analyzed. The method was also tested on human subject data captured from a brachytherapy procedure. RESULTS: For volume-to-volume registration, noise reduction filter and normalized correlation metrics yielded the best result: An average of 0.54 +/- 0.11 mm seed localization error relative to ground truth. For point-to-volume registration, noise reduction combined with beam profile filter and mean squares metrics yielded the best result: An average of 0.38 +/- 0.19 mm seed localization error relative to the ground truth. In human patient data, C-arm fluoroscopy images showed 81 radioactive seeds implanted inside the prostate. A qualitative analysis showed clinically correct agreement between the seeds visible in TRUS and reconstructed from intraoperative fluoroscopy imaging. The measured registration error compared to the manually selected seed locations by the clinician was 2.86 +/- 1.26 mm. CONCLUSIONS: Fully automated registration between TRUS and the reconstructed seeds performed well in ground-truth phantom experiments and qualitative observation showed adequate performance on early clinical patient data.
Subject(s)
Fluoroscopy/methods , Prostatic Neoplasms/diagnosis , Prostatic Neoplasms/radiotherapy , Subtraction Technique , Tomography, X-Ray Computed/methods , Ultrasonography/methods , Brachytherapy , Humans , Image Enhancement/methods , Image Interpretation, Computer-Assisted/methods , Male , Radiotherapy, Computer-Assisted/methods , Reproducibility of Results , Sensitivity and SpecificityABSTRACT
Breast-conserving surgery, also known as lumpectomy, is an early stage breast cancer treatment that aims to spare as much healthy breast tissue as possible. A risk associated with lumpectomy is the presence of cancer positive margins post operation. Surgical navigation has been shown to reduce cancer positive margins but requires manual segmentation of the tumor intraoperatively. In this paper, we propose an end-to-end solution for automatic contouring of breast tumor from intraoperative ultrasound images using two convolutional neural network architectures, the U-Net and residual U-Net. The networks are trained on annotated intraoperative breast ultrasound images and evaluated on the quality of predicted segmentations. This work brings us one step closer to providing surgeons with an automated surgical navigation system that helps reduce cancer-positive margins during lumpectomy.
Subject(s)
Breast Neoplasms , Mastectomy, Segmental , Breast/diagnostic imaging , Breast Neoplasms/diagnostic imaging , Female , Humans , Neural Networks, Computer , Ultrasonography, MammaryABSTRACT
Three dimensional dosimetry is being used in an increasingly wide variety of clinical applications as more gel and radiochromic plastic dosimeters become available. However, accessible 3D dosimetry analysis tools have not kept pace. 3D dosimetry data analysis is time consuming and laborious, creating a barrier to entry for busy clinical environments. To help in the adoption of 3D dosimetry, we have produced a streamlined, open-source dosimetry analysis system by developing a custom extension in 3D Slicer, called the Gel Dosimetry Analysis slicelet, which enables rapid and accurate data analysis. To assist those interested in adopting 3D dosimetry in their clinic or those unfamiliar with what is involved in a 3D dosimeter experiment, we first present the workflow of a typical gel dosimetry experiment. This is followed by the results of experiments used to validate, step-wise, each component of our software. Overall, our software has made a full 3D gel dosimeter analysis roughly 20 times faster than previous analysis systems.
ABSTRACT
In prostate cancer treatment, there is a move toward targeted interventions for biopsy and therapy, which has precipitated the need for precise image-guided methods for needle placement. This paper describes an integrated system for planning and performing percutaneous procedures with robotic assistance under MRI guidance. A graphical planning interface allows the physician to specify the set of desired needle trajectories, based on anatomical structures and lesions observed in the patient's registered pre-operative and pre-procedural MR images, immediately prior to the intervention in an open-bore MRI scanner. All image-space coordinates are automatically computed, and are used to position a needle guide by means of an MRI-compatible robotic manipulator, thus avoiding the limitations of the traditional fixed needle template. Automatic alignment of real-time intra-operative images aids visualization of the needle as it is manually inserted through the guide. Results from in-scanner phantom experiments are provided.
Subject(s)
Biopsy, Needle , Magnetic Resonance Imaging/methods , Prostatic Neoplasms/diagnosis , Robotics , Humans , Magnetic Resonance Imaging/instrumentation , Male , NeuronavigationABSTRACT
This work explores an image-based approach for localizing needles during MRI-guided interventions, for the purpose of tracking and navigation. Susceptibility artifacts for several needles of varying thickness were imaged, in phantoms, using a 3 tesla MRI system, under a variety of conditions. The relationship between the true needle positions and the locations of artifacts within the images, determined both by manual and automatic segmentation methods, have been quantified and are presented here.
Subject(s)
Artifacts , Magnetic Resonance Imaging , Needles , United StatesABSTRACT
This work describes an integrated system for planning and performing percutaneous procedures-such as prostate biopsy-with robotic assistance under MRI-guidance. The physician interacts with a planning interface in order to specify the set of desired needle trajectories, based on anatomical structures and lesions observed in the patient's MR images. All image-space coordinates are automatically computed, and used to position a needle guide by means of an MRI-compatible robotic manipulator, thus avoiding the limitations of the traditional fixed needle template. Direct control of real-time imaging aids visualization of the needle as it is manually inserted through the guide. Results from in-scanner phantom experiments are provided.
Subject(s)
Biopsy, Needle , Magnetic Resonance Imaging , Robotics/instrumentation , Computer Systems , Humans , Male , Prostatic Neoplasms/diagnosisABSTRACT
We present the prototype of an image-guided robotic system for accurate and consistent placement of percutaneous needles in soft-tissue targets under CT guidance inside the gantry of a CT scanner. The couch-mounted system consists of a seven-degrees-of-freedom passive mounting arm, a remote center-of-motion robot, and a motorized needle-insertion device. Single-image-based coregistration of the robot and image space is achieved by stereotactic localization using a miniature version of the BRW head frame built into the radiolucent needle driver. The surgeon plans and controls the intervention in the scanner room on a desktop computer that receives DICOM images from the scanner. The system does not need calibration, employs pure image-based registration, and does not utilize any vendor-specific hardware or software features. In the open air, where there is no needle-tissue interaction, we systematically achieved an accuracy better than 1 mm in hitting targets at 5-8 cm from the fulcrum point. In the phantom, the orientation accuracy was 0.6 degrees, and the distance between the needle tip and the target was 1.04 mm. Experiments indicated that this robotic system is suitable for a variety of percutaneous clinical applications.
Subject(s)
Biopsy, Needle/instrumentation , Robotics/instrumentation , Surgery, Computer-Assisted/instrumentation , Tomography, X-Ray Computed/instrumentation , Humans , Image Processing, Computer-Assisted , Phantoms, Imaging , Tomography, X-Ray Computed/methods , User-Computer InterfaceABSTRACT
Previously, a static and adjustable image overlay systems were proposed for aiding needle interventions. The system was either fixed to a scanner or mounted over a large articulated counterbalanced arm. Certain drawbacks associated with these systems limited the clinical translation. In order to minimize these limitations, we present the mobile image overlay system with the objective of reduced system weight, smaller dimension, and increased tracking accuracy. The design study includes optimal workspace definition, selection of display device, mirror, and laser source. The laser plane alignment, phantom design, image overlay plane calibration, and system accuracy validation methods are discussed. The virtual image is generated by a tablet device and projected into the patient by using a beamsplitter mirror. The viewbox weight (1.0 kg) was reduced by 8.2 times and image overlay plane tracking precision (0.21 mm, STD = 0.05) was improved by 5 times compared to previous system. The automatic self-calibration of the image overlay plane was achieved in two simple steps and can be done away from patient table. The fiducial registration error of the physical phantom to scanned image volume registration was 1.35 mm (STD = 0.11). The reduced system weight and increased accuracy of optical tracking should enable the system to be hand held by the physician and explore the image volume over the patient for needle interventions.
Subject(s)
Surgery, Computer-Assisted/instrumentation , Cell Phone , Equipment Design , Humans , Image Processing, Computer-Assisted , Lasers , Needles , Phantoms, Imaging , Surgery, Computer-Assisted/methods , Tomography, X-Ray ComputedABSTRACT
PURPOSE: Facet syndrome is a condition that may cause 15-45 % of chronic lower back pain. It is commonly diagnosed and treated using facet joint injections. This needle technique demands high accuracy, and ultrasound (US) is a potentially useful modality to guide the needle. US-guided injections, however, require physicians to interpret 2-D sonographic images while simultaneously manipulating an US probe and needle. Therefore, US-guidance for facet joint injections needs advanced training methodologies that will equip physicians with the requisite skills. METHODS: We used Perk Tutor-an augmented reality training system for US-guided needle insertions-in a configuration for percutaneous procedures of the lumbar spine. In a pilot study of 26 pre-medical undergraduate students, we evaluated the efficacy of Perk Tutor training compared to traditional training. RESULTS: The Perk Tutor Trained group, which had access to Perk Tutor during training, had a mean success rate of 61.5 %, while the Control group, which received traditional training, had a mean success rate of 38.5 % ([Formula: see text]). No significant differences in procedure times or needle path lengths were observed between the two groups. CONCLUSIONS: The results of this pilot study suggest that Perk Tutor provides an improved training environment for US-guided facet joint injections on a synthetic model.
Subject(s)
Education, Medical/methods , Injections, Intra-Articular/instrumentation , Low Back Pain/drug therapy , Zygapophyseal Joint/diagnostic imaging , Equipment Design , Humans , Low Back Pain/diagnostic imaging , Needles , Reproducibility of Results , UltrasonographyABSTRACT
We present a needle deflection estimation method to compensate for needle bending during insertion into deformable tissue. We combine a kinematic needle deflection estimation model, electromagnetic (EM) trackers, and a Kalman filter (KF). We reduce the impact of error from the needle deflection estimation model by using the fusion of two EM trackers to report the approximate needle tip position in real-time. One reliable EM tracker is installed on the needle base, and estimates the needle tip position using the kinematic needle deflection model. A smaller but much less reliable EM tracker is installed on the needle tip, and estimates the needle tip position through direct noisy measurements. Using a KF, the sensory information from both EM trackers is fused to provide a reliable estimate of the needle tip position with much reduced variance in the estimation error. We then implement this method to compensate for needle deflection during simulated prostate cancer brachytherapy needle insertion. At a typical maximum insertion depth of 15 cm, needle tip mean estimation error was reduced from 2.39 mm to 0.31 mm, which demonstrates the effectiveness of our method, offering a clinically practical solution.
Subject(s)
Brachytherapy , Electromagnetic Fields , Models, Biological , Needles , Prostatic Neoplasms/therapy , Biomechanical Phenomena , Brachytherapy/instrumentation , Brachytherapy/methods , Humans , MaleABSTRACT
PURPOSE: Brachytherapy is an important mode of breast cancer treatment; however, improvements in both treatment planning and delivery are needed. In order to meet these specific needs, integration of pre-operative imaging, supplemented by computerized surgical planning and mathematical optimization were used to develop and test an intra-operative immobilization and catheter guidance system. METHOD: A custom template specific to each patient with optimally placed guide holes for catheter insertion was designed and fabricated. Creation of the template is based on a virtual reality reconstruction of the patient's anatomy from computed tomography imaging. The template fits on the patient's breast, immobilizing the soft tissue, and provides pre-planned catheter insertion holes for guidance to the tumor site. Agar-based phantom and target models were used for quantitative validation of the template by ascertaining the precision and accuracy of the templates. RESULTS: Tests were performed on agar-based tissue models using computed tomography imaging for template planning and validation. Planned catheter tracks were compared to post-insertion image data and distance measurements from target location were used to create an error measure. Initial results yielded an average error of 4.5mm. Once the workflow and template design were improved, an average error of 2.6mm was observed, bringing the error close to a clinically acceptable range. CONCLUSION: Use of a patient-specific template for breast brachytherapy is feasible and may improve the procedure accuracy and outcome.
Subject(s)
Brachytherapy/methods , Breast Neoplasms/radiotherapy , Catheterization/instrumentation , Immobilization/instrumentation , Radiotherapy Planning, Computer-Assisted/methods , Breast Neoplasms/diagnostic imaging , Female , Humans , Imaging, Three-Dimensional , Phantoms, Imaging , Software , Tomography, X-Ray Computed , User-Computer InterfaceABSTRACT
PURPOSE: Brachytherapy (radioactive seed insertion) has emerged as one of the most effective treatment options for patients with prostate cancer, with the added benefit of a convenient outpatient procedure. The main limitation in contemporary brachytherapy is faulty seed placement, predominantly due to the presence of intra-operative edema (tissue expansion). Though currently not available, the capability to intra-operatively monitor the seed distribution, can make a significant improvement in cancer control. We present such a system here. METHODS: Intra-operative measurement of edema in prostate brachytherapy requires localization of inserted radioactive seeds relative to the prostate. Seeds were reconstructed using a typical non-isocentric C-arm, and exported to a commercial brachytherapy treatment planning system. Technical obstacles for 3D reconstruction on a non-isocentric C-arm include pose-dependent C-arm calibration; distortion correction; pose estimation of C-arm images; seed reconstruction; and C-arm to TRUS registration. RESULTS: In precision-machined hard phantoms with 40-100 seeds and soft tissue phantoms with 45-87 seeds, we correctly reconstructed the seed implant shape with an average 3D precision of 0.35 mm and 0.24 mm, respectively. In a DoD Phase-1 clinical trial on six patients with 48-82 planned seeds, we achieved intra-operative monitoring of seed distribution and dosimetry, correcting for dose inhomogeneities by inserting an average of over four additional seeds in the six enrolled patients (minimum 1; maximum 9). Additionally, in each patient, the system automatically detected intra-operative seed migration induced due to edema (mean 3.84 mm, STD 2.13 mm, Max 16.19 mm). CONCLUSIONS: The proposed system is the first of a kind that makes intra-operative detection of edema (and subsequent re-optimization) possible on any typical non-isocentric C-arm, at negligible additional cost to the existing clinical installation. It achieves a significantly more homogeneous seed distribution, and has the potential to affect a paradigm shift in clinical practice. Large scale studies and commercialization are currently underway.
Subject(s)
Brachytherapy/adverse effects , Edema/diagnostic imaging , Edema/etiology , Imaging, Three-Dimensional/methods , Prostatic Neoplasms/diagnostic imaging , Prostatic Neoplasms/radiotherapy , Ultrasonography/methods , Edema/prevention & control , Humans , Male , Preoperative Care/methods , Prostatic Neoplasms/complications , Radiotherapy, Computer-Assisted/methodsABSTRACT
In prostate brachytherapy, a transrectal ultrasound (TRUS) will show the prostate boundary but not all the implanted seeds, while fluoroscopy will show all the seeds clearly but not the boundary. We propose an intensity-based registration between TRUS images and the implant reconstructed from uoroscopy as a means of achieving accurate intra-operative dosimetry. The TRUS images are first filtered and compounded, and then registered to the uoroscopy model via mutual information. A training phantom was implanted with 48 seeds and imaged. Various ultrasound filtering techniques were analyzed, and the best results were achieved with the Bayesian combination of adaptive thresholding, phase congruency, and compensation for the non-uniform ultrasound beam profile in the elevation and lateral directions. The average registration error between corresponding seeds relative to the ground truth was 0.78 mm. The effect of false positives and false negatives in ultrasound were investigated by masking true seeds in the uoroscopy volume or adding false seeds. The registration error remained below 1.01 mm when the false positive rate was 31%, and 0.96 mm when the false negative rate was 31%. This fully automated method delivers excellent registration accuracy and robustness in phantom studies, and promises to demonstrate clinically adequate performance on human data as well. Keywords: Prostate brachytherapy, Ultrasound, Fluoroscopy, Registration.
ABSTRACT
PURPOSE: In prostate brachytherapy, determining the 3D location of the seeds relative to surrounding structures is necessary for calculating dosimetry. Ultrasound imaging provides the ability to visualize soft tissues, and implanted seeds can be reconstructed from C-arm fluoroscopy. Registration between these two complementary modalities would allow us to make immediate provisions for dosimetric deviation from the optimal implant plan. METHODS: We propose intensity-based registration between ultrasound and a reconstructed model of seeds from fluoroscopy. The ultrasound images are pre-processed with recursive thresholding and phase congruency. Then a 3D ultrasound volume is reconstructed and registered to the implant model using mutual information. RESULTS: A standard training phantom was implanted with 49 seeds. Average registration error between corresponding seeds relative to the ground truth is 0.09 mm. The effect of false positives in ultrasound was investigated by masking seeds from the fluoroscopy reconstructed model. The registration error remained below 0.5 mm at a rate of 30% false positives. CONCLUSION: Our method promises to be clinically adequate, where requirements for registration is 1.5 mm.
ABSTRACT
This paper describes a novel image-based method for tracking robotic mechanisms and interventional devices during Magnetic Resonance Image (MRI)-guided procedures. It takes advantage of the multi-planar imaging capabilities of MRI to optimally image a set of localizing fiducials for passive motion tracking in the image coordinate frame. The imaging system is servoed to adaptively position the scan plane based on automatic detection and localization of fiducial artifacts directly from the acquired image stream. This closed-loop control system has been implemented using an open-source software framework and currently operates with GE MRI scanners. Accuracy and performance were evaluated in experiments, the results of which are presented here.
Subject(s)
Algorithms , Image Interpretation, Computer-Assisted/methods , Imaging, Three-Dimensional/methods , Magnetic Resonance Imaging/methods , Pattern Recognition, Automated/methods , Surgery, Computer-Assisted/methods , User-Computer Interface , Anatomy, Cross-Sectional/instrumentation , Anatomy, Cross-Sectional/methods , Artificial Intelligence , Humans , Image Enhancement/methods , Magnetic Resonance Imaging/instrumentation , Phantoms, Imaging , Reproducibility of Results , Robotics/methods , Sensitivity and Specificity , Surgery, Computer-Assisted/instrumentationABSTRACT
Intra-operative guidance in Transrectal Ultrasound (TRUS) guided prostate brachytherapy requires localization of inserted radioactive seeds relative to the prostate. Seeds were reconstructed using a typical C-arm, and exported to a commercial brachytherapy system for dosimetry analysis. Technical obstacles for 3D reconstruction on a non-isocentric C-arm included pose-dependent C-arm calibration; distortion correction; pose estimation of C-arm images; seed reconstruction; and C-arm to TRUS registration. In precision-machined hard phantoms with 40-100 seeds, we correctly reconstructed 99.8% seeds with a mean 3D accuracy of 0.68 mm. In soft tissue phantoms with 45-87 seeds and clinically realistic 15 degrees C-arm motion, we correctly reconstructed 100% seeds with an accuracy of 1.3 mm. The reconstructed 3D seed positions were then registered to the prostate segmented from TRUS. In a Phase-1 clinical trial, so far on 4 patients with 66-84 seeds, we achieved intra-operative monitoring of seed distribution and dosimetry. We optimized the 100% prescribed iso-dose contour by inserting an average of 3.75 additional seeds, making intra-operative dosimetry possible on a typical C-arm, at negligible additional cost to the existing clinical installation.
Subject(s)
Brachytherapy/instrumentation , Imaging, Three-Dimensional/methods , Prostatic Neoplasms/diagnostic imaging , Prostatic Neoplasms/radiotherapy , Prosthesis Implantation/instrumentation , Radiography, Interventional/instrumentation , Tomography, X-Ray Computed/instrumentation , Algorithms , Brachytherapy/methods , Equipment Design , Equipment Failure Analysis , Humans , Intraoperative Care/instrumentation , Intraoperative Care/methods , Male , Prosthesis Implantation/methods , Radiographic Image Enhancement/instrumentation , Radiographic Image Enhancement/methods , Radiographic Image Interpretation, Computer-Assisted/methods , Radiography, Interventional/methods , Radiotherapy, Computer-Assisted/instrumentation , Radiotherapy, Computer-Assisted/methods , Reproducibility of Results , Sensitivity and Specificity , Tomography, X-Ray Computed/methodsABSTRACT
Medical practice continues to move toward less invasive procedures. Many of these procedures require the precision placement of a needle in the anatomy. Over the past several years, our research team has been investigating the use of a robotic needle driver to assist the physician in this task. This paper summarizes our work in this area. The robotic system is briefly described, followed by a description of a clinical trial in spinal nerve blockade. The robot was used under joystick control to place a 22 gauge needle in the spines of 10 patients using fluoroscopic imaging. The results were equivalent to the current manual procedure. We next describe our follow-up clinical application in lung biopsy for lung cancer screening under CT fluoroscopy. The system concept is discussed and the results of a phantom study are presented. A start-up company named ImageGuide has recently been formed to commercialize the robot. Their revised robot design is presented, along with plans to install a ceiling-mounted version of the robot in the CT fluoroscopy suite at Georgetown University.