Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 57
Filtrar
1.
BJU Int ; 133(6): 709-716, 2024 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-38294145

RESUMEN

OBJECTIVE: To report the learning curve of multiple operators for fusion magnetic resonance imaging (MRI) targeted biopsy and to determine the number of cases needed to achieve proficiency. MATERIALS AND METHODS: All adult males who underwent fusion MRI targeted biopsy between February 2012 and July 2021 for clinically suspected prostate cancer (PCa) in a single centre were included. Fusion transrectal MRI targeted biopsy was performed under local anaesthesia using the Koelis platform. Learning curves for segmentation of transrectal ultrasonography (TRUS) images and the overall MRI targeted biopsy procedure were estimated with locally weighted scatterplot smoothing by computing each operator's timestamps for consecutive procedures. Non-risk-adjusted cumulative sum (CUSUM) methods were used to create learning curves for clinically significant (i.e., International Society of Urological Pathology grade ≥ 2) PCa detection. RESULTS: Overall, 1721 patients underwent MRI targeted biopsy in our centre during the study period. The median (interquartile range) times for TRUS segmentation and for the MRI targeted biopsy procedure were 4.5 (3.5, 6.0) min and 13.2 (10.6, 16.9) min, respectively. Among the 14 operators with experience of more than 50 cases, a plateau was reached after 40 cases for TRUS segmentation time and 50 cases for overall MRI targeted biopsy procedure time. CUSUM analysis showed that the learning curve for clinically significant PCa detection required 25 to 45 procedures to achieve clinical proficiency. Pain scores ranged between 0 and 1 for 84% of patients, and a plateau phase was reached after 20 to 100 cases. CONCLUSIONS: A minimum of 50 cases of MRI targeted biopsy are necessary to achieve clinical and technical proficiency and to reach reproducibility in terms of timing, clinically significant PCa detection, and pain.


Asunto(s)
Biopsia Guiada por Imagen , Curva de Aprendizaje , Próstata , Neoplasias de la Próstata , Humanos , Masculino , Neoplasias de la Próstata/patología , Neoplasias de la Próstata/diagnóstico por imagen , Biopsia Guiada por Imagen/métodos , Anciano , Persona de Mediana Edad , Próstata/patología , Próstata/diagnóstico por imagen , Ultrasonografía Intervencional/métodos , Imagenología Tridimensional , Imagen por Resonancia Magnética , Imagen por Resonancia Magnética Intervencional , Competencia Clínica , Estudios Retrospectivos
2.
Eur Urol Oncol ; 2023 Aug 19.
Artículo en Inglés | MEDLINE | ID: mdl-37599199

RESUMEN

BACKGROUND: Segmentation of three-dimensional (3D) transrectal ultrasound (TRUS) images is known to be challenging, and the clinician often lacks a reliable and easy-to-use indicator to assess its accuracy during the fusion magnetic resonance imaging (MRI)-targeted prostate biopsy procedure. OBJECTIVE: To assess the effect of the relative volume difference between 3D-TRUS and MRI segmentation on the outcome of a targeted biopsy. DESIGN, SETTING, AND PARTICIPANTS: All adult males who underwent an MRI-targeted prostate biopsy for clinically suspected prostate cancer between February 2012 and July 2021 were consecutively included. INTERVENTION: All patients underwent a fusion MRI-targeted prostate biopsy with a Koelis device. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: Three-dimensional TRUS and MRI prostate volumes were calculated using 3D prostate models issued from the segmentations. The primary outcome was the relative segmentation volume difference (SVD) between transrectal ultrasound and MRI divided by the MRI volume (SVD = MRI volume - TRUS volume/MRI volume) and its correlation with clinically significant prostate cancer (eg, International Society of Urological Pathology [ISUP] ≥2) positiveness on targeted biopsy cores. RESULTS AND LIMITATIONS: Overall, 1721 patients underwent a targeted biopsy resulting in a total of 5593 targeted cores. The median relative SVD was significantly lower in patients diagnosed with clinically significant prostate cancer than in those with ISUP 0-1: (6.7% [interquartile range {IQR} -2.7, 13.6] vs 8.0% [IQR 3.3, 16.4], p < 0.01). A multivariate regression analysis showed that a relative SVD of >10% of the MRI volume was associated with a lower detection rate of clinically significant prostate cancer (odds ratio = 0.74 [95% confidence interval: 0.55-0.98]; p = 0.038). CONCLUSIONS: A relative SVD of >10% of the MRI segmented volume was associated with a lower detection rate of clinically significant prostate cancer on targeted biopsy cores. The relative SVD can be used as a per-procedure quality indicator of 3D-TRUS segmentation. PATIENT SUMMARY: A discrepancy of ≥10% between segmented magnetic resonance imaging and transrectal ultrasound volume is associated with a reduced ability to detect significant prostate cancer on targeted biopsy cores.

3.
IEEE Trans Biomed Eng ; 70(8): 2338-2349, 2023 08.
Artículo en Inglés | MEDLINE | ID: mdl-37022829

RESUMEN

OBJECTIVE: The accuracy of biopsy targeting is a major issue for prostate cancer diagnosis and therapy. However, navigation to biopsy targets remains challenging due to the limitations of transrectal ultrasound (TRUS) guidance added to prostate motion issues. This article describes a rigid 2D/3D deep registration method, which provides a continuous tracking of the biopsy location w.r.t the prostate for enhanced navigation. METHODS: A spatiotemporal registration network (SpT-Net) is proposed to localize the live 2D US image relatively to a previously aquired US reference volume. The temporal context relies on prior trajectory information based on previous registration results and probe tracking. Different forms of spatial context were compared through inputs (local, partial or global) or using an additional spatial penalty term. The proposed 3D CNN architecture with all combinations of spatial and temporal context was evaluated in an ablation study. For providing a realistic clinical validation, a cumulative error was computed through series of registrations along trajectories, simulating a complete clinical navigation procedure. We also proposed two dataset generation processes with increasing levels of registration complexity and clinical realism. RESULTS: The experiments show that a model using local spatial information combined with temporal information performs better than more complex spatiotemporal combination. CONCLUSION: The best proposed model demonstrates robust real-time 2D/3D US cumulated registration performance on trajectories. Those results respect clinical requirements, application feasibility, and they outperform similar state-of-the-art methods. SIGNIFICANCE: Our approach seems promising for clinical prostate biopsy navigation assistance or other US image-guided procedure.


Asunto(s)
Próstata , Neoplasias de la Próstata , Masculino , Humanos , Próstata/diagnóstico por imagen , Próstata/patología , Imagenología Tridimensional/métodos , Biopsia , Neoplasias de la Próstata/diagnóstico por imagen , Neoplasias de la Próstata/patología , Ultrasonografía/métodos
4.
Med Phys ; 49(8): 5268-5282, 2022 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-35506596

RESUMEN

PURPOSE: Precise determination of target is an essential procedure in prostate interventions, such as prostate biopsy, lesion detection, and targeted therapy. However, the prostate delineation may be tough in some cases due to tissue ambiguity or lack of partial anatomical boundary. In this study, we propose a novel supervised registration-based algorithm for precise prostate segmentation, which combines the convolutional neural network (CNN) with a statistical shape model (SSM). METHODS: The proposed network mainly consists of two branches. One called SSM-Net branch was exploited to predict the shape transform matrix, shape control parameters, and shape fine-tuning vector, for the generation of the prostate boundary. Furthermore, according to the inferred boundary, a normalized distance map was calculated as the output of SSM-Net. Another branch named ResU-Net was employed to predict a probability label map from the input images at the same time. Integrating the output of these two branches, the optimal weighted sum of the distance map and the probability map was regarded as the prostate segmentation. RESULTS: Two public data sets PROMISE12 and NCI-ISBI 2013 were utilized to evaluate the performance of the proposed algorithm. The results demonstrated that the segmentation algorithm achieved the best performance with an SSM of 9500 nodes, which obtained a dice of 0.907 and an average surface distance of 1.85 mm. Compared with other methods, our algorithm delineates the prostate region more accurately and efficiently. In addition, we verified the impact of model elasticity augmentation and the fine-tuning item on the network segmentation capability. As a result, both factors have improved the delineation accuracy, with dice increased by 10% and 7%, respectively. CONCLUSIONS: Our segmentation method has the potential to be an effective and robust approach for prostate segmentation.


Asunto(s)
Imagenología Tridimensional , Próstata , Algoritmos , Humanos , Procesamiento de Imagen Asistido por Computador/métodos , Imagenología Tridimensional/métodos , Imagen por Resonancia Magnética/métodos , Masculino , Modelos Estadísticos , Redes Neurales de la Computación , Próstata/diagnóstico por imagen
5.
Med Phys ; 49(8): 5138-5148, 2022 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-35443086

RESUMEN

PURPOSE: Prostate segmentation of 3D TRUS images is a prerequisite for several diagnostic and therapeutic applications. Unfortunately, this difficult task suffers from high intra and interobserver variability, even for experienced urologists/radiologists. This is why automatic segmentation algorithms could have a significant clinical added-value. METHODS: This paper introduces a new deep segmentation architecture consisting of two main phases: view-specific segmentations of 2D slices and their fusion. The segmentation phase is based on three segmentation networks trained in parallel on specific slice viewing directions: axial, coronal, and sagittal. The proposed fusion network is then fed with the output of the segmentation networks and trained to produce three confidence maps. These maps correspond to the local trust granted by the fusion network to each view-specific segmentation network. Finally, for a given slice, the segmentation is computed by combining these confidence maps with their corresponding segmentations. The 3D segmentation of the prostate is obtained by restacking all the segmented slices to form a volume. RESULTS: This approach was evaluated on a database of 100 patients with several combinations of network architectures (for both the segmentation phase and the fusion phase) to show the flexibility and reliability of the framework. The proposed approach was also compared to STAPLE, to the majority voting strategy, and to a direct 3D approach tested on the same database. The new method outperforms these three approaches on all evaluation criteria. Finally, the results of the multi-eXpert fusion (MXF) framework compare favorably with other state-of-the-art methods, while these methods typically work on smaller databases. CONCLUSIONS: We proposed a novel MXF framework to segment 3D TRUS images of the prostate. The main feature of this approach is the fusion of expert network results at the pixel level using computed confidence maps. Experiments conducted on a clinical database have shown the robustness and flexibility of this approach and its superiority over state-of-the-art approaches. Finally, the MXF framework demonstrated its ability to capture and preserve the underlying gland structures, particularly in the base and apex regions.


Asunto(s)
Imagenología Tridimensional , Próstata , Algoritmos , Humanos , Procesamiento de Imagen Asistido por Computador/métodos , Imagenología Tridimensional/métodos , Aprendizaje Automático , Masculino , Próstata/diagnóstico por imagen , Reproducibilidad de los Resultados
6.
Med Eng Phys ; 95: 30-38, 2021 09.
Artículo en Inglés | MEDLINE | ID: mdl-34479690

RESUMEN

In this study, we investigated a method allowing the determination of the femur bone surface as well as its mechanical axis from some easy-to-identify bony landmarks. The reconstruction of the whole femur is therefore performed from these landmarks using a Statistical Shape Model (SSM). The aim of this research is therefore to assess the impact of the number, the position, and the accuracy of the landmarks for the reconstruction of the femur and the determination of its related mechanical axis, an important clinical parameter to consider for the lower limb analysis. Two statistical femur models were created from our in-house dataset and a publicly available dataset. Both were evaluated in terms of average point-to-point surface distance error and through the mechanical axis of the femur. Furthermore, the clinical impact of using landmarks on the skin in replacement of bony landmarks is investigated. The predicted proximal femurs from bony landmarks were more accurate compared to on-skin landmarks while both had less than 3.5∘ degrees mechanical axis angle deviation error. The results regarding the non-invasive determination of the mechanical axis are very encouraging and could open very interesting clinical perspectives for the analysis of the lower limb either for orthopedics or functional rehabilitation.


Asunto(s)
Fémur , Procedimientos de Cirugía Plástica , Huesos , Estudios de Factibilidad , Fémur/diagnóstico por imagen , Fémur/cirugía , Imagenología Tridimensional , Modelos Estadísticos
7.
Prog Urol ; 31(16): 1115-1122, 2021 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-34303611

RESUMEN

INTRODUCTION: Simulation-based training has proven to be a promising option allowing for initial and continuous training while limiting the impact of the learning curve on the patient. The Biopsym simulator was developed as a complete teaching environment for the prostate biopsy procedure. This paper presents the results of an external validation of this simulator, involving urology residents recruited during a regional teaching seminar. METHODS: Residents from 4 academic urology departments of the French Auvergne Rhône-Alpes region, who did not take part in the previous simulator validation studies, were enrolled. After a short presentation and standardized initiation session, residents carried out a simulated systematic 12-core biopsy procedure and were asked to fill in a questionnaire collecting their expectations and evaluation of the Biopsym simulator. The number of biopsies reaching each targeted sector, the total score provided by the simulator and the duration of the procedure were recorded. RESULTS: Twenty-three residents were recruited. The overall added value (/100) for learning was rated at a median of 100 (interquartile range 83-100), overall realism of the biopsy procedure at 80 (65-89). The median percentage of biopsies reaching the targeted sector was 66.7% (62-75). The median score provided by the simulator was 50% (37-60). For both, the difference between residents with or without prior biopsy experience was not statistically significant. The median duration of the simulated biopsy procedure was 4:58 (minutes: seconds) (3:49-6:00). Resident with prior experience required less time to complete the biopsy procedure 3:53 (3:39-4:56) vs. 5:10 (4:59-7:10), P=0.01. CONCLUSION: This external validation study confirms a high acceptance of the simulator by the target audience. To our knowledge, the Biopsym simulator is the only prostate biopsy simulator that demonstrated such validity as evaluated by clinicians, outside the center involved in its early development. LEVEL OF EVIDENCE: 3.


Asunto(s)
Próstata , Entrenamiento Simulado , Biopsia , Competencia Clínica , Simulación por Computador , Humanos , Curva de Aprendizaje , Masculino
8.
Med Phys ; 48(3): 1144-1156, 2021 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-33511658

RESUMEN

PURPOSE: New radiation therapy protocols, in particular adaptive, focal or boost brachytherapy treatments, require determining precisely the position and orientation of the implanted radioactive seeds from real-time ultrasound (US) images. This is necessary to compare them to the planned one and to adjust automatically the dosimetric plan accordingly for next seeds implantations. The image modality, the small size of the seeds, and the artifacts they produce make it a very challenging problem. The objective of the presented work is to setup and to evaluate a robust and automatic method for seed localization in three-dimensional (3D) US images. METHODS: The presented method is based on a prelocalization of the needles through which the seeds are injected in the prostate. This prelocalization allows focusing the search on a region of interest (ROI) around the needle tip. Seeds localization starts by binarizing the ROI and removing false positives using, respectively, a Bayesian classifier and a support vector machine (SVM). This is followed by a registration stage using first an iterative closest point (ICP) for localizing the connected set of seeds (named strand) inserted through a needle, and secondly refining each seed position using sum of squared differences (SSD) as a similarity criterion. ICP registers a geometric model of the strand to the candidate voxels while SSD compares an appearance model of a single seed to a subset of the image. The method was evaluated both for 3D images of an Agar-agar phantom and a dataset of clinical 3D images. It was tested on stranded and on loose seeds. RESULTS: Results on phantom and clinical images were compared with a manual localization giving mean errors of 1.09 ± 0.61 mm on phantom image and 1.44 ± 0.45 mm on clinical images. On clinical images, the mean errors of individual seeds orientation was 4.33 ± 8 . 51 ∘ . CONCLUSIONS: The proposed algorithm for radioactive seed localization is robust, tested on different US images, accurate, giving small mean error values, and returns the five cylindrical seeds degrees of freedom.


Asunto(s)
Braquiterapia , Aprendizaje Automático , Neoplasias de la Próstata , Teorema de Bayes , Humanos , Masculino , Fantasmas de Imagen , Neoplasias de la Próstata/diagnóstico por imagen , Neoplasias de la Próstata/radioterapia
9.
IEEE Trans Biomed Eng ; 68(4): 1166-1177, 2021 04.
Artículo en Inglés | MEDLINE | ID: mdl-32897859

RESUMEN

This paper presents a new solution for 3D steering of flexible needles guided by 3D B-mode ultrasound imaging. It aims to realize a robust steering, by accounting for uncertainties, noise and tissue heterogeneities, while limiting tissue-related disturbances. The proposed solution features interconnected state observer, automatic needle tip segmentation and path planning algorithms. Measurement quality, state uncertainties and tissue heterogeneity are considered for robust needle steering with helical paths of variable curvature. Fast replanning allows for adaptability to unexpected disturbances. An experimental validation has been done through 62 insertions of 24 Gauge bevel-tip nitinol needles in various tissue. Results are promising, characterized by mean targeting errors of less than 1 mm in homogeneous phantoms, 1.5 ± 0.9 mm in heterogeneous phantoms and 1.7 ± 0.8 mm in ex-vivo tissue. This new approach is a step towards a precise and robust patient-specific gesture.


Asunto(s)
Algoritmos , Agujas , Humanos , Fantasmas de Imagen , Ultrasonografía , Ultrasonografía Intervencional
10.
Med Phys ; 48(7): 3904-3915, 2021 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-33159811

RESUMEN

PURPOSE: Performing a transrectal ultrasound (TRUS) prostate biopsy is at the heart of the current prostate cancer detection procedure. With today's two-dimensional (2D) live ultrasound (US) imaging equipment, this task remains complex due to the poor visibility of cancerous tissue on TRUS images and the limited anatomical context available in the 2D TRUS plane. This paper presents a rigid 2D/3DUS registration method for navigated prostate biopsy. This allows continuous localization of the biopsy trajectory during the procedure. METHODS: We proposed an organ-based approach to achieve real-time rigid registration without the need for any probe localization device. The registration method combines image similarity and geometric proximity of detected features. Additions to our previous work include a multi-level approach and the use of a rejection rate favouring the best matches. Their aim is to increase the accuracy and time performances. These modifications and their in-depth evaluation on real clinical cases and comparison to this previous work are described. We performed static and dynamic evaluations along biopsy trajectories on a very large amount of data acquired under uncontrolled routine conditions. The computed transforms are compared to a ground truth obtained either from corresponding manually detected fiducials or from an already evaluated registration method. RESULTS: All results show that the current method outperforms its previous version, both in terms of accuracy (the average error reported here is 12 to 17% smaller depending on the experiment) and processing time (from 20 to 60 times faster compared to the previous implementation). The dynamic registration experiment demonstrates that the method can be successfully used for continuous tracking of the biopsy location w.r.t the prostate at a rate that varies between 5 and 15 Hz. CONCLUSIONS: This work shows that on the fly 2D/3DUS registration can be performed very efficiently on biopsy trajectories. This allows us to plan further improvements in prostate navigation and a clinical transfer.


Asunto(s)
Imagenología Tridimensional , Neoplasias de la Próstata , Biopsia , Humanos , Masculino , Neoplasias de la Próstata/diagnóstico por imagen , Ultrasonografía
11.
J Surg Educ ; 77(4): 953-960, 2020.
Artículo en Inglés | MEDLINE | ID: mdl-32201141

RESUMEN

OBJECTIVES: To evaluate the ability of students to reproduce the skills acquired on a prostate biopsy simulator in a real-life situation. DESIGN: A prospective randomized controlled study was conducted. Medical students with no experience of prostate biopsy were randomized between arm A « conventional training ¼ and arm B « simulator-enhanced training. ¼ The training was performed for both groups on the simulator. The students in arm B were provided with visual and numerical feedback. The transfer of skills was assessed by recording the position of the 12 biopsies performed by each student on an unembalmed human cadaver using a 3D ultrasound mapping device. SETTING: The study was conducted in an academic urology department and the cadaver experiments in the adjoining anatomy laboratory. RESULTS: Twenty-four students were included, and 22 completed the study. The median score obtained on the simulator at the end of the training was 57% (53-61) for arm A and 66% (59-71) for arm B. The median score obtained on the cadaver by students trained with the simulator was 75% (60-80), statistically superior to the score obtained by students trained conventionally of 45% (30-60), p < 0.0001. The median score obtained by all students when performing biopsies in a real-life situation was 63% (50-80) versus 60% (56-70) for their last training on the simulator. CONCLUSION: These results support the transfer of skills acquired on the simulator, and the superiority of a training curriculum integrating simulation, and performance feedback.


Asunto(s)
Próstata , Estudiantes de Medicina , Biopsia , Competencia Clínica , Simulación por Computador , Humanos , Masculino , Estudios Prospectivos
12.
Minim Invasive Ther Allied Technol ; 29(6): 359-365, 2020 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-31430218

RESUMEN

Objectives: The Biopsym simulator, a virtual-reality simulator for prostate biopsies, was designed to offer enhanced teaching of the biopsy procedure. The objectives of the present article are to describe the new version of the simulator and report the results of a new validation study.Material and methods: A prospective validation study was conducted between January and March 2017. The new version of the simulator, with improved physical realism, ultrasound image deformation, and a new scoring system, was evaluated by novice and confirmed users.Results: Twenty-one users evaluated the simulator, including ten novices and 11 confirmed users. The overall realism of the biopsy procedure was rated at 7.7/10 (IQR 5.7-9). The differences between the rates given by confirmed users and novices were not statistically significant. The median overall score obtained for the performance of 12 systematic ultrasound-guided biopsies was 43% (IQR 33-55). The median score obtained by confirmed users was 54% (IQR 46-62), and the median score obtained by novices was 31% (IQR 20-35). The difference between the scores was statistically significant (p = 0.005).Conclusions: This study allowed us to gather evidence towards the validation, and particularly towards the construct validation of the new version of the Biopsym simulator.


Asunto(s)
Próstata , Entrenamiento Simulado , Biopsia , Competencia Clínica , Simulación por Computador , Masculino , Estudios Prospectivos , Interfaz Usuario-Computador
13.
Annu Rev Biomed Eng ; 21: 193-218, 2019 06 04.
Artículo en Inglés | MEDLINE | ID: mdl-30822100

RESUMEN

Medical robotics is poised to transform all aspects of medicine-from surgical intervention to targeted therapy, rehabilitation, and hospital automation. A key area is the development of robots for minimally invasive interventions. This review provides a detailed analysis of the evolution of interventional robots and discusses how the integration of imaging, sensing, and robotics can influence the patient care pathway toward precision intervention and patient-specific treatment. It outlines how closer coupling of perception, decision, and action can lead to enhanced dexterity, greater precision, and reduced invasiveness. It provides a critical analysis of some of the key interventional robot platforms developed over the years and their relative merit and intrinsic limitations. The review also presents a future outlook for robotic interventions and emerging trends in making them easier to use, lightweight, ergonomic, and intelligent, and thus smarter, safer, and more accessible for clinical use.


Asunto(s)
Ingeniería Biomédica/tendencias , Robótica/tendencias , Investigación Biomédica Traslacional/tendencias , Ingeniería Biomédica/métodos , Sistemas de Liberación de Medicamentos , Economía Médica , Diseño de Equipo , Humanos , Laparoscopía/tendencias , Procedimientos Quirúrgicos Mínimamente Invasivos/tendencias , Neurocirugia/tendencias , Ortopedia/tendencias , Procedimientos Quirúrgicos Robotizados/tendencias , Investigación Biomédica Traslacional/métodos
14.
Ann Biomed Eng ; 46(9): 1385-1396, 2018 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-29845413

RESUMEN

Robotic control of needle bending aims at increasing the precision of percutaneous procedures. Ultrasound feedback is preferable for its clinical ease of use, cost and compactness but raises needle detection issues. In this paper, we propose a complete system dedicated to robotized guidance of a flexible needle under 3D ultrasound imaging. This system includes a medical robot dedicated to transperineal needle positioning and insertion, a rapid path planning for needle steering using bevel-tip needle natural curvature in tissue, and an ultrasound-based automatic needle detection algorithm. Since ultrasound-based automatic needle steering is often made difficult by the needle localization in biological tissue, we quantify the benefit of using flexible echogenic needles for robotized guidance under 3D ultrasound. The "echogenic" term refers to the etching of microstructures on the needle shaft. We prove that these structures improve needle visibility and detection robustness in ultrasound images. We finally present promising results when reaching targets using needle steering. The experiments were conducted with various needles in different media (synthetic phantoms and ex vivo biological tissue). For instance, with nitinol needles the mean accuracy is 1.2 mm (respectively 3.8 mm) in phantoms (resp. biological tissue).


Asunto(s)
Agujas , Robótica , Aleaciones , Animales , Imagenología Tridimensional , Fantasmas de Imagen , Porcinos , Ultrasonografía
15.
Int J Comput Assist Radiol Surg ; 13(7): 987-995, 2018 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-29557082

RESUMEN

PURPOSE: We present a hybrid 2D-3D ultrasound (US) rigid registration method for navigated prostate biopsy that enables continuous localization of the biopsy trajectory during the exam. METHODS: Current clinical computer-assisted biopsy systems use either sensor-based or image-based approaches. We combine the advantages of both in order to obtain an accurate and real-time navigation based only on an approximate localization of the US probe. Starting with features extracted in both 2D and 3D images, our method introduces a variant of the iterative closest point (ICP) algorithm. Among other differences to ICP, a combination of both the euclidean distance of feature positions and the similarity distance of feature descriptors is used to find matches between 2D and 3D features. The evaluation of the method is twofold. First, an analysis of variance on input parameters is conducted to estimate the sensitivity of our method to their initialization. Second, for a selected set of their values, the target registration error (TRE) was calculated on 29,760 (resp. 4000) registrations in two different experiments. It was obtained using manually identified anatomical fiducials. RESULTS: For 160 US volumes, from 20 patients, recorded during routine biopsy procedures performed in two hospitals by six operators, the mean TRE was [Formula: see text] mm (resp. [Formula: see text] mm). CONCLUSION: This work allows envisioning further developments for prostate navigation and their clinical transfer.


Asunto(s)
Imagenología Tridimensional/métodos , Próstata/diagnóstico por imagen , Próstata/patología , Ultrasonografía Intervencional/métodos , Algoritmos , Biopsia/métodos , Marcadores Fiduciales , Humanos , Masculino
16.
Front Psychol ; 8: 1689, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-29062287

RESUMEN

Manual gestures can facilitate problem solving but also language or conceptual learning. Both seeing and making the gestures during learning seem to be beneficial. However, the stronger activation of the motor system in the second case should provide supplementary cues to consolidate and re-enact the mental traces created during learning. We tested this hypothesis in the context of anatomy learning by naïve adult participants. Anatomy is a challenging topic to learn and is of specific interest for research on embodied learning, as the learning content can be directly linked to learners' body. Two groups of participants were asked to look at a video lecture on the forearm anatomy. The video included a model making gestures related to the content of the lecture. Both groups see the gestures but only one also imitate the model. Tests of knowledge were run just after learning and few days later. The results revealed that imitating gestures improves the recall of structures names and their localization on a diagram. This effect was however significant only in long-term assessments. This suggests that: (1) the integration of motor actions and knowledge may require sleep; (2) a specific activation of the motor system during learning may improve the consolidation and/or the retrieval of memories.

17.
Annu Int Conf IEEE Eng Med Biol Soc ; 2016: 4109-4112, 2016 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-28269186

RESUMEN

The aim of this paper is to describe a 3D-2D ultrasound feature-based registration method for navigated prostate biopsy and its first results obtained on patient data. A system combining a low-cost tracking system and a 3D-2D registration algorithm was designed. The proposed 3D-2D registration method combines geometric and image-based distances. After extracting features from ultrasound images, 3D and 2D features within a defined distance are matched using an intensity-based function. The results are encouraging and show acceptable errors with simulated transforms applied on ultrasound volumes from real patients.


Asunto(s)
Algoritmos , Biopsia Guiada por Imagen/métodos , Imagenología Tridimensional/métodos , Próstata/patología , Ultrasonografía/métodos , Estudios de Factibilidad , Humanos , Masculino , Próstata/diagnóstico por imagen
18.
IEEE Trans Biomed Eng ; 62(8): 2012-24, 2015 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-25769143

RESUMEN

GOAL: In this paper, we address the development of an automatic approach for the computation of pose information (position + orientation) of prostate brachytherapy loose seeds from 3-D CT images. METHODS: From an initial detection of a set of seed candidates in CT images using a threshold and connected component method, the orientation of each individual seed is estimated by using the principal components analysis method. The main originality of this approach is the ability to classify the detected objects based on a priori intensity and volume information and to separate groups of closely spaced seeds using three competing clustering methods: the standard and a modified k-means method and a Gaussian mixture model with an expectation-maximization algorithm. Experiments were carried out on a series of CT images of two phantoms and patients. The fourteen patients correspond to a total of 1063 implanted seeds. Detections are compared to manual segmentation and to related work in terms of detection performance and calculation time. RESULTS: This automatic method has proved to be accurate and fast including the ability to separate groups of seeds in a reliable way and to determine the orientation of each seed. SIGNIFICANCE: Such a method is mandatory to be able to compute precisely the real dose delivered to the patient postoperatively instead of assuming the alignment of seeds along the theoretical insertion direction of the brachytherapy needles.


Asunto(s)
Braquiterapia/instrumentación , Braquiterapia/métodos , Procesamiento de Imagen Asistido por Computador/métodos , Neoplasias de la Próstata/radioterapia , Radioterapia Guiada por Imagen/métodos , Tomografía Computarizada por Rayos X/métodos , Algoritmos , Humanos , Masculino , Fantasmas de Imagen , Neoplasias de la Próstata/diagnóstico por imagen , Ultrasonografía
19.
Artículo en Inglés | MEDLINE | ID: mdl-26736191

RESUMEN

3D UltraSound (US) probes are used in clinical applications for their ease of use and ability to obtain intra-operative volumes. In surgical navigation applications a calibration step is needed to localize the probe in a general coordinate system. This paper presents a new hand-eye calibration method using directly the kinematic model of a robot and US volume registration data that does not require any 3D localizers. First results show a targeting error of 2.34 mm on an experimental setup using manual segmentation of five beads in ten US volumes.


Asunto(s)
Imagenología Tridimensional/métodos , Robótica , Ultrasonografía/instrumentación , Fenómenos Biomecánicos , Calibración , Humanos , Robótica/instrumentación , Robótica/métodos , Robótica/normas
20.
Annu Int Conf IEEE Eng Med Biol Soc ; 2015: 1544-7, 2015 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-26736566

RESUMEN

This paper demonstrates a new way to detect needles in 3D color-Doppler volumes of biological tissues. It uses rotation to generate vibrations of a needle using an existing robotic brachytherapy system. The results of our detection for color-Doppler and B-Mode ultrasound are compared to a needle location reference given by robot odometry and robot ultrasound calibration. Average errors between detection and reference are 5.8 mm on needle tip for B-Mode images and 2.17 mm for color-Doppler images. These results show that color-Doppler imaging leads to more robust needle detection in noisy environment with poor needle visibility or when needle interacts with other objects.


Asunto(s)
Ultrasonografía Doppler , Imagenología Tridimensional , Agujas , Reproducibilidad de los Resultados , Rotación
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...