Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
Ultrason Imaging ; 43(5): 262-272, 2021 09.
Artigo em Inglês | MEDLINE | ID: mdl-34180737

RESUMO

Needle visualization in the ultrasound image is essential to successfully perform the ultrasound-guided core needle biopsy. Automatic needle detection can significantly reduce the procedure time, false-negative rate, and highly improve the diagnosis. In this paper, we present a CNN-based, fully automatic method for detection of core needle in 2D ultrasound images. Adaptive moment estimation optimizer is proposed as CNN architecture. Radon transform is applied to locate the needle. The network's model was trained and tested on the total of 619 2D images from 91 cases of breast cancer. The model has achieved an average weighted intersection over union (the weighted Jaccard Index) of 0.986, F1 Score of 0.768, and angle RMSE of 3.73°. The obtained results exceed the other solutions by at least 0.27 and 7° in case of F1 score and angle RMSE, respectively. Finally, the needle is detected in a single frame averagely in 21.6 ms on a modern PC.


Assuntos
Neoplasias da Mama , Redes Neurais de Computação , Neoplasias da Mama/diagnóstico por imagem , Feminino , Humanos , Processamento de Imagem Assistida por Computador , Biópsia Guiada por Imagem , Ultrassonografia
2.
Phys Med Biol ; 69(11)2024 May 21.
Artigo em Inglês | MEDLINE | ID: mdl-38684166

RESUMO

Objective.Automated biopsy needle segmentation in 3D ultrasound images can be used for biopsy navigation, but it is quite challenging due to the low ultrasound image resolution and interference similar to the needle appearance. For 3D medical image segmentation, such deep learning networks as convolutional neural network and transformer have been investigated. However, these segmentation methods require numerous labeled data for training, have difficulty in meeting the real-time segmentation requirement and involve high memory consumption.Approach.In this paper, we have proposed the temporal information-based semi-supervised training framework for fast and accurate needle segmentation. Firstly, a novel circle transformer module based on the static and dynamic features has been designed after the encoders for extracting and fusing the temporal information. Then, the consistency constraints of the outputs before and after combining temporal information are proposed to provide the semi-supervision for the unlabeled volume. Finally, the model is trained using the loss function which combines the cross-entropy and Dice similarity coefficient (DSC) based segmentation loss with mean square error based consistency loss. The trained model with the single ultrasound volume input is applied to realize the needle segmentation in ultrasound volume.Main results.Experimental results on three needle ultrasound datasets acquired during the beagle biopsy show that our approach is superior to the most competitive mainstream temporal segmentation model and semi-supervised method by providing higher DSC (77.1% versus 76.5%), smaller needle tip position (1.28 mm versus 1.87 mm) and length (1.78 mm versus 2.19 mm) errors on the kidney dataset as well as DSC (78.5% versus 76.9%), needle tip position (0.86 mm versus 1.12 mm) and length (1.01 mm versus 1.26 mm) errors on the prostate dataset.Significance.The proposed method can significantly enhance needle segmentation accuracy by training with sequential images at no additional cost. This enhancement may further improve the effectiveness of biopsy navigation systems.


Assuntos
Imageamento Tridimensional , Ultrassonografia , Imageamento Tridimensional/métodos , Agulhas , Fatores de Tempo , Processamento de Imagem Assistida por Computador/métodos , Animais , Cães , Humanos , Aprendizado de Máquina Supervisionado , Biópsia por Agulha
3.
Int J Comput Assist Radiol Surg ; 17(2): 295-303, 2022 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-34677747

RESUMO

PURPOSE: Robot-assisted needle insertion guided by 2D ultrasound (US) can effectively improve the accuracy and success rate of clinical puncture. To this end, automatic and accurate needle-tracking methods are important for monitoring puncture processes, avoiding the needle deviating from the intended path, and reducing the risk of injury to surrounding tissues. This work aims to develop a framework for automatic and accurate detection of an inserted needle in 2D US image during the insertion process. METHODS: We propose a novel convolutional neural network architecture comprising of a two-channel encoder and single-channel decoder for needle segmentation using needle motion information extracted from two adjacent US image frames. Based on the novel network, we further propose an automatic needle detection framework. According to the prediction result of the previous frame, a region of interest of the needle in the US image was extracted and fed into the proposed network to achieve finer and faster continuous needle localization. RESULTS: The performance of our method was evaluated based on 1000 pairs of US images extracted from robot-assisted needle insertions on freshly excised bovine and porcine tissues. The needle segmentation network achieved 99.7% accuracy, 86.2% precision, 89.1% recall, and an F1-score of 0.87. The needle detection framework successfully localized the needle with a mean tip error of 0.45 ± 0.33 mm and a mean orientation error of 0.42° ± 0.34° and achieved a total processing time of 50 ms per image. CONCLUSION: The proposed framework demonstrated the capability to realize robust, accurate, and real-time needle localization during robot-assisted needle insertion processes. It has a promising application in tracking the needle and ensuring the safety of robotic-assisted automatic puncture during challenging US-guided minimally invasive procedures.


Assuntos
Robótica , Cirurgia Assistida por Computador , Animais , Bovinos , Agulhas , Redes Neurais de Computação , Suínos , Ultrassonografia
4.
Med Phys ; 47(10): 4956-4970, 2020 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-32767411

RESUMO

PURPOSE: Many interventional procedures require the precise placement of needles or therapy applicators (tools) to correctly achieve planned targets for optimal diagnosis or treatment of cancer, typically leveraging the temporal resolution of ultrasound (US) to provide real-time feedback. Identifying tools in two-dimensional (2D) images can often be time-consuming with the precise position difficult to distinguish. We have developed and implemented a deep learning method to segment tools in 2D US images in near real-time for multiple anatomical sites, despite the widely varying appearances across interventional applications. METHODS: A U-Net architecture with a Dice similarity coefficient (DSC) loss function was used to perform segmentation on input images resized to 256 × 256 pixels. The U-Net was modified by adding 50% dropouts and the use of transpose convolutions in the decoder section of the network. The proposed approach was trained with 917 images and manual segmentations from prostate/gynecologic brachytherapy, liver ablation, and kidney biopsy/ablation procedures, as well as phantom experiments. Real-time data augmentation was applied to improve generalizability and doubled the dataset for each epoch. Postprocessing to identify the tool tip and trajectory was performed using two different approaches, comparing the largest island with a linear fit to random sample consensus (RANSAC) fitting. RESULTS: Comparing predictions from 315 unseen test images to manual segmentations, the overall median [first quartile, third quartile] tip error, angular error, and DSC were 3.5 [1.3, 13.5] mm, 0.8 [0.3, 1.7]°, and 73.3 [56.2, 82.3]%, respectively, following RANSAC postprocessing. The predictions with the lowest median tip and angular errors were observed in the gynecologic images (median tip error: 0.3 mm; median angular error: 0.4°) with the highest errors in the kidney images (median tip error: 10.1 mm; median angular error: 2.9°). The performance on the kidney images was likely due to a reduction in acoustic signal associated with oblique insertions relative to the US probe and the increased number of anatomical interfaces with similar echogenicity. Unprocessed segmentations were performed with a mean time of approximately 50 ms per image. CONCLUSIONS: We have demonstrated that our proposed approach can accurately segment tools in 2D US images from multiple anatomical locations and a variety of clinical interventional procedures in near real-time, providing the potential to improve image guidance during a broad range of diagnostic and therapeutic cancer interventions.


Assuntos
Aprendizado Profundo , Feminino , Fígado/diagnóstico por imagem , Masculino , Agulhas , Imagens de Fantasmas , Ultrassonografia
5.
Brachytherapy ; 19(5): 659-668, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-32631651

RESUMO

PURPOSE: The purpose of this study was to evaluate the use of a semiautomatic algorithm to simultaneously segment multiple high-dose-rate (HDR) gynecologic interstitial brachytherapy (ISBT) needles in three-dimensional (3D) transvaginal ultrasound (TVUS) images, with the aim of providing a clinically useful tool for intraoperative implant assessment. METHODS AND MATERIALS: A needle segmentation algorithm previously developed for HDR prostate brachytherapy was adapted and extended to 3D TVUS images from gynecologic ISBT patients with vaginal tumors. Two patients were used for refining/validating the modified algorithm and five patients (8-12 needles/patient) were reserved as an unseen test data set. The images were filtered to enhance needle edges, using intensity peaks to generate feature points, and leveraged the randomized 3D Hough transform to identify candidate needle trajectories. Algorithmic segmentations were compared against manual segmentations and calculated dwell positions were evaluated. RESULTS: All 50 test data set needles were successfully segmented with 96% of algorithmically segmented needles having angular differences <3° compared with manually segmented needles and the maximum Euclidean distance was <2.1 mm. The median distance between corresponding dwell positions was 0.77 mm with 86% of needles having maximum differences <3 mm. The mean segmentation time using the algorithm was <30 s/patient. CONCLUSIONS: We successfully segmented multiple needles simultaneously in intraoperative 3D TVUS images from gynecologic HDR-ISBT patients with vaginal tumors and demonstrated the robustness of the algorithmic approach to image artifacts. This method provided accurate segmentations within a clinically efficient timeframe, providing the potential to be translated into intraoperative clinical use for implant assessment.


Assuntos
Adenocarcinoma de Células Claras/radioterapia , Braquiterapia/métodos , Carcinoma Endometrioide/radioterapia , Carcinoma de Células Escamosas/radioterapia , Neoplasias Vaginais/radioterapia , Adenocarcinoma de Células Claras/secundário , Idoso , Idoso de 80 Anos ou mais , Algoritmos , Braquiterapia/instrumentação , Carcinoma Endometrioide/secundário , Carcinoma de Células Escamosas/patologia , Carcinoma de Células Escamosas/secundário , Neoplasias do Endométrio/patologia , Feminino , Humanos , Processamento de Imagem Assistida por Computador , Imageamento Tridimensional/métodos , Masculino , Pessoa de Meia-Idade , Agulhas , Neoplasias Ovarianas/patologia , Próstata/diagnóstico por imagem , Planejamento da Radioterapia Assistida por Computador , Ultrassonografia/métodos , Neoplasias Vaginais/patologia , Neoplasias Vaginais/secundário
6.
Med Image Anal ; 53: 104-110, 2019 04.
Artigo em Inglês | MEDLINE | ID: mdl-30763829

RESUMO

2D ultrasound (US) image guidance is used in minimally invasive procedures in the liver to visualize the target and the needle. Needle insertion using 2D ultrasound keeping the transducer position to view needle and reach target is challenging. Dedicated needle holders attached to the US transducer help to target in plane and at a specific angle. A drawback of this is that, the probe is fixed to the needle and cannot be rotated to assess the position of the needle in a perpendicular plane. In this study, we propose an automatic needle detection and tracking method using 3D US imaging to improve image guidance and visualization of the target in the liver with respect to the needle during these interventional procedures. The method utilizes a convolutional neural network for detection of the needle in 3D US images. In a subsequent step, the output of the convolutional neural network is used to detect needle candidates, which are fed into a final tracking step to determine the real needle position. The needle position is used to present two perpendicular cross-sectional planes of the 3D US image containing the needle in both directions. Performance of the method was evaluated in phantoms and in-vivo data by calculating the needle position distance and needle orientation angle between segmented needles and reference ground truth needles, which were manually annotated by an observer. The method successfully detects the needle position and orientation with mean errors of 1 mm and 2°, respectively. The proposed method yields a robust automatic needle detection and visualization at a frame rate of 3 Hz in 3D ultrasound imaging of the liver.


Assuntos
Imageamento Tridimensional/métodos , Fígado/diagnóstico por imagem , Agulhas , Ultrassonografia de Intervenção/métodos , Humanos , Biópsia Guiada por Imagem , Redes Neurais de Computação , Imagens de Fantasmas , Transdutores
7.
Med Phys ; 44(4): 1234-1245, 2017 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-28160517

RESUMO

PURPOSE: Sagittally reconstructed 3D (SR3D) ultrasound imaging shows promise for improved needle localization for high-dose-rate prostate brachytherapy (HDR-BT); however, needles must be manually segmented intraoperatively while the patient is anesthetized to create a treatment plan. The purpose of this article was to describe and validate an automatic needle segmentation algorithm designed for HDR-BT, specifically capable of simultaneously segmenting all needles in an HDR-BT implant using a single SR3D image with ~5 mm interneedle spacing. MATERIALS AND METHODS: The segmentation algorithm involves regularized feature point classification and line trajectory identification based on the randomized 3D Hough transform modified to handle multiple straight needles in a single image simultaneously. Needle tips are identified based on peaks in the derivative of the signal intensity profile along the needle trajectory. For algorithm validation, 12 prostate cancer patients underwent HDR-BT during which SR3D images were acquired with all needles in place. Needles present in each of the 12 images were segmented manually, providing a gold standard for comparison, and using the algorithm. Tip errors were assessed in terms of the 3D Euclidean distance between needle tips, and trajectory error was assessed in terms of 2D distance in the axial plane and angular deviation between trajectories. RESULTS: In total, 190 needles were investigated. Mean execution time of the algorithm was 11.0 s per patient, or 0.7 s per needle. The algorithm identified 82% and 85% of needle tips with 3D errors ≤3 mm and ≤5 mm, respectively, 91% of needle trajectories with 2D errors in the axial plane ≤3 mm, and 83% of needle trajectories with angular errors ≤3°. The largest tip error component was in the needle insertion direction. CONCLUSIONS: Previous work has indicated HDR-BT needles may be manually segmented using SR3D images with insertion depth errors ≤3 mm and ≤5 mm for 83% and 92% of needles, respectively. The algorithm shows promise for reducing the time required for the segmentation of straight HDR-BT needles, and future work involves improving needle tip localization performance through improved image quality and modeling curvilinear trajectories.


Assuntos
Braquiterapia/instrumentação , Imageamento Tridimensional/métodos , Agulhas , Neoplasias da Próstata/diagnóstico por imagem , Neoplasias da Próstata/radioterapia , Doses de Radiação , Algoritmos , Artefatos , Automação , Humanos , Masculino , Dosagem Radioterapêutica , Fatores de Tempo , Ultrassonografia
8.
Ultrasound Med Biol ; 40(4): 804-16, 2014 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-24462163

RESUMO

A robust and efficient needle segmentation method used to localize and track the needle in 3-D trans-rectal ultrasound (TRUS)-guided prostate therapy is proposed. The algorithmic procedure begins by cropping the 3-D US image containing a needle; then all voxels in the cropped 3-D image are grouped into different line support regions (LSRs) based on the outer product of the adjacent voxels' gradient vector. Two different needle axis extraction methods in the candidate LSR are presented: least-squares fitting and 3-D randomized Hough transform. Subsequent local optimization refines the position of the needle axis. Finally, the needle endpoint is localized by finding an intensity drop along the needle axis. The proposed methods were validated with 3-D TRUS tissue-mimicking agar phantom images, chicken breast phantom images and patient images obtained during prostate cryotherapy. The results of the in vivo test indicate that our method can localize the needle accurately and robustly with a needle endpoint localization accuracy <1.43 mm and detection accuracy >84%, which are favorable for 3-D TRUS-guided prostate trans-perineal therapy.


Assuntos
Braquiterapia/métodos , Aspiração por Agulha Fina Guiada por Ultrassom Endoscópico/métodos , Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Próstata/diagnóstico por imagem , Cirurgia Assistida por Computador/métodos , Ultrassonografia de Intervenção/métodos , Braquiterapia/instrumentação , Humanos , Masculino , Agulhas , Reconhecimento Automatizado de Padrão/métodos , Próstata/patologia , Reprodutibilidade dos Testes , Sensibilidade e Especificidade
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA