RESUMO
Objective.Automated biopsy needle segmentation in 3D ultrasound images can be used for biopsy navigation, but it is quite challenging due to the low ultrasound image resolution and interference similar to the needle appearance. For 3D medical image segmentation, such deep learning networks as convolutional neural network and transformer have been investigated. However, these segmentation methods require numerous labeled data for training, have difficulty in meeting the real-time segmentation requirement and involve high memory consumption.Approach.In this paper, we have proposed the temporal information-based semi-supervised training framework for fast and accurate needle segmentation. Firstly, a novel circle transformer module based on the static and dynamic features has been designed after the encoders for extracting and fusing the temporal information. Then, the consistency constraints of the outputs before and after combining temporal information are proposed to provide the semi-supervision for the unlabeled volume. Finally, the model is trained using the loss function which combines the cross-entropy and Dice similarity coefficient (DSC) based segmentation loss with mean square error based consistency loss. The trained model with the single ultrasound volume input is applied to realize the needle segmentation in ultrasound volume.Main results.Experimental results on three needle ultrasound datasets acquired during the beagle biopsy show that our approach is superior to the most competitive mainstream temporal segmentation model and semi-supervised method by providing higher DSC (77.1% versus 76.5%), smaller needle tip position (1.28 mm versus 1.87 mm) and length (1.78 mm versus 2.19 mm) errors on the kidney dataset as well as DSC (78.5% versus 76.9%), needle tip position (0.86 mm versus 1.12 mm) and length (1.01 mm versus 1.26 mm) errors on the prostate dataset.Significance.The proposed method can significantly enhance needle segmentation accuracy by training with sequential images at no additional cost. This enhancement may further improve the effectiveness of biopsy navigation systems.
Assuntos
Imageamento Tridimensional , Ultrassonografia , Imageamento Tridimensional/métodos , Agulhas , Fatores de Tempo , Processamento de Imagem Assistida por Computador/métodos , Animais , Cães , Humanos , Aprendizado de Máquina Supervisionado , Biópsia por AgulhaRESUMO
BACKGROUND AND OBJECTIVE: Automated follicle detection in ovarian ultrasound volumes remains a challenging task. An objective comparison of different follicle-detection approaches is only possible when all are tested on the same data. This paper describes the development and structure of the first publicly accessible USOVA3D database of annotated ultrasound volumes with ovarian follicles. METHODS: The ovary and all follicles were annotated in each volume by two medical experts. The USOVA3D database is supplemented by a general verification protocol for unbiased assessment of detection algorithms that can be compared and ranked by scoring according to this protocol. This paper also introduces two baseline automated follicle-detection algorithms, the first based on Directional 3D Wavelet Transform (3D DWT) and the second based on Convolutional Neural Networks (CNN). RESULTS: The USOVA3D testing data set was used to verify the variability and reliability of follicle annotations. The intra-rater overall score yielded around 83 (out of a maximum of 100), while both baseline algorithms pointed out just a slightly lower performance, with the 3D DWT-based algorithm being better, with an overall score around 78. CONCLUSIONS: On the other hand, the development of the CNN-based algorithm demonstrated that the USOVA3D database contains sufficient data for successful training without overfitting. The inter-rater reliability analysis and the obtained statistical metrics of effectiveness for both baseline algorithms confirmed that the USOVA3D database is a reliable source for developing new automated detection methods.
Assuntos
Folículo Ovariano , Ovário , Algoritmos , Feminino , Folículo Ovariano/diagnóstico por imagem , Ovário/diagnóstico por imagem , Reprodutibilidade dos Testes , UltrassonografiaRESUMO
In this paper, we present a real-time approach that allows tracking deformable structures in 3D ultrasound sequences. Our method consists in obtaining the target displacements by combining robust dense motion estimation and mechanical model simulation. We perform evaluation of our method through simulated data, phantom data, and real-data. Results demonstrate that this novel approach has the advantage of providing correct motion estimation regarding different ultrasound shortcomings including speckle noise, large shadows and ultrasound gain variation. Furthermore, we show the good performance of our method with respect to state-of-the-art techniques by testing on the 3D databases provided by MICCAI CLUST'14 and CLUST'15 challenges.
Assuntos
Imageamento Tridimensional/métodos , Ultrassonografia/métodos , Algoritmos , Simulação por Computador , Bases de Dados Factuais , Imagens de Fantasmas , Reprodutibilidade dos TestesRESUMO
A robust and efficient needle segmentation method used to localize and track the needle in 3-D trans-rectal ultrasound (TRUS)-guided prostate therapy is proposed. The algorithmic procedure begins by cropping the 3-D US image containing a needle; then all voxels in the cropped 3-D image are grouped into different line support regions (LSRs) based on the outer product of the adjacent voxels' gradient vector. Two different needle axis extraction methods in the candidate LSR are presented: least-squares fitting and 3-D randomized Hough transform. Subsequent local optimization refines the position of the needle axis. Finally, the needle endpoint is localized by finding an intensity drop along the needle axis. The proposed methods were validated with 3-D TRUS tissue-mimicking agar phantom images, chicken breast phantom images and patient images obtained during prostate cryotherapy. The results of the in vivo test indicate that our method can localize the needle accurately and robustly with a needle endpoint localization accuracy <1.43 mm and detection accuracy >84%, which are favorable for 3-D TRUS-guided prostate trans-perineal therapy.