Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros











Base de dados
Intervalo de ano de publicação
1.
Phys Med Biol ; 69(11)2024 May 21.
Artigo em Inglês | MEDLINE | ID: mdl-38684166

RESUMO

Objective.Automated biopsy needle segmentation in 3D ultrasound images can be used for biopsy navigation, but it is quite challenging due to the low ultrasound image resolution and interference similar to the needle appearance. For 3D medical image segmentation, such deep learning networks as convolutional neural network and transformer have been investigated. However, these segmentation methods require numerous labeled data for training, have difficulty in meeting the real-time segmentation requirement and involve high memory consumption.Approach.In this paper, we have proposed the temporal information-based semi-supervised training framework for fast and accurate needle segmentation. Firstly, a novel circle transformer module based on the static and dynamic features has been designed after the encoders for extracting and fusing the temporal information. Then, the consistency constraints of the outputs before and after combining temporal information are proposed to provide the semi-supervision for the unlabeled volume. Finally, the model is trained using the loss function which combines the cross-entropy and Dice similarity coefficient (DSC) based segmentation loss with mean square error based consistency loss. The trained model with the single ultrasound volume input is applied to realize the needle segmentation in ultrasound volume.Main results.Experimental results on three needle ultrasound datasets acquired during the beagle biopsy show that our approach is superior to the most competitive mainstream temporal segmentation model and semi-supervised method by providing higher DSC (77.1% versus 76.5%), smaller needle tip position (1.28 mm versus 1.87 mm) and length (1.78 mm versus 2.19 mm) errors on the kidney dataset as well as DSC (78.5% versus 76.9%), needle tip position (0.86 mm versus 1.12 mm) and length (1.01 mm versus 1.26 mm) errors on the prostate dataset.Significance.The proposed method can significantly enhance needle segmentation accuracy by training with sequential images at no additional cost. This enhancement may further improve the effectiveness of biopsy navigation systems.


Assuntos
Imageamento Tridimensional , Ultrassonografia , Imageamento Tridimensional/métodos , Agulhas , Fatores de Tempo , Processamento de Imagem Assistida por Computador/métodos , Animais , Cães , Humanos , Aprendizado de Máquina Supervisionado , Biópsia por Agulha
2.
Med Image Anal ; 94: 103130, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38437787

RESUMO

Robot-assisted prostate biopsy is a new technology to diagnose prostate cancer, but its safety is influenced by the inability of robots to sense the tool-tissue interaction force accurately during biopsy. Recently, vision based force sensing (VFS) provides a potential solution to this issue by utilizing image sequences to infer the interaction force. However, the existing mainstream VFS methods cannot realize the accurate force sensing due to the adoption of convolutional or recurrent neural network to learn deformation from the optical images and some of these methods are not efficient especially when the recurrent convolutional operations are involved. This paper has presented a Transformer based VFS (TransVFS) method by leveraging ultrasound volume sequences acquired during prostate biopsy. The TransVFS method uses a spatio-temporal local-global Transformer to capture the local image details and the global dependency simultaneously to learn prostate deformations for force estimation. Distinctively, our method explores both the spatial and temporal attention mechanisms for image feature learning, thereby addressing the influence of the low ultrasound image resolution and the unclear prostate boundary on the accurate force estimation. Meanwhile, the two efficient local-global attention modules are introduced to reduce 4D spatio-temporal computation burden by utilizing the factorized spatio-temporal processing strategy, thereby facilitating the fast force estimation. Experiments on prostate phantom and beagle dogs show that our method significantly outperforms existing VFS methods and other spatio-temporal Transformer models. The TransVFS method surpasses the most competitive compared method ResNet3dGRU by providing the mean absolute errors of force estimation, i.e., 70.4 ± 60.0 millinewton (mN) vs 123.7 ± 95.6 mN, on the transabdominal ultrasound dataset of dogs.


Assuntos
Próstata , Neoplasias da Próstata , Masculino , Humanos , Animais , Cães , Próstata/diagnóstico por imagem , Neoplasias da Próstata/diagnóstico por imagem , Biópsia , Aprendizagem , Ultrassonografia de Intervenção , Processamento de Imagem Assistida por Computador
3.
Int J Med Robot ; : e2597, 2023 Nov 20.
Artigo em Inglês | MEDLINE | ID: mdl-37984069

RESUMO

BACKGROUND: Robotic systems are increasingly used to enhance clinical outcomes in prostate intervention. To evaluate the clinical value of the proposed portable robot, the robot-assisted and robot-targeted punctures were validated experimentally. METHOD: The robot registration utilising the electromagnetic tracker achieves coordinate transformation from the ultrasound (US) image to the robot. Subsequently, Transrectal ultrasound (TRUS)-guided phantom trials were conducted for robot-assisted, free-hand, and robot-targeted punctures. RESULTS: The accuracy of robot registration was 0.95 mm, and the accuracy of robot-assisted, free-hand, and robot-targeted punctures was 2.38 ± 0.64 mm, 3.11 ± 0.72 mm, and 3.29 ± 0.83 mm sequentially. CONCLUSION: The registration method has been successfully applied to robot-targeted puncture. Current results indicate that the accuracy of robot-targeted puncture is slightly inferior to that of manual operations. Moreover, in manual operation, robot-assisted puncture improves the accuracy of free-hand puncture. Accuracy superior to 3.5 mm demonstrates the clinical applicability of both robot-assisted and robot-targeted punctures.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA