Your browser doesn't support javascript.
loading
Ultrasound-guided needle tracking with deep learning: A novel approach with photoacoustic ground truth.
Hui, Xie; Rajendran, Praveenbalaji; Ling, Tong; Dai, Xianjin; Xing, Lei; Pramanik, Manojit.
Afiliação
  • Hui X; School of Chemistry, Chemical Engineering and Biotechnology, Nanyang Technological University, Singapore 637459, Singapore.
  • Rajendran P; Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States.
  • Ling T; School of Chemistry, Chemical Engineering and Biotechnology, Nanyang Technological University, Singapore 637459, Singapore.
  • Dai X; School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore 637459, Singapore.
  • Xing L; Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States.
  • Pramanik M; Stanford University, Department of Radiation Oncology, Stanford, California 94305, United States.
Photoacoustics ; 34: 100575, 2023 Dec.
Article em En | MEDLINE | ID: mdl-38174105
ABSTRACT
Accurate needle guidance is crucial for safe and effective clinical diagnosis and treatment procedures. Conventional ultrasound (US)-guided needle insertion often encounters challenges in consistency and precisely visualizing the needle, necessitating the development of reliable methods to track the needle. As a powerful tool in image processing, deep learning has shown promise for enhancing needle visibility in US images, although its dependence on manual annotation or simulated data as ground truth can lead to potential bias or difficulties in generalizing to real US images. Photoacoustic (PA) imaging has demonstrated its capability for high-contrast needle visualization. In this study, we explore the potential of PA imaging as a reliable ground truth for deep learning network training without the need for expert annotation. Our network (UIU-Net), trained on ex vivo tissue image datasets, has shown remarkable precision in localizing needles within US images. The evaluation of needle segmentation performance extends across previously unseen ex vivo data and in vivo human data (collected from an open-source data repository). Specifically, for human data, the Modified Hausdorff Distance (MHD) value stands at approximately 3.73, and the targeting error value is around 2.03, indicating the strong similarity and small needle orientation deviation between the predicted needle and actual needle location. A key advantage of our method is its applicability beyond US images captured from specific imaging systems, extending to images from other US imaging systems.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Tipo de estudo: Guideline Idioma: En Revista: Photoacoustics Ano de publicação: 2023 Tipo de documento: Article País de afiliação: Singapura

Texto completo: 1 Base de dados: MEDLINE Tipo de estudo: Guideline Idioma: En Revista: Photoacoustics Ano de publicação: 2023 Tipo de documento: Article País de afiliação: Singapura