Your browser doesn't support javascript.
loading
3D EAGAN: 3D edge-aware attention generative adversarial network for prostate segmentation in transrectal ultrasound images.
Liu, Mengqing; Shao, Xiao; Jiang, Liping; Wu, Kaizhi.
Affiliation
  • Liu M; School of Computer and Information Engineering, Nantong Institute of Technology, Nantong, China.
  • Shao X; School of Information Engineering, Nanchang Hangkong University, Nanchang, China.
  • Jiang L; School of Computer Science, Nanjing University of Information Science and Technology, Nanjing, China.
  • Wu K; Department of Ultrasound Medicine, The First Affiliated Hospital of Nanchang University, Nanchang, China.
Quant Imaging Med Surg ; 14(6): 4067-4085, 2024 Jun 01.
Article in En | MEDLINE | ID: mdl-38846298
ABSTRACT

Background:

The segmentation of prostates from transrectal ultrasound (TRUS) images is a critical step in the diagnosis and treatment of prostate cancer. Nevertheless, the manual segmentation performed by physicians is a time-consuming and laborious task. To address this challenge, there is a pressing need to develop computerized algorithms capable of autonomously segmenting prostates from TRUS images, which sets a direction and form for future development. However, automatic prostate segmentation in TRUS images has always been a challenging problem since prostates in TRUS images have ambiguous boundaries and inhomogeneous intensity distribution. Although many prostate segmentation methods have been proposed, they still need to be improved due to the lack of sensibility to edge information. Consequently, the objective of this study is to devise a highly effective prostate segmentation method that overcomes these limitations and achieves accurate segmentation of prostates in TRUS images.

Methods:

A three-dimensional (3D) edge-aware attention generative adversarial network (3D EAGAN)-based prostate segmentation method is proposed in this paper, which consists of an edge-aware segmentation network (EASNet) that performs the prostate segmentation and a discriminator network that distinguishes predicted prostates from real prostates. The proposed EASNet is composed of an encoder-decoder-based U-Net backbone network, a detail compensation module (DCM), four 3D spatial and channel attention modules (3D SCAM), an edge enhancement module (EEM), and a global feature extractor (GFE). The DCM is proposed to compensate for the loss of detailed information caused by the down-sampling process of the encoder. The features of the DCM are selectively enhanced by the 3D spatial and channel attention module. Furthermore, an EEM is proposed to guide shallow layers in the EASNet to focus on contour and edge information in prostates. Finally, features from shallow layers and hierarchical features from the decoder module are fused through the GFE to predict the segmentation prostates.

Results:

The proposed method is evaluated on our TRUS image dataset and the open-source µRegPro dataset. Specifically, experimental results on two datasets show that the proposed method significantly improved the average segmentation Dice score from 85.33% to 90.06%, Jaccard score from 76.09% to 84.11%, Hausdorff distance (HD) score from 8.59 to 4.58 mm, Precision score from 86.48% to 90.58%, and Recall score from 84.79% to 89.24%.

Conclusions:

A novel 3D EAGAN-based prostate segmentation method is proposed. The proposed method consists of an EASNet and a discriminator network. Experimental results demonstrate that the proposed method has achieved satisfactory results on 3D TRUS image segmentation for prostates.
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Quant Imaging Med Surg Year: 2024 Document type: Article Affiliation country: China

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Quant Imaging Med Surg Year: 2024 Document type: Article Affiliation country: China