Your browser doesn't support javascript.
loading
Quantitative susceptibility mapping based basal ganglia segmentation via AGSeg: leveraging active gradient guiding mechanism in deep learning.
Xi, Jiaxiu; Huang, Yuqing; Bao, Lijun.
Afiliación
  • Xi J; Department of Electronic Science, Xiamen University, Xiamen, China.
  • Huang Y; Department of Electronic Science, Xiamen University, Xiamen, China.
  • Bao L; Department of Electronic Science, Xiamen University, Xiamen, China.
Quant Imaging Med Surg ; 14(7): 4417-4435, 2024 Jul 01.
Article en En | MEDLINE | ID: mdl-39022266
ABSTRACT

Background:

With better visual contrast and the ability for magnetic susceptibility quantification analysis, quantitative susceptibility mapping (QSM) has emerged as an important magnetic resonance imaging (MRI) method for basal ganglia studies. Precise segmentation of basal ganglia is a prerequisite for quantification analysis of tissue magnetic susceptibility, which is crucial for subsequent disease diagnosis and surgical planning. The conventional method of localizing and segmenting basal ganglia heavily relies on layer-by-layer manual annotation by experts, resulting in a tedious amount of workload. Although several morphology registration and deep learning based methods have been developed to automate segmentation, the voxels around the nuclei boundary remain a challenge to distinguish due to insufficient tissue contrast. This paper proposes AGSeg, an active gradient guidance-based susceptibility and magnitude information complete (MIC) network for real-time and accurate basal ganglia segmentation.

Methods:

Various datasets, including clinical scans and data from healthy volunteers, were collected across multiple centers with different magnetic field strengths (3T/5T/7T), with a total of 210 three-dimensional (3D) susceptibility measurements. Manual segmentations following fixed rules for anatomical borders annotated by experts were used as ground truth labels. The proposed network took QSM maps and Magnitude images as two individual inputs, of which the features are selectively enhanced in the proposed magnitude information complete (MIC) module. AGSeg utilized a dual-branch architecture, with Seg-branch aiming to generate a proper segmentation map and Grad-branch to reconstruct the gradient map of regions of interest (ROIs). With the support of the newly designed active gradient module (AGM) and gradient guiding module (GGM), the Grad-branch provided attention guidance for the Seg-branch, facilitating it to focus on the boundary of target nuclei.

Results:

Ablation studies were conducted to assess the functionality of the proposed modules. Significant performance decrement was observed after ablating relative modules. AGSeg was evaluated against several existing methods on both healthy and clinical data, achieving an average Dice similarity coefficient (DSC) =0.874 and average 95% Hausdorff distance (HD95) =2.009. Comparison experiments indicated that our model had superior performance on basal ganglia segmentation and better generalization ability over existing methods. The AGSeg outperformed all implemented comparison deep learning algorithms with average DSC enhancement ranging from 0.036 to 0.074.

Conclusions:

The current work integrates a deep learning-based method into automated basal ganglia segmentation. The high processing speed and segmentation robustness of AGSeg contribute to the feasibility of future surgery planning and intraoperative navigation. Experiments show that leveraging active gradient guidance mechanisms and magnitude information completion can facilitate the segmentation process. Moreover, this approach also offers a portable solution for other multi-modality medical image segmentation tasks.
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Quant Imaging Med Surg Año: 2024 Tipo del documento: Article País de afiliación: China

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Quant Imaging Med Surg Año: 2024 Tipo del documento: Article País de afiliación: China