Your browser doesn't support javascript.
loading
A subject-specific unsupervised deep learning method for quantitative susceptibility mapping using implicit neural representation.
Zhang, Ming; Feng, Ruimin; Li, Zhenghao; Feng, Jie; Wu, Qing; Zhang, Zhiyong; Ma, Chengxin; Wu, Jinsong; Yan, Fuhua; Liu, Chunlei; Zhang, Yuyao; Wei, Hongjiang.
Afiliação
  • Zhang M; School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, China.
  • Feng R; School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, China.
  • Li Z; School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, China.
  • Feng J; School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, China.
  • Wu Q; School of Information Science and Technology, ShanghaiTech University, Shanghai, China.
  • Zhang Z; School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, China.
  • Ma C; Department of Neurosurgery, Huashan Hospital, Shanghai Medical College, Fudan University, Shanghai, China.
  • Wu J; Department of Neurosurgery, Huashan Hospital, Shanghai Medical College, Fudan University, Shanghai, China.
  • Yan F; Department of Radiology, Ruijin Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China.
  • Liu C; Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, CA, USA.
  • Zhang Y; School of Information Science and Technology, ShanghaiTech University, Shanghai, China.
  • Wei H; School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, China; Department of Radiology, Ruijin Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai, China; National Engineering Research Center of Advanced Magnetic Resonance Technologies for Diagnosis and Therapy (
Med Image Anal ; 95: 103173, 2024 Jul.
Article em En | MEDLINE | ID: mdl-38657424
ABSTRACT
Quantitative susceptibility mapping (QSM) is an MRI-based technique that estimates the underlying tissue magnetic susceptibility based on phase signal. Deep learning (DL)-based methods have shown promise in handling the challenging ill-posed inverse problem for QSM reconstruction. However, they require extensive paired training data that are typically unavailable and suffer from generalization problems. Recent model-incorporated DL approaches also overlook the non-local effect of the tissue phase in applying the source-to-field forward model due to patch-based training constraint, resulting in a discrepancy between the prediction and measurement and subsequently suboptimal QSM reconstruction. This study proposes an unsupervised and subject-specific DL method for QSM reconstruction based on implicit neural representation (INR), referred to as INR-QSM. INR has emerged as a powerful framework for learning a high-quality continuous representation of the signal (image) by exploiting its internal information without training labels. In INR-QSM, the desired susceptibility map is represented as a continuous function of the spatial coordinates, parameterized by a fully-connected neural network. The weights are learned by minimizing a loss function that includes a data fidelity term incorporated by the physical model and regularization terms. Additionally, a novel phase compensation strategy is proposed for the first time to account for the non-local effect of tissue phase in data consistency calculation to make the physical model more accurate. Our experiments show that INR-QSM outperforms traditional established QSM reconstruction methods and the compared unsupervised DL method both qualitatively and quantitatively, and is competitive against supervised DL methods under data perturbations.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Imageamento por Ressonância Magnética / Aprendizado de Máquina não Supervisionado / Aprendizado Profundo Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Imageamento por Ressonância Magnética / Aprendizado de Máquina não Supervisionado / Aprendizado Profundo Idioma: En Ano de publicação: 2024 Tipo de documento: Article