Your browser doesn't support javascript.
loading
CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging.
Yang, Yanwu; Guo, Xutao; Ye, Chenfei; Xiang, Yang; Ma, Ting.
Afiliación
  • Yang Y; Electronic & Information Engineering School, Harbin Institute of Technology (Shenzhen), Shenzhen, China; Peng Cheng Laboratory, Shenzhen, China. Electronic address: 20b952019@stu.hit.edu.cn.
  • Guo X; Electronic & Information Engineering School, Harbin Institute of Technology (Shenzhen), Shenzhen, China; Peng Cheng Laboratory, Shenzhen, China. Electronic address: 18B952052@stu.hit.edu.cn.
  • Ye C; Peng Cheng Laboratory, Shenzhen, China; International Research Institute for Artificial Intelligence, Harbin Institute of Technology (Shenzhen), Shenzhen, China. Electronic address: chenfei.ye@foxmail.com.
  • Xiang Y; Peng Cheng Laboratory, Shenzhen, China. Electronic address: xiangy@pcl.ac.cn.
  • Ma T; Electronic & Information Engineering School, Harbin Institute of Technology (Shenzhen), Shenzhen, China; Peng Cheng Laboratory, Shenzhen, China; Guangdong Provincial Key Laboratory of Aerospace Communication and Networking Technology, Harbin Institute of Technology (Shenzhen), Shenzhen, China; I
Med Image Anal ; 89: 102916, 2023 10.
Article en En | MEDLINE | ID: mdl-37549611
ABSTRACT
One of the core challenges of deep learning in medical image analysis is data insufficiency, especially for 3D brain imaging, which may lead to model over-fitting and poor generalization. Regularization strategies such as knowledge distillation are powerful tools to mitigate the issue by penalizing predictive distributions and introducing additional knowledge to reinforce the training process. In this paper, we revisit knowledge distillation as a regularization paradigm by penalizing attentive output distributions and intermediate representations. In particular, we propose a Confidence Regularized Knowledge Distillation (CReg-KD) framework, which adaptively transfers knowledge for distillation in light of knowledge confidence. Two strategies are advocated to regularize the global and local dependencies between teacher and student knowledge. In detail, a gated distillation mechanism is proposed to soften the transferred knowledge globally by utilizing the teacher loss as a confidence score. Moreover, the intermediate representations are attentively and locally refined with key semantic context to mimic meaningful features. To demonstrate the superiority of our proposed framework, we evaluated the framework on two brain imaging analysis tasks (i.e. Alzheimer's Disease classification and brain age estimation based on T1-weighted MRI) on the Alzheimer's Disease Neuroimaging Initiative dataset including 902 subjects and a cohort of 3655 subjects from 4 public datasets. Extensive experimental results show that CReg-KD achieves consistent improvements over the baseline teacher model and outperforms other state-of-the-art knowledge distillation approaches, manifesting that CReg-KD as a powerful medical image analysis tool in terms of both promising prediction performance and generalizability.
Asunto(s)
Palabras clave

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Enfermedad de Alzheimer Tipo de estudio: Prognostic_studies Límite: Humans Idioma: En Revista: Med Image Anal Asunto de la revista: DIAGNOSTICO POR IMAGEM Año: 2023 Tipo del documento: Article

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Enfermedad de Alzheimer Tipo de estudio: Prognostic_studies Límite: Humans Idioma: En Revista: Med Image Anal Asunto de la revista: DIAGNOSTICO POR IMAGEM Año: 2023 Tipo del documento: Article