Your browser doesn't support javascript.
loading
CFINet: Cross-Modality MRI Feature Interaction Network for Pseudoprogression Prediction of Glioblastoma.
Lv, Ya; Liu, Jin; Tian, Xu; Yang, Pei; Pan, Yi.
Afiliação
  • Lv Y; Xinjiang Engineering Research Center of Big Data and Intelligent Software, School of Software, Xinjiang University, Wulumuqi, China.
  • Liu J; Xinjiang Engineering Research Center of Big Data and Intelligent Software, School of Software, Xinjiang University, Wulumuqi, China.
  • Tian X; Hunan Provincial Key Lab on Bioinformatics, School of Computer Science and Engineering, Central South University, Changsha, China.
  • Yang P; Hunan Provincial Key Lab on Bioinformatics, School of Computer Science and Engineering, Central South University, Changsha, China.
  • Pan Y; Radiation Oncology Department, Hunan Cancer Hospital, Changsha, China.
J Comput Biol ; 2024 Jul 08.
Article em En | MEDLINE | ID: mdl-38975725
ABSTRACT
Pseudoprogression (PSP) is a related reaction of glioblastoma treatment, and misdiagnosis can lead to unnecessary intervention. Magnetic resonance imaging (MRI) provides cross-modality images for PSP prediction studies. However, how to effectively use the complementary information between the cross-modality MRI to improve PSP prediction is still a challenging task. To address this challenge, we propose a cross-modality feature interaction network for PSP prediction. Firstly, we propose a triple-branch multi-scale module to extract low-order feature representations and a skip-connection multi-scale module to extract high-order feature representations. Then, a cross-modality interaction module based on attention mechanism is designed to make the complementary information between cross-modality MRI fully interact. Finally, the high-order cross-modality interaction information is fed into a multi-layer perceptron to achieve the PSP prediction task. We evaluate the proposed network on a private dataset with 52 subjects from Hunan Cancer Hospital and validate it on a private dataset with 30 subjects from Xiangya Hospital. The accuracy of our proposed network on the datasets is 0.954 and 0.929, respectively, which is better than most typical convolutional neural network and interaction methods.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article