Contrastive Multi-View Composite Graph Convolutional Networks Based on Contribution Learning for Autism Spectrum Disorder Classification.
IEEE Trans Biomed Eng
; 70(6): 1943-1954, 2023 06.
Article
em En
| MEDLINE
| ID: mdl-37015677
The resting-state functional magnetic resonance imaging (rs-fMRI) faithfully reflects the brain activities and thus provides a promising tool for autism spectrum disorder (ASD) classification. Up to now, graph convolutional networks (GCNs) have been successfully applied in rs-fMRI based ASD classification. However, most of these methods were developed based on functional connectivities (FCs) that only reflect low-level correlation between brain regions, without integrating both high-level discriminative knowledge and phenotypic information into classification. Besides, they suffered from the overfitting problem caused by insufficient training samples. To this end, we propose a novel contrastive multi-view composite GCN (CMV-CGCN) for ASD classification using both FCs and HOFCs. Specifically, a pair of graphs are constructed based on the FC and HOFC features of the subjects, respectively, and they share the phenotypic information in the graph edges. A novel contrastive multi-view learning method is proposed based on the consistent representation of both views. A contribution learning mechanism is further incorporated, encouraging the FC and HOFC features of different subjects to have various contribution in the contrastive multi-view learning. The proposed CMV-CGCN is evaluated on 613 subjects (including 286 ASD patients and 327 NCs) from the Autism Brain Imaging Data Exchange (ABIDE). We demonstrate the performance of the method for ASD classification, which yields an accuracy of 75.20% and an area under the curve (AUC) of 0.7338. Experimental results show that our proposed method outperforms state-of-the-art methods on the ABIDE database.
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Infecções por Citomegalovirus
/
Transtorno do Espectro Autista
Idioma:
En
Ano de publicação:
2023
Tipo de documento:
Article