Assembling spatial clustering framework for heterogeneous spatial transcriptomics data with GRAPHDeep.
Bioinformatics
; 40(1)2024 01 02.
Article
em En
| MEDLINE
| ID: mdl-38243703
ABSTRACT
MOTIVATION Spatial clustering is essential and challenging for spatial transcriptomics' data analysis to unravel tissue microenvironment and biological function. Graph neural networks are promising to address gene expression profiles and spatial location information in spatial transcriptomics to generate latent representations. However, choosing an appropriate graph deep learning module and graph neural network necessitates further exploration and investigation. RESULTS:
In this article, we present GRAPHDeep to assemble a spatial clustering framework for heterogeneous spatial transcriptomics data. Through integrating 2 graph deep learning modules and 20 graph neural networks, the most appropriate combination is decided for each dataset. The constructed spatial clustering method is compared with state-of-the-art algorithms to demonstrate its effectiveness and superiority. The significant new findings include (i) the number of genes or proteins of spatial omics data is quite crucial in spatial clustering algorithms; (ii) the variational graph autoencoder is more suitable for spatial clustering tasks than deep graph infomax module; (iii) UniMP, SAGE, SuperGAT, GATv2, GCN, and TAG are the recommended graph neural networks for spatial clustering tasks; and (iv) the used graph neural network in the existent spatial clustering frameworks is not the best candidate. This study could be regarded as desirable guidance for choosing an appropriate graph neural network for spatial clustering. AVAILABILITY AND IMPLEMENTATION The source code of GRAPHDeep is available at https//github.com/narutoten520/GRAPHDeep. The studied spatial omics data are available at https//zenodo.org/record/8141084.
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Algoritmos
/
Perfilação da Expressão Gênica
Tipo de estudo:
Prognostic_studies
Idioma:
En
Ano de publicação:
2024
Tipo de documento:
Article