Large-scale foundation model on single-cell transcriptomics.
Nat Methods
; 21(8): 1481-1491, 2024 Aug.
Article
en En
| MEDLINE
| ID: mdl-38844628
ABSTRACT
Large pretrained models have become foundation models leading to breakthroughs in natural language processing and related fields. Developing foundation models for deciphering the 'languages' of cells and facilitating biomedical research is promising yet challenging. Here we developed a large pretrained model scFoundation, also named 'xTrimoscFoundationα', with 100 million parameters covering about 20,000 genes, pretrained on over 50 million human single-cell transcriptomic profiles. scFoundation is a large-scale model in terms of the size of trainable parameters, dimensionality of genes and volume of training data. Its asymmetric transformer-like architecture and pretraining task design empower effectively capturing complex context relations among genes in a variety of cell types and states. Experiments showed its merit as a foundation model that achieved state-of-the-art performances in a diverse array of single-cell analysis tasks such as gene expression enhancement, tissue drug response prediction, single-cell drug response classification, single-cell perturbation prediction, cell type annotation and gene module inference.
Texto completo:
1
Colección:
01-internacional
Banco de datos:
MEDLINE
Asunto principal:
Perfilación de la Expresión Génica
/
Análisis de la Célula Individual
/
Transcriptoma
Límite:
Humans
Idioma:
En
Revista:
Nat Methods
Asunto de la revista:
TECNICAS E PROCEDIMENTOS DE LABORATORIO
Año:
2024
Tipo del documento:
Article
País de afiliación:
China