Self-supervised learning of T cell receptor sequences exposes core properties for T cell membership.
Sci Adv
; 10(17): eadk4670, 2024 Apr 26.
Article
em En
| MEDLINE
| ID: mdl-38669334
ABSTRACT
The T cell receptor (TCR) repertoire is an extraordinarily diverse collection of TCRs essential for maintaining the body's homeostasis and response to threats. In this study, we compiled an extensive dataset of more than 4200 bulk TCR repertoire samples, encompassing 221,176,713 sequences, alongside 6,159,652 single-cell TCR sequences from over 400 samples. From this dataset, we then selected a representative subset of 5 million bulk sequences and 4.2 million single-cell sequences to train two specialized Transformer-based language models for bulk (CVC) and single-cell (scCVC) TCR repertoires, respectively. We show that these models successfully capture TCR core qualities, such as sharing, gene composition, and single-cell properties. These qualities are emergent in the encoded TCR latent space and enable classification into TCR-based qualities such as public sequences. These models demonstrate the potential of Transformer-based language models in TCR downstream applications.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Receptores de Antígenos de Linfócitos T
/
Linfócitos T
Limite:
Humans
Idioma:
En
Revista:
Sci Adv
Ano de publicação:
2024
Tipo de documento:
Article