Counting is almost all you need.
Front Immunol
; 13: 1031011, 2022.
Article
en En
| MEDLINE
| ID: mdl-36741395
ABSTRACT
The immune memory repertoire encodes the history of present and past infections and immunological attributes of the individual. As such, multiple methods were proposed to use T-cell receptor (TCR) repertoires to detect disease history. We here show that the counting method outperforms two leading algorithms. We then show that the counting can be further improved using a novel attention model to weigh the different TCRs. The attention model is based on the projection of TCRs using a Variational AutoEncoder (VAE). Both counting and attention algorithms predict better than current leading algorithms whether the host had CMV and its HLA alleles. As an intermediate solution between the complex attention model and the very simple counting model, we propose a new Graph Convolutional Network approach that obtains the accuracy of the attention model and the simplicity of the counting model. The code for the models used in the paper is provided at https//github.com/louzounlab/CountingIsAlmostAllYouNeed.
Palabras clave
Texto completo:
1
Banco de datos:
MEDLINE
Asunto principal:
Algoritmos
/
Receptores de Antígenos de Linfocitos T
Tipo de estudio:
Prognostic_studies
Idioma:
En
Revista:
Front Immunol
Año:
2022
Tipo del documento:
Article
País de afiliación:
Israel