Protein function prediction through multi-view multi-label latent tensor reconstruction.
BMC Bioinformatics
; 25(1): 174, 2024 May 02.
Article
em En
| MEDLINE
| ID: mdl-38698340
ABSTRACT
BACKGROUND:
In last two decades, the use of high-throughput sequencing technologies has accelerated the pace of discovery of proteins. However, due to the time and resource limitations of rigorous experimental functional characterization, the functions of a vast majority of them remain unknown. As a result, computational methods offering accurate, fast and large-scale assignment of functions to new and previously unannotated proteins are sought after. Leveraging the underlying associations between the multiplicity of features that describe proteins could reveal functional insights into the diverse roles of proteins and improve performance on the automatic function prediction task.RESULTS:
We present GO-LTR, a multi-view multi-label prediction model that relies on a high-order tensor approximation of model weights combined with non-linear activation functions. The model is capable of learning high-order relationships between multiple input views representing the proteins and predicting high-dimensional multi-label output consisting of protein functional categories. We demonstrate the competitiveness of our method on various performance measures. Experiments show that GO-LTR learns polynomial combinations between different protein features, resulting in improved performance. Additional investigations establish GO-LTR's practical potential in assigning functions to proteins under diverse challenging scenarios very low sequence similarity to previously observed sequences, rarely observed and highly specific terms in the gene ontology. IMPLEMENTATION The code and data used for training GO-LTR is available at https//github.com/aalto-ics-kepaco/GO-LTR-prediction .Palavras-chave
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Proteínas
/
Biologia Computacional
Idioma:
En
Revista:
BMC Bioinformatics
Ano de publicação:
2024
Tipo de documento:
Article