Chemical-Protein Relation Extraction with Pre-trained Prompt Tuning.
IEEE Int Conf Healthc Inform
; 2022: 608-609, 2022 Jun.
Article
em En
| MEDLINE
| ID: mdl-37664001
ABSTRACT
Biomedical relation extraction plays a critical role in the construction of high-quality knowledge graphs and databases, which can further support many downstream applications. Pre-trained prompt tuning, as a new paradigm, has shown great potential in many natural language processing (NLP) tasks. Through inserting a piece of text into the original input, prompt converts NLP tasks into masked language problems, which could be better addressed by pre-trained language models (PLMs). In this study, we applied pre-trained prompt tuning to chemical-protein relation extraction using the BioCreative VI CHEMPROT dataset. The experiment results showed that the pre-trained prompt tuning outperformed the baseline approach in chemical-protein interaction classification. We conclude that the prompt tuning can improve the efficiency of the PLMs on chemical-protein relation extraction tasks.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Idioma:
En
Revista:
IEEE Int Conf Healthc Inform
Ano de publicação:
2022
Tipo de documento:
Article
País de afiliação:
Estados Unidos