Your browser doesn't support javascript.
loading
Pretrained Quantum-Inspired Deep Neural Network for Natural Language Processing.
IEEE Trans Cybern ; PP2024 May 29.
Article em En | MEDLINE | ID: mdl-38809747
ABSTRACT
Natural language processing (NLP) may face the inexplicable "black-box" problem of parameters and unreasonable modeling for lack of embedding of some characteristics of natural language, while the quantum-inspired models based on quantum theory may provide a potential solution. However, the essential prior knowledge and pretrained text features are often ignored at the early stage of the development of quantum-inspired models. To attacking the above challenges, a pretrained quantum-inspired deep neural network is proposed in this work, which is constructed based on quantum theory for carrying out strong performance and great interpretability in related NLP fields. Concretely, a quantum-inspired pretrained feature embedding (QPFE) method is first developed to model superposition states for words to embed more textual features. Then, a QPFE-ERNIE model is designed by merging the semantic features learned from the prevalent pretrained model ERNIE, which is verified with two NLP downstream tasks 1) sentiment classification and 2) word sense disambiguation (WSD). In addition, schematic quantum circuit diagrams are provided, which has potential impetus for the future realization of quantum NLP with quantum device. Finally, the experiment results demonstrate QPFE-ERNIE is significantly better for sentiment classification than gated recurrent unit (GRU), BiLSTM, and TextCNN on five datasets in all metrics and achieves better results than ERNIE in accuracy, F1-score, and precision on two datasets (CR and SST), and it also has advantage for WSD over the classical models, including BERT (improves F1-score by 5.2 on average) and ERNIE (improves F1-score by 4.2 on average) and improves the F1-score by 8.7 on average compared with a previous quantum-inspired model QWSD. QPFE-ERNIE provides a novel pretrained quantum-inspired model for solving NLP problems, and it lays a foundation for exploring more quantum-inspired models in the future.

Texto completo: 1 Base de dados: MEDLINE Idioma: En Revista: IEEE Trans Cybern Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Revista: IEEE Trans Cybern Ano de publicação: 2024 Tipo de documento: Article