Transfer Learning for Sentiment Analysis Using BERT Based Supervised Fine-Tuning.
Sensors (Basel)
; 22(11)2022 May 30.
Article
en En
| MEDLINE
| ID: mdl-35684778
The growth of the Internet has expanded the amount of data expressed by users across multiple platforms. The availability of these different worldviews and individuals' emotions empowers sentiment analysis. However, sentiment analysis becomes even more challenging due to a scarcity of standardized labeled data in the Bangla NLP domain. The majority of the existing Bangla research has relied on models of deep learning that significantly focus on context-independent word embeddings, such as Word2Vec, GloVe, and fastText, in which each word has a fixed representation irrespective of its context. Meanwhile, context-based pre-trained language models such as BERT have recently revolutionized the state of natural language processing. In this work, we utilized BERT's transfer learning ability to a deep integrated model CNN-BiLSTM for enhanced performance of decision-making in sentiment analysis. In addition, we also introduced the ability of transfer learning to classical machine learning algorithms for the performance comparison of CNN-BiLSTM. Additionally, we explore various word embedding techniques, such as Word2Vec, GloVe, and fastText, and compare their performance to the BERT transfer learning strategy. As a result, we have shown a state-of-the-art binary classification performance for Bangla sentiment analysis that significantly outperforms all embedding and algorithms.
Palabras clave
Texto completo:
1
Banco de datos:
MEDLINE
Asunto principal:
Procesamiento de Lenguaje Natural
/
Análisis de Sentimientos
Tipo de estudio:
Prognostic_studies
Límite:
Humans
Idioma:
En
Año:
2022
Tipo del documento:
Article