Your browser doesn't support javascript.
loading
Building a Korean morphological analyzer using two Korean BERT models.
Choi, Yong-Seok; Park, Yo-Han; Lee, Kong Joo.
Afiliação
  • Choi YS; Department of Radio and Information Communications Engineering, Chungnam National University, Daejeon, South Korea.
  • Park YH; Department of Radio and Information Communications Engineering, Chungnam National University, Daejeon, South Korea.
  • Lee KJ; Department of Radio and Information Communications Engineering, Chungnam National University, Daejeon, South Korea.
PeerJ Comput Sci ; 8: e968, 2022.
Article em En | MEDLINE | ID: mdl-35634098
A morphological analyzer plays an essential role in identifying functional suffixes of Korean words. The analyzer input and output differ from each other in their length and strings, which can be dealt with by an encoder-decoder architecture. We adopt a Transformer architecture, which is an encoder-decoder architecture with self-attention rather than a recurrent connection, to implement a Korean morphological analyzer. Bidirectional Encoder Representations from Transformers (BERT) is one of the most popular pretrained representation models; it can present an encoded sequence of input words, considering contextual information. We initialize both the Transformer encoder and decoder with two types of Korean BERT, one of which is pretrained with a raw corpus, and the other is pretrained with a morphologically analyzed dataset. Therefore, implementing a Korean morphological analyzer based on Transformer is a fine-tuning process with a relatively small corpus. A series of experiments proved that parameter initialization using pretrained models can alleviate the chronic problem of a lack of training data and reduce the time required for training. In addition, we can determine the number of layers required for the encoder and decoder to optimize the performance of a Korean morphological analyzer.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: PeerJ Comput Sci Ano de publicação: 2022 Tipo de documento: Article País de afiliação: Coréia do Sul

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: PeerJ Comput Sci Ano de publicação: 2022 Tipo de documento: Article País de afiliação: Coréia do Sul