Your browser doesn't support javascript.
loading
LAD: Layer-Wise Adaptive Distillation for BERT Model Compression.
Lin, Ying-Jia; Chen, Kuan-Yu; Kao, Hung-Yu.
Afiliación
  • Lin YJ; Department of Computer Science and Information Engineering, National Cheng Kung University, Tainan 70101, Taiwan.
  • Chen KY; Department of Computer Science and Information Engineering, National Cheng Kung University, Tainan 70101, Taiwan.
  • Kao HY; Department of Computer Science and Information Engineering, National Cheng Kung University, Tainan 70101, Taiwan.
Sensors (Basel) ; 23(3)2023 Jan 28.
Article en En | MEDLINE | ID: mdl-36772523
ABSTRACT
Recent advances with large-scale pre-trained language models (e.g., BERT) have brought significant potential to natural language processing. However, the large model size hinders their use in IoT and edge devices. Several studies have utilized task-specific knowledge distillation to compress the pre-trained language models. However, to reduce the number of layers in a large model, a sound strategy for distilling knowledge to a student model with fewer layers than the teacher model is lacking. In this work, we present Layer-wise Adaptive Distillation (LAD), a task-specific distillation framework that can be used to reduce the model size of BERT. We design an iterative aggregation mechanism with multiple gate blocks in LAD to adaptively distill layer-wise internal knowledge from the teacher model to the student model. The proposed method enables an effective knowledge transfer process for a student model, without skipping any teacher layers. The experimental results show that both the six-layer and four-layer LAD student models outperform previous task-specific distillation approaches during GLUE tasks.
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Sensors (Basel) Año: 2023 Tipo del documento: Article País de afiliación: Taiwán

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: Sensors (Basel) Año: 2023 Tipo del documento: Article País de afiliación: Taiwán