Your browser doesn't support javascript.
loading
Complementary label learning based on knowledge distillation.
Ying, Peng; Li, Zhongnian; Sun, Renke; Xu, Xinzheng.
Affiliation
  • Ying P; School of Computer Science and Technology, China University of Mining and Technology, Xuzhou 221116, China.
  • Li Z; School of Computer Science and Technology, China University of Mining and Technology, Xuzhou 221116, China.
  • Sun R; School of Computer Science and Technology, China University of Mining and Technology, Xuzhou 221116, China.
  • Xu X; School of Computer Science and Technology, China University of Mining and Technology, Xuzhou 221116, China.
Math Biosci Eng ; 20(10): 17905-17918, 2023 Sep 19.
Article in En | MEDLINE | ID: mdl-38052542
ABSTRACT
Complementary label learning (CLL) is a type of weakly supervised learning method that utilizes the category of samples that do not belong to a certain class to learn their true category. However, current CLL methods mainly rely on rewriting classification losses without fully leveraging the supervisory information in complementary labels. Therefore, enhancing the supervised information in complementary labels is a promising approach to improve the performance of CLL. In this paper, we propose a novel framework called Complementary Label Enhancement based on Knowledge Distillation (KDCL) to address the lack of attention given to complementary labels. KDCL consists of two deep neural networks a teacher model and a student model. The teacher model focuses on softening complementary labels to enrich the supervision information in them, while the student model learns from the complementary labels that have been softened by the teacher model. Both the teacher and student models are trained on the dataset that contains only complementary labels. To evaluate the effectiveness of KDCL, we conducted experiments on four datasets, namely MNIST, F-MNIST, K-MNIST and CIFAR-10, using two sets of teacher-student models (Lenet-5+MLP and DenseNet-121+ResNet-18) and three CLL algorithms (PC, FWD and SCL-NL). Our experimental results demonstrate that models optimized by KDCL outperform those trained only with complementary labels in terms of accuracy.
Key words

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Math Biosci Eng Year: 2023 Document type: Article Affiliation country: China

Full text: 1 Collection: 01-internacional Database: MEDLINE Language: En Journal: Math Biosci Eng Year: 2023 Document type: Article Affiliation country: China