Your browser doesn't support javascript.
loading
MV-MR: Multi-Views and Multi-Representations for Self-Supervised Learning and Knowledge Distillation.
Kinakh, Vitaliy; Drozdova, Mariia; Voloshynovskiy, Slava.
Afiliação
  • Kinakh V; Department of Computer Science, University of Geneva, 1227 Carouge, Switzerland.
  • Drozdova M; Department of Computer Science, University of Geneva, 1227 Carouge, Switzerland.
  • Voloshynovskiy S; Department of Computer Science, University of Geneva, 1227 Carouge, Switzerland.
Entropy (Basel) ; 26(6)2024 May 29.
Article em En | MEDLINE | ID: mdl-38920475
ABSTRACT
We present a new method of self-supervised learning and knowledge distillation based on multi-views and multi-representations (MV-MR). MV-MR is based on the maximization of dependence between learnable embeddings from augmented and non-augmented views, jointly with the maximization of dependence between learnable embeddings from the augmented view and multiple non-learnable representations from the non-augmented view. We show that the proposed method can be used for efficient self-supervised classification and model-agnostic knowledge distillation. Unlike other self-supervised techniques, our approach does not use any contrastive learning, clustering, or stop gradients. MV-MR is a generic framework allowing the incorporation of constraints on the learnable embeddings via the usage of image multi-representations as regularizers. The proposed method is used for knowledge distillation. MV-MR provides state-of-the-art self-supervised performance on the STL10 and CIFAR20 datasets in a linear evaluation setup. We show that a low-complexity ResNet50 model pretrained using proposed knowledge distillation based on the CLIP ViT model achieves state-of-the-art performance on STL10 and CIFAR100 datasets.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Entropy (Basel) Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Suíça

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Entropy (Basel) Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Suíça