Your browser doesn't support javascript.
loading
Pixel Distillation: Cost-flexible Distillation across Image Sizes and Heterogeneous Networks.
Article en En | MEDLINE | ID: mdl-38949946
ABSTRACT
Previous knowledge distillation (KD) methods mostly focus on compressing network architectures, which is not thorough enough in deployment as some costs like transmission bandwidth and imaging equipment are related to the image size. Therefore, we propose Pixel Distillation that extends knowledge distillation into the input level while simultaneously breaking architecture constraints. Such a scheme can achieve flexible cost control for deployment, as it allows the system to adjust both network architecture and image quality according to the overall requirement of resources. Specifically, we first propose an input spatial representation distillation (ISRD) mechanism to transfer spatial knowledge from large images to student's input module, which can facilitate stable knowledge transfer between CNN and ViT. Then, a Teacher-Assistant-Student (TAS) framework is further established to disentangle pixel distillation into the model compression stage and input compression stage, which significantly reduces the overall complexity of pixel distillation and the difficulty of distilling intermediate knowledge. Finally, we adapt pixel distillation to object detection via an aligned feature for preservation (AFP) strategy for TAS, which aligns output dimensions of detectors at each stage by manipulating features and anchors of the assistant. Comprehensive experiments on image classification and object detection demonstrate the effectiveness of our method.

Texto completo: 1 Banco de datos: MEDLINE Idioma: En Revista: IEEE Trans Pattern Anal Mach Intell Asunto de la revista: INFORMATICA MEDICA Año: 2024 Tipo del documento: Article

Texto completo: 1 Banco de datos: MEDLINE Idioma: En Revista: IEEE Trans Pattern Anal Mach Intell Asunto de la revista: INFORMATICA MEDICA Año: 2024 Tipo del documento: Article