Your browser doesn't support javascript.
loading
Evolutionary Multi-Objective One-Shot Filter Pruning for Designing Lightweight Convolutional Neural Network.
Wu, Tao; Shi, Jiao; Zhou, Deyun; Zheng, Xiaolong; Li, Na.
Afiliación
  • Wu T; School of Electronics and Information, Northwestern Polytechnical University, 127 West Youyi Road, Xi'an 710072, China.
  • Shi J; School of Electronics and Information, Northwestern Polytechnical University, 127 West Youyi Road, Xi'an 710072, China.
  • Zhou D; School of Electronics and Information, Northwestern Polytechnical University, 127 West Youyi Road, Xi'an 710072, China.
  • Zheng X; School of Electronics and Information, Northwestern Polytechnical University, 127 West Youyi Road, Xi'an 710072, China.
  • Li N; School of Electronics and Information, Northwestern Polytechnical University, 127 West Youyi Road, Xi'an 710072, China.
Sensors (Basel) ; 21(17)2021 Sep 02.
Article en En | MEDLINE | ID: mdl-34502792
ABSTRACT
Deep neural networks have achieved significant development and wide applications for their amazing performance. However, their complex structure, high computation and storage resource limit their applications in mobile or embedding devices such as sensor platforms. Neural network pruning is an efficient way to design a lightweight model from a well-trained complex deep neural network. In this paper, we propose an evolutionary multi-objective one-shot filter pruning method for designing a lightweight convolutional neural network. Firstly, unlike some famous iterative pruning methods, a one-shot pruning framework only needs to perform filter pruning and model fine-tuning once. Moreover, we built a constraint multi-objective filter pruning problem in which two objectives represent the filter pruning ratio and the accuracy of the pruned convolutional neural network, respectively. A non-dominated sorting-based evolutionary multi-objective algorithm was used to solve the filter pruning problem, and it provides a set of Pareto solutions which consists of a series of different trade-off pruned models. Finally, some models are uniformly selected from the set of Pareto solutions to be fine-tuned as the output of our method. The effectiveness of our method was demonstrated in experimental studies on four designed models, LeNet and AlexNet. Our method can prune over 85%, 82%, 75%, 65%, 91% and 68% filters with little accuracy loss on four designed models, LeNet and AlexNet, respectively.
Asunto(s)
Palabras clave

Texto completo: 1 Colección: 01-internacional Banco de datos: MEDLINE Asunto principal: Algoritmos / Redes Neurales de la Computación Idioma: En Revista: Sensors (Basel) Año: 2021 Tipo del documento: Article País de afiliación: China

Texto completo: 1 Colección: 01-internacional Banco de datos: MEDLINE Asunto principal: Algoritmos / Redes Neurales de la Computación Idioma: En Revista: Sensors (Basel) Año: 2021 Tipo del documento: Article País de afiliación: China