Your browser doesn't support javascript.
loading
UPANets: Learning from the Universal Pixel Attention Neworks.
Tseng, Ching-Hsun; Lee, Shin-Jye; Feng, Jianan; Mao, Shengzhong; Wu, Yu-Ping; Shang, Jia-Yu; Zeng, Xiao-Jun.
Afiliação
  • Tseng CH; Department of Computer Science, The University of Manchester, Manchester M13 9PL, UK.
  • Lee SJ; Institute of Management of Technology, National Chiao Tung University, Hsinchu 300, Taiwan.
  • Feng J; School of Software, Yunnan University, Kunming 650504, China.
  • Mao S; Department of Computer Science, The University of Manchester, Manchester M13 9PL, UK.
  • Wu YP; Department of Computer Science, The University of Manchester, Manchester M13 9PL, UK.
  • Shang JY; Department of Computer Science, The University of Manchester, Manchester M13 9PL, UK.
  • Zeng XJ; Department of Computer Science, The University of Manchester, Manchester M13 9PL, UK.
Entropy (Basel) ; 24(9)2022 Sep 04.
Article em En | MEDLINE | ID: mdl-36141129
ABSTRACT
With the successful development in computer vision, building a deep convolutional neural network (CNNs) has been mainstream, considering the character of shared parameters in a convolutional layer. Stacking convolutional layers into a deep structure improves performance, but over-stacking also ramps up the needed resources for GPUs. Seeing another surge of Transformers in computer vision, the issue has aroused severely. A resource-hungry model is hardly implemented for limited hardware or single-customers-based GPU. Therefore, this work focuses on these concerns and proposes an efficient but robust backbone, which equips with channel and spatial direction attentions, so the attentions help to expand receptive fields in shallow convolutional layers and pass the information to every layer. An attention-boosted network based on already efficient CNNs, Universal Pixel Attention Networks (UPANets), is proposed. Through a series of experiments, UPANets fulfil the purposes of learning global information with less needed resources and outshine many existing SOTAs in CIFAR-{10, 100}.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Ano de publicação: 2022 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Ano de publicação: 2022 Tipo de documento: Article