Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros

Bases de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Sensors (Basel) ; 21(19)2021 Oct 02.
Artículo en Inglés | MEDLINE | ID: mdl-34640921

RESUMEN

Neural network pruning, an important method to reduce the computational complexity of deep models, can be well applied to devices with limited resources. However, most current methods focus on some kind of information about the filter itself to prune the network, rarely exploring the relationship between the feature maps and the filters. In this paper, two novel pruning methods are proposed. First, a new pruning method is proposed, which reflects the importance of filters by exploring the information in the feature maps. Based on the premise that the more information there is, more important the feature map is, the information entropy of feature maps is used to measure information, which is used to evaluate the importance of each filter in the current layer. Further, normalization is used to realize cross layer comparison. As a result, based on the method mentioned above, the network structure is efficiently pruned while its performance is well reserved. Second, we proposed a parallel pruning method using the combination of our pruning method above and slimming pruning method which has better results in terms of computational cost. Our methods perform better in terms of accuracy, parameters, and FLOPs compared to most advanced methods. On ImageNet, it is achieved 72.02% top1 accuracy for ResNet50 with merely 11.41M parameters and 1.12B FLOPs.For DenseNet40, it is obtained 94.04% accuracy with only 0.38M parameters and 110.72M FLOPs on CIFAR10, and our parallel pruning method makes the parameters and FLOPs are just 0.37M and 100.12M, respectively, with little loss of accuracy.

2.
Sci Rep ; 14(1): 14936, 2024 Jun 28.
Artículo en Inglés | MEDLINE | ID: mdl-38942894

RESUMEN

Deep convolutional neural networks approaches often assume that the feature response has a Gaussian distribution with target-centered peak response, which can be used to guide the target location and classification. Nevertheless, such an assumption is implausible when there is progressive interference from other targets and/or background noise, which produces sub-peaks on the tracking response map and causes model drift. In this paper, we propose a feature response regularization approach for sub-peak response suppression and peak response enforcement and aim to handle progressive interference systematically. Our approach, referred to as Peak Response Regularization (PRR), applies simple-yet-efficient method to aggregate and align discriminative features, which convert local extremal response in discrete feature space to extremal response in continuous space, which enforces the localization and representation capability of convolutional features. Experiments on human pose detection, object detection, object tracking, and image classification demonstrate that PRR improves the performance of image tasks with a negligible computational cost.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA