Your browser doesn't support javascript.
loading
Accurate few-shot object counting with Hough matching feature enhancement.
He, Zhiquan; Zheng, Donghong; Wang, Hengyou.
Afiliação
  • He Z; Guangdong Key Laboratory of Intelligent Information Processing, Shenzhen, China.
  • Zheng D; Guangdong Multimedia Information Service Engineering Technology Research Center, Shenzhen University, Shenzhen, China.
  • Wang H; Guangdong Multimedia Information Service Engineering Technology Research Center, Shenzhen University, Shenzhen, China.
Front Comput Neurosci ; 17: 1145219, 2023.
Article em En | MEDLINE | ID: mdl-37065544
ABSTRACT

Introduction:

Given some exemplars, few-shot object counting aims to count the corresponding class objects in query images. However, when there are many target objects or background interference in the query image, some target objects may have occlusion and overlap, which causes a decrease in counting accuracy.

Methods:

To overcome the problem, we propose a novel Hough matching feature enhancement network. First, we extract the image feature with a fixed convolutional network and refine it through local self-attention. And we design an exemplar feature aggregation module to enhance the commonality of the exemplar feature. Then, we build a Hough space to vote for candidate object regions. The Hough matching outputs reliable similarity maps between exemplars and the query image. Finally, we augment the query feature with exemplar features according to the similarity maps, and we use a cascade structure to further enhance the query feature.

Results:

Experiment results on FSC-147 show that our network performs best compared to the existing methods, and the mean absolute counting error on the test set improves from 14.32 to 12.74.

Discussion:

Ablation experiments demonstrate that Hough matching helps to achieve more accurate counting compared with previous matching methods.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Front Comput Neurosci Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Revista: Front Comput Neurosci Ano de publicação: 2023 Tipo de documento: Article