Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
IEEE Trans Neural Netw Learn Syst ; 34(4): 1958-1971, 2023 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-34464275

RESUMO

Visible-Infrared person reidentification (VI-ReID) is a challenging matching problem due to large modality variations between visible and infrared images. Existing approaches usually bridge the modality gap with only feature-level constraints, ignoring pixel-level variations. Some methods employ a generative adversarial network (GAN) to generate style-consistent images, but it destroys the structure information and incurs a considerable level of noise. In this article, we explicitly consider these challenges and formulate a novel spectrum-aware feature augmentation network named SFANet for cross-modality matching problem. Specifically, we put forward to employ grayscale-spectrum images to fully replace RGB images for feature learning. Learning with the grayscale-spectrum images, our model can apparently reduce modality discrepancy and detect inner structure relations across the different modalities, making it robust to color variations. At feature level, we improve the conventional two-stream network by balancing the number of specific and sharable convolutional blocks, which preserve the spatial structure information of features. Additionally, a bidirectional tri-constrained top-push ranking loss (BTTR) is embedded in the proposed network to improve the discriminability, which efficiently further boosts the matching accuracy. Meanwhile, we further introduce an effective dual-linear with batch normalization identification (ID) embedding method to model the identity-specific information and assist BTTR loss in magnitude stabilizing. On SYSU-MM01 and RegDB datasets, we conducted extensively experiments to demonstrate that our proposed framework contributes indispensably and achieves a very competitive VI-ReID performance.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA