Your browser doesn't support javascript.
loading
Asymmetric Transfer Hashing With Adaptive Bipartite Graph Learning.
IEEE Trans Cybern ; 54(1): 533-545, 2024 Jan.
Article en En | MEDLINE | ID: mdl-37018706
ABSTRACT
Thanks to the efficient retrieval speed and low storage consumption, learning to hash has been widely used in visual retrieval tasks. However, the known hashing methods assume that the query and retrieval samples lie in homogeneous feature space within the same domain. As a result, they cannot be directly applied to heterogeneous cross-domain retrieval. In this article, we propose a generalized image transfer retrieval (GITR) problem, which encounters two crucial bottlenecks 1) the query and retrieval samples may come from different domains, leading to an inevitable domain distribution gap and 2) the features of the two domains may be heterogeneous or misaligned, bringing up an additional feature gap. To address the GITR problem, we propose an asymmetric transfer hashing (ATH) framework with its unsupervised/semisupervised/supervised realizations. Specifically, ATH characterizes the domain distribution gap by the discrepancy between two asymmetric hash functions, and minimizes the feature gap with the help of a novel adaptive bipartite graph constructed on cross-domain data. By jointly optimizing asymmetric hash functions and the bipartite graph, not only can knowledge transfer be achieved but information loss caused by feature alignment can also be avoided. Meanwhile, to alleviate negative transfer, the intrinsic geometrical structure of single-domain data is preserved by involving a domain affinity graph. Extensive experiments on both single-domain and cross-domain benchmarks under different GITR subtasks indicate the superiority of our ATH method in comparison with the state-of-the-art hashing methods.

Texto completo: 1 Bases de datos: MEDLINE Idioma: En Revista: IEEE Trans Cybern Año: 2024 Tipo del documento: Article

Texto completo: 1 Bases de datos: MEDLINE Idioma: En Revista: IEEE Trans Cybern Año: 2024 Tipo del documento: Article