Cross-attention-map-based regularization for adversarial domain adaptation.
Neural Netw
; 145: 128-138, 2022 Jan.
Article
em En
| MEDLINE
| ID: mdl-34735891
In unsupervised domain adaptation (UDA), many efforts are taken to pull the source domain and the target domain closer by adversarial training. Most methods focus on aligning distributions or features between the source domain and the target domain. However, little attention is paid to the interaction between finer-grained levels, such as classes or samples of the two domains. In contrast to UDA, another transfer learning task, i.e., few-shot learning (FSL), takes full advantage of the finer-grained-level alignment. Many FSL methods implement the interaction between samples of support sets and query sets, leading to significant improvements. We wonder whether we can get some inspiration from these methods and bring such ideas of FSL to UDA. To this end, we first take a closer look at the differences between FSL and UDA and bridge the gap between them by high-confidence sample selection (HCSS). Then we propose cross-attention map generation module (CAMGM) to interact samples selected by HCSS. Moreover, we propose a simple but efficient method called cross-attention-map-based regularization (CAMR) to regularize the feature maps generated by the feature extractor. Experiments on three challenging datasets demonstrate that CAMR can bring solid improvements when added to the original objective. More specifically, the proposed CAMR can outperform original methods by 1% to 2% in most tasks without bells and whistles.
Palavras-chave
Texto completo:
1
Base de dados:
MEDLINE
Assunto principal:
Atenção
/
Aprendizagem
Idioma:
En
Ano de publicação:
2022
Tipo de documento:
Article