Source-Free Progressive Graph Learning for Open-Set Domain Adaptation.
IEEE Trans Pattern Anal Mach Intell
; 45(9): 11240-11255, 2023 Sep.
Article
em En
| MEDLINE
| ID: mdl-37097801
Open-set domain adaptation (OSDA) aims to transfer knowledge from a label-rich source domain to a label-scarce target domain while addressing disturbances from irrelevant target classes not present in the source data. However, most OSDA approaches are limited due to the lack of essential theoretical analysis of generalization bound, reliance on the coexistence of source and target data during adaptation, and failure to accurately estimate model predictions' uncertainty. To address these limitations, the Progressive Graph Learning (PGL) framework is proposed. PGL decomposes the target hypothesis space into shared and unknown subspaces and progressively pseudo-labels the most confident known samples from the target domain for hypothesis adaptation. PGL guarantees a tight upper bound of the target error by integrating a graph neural network with episodic training and leveraging adversarial learning to close the gap between the source and target distributions. The proposed approach also tackles a more realistic source-free open-set domain adaptation (SF-OSDA) setting that makes no assumptions about the coexistence of source and target domains. In a two-stage framework, the SF-PGL model' uniformly selects the most confident target instances from each category at a fixed ratio, and the confidence thresholds in each class weigh the classification loss in the adaptation step. The proposed methods are evaluated on benchmark image classification and action recognition datasets, where they demonstrate superiority and flexibility in recognizing both shared and unknown categories. Additionally, balanced pseudo-labeling plays a significant role in improving calibration, making the trained model less prone to over- or under-confident predictions on the target data.
Texto completo:
1
Base de dados:
MEDLINE
Tipo de estudo:
Prognostic_studies
Idioma:
En
Ano de publicação:
2023
Tipo de documento:
Article