Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros

Base de dados
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
IEEE Trans Neural Netw Learn Syst ; 34(10): 6813-6823, 2023 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-37071516

RESUMO

Distribution drift is an important issue for practical applications of machine learning (ML). In particular, in streaming ML, the data distribution may change over time, yielding the problem of concept drift, which affects the performance of learners trained with outdated data. In this article, we focus on supervised problems in an online nonstationary setting, introducing a novel learner-agnostic algorithm for drift adaptation, namely importance weighting for drift adaptation (IWDA), with the goal of performing efficient retraining of the learner when drift is detected. IWDA incrementally estimates the joint probability density of input and target for the incoming data and, as soon as drift is detected, retrains the learner using importance-weighted empirical risk minimization. The importance weights are computed for all the samples observed so far, employing the estimated densities, thus, using all available information efficiently. After presenting our approach, we provide a theoretical analysis in the abrupt drift setting. Finally, we present numerical simulations that illustrate how IWDA competes and often outperforms state-of-the-art stream learning techniques, including adaptive ensemble methods, on both synthetic and real-world data benchmarks.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA