Your browser doesn't support javascript.
loading
Kullback-Leibler Divergence Metric Learning.
IEEE Trans Cybern ; 52(4): 2047-2058, 2022 Apr.
Article em En | MEDLINE | ID: mdl-32721911
ABSTRACT
The Kullback-Leibler divergence (KLD), which is widely used to measure the similarity between two distributions, plays an important role in many applications. In this article, we address the KLD metric-learning task, which aims at learning the best KLD-type metric from the distributions of datasets. Concretely, first, we extend the conventional KLD by introducing a linear mapping and obtain the best KLD to well express the similarity of data distributions by optimizing such a linear mapping. It improves the expressivity of data distribution, which means it makes the distributions in the same class close and those in different classes far away. Then, the KLD metric learning is modeled by a minimization problem on the manifold of all positive-definite matrices. To deal with this optimization task, we develop an intrinsic steepest descent method, which preserves the manifold structure of the metric in the iteration. Finally, we apply the proposed method along with ten popular metric-learning approaches on the tasks of 3-D object classification and document classification. The experimental results illustrate that our proposed method outperforms all other methods.
Assuntos

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Projetos de Pesquisa Idioma: En Ano de publicação: 2022 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Projetos de Pesquisa Idioma: En Ano de publicação: 2022 Tipo de documento: Article