Minimising the Kullback-Leibler Divergence for Model Selection in Distributed Nonlinear Systems.
Entropy (Basel)
; 20(2)2018 Jan 23.
Article
en En
| MEDLINE
| ID: mdl-33265171
The Kullback-Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be decomposed as a function of two information-theoretic measures, transfer entropy and stochastic interaction. More specifically, these measures are applicable when selecting a candidate model for a distributed system, where individual subsystems are coupled via latent variables and observed through a filter. We represent this model as a directed acyclic graph (DAG) that characterises the unidirectional coupling between subsystems. Standard approaches to structure learning are not applicable in this framework due to the hidden variables; however, we can exploit the properties of certain dynamical systems to formulate exact methods based on differential topology. We approach the problem by using reconstruction theorems to derive an analytical expression for the KL divergence of a candidate DAG from the observed dataset. Using this result, we present a scoring function based on transfer entropy to be used as a subroutine in a structure learning algorithm. We then demonstrate its use in recovering the structure of coupled Lorenz and Rössler systems.
Texto completo:
1
Colección:
01-internacional
Banco de datos:
MEDLINE
Tipo de estudio:
Prognostic_studies
Idioma:
En
Revista:
Entropy (Basel)
Año:
2018
Tipo del documento:
Article
País de afiliación:
Australia