Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Entropy (Basel) ; 26(4)2024 Apr 10.
Artículo en Inglés | MEDLINE | ID: mdl-38667877

RESUMEN

Controlling the time evolution of a probability distribution that describes the dynamics of a given complex system is a challenging problem. Achieving success in this endeavour will benefit multiple practical scenarios, e.g., controlling mesoscopic systems. Here, we propose a control approach blending the model predictive control technique with insights from information geometry theory. Focusing on linear Langevin systems, we use model predictive control online optimisation capabilities to determine the system inputs that minimise deviations from the geodesic of the information length over time, ensuring dynamics with minimum "geometric information variability". We validate our methodology through numerical experimentation on the Ornstein-Uhlenbeck process and Kramers equation, demonstrating its feasibility. Furthermore, in the context of the Ornstein-Uhlenbeck process, we analyse the impact on the entropy production and entropy rate, providing a physical understanding of the effects of minimum information variability control.

2.
Entropy (Basel) ; 23(8)2021 Aug 21.
Artículo en Inglés | MEDLINE | ID: mdl-34441227

RESUMEN

Information processing is common in complex systems, and information geometric theory provides a useful tool to elucidate the characteristics of non-equilibrium processes, such as rare, extreme events, from the perspective of geometry. In particular, their time-evolutions can be viewed by the rate (information rate) at which new information is revealed (a new statistical state is accessed). In this paper, we extend this concept and develop a new information-geometric measure of causality by calculating the effect of one variable on the information rate of the other variable. We apply the proposed causal information rate to the Kramers equation and compare it with the entropy-based causality measure (information flow). Overall, the causal information rate is a sensitive method for identifying causal relations.

3.
Entropy (Basel) ; 23(6)2021 May 31.
Artículo en Inglés | MEDLINE | ID: mdl-34073076

RESUMEN

Detection and measurement of abrupt changes in a process can provide us with important tools for decision making in systems management. In particular, it can be utilised to predict the onset of a sudden event such as a rare, extreme event which causes the abrupt dynamical change in the system. Here, we investigate the prediction capability of information theory by focusing on how sensitive information-geometric theory (information length diagnostics) and entropy-based information theoretical method (information flow) are to abrupt changes. To this end, we utilise a non-autonomous Kramer equation by including a sudden perturbation to the system to mimic the onset of a sudden event and calculate time-dependent probability density functions (PDFs) and various statistical quantities with the help of numerical simulations. We show that information length diagnostics predict the onset of a sudden event better than the information flow. Furthermore, it is explicitly shown that the information flow like any other entropy-based measures has limitations in measuring perturbations which do not affect entropy.

4.
Entropy (Basel) ; 22(11)2020 Nov 07.
Artículo en Inglés | MEDLINE | ID: mdl-33287033

RESUMEN

When studying the behaviour of complex dynamical systems, a statistical formulation can provide useful insights. In particular, information geometry is a promising tool for this purpose. In this paper, we investigate the information length for n-dimensional linear autonomous stochastic processes, providing a basic theoretical framework that can be applied to a large set of problems in engineering and physics. A specific application is made to a harmonically bound particle system with the natural oscillation frequency ω, subject to a damping γ and a Gaussian white-noise. We explore how the information length depends on ω and γ, elucidating the role of critical damping γ=2ω in information geometry. Furthermore, in the long time limit, we show that the information length reflects the linear geometry associated with the Gaussian statistics in a linear stochastic process.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA