Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
Chaos ; 28(7): 075306, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30070515

ABSTRACT

A rotor, the rotation center of spiral waves, has been proposed as a causal mechanism to maintain atrial fibrillation (AF) in human. However, our current understanding of the causality between rotors and spiral waves remains incomplete. One approach to improving our understanding is to determine the relationship between rotors and downward causation from the macro-scale collective behavior of spiral waves to the micro-scale behavior of individual components in a cardiac system. This downward causation is quantifiable as inter-scale information flow that can be used as a surrogate for the mechanism that maintains spiral waves. We used a numerical model of a cardiac system and generated a renormalization group with system descriptions at multiple scales. We found that transfer entropy quantified the upward and downward inter-scale information flow between micro- and macro-scale descriptions of the cardiac system with spiral waves. In addition, because the spatial profile of transfer entropy and intrinsic transfer entropy was identical, there were no synergistic effects in the system. Furthermore, inter-scale information flow significantly decreased as the description of the system became more macro-scale. Finally, downward information flow was significantly correlated with the number of rotors, but the higher numbers of rotors were not necessarily associated with higher downward information flow. This finding contradicts the concept that the rotors are the causal mechanism that maintains spiral waves, and may account for the conflicting evidence from clinical studies targeting rotors to eliminate AF.

2.
Chaos ; 28(1): 013109, 2018 Jan.
Article in English | MEDLINE | ID: mdl-29390624

ABSTRACT

Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.

3.
Entropy (Basel) ; 21(1)2018 Dec 24.
Article in English | MEDLINE | ID: mdl-33266728

ABSTRACT

The partial information decomposition (PID) is a promising framework for decomposing a joint random variable into the amount of influence each source variable X i has on a target variable Y, relative to the other sources. For two sources, influence breaks down into the information that both X 0 and X 1 redundantly share with Y, what X 0 uniquely shares with Y, what X 1 uniquely shares with Y, and finally what X 0 and X 1 synergistically share with Y. Unfortunately, considerable disagreement has arisen as to how these four components should be quantified. Drawing from cryptography, we consider the secret key agreement rate as an operational method of quantifying unique information. Secret key agreement rate comes in several forms, depending upon which parties are permitted to communicate. We demonstrate that three of these four forms are inconsistent with the PID. The remaining form implies certain interpretations as to the PID's meaning-interpretations not present in PID's definition but that, we argue, need to be explicit. Specifically, the use of a consistent PID quantified using a secret key agreement rate naturally induces a directional interpretation of the PID. We further reveal a surprising connection between third-order connected information, two-way secret key agreement rate, and synergy. We also consider difficulties which arise with a popular PID measure in light of the results here as well as from a maximum entropy viewpoint. We close by reviewing the challenges facing the PID.

4.
Chaos ; 27(1): 013106, 2017 01.
Article in English | MEDLINE | ID: mdl-28147497

ABSTRACT

A spiral wave is a macroscopic dynamics of excitable media that plays an important role in several distinct systems, including the Belousov-Zhabotinsky reaction, seizures in the brain, and lethal arrhythmia in the heart. Because the spiral wave dynamics can exhibit a wide spectrum of behaviors, its precise quantification can be challenging. Here we present a hybrid geometric and information-theoretic approach to quantifying the spiral wave dynamics. We demonstrate the effectiveness of our approach by applying it to numerical simulations of a two-dimensional excitable medium with different numbers and spatial patterns of spiral waves. We show that, by defining the information flow over the excitable medium, hidden coherent structures emerge that effectively quantify the information transport underlying the spiral wave dynamics. Most importantly, we find that some coherent structures become more clearly defined over a longer observation period. These findings provide validity with our approach to quantitatively characterize the spiral wave dynamics by focusing on information transport. Our approach is computationally efficient and is applicable to many excitable media of interest in distinct physical, chemical, and biological systems. Our approach could ultimately contribute to an improved therapy of clinical conditions such as seizures and cardiac arrhythmia by identifying potential targets of interventional therapies.


Subject(s)
Arrhythmias, Cardiac , Computer Simulation , Seizures , Animals , Heart Conduction System , Humans
5.
Phys Rev Lett ; 116(23): 238701, 2016 Jun 10.
Article in English | MEDLINE | ID: mdl-27341264

ABSTRACT

A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose several avenues to alternate measures for information flow. We also address an auxiliary consequence: The proliferation of networks as a now-common theoretical model for large-scale systems, in concert with the use of transferlike entropies, has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems. This interpretation thus fails to include the effects of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems may go undetected.

6.
Sci Adv ; 8(6): eabj1720, 2022 Feb 11.
Article in English | MEDLINE | ID: mdl-35138896

ABSTRACT

Pairwise interactions are fundamental drivers of collective behavior-responsible for group cohesion. The abiding question is how each individual influences the collective. However, time-delayed mutual information and transfer entropy, commonly used to quantify mutual influence in aggregated individuals, can result in misleading interpretations. Here, we show that these information measures have substantial pitfalls in measuring information flow between agents from their trajectories. We decompose the information measures into three distinct modes of information flow to expose the role of individual and group memory in collective behavior. It is found that decomposed information modes between a single pair of agents reveal the nature of mutual influence involving many-body nonadditive interactions without conditioning on additional agents. The pairwise decomposed modes of information flow facilitate an improved diagnosis of mutual influence in collectives.

7.
Chaos ; 21(3): 037109, 2011 Sep.
Article in English | MEDLINE | ID: mdl-21974672

ABSTRACT

Appealing to several multivariate information measures--some familiar, some new here--we analyze the information embedded in discrete-valued stochastic time series. We dissect the uncertainty of a single observation to demonstrate how the measures' asymptotic behavior sheds structural and semantic light on the generating process's internal information dynamics. The measures scale with the length of time window, which captures both intensive (rates of growth) and subextensive components. We provide interpretations for the components, developing explicit relationships between them. We also identify the informational component shared between the past and the future that is not contained in a single observation. The existence of this component directly motivates the notion of a process's effective (internal) states and indicates why one must build models.

8.
Chaos ; 21(3): 037107, 2011 Sep.
Article in English | MEDLINE | ID: mdl-21974670

ABSTRACT

We study dynamical reversibility in stationary stochastic processes from an information-theoretic perspective. Extending earlier work on the reversibility of Markov chains, we focus on finitary processes with arbitrarily long conditional correlations. In particular, we examine stationary processes represented or generated by edge-emitting, finite-state hidden Markov models. Surprisingly, we find pervasive temporal asymmetries in the statistics of such stationary processes. As a consequence, the computational resources necessary to generate a process in the forward and reverse temporal directions are generally not the same. In fact, an exhaustive survey indicates that most stationary processes are irreversible. We study the ensuing relations between model topology in different representations, the process's statistical properties, and its reversibility in detail. A process's temporal asymmetry is efficiently captured using two canonical unifilar representations of the generating model, the forward-time and reverse-time ε-machines. We analyze example irreversible processes whose ε-machine representations change size under time reversal, including one which has a finite number of recurrent causal states in one direction, but an infinite number in the opposite. From the forward-time and reverse-time ε-machines, we are able to construct a symmetrized, but nonunifilar, generator of a process--the bidirectional machine. Using the bidirectional machine, we show how to directly calculate a process's fundamental information properties, many of which are otherwise only poorly approximated via process samples. The tools we introduce and the insights we offer provide a better understanding of the many facets of reversibility and irreversibility in stochastic processes.

9.
Chaos ; 21(3): 037112, 2011 Sep.
Article in English | MEDLINE | ID: mdl-21974675

ABSTRACT

We investigate a stationary process's crypticity--a measure of the difference between its hidden state information and its observed information--using the causal states of computational mechanics. Here, we motivate crypticity and cryptic order as physically meaningful quantities that monitor how hidden a hidden process is. This is done by recasting previous results on the convergence of block entropy and block-state entropy in a geometric setting, one that is more intuitive and that leads to a number of new results. For example, we connect crypticity to how an observer synchronizes to a process. We show that the block-causal-state entropy is a convex function of block length. We give a complete analysis of spin chains. We present a classification scheme that surveys stationary processes in terms of their possible cryptic and Markov orders. We illustrate related entropy convergence behaviors using a new form of foliated information diagram. Finally, along the way, we provide a variety of interpretations of crypticity and cryptic order to establish their naturalness and pervasiveness. This is also a first step in developing applications in spatially extended and network dynamical systems.

10.
Chaos ; 20(3): 037105, 2010 Sep.
Article in English | MEDLINE | ID: mdl-20887071

ABSTRACT

We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. These tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.

11.
Phys Rev E ; 95(6-1): 060102, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28709305

ABSTRACT

One of the most basic characterizations of the relationship between two random variables, X and Y, is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y) can be replaced by its minimal sufficient statistic about Y (or X) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X's minimal sufficient statistic preserves about Y is exactly the information that Y's minimal sufficient statistic preserves about X. We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

12.
Phys Rev E ; 93(2): 022143, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26986324

ABSTRACT

Modeling a temporal process as if it is Markovian assumes that the present encodes all of a process's history. When this occurs, the present captures all of the dependency between past and future. We recently showed that if one randomly samples in the space of structured processes, this is almost never the case. So, how does the Markov failure come about? That is, how do individual measurements fail to encode the past? and How many are needed to capture dependencies between the past and future? Here, we investigate how much information can be shared between the past and the future but not reflected in the present. We quantify this elusive information, give explicit calculational methods, and outline the consequences, the most important of which is that when the present hides past-future correlation or dependency we must move beyond sequence-based statistics and build state-based models.

13.
Phys Rev E ; 93(2): 022221, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26986345

ABSTRACT

Delay-coordinate reconstruction is a proven modeling strategy for building effective forecasts of nonlinear time series. The first step in this process is the estimation of good values for two parameters, the time delay and the embedding dimension. Many heuristics and strategies have been proposed in the literature for estimating these values. Few, if any, of these methods were developed with forecasting in mind, however, and their results are not optimal for that purpose. Even so, these heuristics-intended for other applications-are routinely used when building delay coordinate reconstruction-based forecast models. In this paper, we propose an alternate strategy for choosing optimal parameter values for forecast methods that are based on delay-coordinate reconstructions. The basic calculation involves maximizing the shared information between each delay vector and the future state of the system. We illustrate the effectiveness of this method on several synthetic and experimental systems, showing that this metric can be calculated quickly and reliably from a relatively short time series, and that it provides a direct indication of how well a near-neighbor based forecasting method will work on a given delay reconstruction of that time series. This allows a practitioner to choose reconstruction parameters that avoid any pathologies, regardless of the underlying mechanism, and maximize the predictive information contained in the reconstruction.

14.
Article in English | MEDLINE | ID: mdl-24827220

ABSTRACT

We consider two important time scales-the Markov and cryptic orders-that monitor how an observer synchronizes to a finitary stochastic process. We show how to compute these orders exactly and that they are most efficiently calculated from the ε-machine, a process's minimal unifilar model. Surprisingly, though the Markov order is a basic concept from stochastic process theory, it is not a probabilistic property of a process. Rather, it is a topological property and, moreover, it is not computable from any finite-state model other than the ε-machine. Via an exhaustive survey, we close by demonstrating that infinite Markov and infinite cryptic orders are a dominant feature in the space of finite-memory processes. We draw out the roles played in statistical mechanical spin systems by these two complementary length scales.

SELECTION OF CITATIONS
SEARCH DETAIL