Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 65
Filter
1.
Entropy (Basel) ; 26(3)2024 Mar 01.
Article in English | MEDLINE | ID: mdl-38539737

ABSTRACT

Any given density matrix can be represented as an infinite number of ensembles of pure states. This leads to the natural question of how to uniquely select one out of the many, apparently equally-suitable, possibilities. Following Jaynes' information-theoretic perspective, this can be framed as an inference problem. We propose the Maximum Geometric Quantum Entropy Principle to exploit the notions of Quantum Information Dimension and Geometric Quantum Entropy. These allow us to quantify the entropy of fully arbitrary ensembles and select the one that maximizes it. After formulating the principle mathematically, we give the analytical solution to the maximization problem in a number of cases and discuss the physical mechanism behind the emergence of such maximum entropy ensembles.

2.
Chaos ; 32(2): 023103, 2022 Feb.
Article in English | MEDLINE | ID: mdl-35232043

ABSTRACT

We merge computational mechanics' definition of causal states (predictively equivalent histories) with reproducing-kernel Hilbert space (RKHS) representation inference. The result is a widely applicable method that infers causal structure directly from observations of a system's behaviors whether they are over discrete or continuous events or time. A structural representation-a finite- or infinite-state kernel ϵ-machine-is extracted by a reduced-dimension transform that gives an efficient representation of causal states and their topology. In this way, the system dynamics are represented by a stochastic (ordinary or partial) differential equation that acts on causal states. We introduce an algorithm to estimate the associated evolution operator. Paralleling the Fokker-Planck equation, it efficiently evolves causal-state distributions and makes predictions in the original data space via an RKHS functional mapping. We demonstrate these techniques, together with their predictive abilities, on discrete-time, discrete-value infinite Markov-order processes generated by finite-state hidden Markov models with (i) finite or (ii) uncountably infinite causal states and (iii) continuous-time, continuous-value processes generated by thermally driven chaotic flows. The method robustly estimates causal structure in the presence of varying external and measurement noise levels and for very high-dimensional data.

3.
Chaos ; 32(12): 123115, 2022 Dec.
Article in English | MEDLINE | ID: mdl-36587324

ABSTRACT

Predictive states for stochastic processes are a nonparametric and interpretable construct with relevance across a multitude of modeling paradigms. Recent progress on the self-supervised reconstruction of predictive states from time-series data focused on the use of reproducing kernel Hilbert spaces. Here, we examine how Wasserstein distances may be used to detect predictive equivalences in symbolic data. We compute Wasserstein distances between distributions over sequences ("predictions") using a finite-dimensional embedding of sequences based on the Cantor set for the underlying geometry. We show that exploratory data analysis using the resulting geometry via hierarchical clustering and dimension reduction provides insight into the temporal structure of processes ranging from the relatively simple (e.g., generated by finite-state hidden Markov models) to the very complex (e.g., generated by infinite-state indexed grammars).

4.
Entropy (Basel) ; 24(1)2022 Jan 06.
Article in English | MEDLINE | ID: mdl-35052116

ABSTRACT

Reservoir computers (RCs) and recurrent neural networks (RNNs) can mimic any finite-state automaton in theory, and some workers demonstrated that this can hold in practice. We test the capability of generalized linear models, RCs, and Long Short-Term Memory (LSTM) RNN architectures to predict the stochastic processes generated by a large suite of probabilistic deterministic finite-state automata (PDFA) in the small-data limit according to two metrics: predictive accuracy and distance to a predictive rate-distortion curve. The latter provides a sense of whether or not the RNN is a lossy predictive feature extractor in the information-theoretic sense. PDFAs provide an excellent performance benchmark in that they can be systematically enumerated, the randomness and correlation structure of their generated processes are exactly known, and their optimal memory-limited predictors are easily computed. With less data than is needed to make a good prediction, LSTMs surprisingly lose at predictive accuracy, but win at lossy predictive feature extraction. These results highlight the utility of causal states in understanding the capabilities of RNNs to predict.

5.
Entropy (Basel) ; 24(11)2022 Nov 17.
Article in English | MEDLINE | ID: mdl-36421529

ABSTRACT

Inferring models, predicting the future, and estimating the entropy rate of discrete-time, discrete-event processes is well-worn ground. However, a much broader class of discrete-event processes operates in continuous-time. Here, we provide new methods for inferring, predicting, and estimating them. The methods rely on an extension of Bayesian structural inference that takes advantage of neural network's universal approximation power. Based on experiments with complex synthetic data, the methods are competitive with the state-of-the-art for prediction and entropy-rate estimation.

6.
Entropy (Basel) ; 24(9)2022 Sep 11.
Article in English | MEDLINE | ID: mdl-36141168

ABSTRACT

We compare and contrast three different, but complementary views of "structure" and "pattern" in spatial processes. For definiteness and analytical clarity, we apply all three approaches to the simplest class of spatial processes: one-dimensional Ising spin systems with finite-range interactions. These noncritical systems are well-suited for this study since the change in structure as a function of system parameters is more subtle than that found in critical systems where, at a phase transition, many observables diverge, thereby making the detection of change in structure obvious. This survey demonstrates that the measures of pattern from information theory and computational mechanics differ from known thermodynamic and statistical mechanical functions. Moreover, they capture important structural features that are otherwise missed. In particular, a type of mutual information called the excess entropy-an information theoretic measure of memory-serves to detect ordered, low entropy density patterns. It is superior in several respects to other functions used to probe structure, such as magnetization and structure factors. ϵ-Machines-the main objects of computational mechanics-are seen to be the most direct approach to revealing the (group and semigroup) symmetries possessed by the spatial patterns and to estimating the minimum amount of memory required to reproduce the configuration ensemble, a quantity known as the statistical complexity. Finally, we argue that the information theoretic and computational mechanical analyses of spatial patterns capture the intrinsic computational capabilities embedded in spin systems-how they store, transmit, and manipulate configurational information to produce spatial structure.

7.
Chaos ; 31(8): 083114, 2021 Aug.
Article in English | MEDLINE | ID: mdl-34470245

ABSTRACT

Even simply defined, finite-state generators produce stochastic processes that require tracking an uncountable infinity of probabilistic features for optimal prediction. For processes generated by hidden Markov chains, the consequences are dramatic. Their predictive models are generically infinite state. Until recently, one could determine neither their intrinsic randomness nor structural complexity. The prequel to this work introduced methods to accurately calculate the Shannon entropy rate (randomness) and to constructively determine their minimal (though, infinite) set of predictive features. Leveraging this, we address the complementary challenge of determining how structured hidden Markov processes are by calculating their statistical complexity dimension-the information dimension of the minimal set of predictive features. This tracks the divergence rate of the minimal memory resources required to optimally predict a broad class of truly complex processes.


Subject(s)
Algorithms , Entropy , Markov Chains , Stochastic Processes
8.
Phys Rev Lett ; 125(2): 020601, 2020 Jul 10.
Article in English | MEDLINE | ID: mdl-32701316

ABSTRACT

Quantum coherence allows for reduced-memory simulators of classical processes. Using recent results in single-shot quantum thermodynamics, we derive a minimal work cost rate for quantum simulators that is quasistatically attainable in the limit of asymptotically infinite parallel simulation. Comparing this cost with the classical regime reveals that quantizing classical simulators not only results in memory compression but also in reduced dissipation. We explore this advantage across a suite of representative examples.

9.
Bull Math Biol ; 82(2): 25, 2020 01 28.
Article in English | MEDLINE | ID: mdl-31993762

ABSTRACT

Biological sensors must often predict their input while operating under metabolic constraints. However, determining whether or not a particular sensor is evolved or designed to be accurate and efficient is challenging. This arises partly from the functional constraints being at cross purposes and partly since quantifying the prediction performance of even in silico sensors can require prohibitively long simulations, especially when highly complex environments drive sensors out of equilibrium. To circumvent these difficulties, we develop new expressions for the prediction accuracy and thermodynamic costs of the broad class of conditionally Markovian sensors subject to complex, correlated (unifilar hidden semi-Markov) environmental inputs in nonequilibrium steady state. Predictive metrics include the instantaneous memory and the total predictable information (the mutual information between present sensor state and input future), while dissipation metrics include power extracted from the environment and the nonpredictive information rate. Success in deriving these formulae relies on identifying the environment's causal states, the input's minimal sufficient statistics for prediction. Using these formulae, we study large random channels and the simplest nontrivial biological sensor model-that of a Hill molecule, characterized by the number of ligands that bind simultaneously-the sensor's cooperativity. We find that the seemingly impoverished Hill molecule can capture an order of magnitude more predictable information than large random channels.


Subject(s)
Models, Biological , Biosensing Techniques/statistics & numerical data , Computational Biology , Computer Simulation , Ion Channels/metabolism , Kinetics , Markov Chains , Mathematical Concepts , Synthetic Biology , Systems Biology , Thermodynamics
10.
Chaos ; 30(9): 093105, 2020 Sep.
Article in English | MEDLINE | ID: mdl-33003907

ABSTRACT

Szilard's now-famous single-molecule engine was only the first of three constructions he introduced in 1929 to resolve several challenges arising from Maxwell's demon paradox. Given that it has been thoroughly analyzed, we analyze Szilard's remaining two demon models. We show that the second one, though a markedly different implementation employing a population of distinct molecular species and semipermeable membranes, is informationally and thermodynamically equivalent to an ideal gas of the single-molecule engines. One concludes that (i) it reduces to a chaotic dynamical system-called the Szilard Map, a composite of three piecewise linear maps and associated thermodynamic transformations that implement measurement, control, and erasure; (ii) its transitory functioning as an engine that converts disorganized heat energy to work is governed by the Kolmogorov-Sinai entropy rate; (iii) the demon's minimum necessary "intelligence" for optimal functioning is given by the engine's statistical complexity; and (iv) its functioning saturates thermodynamic bounds and so it is a minimal, optimal implementation. We show that Szilard's third construction is rather different and addresses the fundamental issue raised by the first two: the link between entropy production and the measurement task required to implement either of his engines. The analysis gives insight into designing and implementing novel nanoscale information engines by investigating the relationships between the demon's memory, the nature of the "working fluid," and the thermodynamic costs of erasure and measurement.

11.
Chaos ; 29(9): 093128, 2019 Sep.
Article in English | MEDLINE | ID: mdl-31575142

ABSTRACT

Nonlinear dynamical systems with symmetries exhibit a rich variety of behaviors, often described by complex attractor-basin portraits and enhanced and suppressed bifurcations. Symmetry arguments provide a way to study these collective behaviors and to simplify their analysis. The Koopman operator is an infinite dimensional linear operator that fully captures a system's nonlinear dynamics through the linear evolution of functions of the state space. Importantly, in contrast with local linearization, it preserves a system's global nonlinear features. We demonstrate how the presence of symmetries affects the Koopman operator structure and its spectral properties. In fact, we show that symmetry considerations can also simplify finding the Koopman operator approximations using the extended and kernel dynamic mode decomposition methods (EDMD and kernel DMD). Specifically, representation theory allows us to demonstrate that an isotypic component basis induces a block diagonal structure in operator approximations, revealing hidden organization. Practically, if the symmetries are known, the EDMD and kernel DMD methods can be modified to give more efficient computation of the Koopman operator approximation and its eigenvalues, eigenfunctions, and eigenmodes. Rounding out the development, we discuss the effect of measurement noise.

12.
Chaos ; 28(7): 075312, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30070532

ABSTRACT

Coherent structures form spontaneously in nonlinear spatiotemporal systems and are found at all spatial scales in natural phenomena from laboratory hydrodynamic flows and chemical reactions to ocean, atmosphere, and planetary climate dynamics. Phenomenologically, they appear as key components that organize the macroscopic behaviors in such systems. Despite a century of effort, they have eluded rigorous analysis and empirical prediction, with progress being made only recently. As a step in this, we present a formal theory of coherent structures in fully discrete dynamical field theories. It builds on the notion of structure introduced by computational mechanics, generalizing it to a local spatiotemporal setting. The analysis' main tool employs the local causal states, which are used to uncover a system's hidden spatiotemporal symmetries and which identify coherent structures as spatially localized deviations from those symmetries. The approach is behavior-driven in the sense that it does not rely on directly analyzing spatiotemporal equations of motion, rather it considers only the spatiotemporal fields a system generates. As such, it offers an unsupervised approach to discover and describe coherent structures. We illustrate the approach by analyzing coherent structures generated by elementary cellular automata, comparing the results with an earlier, dynamic-invariant-set approach that decomposes fields into domains, particles, and particle interactions.

13.
Chaos ; 28(3): 033115, 2018 Mar.
Article in English | MEDLINE | ID: mdl-29604656

ABSTRACT

Virtually all questions that one can ask about the behavioral and structural complexity of a stochastic process reduce to a linear algebraic framing of a time evolution governed by an appropriate hidden-Markov process generator. Each type of question-correlation, predictability, predictive cost, observer synchronization, and the like-induces a distinct generator class. Answers are then functions of the class-appropriate transition dynamic. Unfortunately, these dynamics are generically nonnormal, nondiagonalizable, singular, and so on. Tractably analyzing these dynamics relies on adapting the recently introduced meromorphic functional calculus, which specifies the spectral decomposition of functions of nondiagonalizable linear operators, even when the function poles and zeros coincide with the operator's spectrum. Along the way, we establish special properties of the spectral projection operators that demonstrate how they capture the organization of subprocesses within a complex system. Circumventing the spurious infinities of alternative calculi, this leads in the sequel, Part II [P. M. Riechers and J. P. Crutchfield, Chaos 28, 033116 (2018)], to the first closed-form expressions for complexity measures, couched either in terms of the Drazin inverse (negative-one power of a singular operator) or the eigenvalues and projection operators of the appropriate transition dynamic.

14.
Chaos ; 28(3): 033116, 2018 Mar.
Article in English | MEDLINE | ID: mdl-29604661

ABSTRACT

The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

15.
Chaos ; 28(1): 013109, 2018 Jan.
Article in English | MEDLINE | ID: mdl-29390624

ABSTRACT

Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.

16.
Nano Lett ; 17(10): 5977-5983, 2017 10 11.
Article in English | MEDLINE | ID: mdl-28884582

ABSTRACT

Control of the global parameters of complex networks has been explored experimentally in a variety of contexts. Yet, the more difficult prospect of realizing arbitrary network architectures, especially analog physical networks that provide dynamical control of individual nodes and edges, has remained elusive. Given the vast hierarchy of time scales involved, it also proves challenging to measure a complex network's full internal dynamics. These span from the fastest nodal dynamics to very slow epochs over which emergent global phenomena, including network synchronization and the manifestation of exotic steady states, eventually emerge. Here, we demonstrate an experimental system that satisfies these requirements. It is based upon modular, fully controllable, nonlinear radio frequency nanomechanical oscillators, designed to form the nodes of complex dynamical networks with edges of arbitrary topology. The dynamics of these oscillators and their surrounding network are analog and continuous-valued and can be fully interrogated in real time. They comprise a piezoelectric nanomechanical membrane resonator, which serves as the frequency-determining element within an electrical feedback circuit. This embodiment permits network interconnections entirely within the electrical domain and provides unprecedented node and edge control over a vast region of parameter space. Continuous measurement of the instantaneous amplitudes and phases of every constituent oscillator node are enabled, yielding full and detailed network data without reliance upon statistical quantities. We demonstrate the operation of this platform through the real-time capture of the dynamics of a three-node ring network as it evolves from the uncoupled state to full synchronization.

17.
Entropy (Basel) ; 21(1)2018 Dec 24.
Article in English | MEDLINE | ID: mdl-33266728

ABSTRACT

The partial information decomposition (PID) is a promising framework for decomposing a joint random variable into the amount of influence each source variable X i has on a target variable Y, relative to the other sources. For two sources, influence breaks down into the information that both X 0 and X 1 redundantly share with Y, what X 0 uniquely shares with Y, what X 1 uniquely shares with Y, and finally what X 0 and X 1 synergistically share with Y. Unfortunately, considerable disagreement has arisen as to how these four components should be quantified. Drawing from cryptography, we consider the secret key agreement rate as an operational method of quantifying unique information. Secret key agreement rate comes in several forms, depending upon which parties are permitted to communicate. We demonstrate that three of these four forms are inconsistent with the PID. The remaining form implies certain interpretations as to the PID's meaning-interpretations not present in PID's definition but that, we argue, need to be explicit. Specifically, the use of a consistent PID quantified using a secret key agreement rate naturally induces a directional interpretation of the PID. We further reveal a surprising connection between third-order connected information, two-way secret key agreement rate, and synergy. We also consider difficulties which arise with a popular PID measure in light of the results here as well as from a maximum entropy viewpoint. We close by reviewing the challenges facing the PID.

18.
Phys Rev Lett ; 118(22): 220602, 2017 Jun 02.
Article in English | MEDLINE | ID: mdl-28621996

ABSTRACT

A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the information-processing second law (IPSL): the physical entropy of the Universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? We identify a minimal, and thus inescapable, transient dissipation of physical information processing, which is not captured by asymptotic rates, but is critical to adaptive thermodynamic processes such as those found in biological systems. A component of transient dissipation, we also identify an implementation-dependent cost that varies from one physical substrate to another for the same information processing task. Applying these results to producing structured patterns from a structureless information reservoir, we show that "retrodictive" generators achieve the minimal costs. The results establish the thermodynamic toll imposed by a physical system's structure as it comes to optimally transduce information.

19.
Phys Rev Lett ; 116(19): 190601, 2016 May 13.
Article in English | MEDLINE | ID: mdl-27232011

ABSTRACT

We introduce a deterministic chaotic system-the Szilard map-that encapsulates the measurement, control, and erasure protocol by which Maxwellian demons extract work from a heat reservoir. Implementing the demon's control function in a dynamical embodiment, our construction symmetrizes the demon and the thermodynamic system, allowing one to explore their functionality and recover the fundamental trade-off between the thermodynamic costs of dissipation due to measurement and those due to erasure. The map's degree of chaos-captured by the Kolmogorov-Sinai entropy-is the rate of energy extraction from the heat bath. Moreover, an engine's statistical complexity quantifies the minimum necessary system memory for it to function. In this way, dynamical instability in the control protocol plays an essential and constructive role in intelligent thermodynamic systems.

20.
Phys Rev Lett ; 116(23): 238701, 2016 Jun 10.
Article in English | MEDLINE | ID: mdl-27341264

ABSTRACT

A central task in analyzing complex dynamics is to determine the loci of information storage and the communication topology of information flows within a system. Over the last decade and a half, diagnostics for the latter have come to be dominated by the transfer entropy. Via straightforward examples, we show that it and a derivative quantity, the causation entropy, do not, in fact, quantify the flow of information. At one and the same time they can overestimate flow or underestimate influence. We isolate why this is the case and propose several avenues to alternate measures for information flow. We also address an auxiliary consequence: The proliferation of networks as a now-common theoretical model for large-scale systems, in concert with the use of transferlike entropies, has shoehorned dyadic relationships into our structural interpretation of the organization and behavior of complex systems. This interpretation thus fails to include the effects of polyadic dependencies. The net result is that much of the sophisticated organization of complex systems may go undetected.

SELECTION OF CITATIONS
SEARCH DETAIL