Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros

Banco de datos
Tipo de estudio
Tipo del documento
Intervalo de año de publicación
1.
Entropy (Basel) ; 22(2)2020 Feb 14.
Artículo en Inglés | MEDLINE | ID: mdl-33285991

RESUMEN

The entropy of a pair of random variables is commonly depicted using a Venn diagram. This representation is potentially misleading, however, since the multivariate mutual information can be negative. This paper presents new measures of multivariate information content that can be accurately depicted using Venn diagrams for any number of random variables. These measures complement the existing measures of multivariate mutual information and are constructed by considering the algebraic structure of information sharing. It is shown that the distinct ways in which a set of marginal observers can share their information with a non-observing third party corresponds to the elements of a free distributive lattice. The redundancy lattice from partial information decomposition is then subsequently and independently derived by combining the algebraic structures of joint and shared information content.

2.
Entropy (Basel) ; 20(4)2018 Apr 18.
Artículo en Inglés | MEDLINE | ID: mdl-33265388

RESUMEN

What are the distinct ways in which a set of predictor variables can provide information about a target variable? When does a variable provide unique information, when do variables share redundant information, and when do variables combine synergistically to provide complementary information? The redundancy lattice from the partial information decomposition of Williams and Beer provided a promising glimpse at the answer to these questions. However, this structure was constructed using a much criticised measure of redundant information, and despite sustained research, no completely satisfactory replacement measure has been proposed. In this paper, we take a different approach, applying the axiomatic derivation of the redundancy lattice to a single realisation from a set of discrete variables. To overcome the difficulty associated with signed pointwise mutual information, we apply this decomposition separately to the unsigned entropic components of pointwise mutual information which we refer to as the specificity and ambiguity. This yields a separate redundancy lattice for each component. Then based upon an operational interpretation of redundancy, we define measures of redundant specificity and ambiguity enabling us to evaluate the partial information atoms in each lattice. These atoms can be recombined to yield the sought-after multivariate information decomposition. We apply this framework to canonical examples from the literature and discuss the results and the various properties of the decomposition. In particular, the pointwise decomposition using specificity and ambiguity satisfies a chain rule over target variables, which provides new insights into the so-called two-bit-copy example.

3.
Entropy (Basel) ; 20(11)2018 Oct 28.
Artículo en Inglés | MEDLINE | ID: mdl-33266550

RESUMEN

Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley's foundational work on information theory, there is a surprising lack of a formal treatment of this interpretation in terms of exclusions. This paper addresses the gap by providing an explicit characterisation of information in terms of probability mass exclusions. It then demonstrates that different exclusions can yield the same amount of information and discusses the insight this provides about how information is shared amongst random variables-lack of progress in this area is a key barrier preventing us from understanding how information is distributed in complex systems. The paper closes by deriving a decomposition of the mutual information which can distinguish between differing exclusions; this provides surprising insight into the nature of directed information.

4.
Cell Rep Phys Sci ; 5(4): 101892, 2024 Apr 17.
Artículo en Inglés | MEDLINE | ID: mdl-38720789

RESUMEN

Understanding how different networks relate to each other is key for understanding complex systems. We introduce an intuitive yet powerful framework to disentangle different ways in which networks can be (dis)similar and complementary to each other. We decompose the shortest paths between nodes as uniquely contributed by one source network, or redundantly by either, or synergistically by both together. Our approach considers the networks' full topology, providing insights at multiple levels of resolution: from global statistics to individual paths. Our framework is widely applicable across scientific domains, from public transport to brain networks. In humans and 124 other species, we demonstrate the prevalence of unique contributions by long-range white-matter fibers in structural brain networks. Across species, efficient communication also relies on significantly greater synergy between long-range and short-range fibers than expected by chance. Our framework could find applications for designing network systems or evaluating existing ones.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA