Your browser doesn't support javascript.
loading
Revealing the Dynamics of Neural Information Processing with Multivariate Information Decomposition.
Newman, Ehren L; Varley, Thomas F; Parakkattu, Vibin K; Sherrill, Samantha P; Beggs, John M.
Afiliação
  • Newman EL; Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA.
  • Varley TF; Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA.
  • Parakkattu VK; Department of Psychological and Brain Sciences, Indiana University, Bloomington, IN 47405, USA.
  • Sherrill SP; Brighton and Sussex Medical School, University of Sussex, Brighton BN1 9RH, UK.
  • Beggs JM; Department of Physics, Indiana University, Bloomington, IN 47405, USA.
Entropy (Basel) ; 24(7)2022 Jul 05.
Article em En | MEDLINE | ID: mdl-35885153
ABSTRACT
The varied cognitive abilities and rich adaptive behaviors enabled by the animal nervous system are often described in terms of information processing. This framing raises the issue of how biological neural circuits actually process information, and some of the most fundamental outstanding questions in neuroscience center on understanding the mechanisms of neural information processing. Classical information theory has long been understood to be a natural framework within which information processing can be understood, and recent advances in the field of multivariate information theory offer new insights into the structure of computation in complex systems. In this review, we provide an introduction to the conceptual and practical issues associated with using multivariate information theory to analyze information processing in neural circuits, as well as discussing recent empirical work in this vein. Specifically, we provide an accessible introduction to the partial information decomposition (PID) framework. PID reveals redundant, unique, and synergistic modes by which neurons integrate information from multiple sources. We focus particularly on the synergistic mode, which quantifies the "higher-order" information carried in the patterns of multiple inputs and is not reducible to input from any single source. Recent work in a variety of model systems has revealed that synergistic dynamics are ubiquitous in neural circuitry and show reliable structure-function relationships, emerging disproportionately in neuronal rich clubs, downstream of recurrent connectivity, and in the convergence of correlated activity. We draw on the existing literature on higher-order information dynamics in neuronal networks to illustrate the insights that have been gained by taking an information decomposition perspective on neural activity. Finally, we briefly discuss future promising directions for information decomposition approaches to neuroscience, such as work on behaving animals, multi-target generalizations of PID, and time-resolved local analyses.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2022 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2022 Tipo de documento: Article