RESUMO
The relation between electroencephalography (EEG) rhythms, brain functions, and behavioral correlates is well-established. Some physiological mechanisms underlying rhythm generation are understood, enabling the replication of brain rhythms in silico. This offers a pathway to explore connections between neural oscillations and specific neuronal circuits, potentially yielding fundamental insights into the functional properties of brain waves. Information theory frameworks, such as Integrated Information Decomposition (Φ-ID), relate dynamical regimes with informational properties, providing deeper insights into neuronal dynamic functions. Here, we investigate wave emergence in an excitatory/inhibitory (E/I) balanced network of integrate and fire neurons with short-term synaptic plasticity. This model produces a diverse range of EEG-like rhythms, from low δ waves to high-frequency oscillations. Through Φ-ID, we analyze the network's information dynamics and its relation with different emergent rhythms, elucidating the system's suitability for functions such as robust information transfer, storage, and parallel operation. Furthermore, our study helps to identify regimes that may resemble pathological states due to poor informational properties and high randomness. We found, e.g., that in silico ß and δ waves are associated with maximum information transfer in inhibitory and excitatory neuron populations, respectively, and that the coexistence of excitatory θ, α, and ß waves is associated to information storage. Additionally, we observed that high-frequency oscillations can exhibit either high or poor informational properties, potentially shedding light on ongoing discussions regarding physiological versus pathological high-frequency oscillations. In summary, our study demonstrates that dynamical regimes with similar oscillations may exhibit vastly different information dynamics. Characterizing information dynamics within these regimes serves as a potent tool for gaining insights into the functions of complex neuronal networks. Finally, our findings suggest that the use of information dynamics in both model and experimental data analysis, could help discriminate between oscillations associated with cognitive functions and those linked to neuronal disorders.
Assuntos
Simulação por Computador , Eletroencefalografia , Modelos Neurológicos , Humanos , Encéfalo/fisiologia , Biologia Computacional , Ondas Encefálicas/fisiologia , Neurônios/fisiologia , Plasticidade Neuronal/fisiologia , Rede Nervosa/fisiologia , Teoria da InformaçãoRESUMO
The properties of complex networked systems arise from the interplay between the dynamics of their elements and the underlying topology. Thus, to understand their behavior, it is crucial to convene as much information as possible about their topological organization. However, in large systems, such as neuronal networks, the reconstruction of such topology is usually carried out from the information encoded in the dynamics on the network, such as spike train time series, and by measuring the transfer entropy between system elements. The topological information recovered by these methods does not necessarily capture the connectivity layout, but rather the causal flow of information between elements. New theoretical frameworks, such as Integrated Information Decomposition (Φ-ID), allow one to explore the modes in which information can flow between parts of a system, opening a rich landscape of interactions between network topology, dynamics, and information. Here, we apply Φ-ID on in silico and in vitro data to decompose the usual transfer entropy measure into different modes of information transfer, namely, synergistic, redundant, or unique. We demonstrate that the unique information transfer is the most relevant measure to uncover structural topological details from network activity data, while redundant information only introduces residual information for this application. Although the retrieved network connectivity is still functional, it captures more details of the underlying structural topology by avoiding to take into account emergent high-order interactions and information redundancy between elements, which are important for the functional behavior, but mask the detection of direct simple interactions between elements constituted by the structural network topology.
Assuntos
Simulação por Computador , Modelos Neurológicos , Rede Nervosa , Neurônios , Rede Nervosa/fisiologia , Neurônios/fisiologia , Animais , Entropia , Potenciais de Ação/fisiologiaRESUMO
The higher-order interactions of complex systems, such as the brain, are captured by their simplicial complex structure and have a significant effect on dynamics. However, the existing dynamical models defined on simplicial complexes make the strong assumption that the dynamics resides exclusively on the nodes. Here we formulate the higher-order Kuramoto model which describes the interactions between oscillators placed not only on nodes but also on links, triangles, and so on. We show that higher-order Kuramoto dynamics can lead to an explosive synchronization transition by using an adaptive coupling dependent on the solenoidal and the irrotational component of the dynamics.
RESUMO
Inverse Stochastic Resonance (ISR) is a phenomenon in which the average spiking rate of a neuron exhibits a minimum with respect to noise. ISR has been studied in individual neurons, but here, we investigate ISR in scale-free networks, where the average spiking rate is calculated over the neuronal population. We use Hodgkin-Huxley model neurons with channel noise (i.e., stochastic gating variable dynamics), and the network connectivity is implemented via electrical or chemical connections (i.e., gap junctions or excitatory/inhibitory synapses). We find that the emergence of ISR depends on the interplay between each neuron's intrinsic dynamical structure, channel noise, and network inputs, where the latter in turn depend on network structure parameters. We observe that with weak gap junction or excitatory synaptic coupling, network heterogeneity and sparseness tend to favor the emergence of ISR. With inhibitory coupling, ISR is quite robust. We also identify dynamical mechanisms that underlie various features of this ISR behavior. Our results suggest possible ways of experimentally observing ISR in actual neuronal systems.
Assuntos
Potenciais de Ação/fisiologia , Modelos Neurológicos , Rede Nervosa/fisiologia , Neurônios/fisiologia , Algoritmos , Biologia Computacional , Humanos , Processos EstocásticosRESUMO
In the last years, network scientists have directed their interest to the multi-layer character of real-world systems, and explicitly considered the structural and dynamical organization of graphs made of diverse layers between its constituents. Most complex systems include multiple subsystems and layers of connectivity and, in many cases, the interdependent components of systems interact through many different channels. Such a new perspective is indeed found to be the adequate representation for a wealth of features exhibited by networked systems in the real world. The contributions presented in this Focus Issue cover, from different points of view, the many achievements and still open questions in the field of multi-layer networks, such as: new frameworks and structures to represent and analyze heterogeneous complex systems, different aspects related to synchronization and centrality of complex networks, interplay between layers, and applications to logistic, biological, social, and technological fields.
Assuntos
Modelos Teóricos , AlgoritmosRESUMO
Triadic interactions are higher-order interactions which occur when a set of nodes affects the interaction between two other nodes. Examples of triadic interactions are present in the brain when glia modulate the synaptic signals among neuron pairs or when interneuron axo-axonic synapses enable presynaptic inhibition and facilitation, and in ecosystems when one or more species can affect the interaction among two other species. On random graphs, triadic percolation has been recently shown to turn percolation into a fully fledged dynamical process in which the size of the giant component undergoes a route to chaos. However, in many real cases, triadic interactions are local and occur on spatially embedded networks. Here, we show that triadic interactions in spatial networks induce a very complex spatio-temporal modulation of the giant component which gives rise to triadic percolation patterns with significantly different topology. We classify the observed patterns (stripes, octopus, and small clusters) with topological data analysis and we assess their information content (entropy and complexity). Moreover, we illustrate the multistability of the dynamics of the triadic percolation patterns, and we provide a comprehensive phase diagram of the model. These results open new perspectives in percolation as they demonstrate that in presence of spatial triadic interactions, the giant component can acquire a time-varying topology. Hence, this work provides a theoretical framework that can be applied to model realistic scenarios in which the giant component is time dependent as in neuroscience.
RESUMO
The last decade has witnessed a remarkable progress in our understanding of the brain. This has mainly been based on the scrutiny and modeling of the transmission of activity among neurons across lively synapses. A main conclusion, thus far, is that essential features of the mind rely on collective phenomena that emerge from a willful interaction of many neurons that, mediating other cells, form a complex network whose details keep constantly adapting to their activity and surroundings. In parallel, theoretical and computational studies developed to understand many natural and artificial complex systems, which have truthfully explained their amazing emergent features and precise the role of the interaction dynamics and other conditions behind the different collective phenomena they happen to display. Focusing on promising ideas that arise when comparing these neurobiology and physics studies, the present perspective article shortly reviews such fascinating scenarios looking for clues about how high-level cognitive processes such as consciousness, intelligence, and identity can emerge. We, thus, show that basic concepts of physics, such as dynamical phases and non-equilibrium phase transitions, become quite relevant to the brain activity while determined by factors at the subcellular, cellular, and network levels. We also show how these transitions depend on details of the processing mechanism of stimuli in a noisy background and, most important, that one may detect them in familiar electroencephalogram (EEG) recordings. Thus, we associate the existence of such phases, which reveal a brain operating at (non-equilibrium) criticality, with the emergence of most interesting phenomena during memory tasks.
RESUMO
We here study a network of synaptic relations mingling excitatory and inhibitory neuron nodes that displays oscillations quite similar to electroencephalogram (EEG) brain waves, and identify abrupt variations brought about by swift synaptic mediations. We thus conclude that corresponding changes in EEG series surely come from the slowdown of the activity in neuron populations due to synaptic restrictions. The latter happens to generate an imbalance between excitation and inhibition causing a quick explosive increase of excitatory activity, which turns out to be a (first-order) transition among dynamic mental phases. Moreover, near this phase transition, our model system exhibits waves with a strong component in the so-called delta-theta domain that coexist with fast oscillations. These findings provide a simple explanation for the observed delta-gamma and theta-gamma modulation in actual brains, and open a serious and versatile path to understand deeply large amounts of apparently erratic, easily accessible brain data.
RESUMO
The interplay between structure and function affects the emerging properties of many natural systems. Here we use an adaptive neural network model that couples activity and topological dynamics and reproduces the experimental temporal profiles of synaptic density observed in the brain. We prove that the existence of a transient period of relatively high synaptic connectivity is critical for the development of the system under noise circumstances, such that the resulting network can recover stored memories. Moreover, we show that intermediate synaptic densities provide optimal developmental paths with minimum energy consumption, and that ultimately it is the transient heterogeneity in the network that determines its evolution. These results could explain why the pruning curves observed in actual brain areas present their characteristic temporal profiles and they also suggest new design strategies to build biologically inspired neural networks with particular information processing capabilities.
Assuntos
Encéfalo , Redes Neurais de ComputaçãoRESUMO
Why are most empirical networks, with the prominent exception of social ones, generically degree-degree anticorrelated? To answer this long-standing question, we define the ensemble of correlated networks and obtain the associated Shannon entropy. Maximum entropy can correspond to either assortative (correlated) or disassortative (anticorrelated) configurations, but in the case of highly heterogeneous, scale-free networks a certain disassortativity is predicted--offering a parsimonious explanation for the question above. Our approach provides a neutral model from which, in the absence of further knowledge regarding network evolution, one can obtain the expected value of correlations. When empirical observations deviate from the neutral predictions--as happens for social networks--one can then infer that there are specific correlating mechanisms at work.
RESUMO
Here we study the emergence of chimera states, a recently reported phenomenon referring to the coexistence of synchronized and unsynchronized dynamical units, in a population of Morris-Lecar neurons which are coupled by both electrical and chemical synapses, constituting a hybrid synaptic architecture, as in actual brain connectivity. This scheme consists of a nonlocal network where the nearest neighbor neurons are coupled by electrical synapses, while the synapses from more distant neurons are of the chemical type. We demonstrate that peculiar dynamical behaviors, including chimera state and traveling wave, exist in such a hybrid coupled neural system, and analyze how the relative abundance of chemical and electrical synapses affects the features of chimera and different synchrony states (i.e. incoherent, traveling wave and coherent) and the regions in the space of relevant parameters for their emergence. Additionally, we show that, when the relative population of chemical synapses increases further, a new intriguing chaotic dynamical behavior appears above the region for chimera states. This is characterized by the coexistence of two distinct synchronized states with different amplitude, and an unsynchronized state, that we denote as a chaotic amplitude chimera. We also discuss about the computational implications of such state.
Assuntos
Sinapses Elétricas/fisiologia , Modelos Neurológicos , Neurônios/fisiologia , Animais , Encéfalo/fisiologia , Conectoma , HumanosRESUMO
We present an evolving network model in which the total numbers of nodes and edges are conserved, but in which edges are continuously rewired according to nonlinear preferential detachment and reattachment. Assuming power-law kernels with exponents alpha and beta , the stationary states which the degree distributions evolve toward exhibit a second-order phase transition-from relatively homogeneous to highly heterogeneous (with the emergence of starlike structures) at alpha=beta . Temporal evolution of the distribution in this critical regime is shown to follow a nonlinear diffusion equation, arriving at either pure or mixed power laws of exponents -alpha and 1-alpha .
RESUMO
Recently there is a surge of interest in network geometry and topology. Here we show that the spectral dimension plays a fundamental role in establishing a clear relation between the topological and geometrical properties of a network and its dynamics. Specifically we explore the role of the spectral dimension in determining the synchronization properties of the Kuramoto model. We show that the synchronized phase can only be thermodynamically stable for spectral dimensions above four and that phase entrainment of the oscillators can only be found for spectral dimensions greater than two. We numerically test our analytical predictions on the recently introduced model of network geometry called complex network manifolds, which displays a tunable spectral dimension.
RESUMO
Nature exhibits countless examples of adaptive networks, whose topology evolves constantly coupled with the activity due to its function. The brain is an illustrative example of a system in which a dynamic complex network develops by the generation and pruning of synaptic contacts between neurons while memories are acquired and consolidated. Here, we consider a recently proposed brain developing model to study how mechanisms responsible for the evolution of brain structure affect and are affected by memory storage processes. Following recent experimental observations, we assume that the basic rules for adding and removing synapses depend on local synaptic currents at the respective neurons in addition to global mechanisms depending on the mean connectivity. In this way a feedback loop between "form" and "function" spontaneously emerges that influences the ability of the system to optimally store and retrieve sensory information in patterns of brain activity or memories. In particular, we report here that, as a consequence of such a feedback-loop, oscillations in the activity of the system among the memorized patterns can occur, depending on parameters, reminding mind dynamical processes. Such oscillations have their origin in the destabilization of memory attractors due to the pruning dynamics, which induces a kind of structural disorder or noise in the system at a long-term scale. This constantly modifies the synaptic disorder induced by the interference among the many patterns of activity memorized in the system. Such new intriguing oscillatory behavior is to be associated only to long-term synaptic mechanisms during the network evolution dynamics, and it does not depend on short-term synaptic processes, as assumed in other studies, that are not present in our model.
RESUMO
We observe and study a self-organized phenomenon whereby the activity in a network of spiking neurons spontaneously terminates. We consider different types of populations, consisting of bistable model neurons connected electrically by gap junctions, or by either excitatory or inhibitory synapses, in a scale-free connection topology. We find that strongly synchronized population spiking events lead to complete cessation of activity in excitatory networks, but not in gap junction or inhibitory networks. We identify the underlying mechanism responsible for this phenomenon by examining the particular shape of the excitatory postsynaptic currents that arise in the neurons. We also examine the effects of the synaptic time constant, coupling strength, and channel noise on the occurrence of the phenomenon.
Assuntos
Potenciais de Ação , Redes Neurais de Computação , Neurônios , Potenciais de Ação/fisiologia , Sincronização Cortical/fisiologia , Junções Comunicantes/fisiologia , Humanos , Modelos Neurológicos , Rede Nervosa/fisiologia , Neurônios/fisiologia , Sinapses/fisiologiaRESUMO
The dynamics of networks of neuronal cultures has been recently shown to be strongly dependent on the network geometry and in particular on their dimensionality. However, this phenomenon has been so far mostly unexplored from the theoretical point of view. Here we reveal the rich interplay between network geometry and synchronization of coupled oscillators in the context of a simplicial complex model of manifolds called Complex Network Manifold. The networks generated by this model combine small world properties (infinite Hausdorff dimension) and a high modular structure with finite and tunable spectral dimension. We show that the networks display frustrated synchronization for a wide range of the coupling strength of the oscillators, and that the synchronization properties are directly affected by the spectral dimension of the network.
RESUMO
We present a neurobiologically-inspired stochastic cellular automaton whose state jumps with time between the attractors corresponding to a series of stored patterns. The jumping varies from regular to chaotic as the model parameters are modified. The resulting irregular behavior, which mimics the state of attention in which a system shows a great adaptability to changing stimulus, is a consequence in the model of short-time presynaptic noise which induces synaptic depression. We discuss results from both a mean-field analysis and Monte Carlo simulations.
Assuntos
Encéfalo/fisiologia , Rede Nervosa/fisiologia , Redes Neurais de Computação , Dinâmica não Linear , Sinapses/fisiologia , Animais , Modelos Neurológicos , Método de Monte Carlo , Transmissão Sináptica/fisiologia , Fatores de TempoRESUMO
We investigate the behavior of a model neuron that receives a biophysically realistic noisy postsynaptic current based on uncorrelated spiking activity from a large number of afferents. We show that, with static synapses, such noise can give rise to inverse stochastic resonance (ISR) as a function of the presynaptic firing rate. We compare this to the case with dynamic synapses that feature short-term synaptic plasticity and show that the interval of presynaptic firing rate over which ISR exists can be extended or diminished. We consider both short-term depression and facilitation. Interestingly, we find that a double inverse stochastic resonance (DISR), with two distinct wells centered at different presynaptic firing rates, can appear.
Assuntos
Modelos Neurológicos , Neurônios/fisiologia , Sinapses/fisiologia , Potenciais de Ação , Animais , Plasticidade Neuronal/fisiologia , Processos EstocásticosRESUMO
In this paper we analyze the interplay between the subthreshold oscillations of a single neuron conductance-based model and the short-term plasticity of a dynamic synapse with a depressing mechanism. In previous research, the computational properties of subthreshold oscillations and dynamic synapses have been studied separately. Our results show that dynamic synapses can influence different aspects of the dynamics of neuronal subthreshold oscillations. Factors such as maximum hyperpolarization level, oscillation amplitude and frequency or the resulting firing threshold are modulated by synaptic depression, which can even make subthreshold oscillations disappear. This influence reshapes the postsynaptic neuron's resonant properties arising from subthreshold oscillations and leads to specific input/output relations. We also study the neuron's response to another simultaneous input in the context of this modulation, and show a distinct contextual processing as a function of the depression, in particular for detection of signals through weak synapses. Intrinsic oscillations dynamics can be combined with the characteristic time scale of the modulatory input received by a dynamic synapse to build cost-effective cell/channel-specific information discrimination mechanisms, beyond simple resonances. In this regard, we discuss the functional implications of synaptic depression modulation on intrinsic subthreshold dynamics.
Assuntos
Potenciais Pós-Sinápticos Inibidores , Neurônios/fisiologia , Sinapses/fisiologia , Simulação por Computador , Humanos , Modelos Neurológicos , Neurônios/citologiaRESUMO
We here illustrate how a well-founded study of the brain may originate in assuming analogies with phase-transition phenomena. Analyzing to what extent a weak signal endures in noisy environments, we identify the underlying mechanisms, and it results a description of how the excitability associated to (non-equilibrium) phase changes and criticality optimizes the processing of the signal. Our setting is a network of integrate-and-fire nodes in which connections are heterogeneous with rapid time-varying intensities mimicking fatigue and potentiation. Emergence then becomes quite robust against wiring topology modification--in fact, we considered from a fully connected network to the Homo sapiens connectome--showing the essential role of synaptic flickering on computations. We also suggest how to experimentally disclose significant changes during actual brain operation.