RESUMEN
The relation between electroencephalography (EEG) rhythms, brain functions, and behavioral correlates is well-established. Some physiological mechanisms underlying rhythm generation are understood, enabling the replication of brain rhythms in silico. This offers a pathway to explore connections between neural oscillations and specific neuronal circuits, potentially yielding fundamental insights into the functional properties of brain waves. Information theory frameworks, such as Integrated Information Decomposition (Φ-ID), relate dynamical regimes with informational properties, providing deeper insights into neuronal dynamic functions. Here, we investigate wave emergence in an excitatory/inhibitory (E/I) balanced network of integrate and fire neurons with short-term synaptic plasticity. This model produces a diverse range of EEG-like rhythms, from low δ waves to high-frequency oscillations. Through Φ-ID, we analyze the network's information dynamics and its relation with different emergent rhythms, elucidating the system's suitability for functions such as robust information transfer, storage, and parallel operation. Furthermore, our study helps to identify regimes that may resemble pathological states due to poor informational properties and high randomness. We found, e.g., that in silico ß and δ waves are associated with maximum information transfer in inhibitory and excitatory neuron populations, respectively, and that the coexistence of excitatory θ, α, and ß waves is associated to information storage. Additionally, we observed that high-frequency oscillations can exhibit either high or poor informational properties, potentially shedding light on ongoing discussions regarding physiological versus pathological high-frequency oscillations. In summary, our study demonstrates that dynamical regimes with similar oscillations may exhibit vastly different information dynamics. Characterizing information dynamics within these regimes serves as a potent tool for gaining insights into the functions of complex neuronal networks. Finally, our findings suggest that the use of information dynamics in both model and experimental data analysis, could help discriminate between oscillations associated with cognitive functions and those linked to neuronal disorders.
Asunto(s)
Simulación por Computador , Electroencefalografía , Modelos Neurológicos , Humanos , Encéfalo/fisiología , Biología Computacional , Ondas Encefálicas/fisiología , Neuronas/fisiología , Plasticidad Neuronal/fisiología , Red Nerviosa/fisiología , Teoría de la InformaciónRESUMEN
Triadic interactions are higher-order interactions which occur when a set of nodes affects the interaction between two other nodes. Examples of triadic interactions are present in the brain when glia modulate the synaptic signals among neuron pairs or when interneuron axo-axonic synapses enable presynaptic inhibition and facilitation, and in ecosystems when one or more species can affect the interaction among two other species. On random graphs, triadic percolation has been recently shown to turn percolation into a fully fledged dynamical process in which the size of the giant component undergoes a route to chaos. However, in many real cases, triadic interactions are local and occur on spatially embedded networks. Here, we show that triadic interactions in spatial networks induce a very complex spatio-temporal modulation of the giant component which gives rise to triadic percolation patterns with significantly different topology. We classify the observed patterns (stripes, octopus, and small clusters) with topological data analysis and we assess their information content (entropy and complexity). Moreover, we illustrate the multistability of the dynamics of the triadic percolation patterns, and we provide a comprehensive phase diagram of the model. These results open new perspectives in percolation as they demonstrate that in presence of spatial triadic interactions, the giant component can acquire a time-varying topology. Hence, this work provides a theoretical framework that can be applied to model realistic scenarios in which the giant component is time dependent as in neuroscience.
RESUMEN
The properties of complex networked systems arise from the interplay between the dynamics of their elements and the underlying topology. Thus, to understand their behavior, it is crucial to convene as much information as possible about their topological organization. However, in large systems, such as neuronal networks, the reconstruction of such topology is usually carried out from the information encoded in the dynamics on the network, such as spike train time series, and by measuring the transfer entropy between system elements. The topological information recovered by these methods does not necessarily capture the connectivity layout, but rather the causal flow of information between elements. New theoretical frameworks, such as Integrated Information Decomposition (Φ-ID), allow one to explore the modes in which information can flow between parts of a system, opening a rich landscape of interactions between network topology, dynamics, and information. Here, we apply Φ-ID on in silico and in vitro data to decompose the usual transfer entropy measure into different modes of information transfer, namely, synergistic, redundant, or unique. We demonstrate that the unique information transfer is the most relevant measure to uncover structural topological details from network activity data, while redundant information only introduces residual information for this application. Although the retrieved network connectivity is still functional, it captures more details of the underlying structural topology by avoiding to take into account emergent high-order interactions and information redundancy between elements, which are important for the functional behavior, but mask the detection of direct simple interactions between elements constituted by the structural network topology.
Asunto(s)
Simulación por Computador , Modelos Neurológicos , Red Nerviosa , Neuronas , Red Nerviosa/fisiología , Neuronas/fisiología , Animales , Entropía , Potenciales de Acción/fisiologíaRESUMEN
The last decade has witnessed a remarkable progress in our understanding of the brain. This has mainly been based on the scrutiny and modeling of the transmission of activity among neurons across lively synapses. A main conclusion, thus far, is that essential features of the mind rely on collective phenomena that emerge from a willful interaction of many neurons that, mediating other cells, form a complex network whose details keep constantly adapting to their activity and surroundings. In parallel, theoretical and computational studies developed to understand many natural and artificial complex systems, which have truthfully explained their amazing emergent features and precise the role of the interaction dynamics and other conditions behind the different collective phenomena they happen to display. Focusing on promising ideas that arise when comparing these neurobiology and physics studies, the present perspective article shortly reviews such fascinating scenarios looking for clues about how high-level cognitive processes such as consciousness, intelligence, and identity can emerge. We, thus, show that basic concepts of physics, such as dynamical phases and non-equilibrium phase transitions, become quite relevant to the brain activity while determined by factors at the subcellular, cellular, and network levels. We also show how these transitions depend on details of the processing mechanism of stimuli in a noisy background and, most important, that one may detect them in familiar electroencephalogram (EEG) recordings. Thus, we associate the existence of such phases, which reveal a brain operating at (non-equilibrium) criticality, with the emergence of most interesting phenomena during memory tasks.
RESUMEN
We here study a network of synaptic relations mingling excitatory and inhibitory neuron nodes that displays oscillations quite similar to electroencephalogram (EEG) brain waves, and identify abrupt variations brought about by swift synaptic mediations. We thus conclude that corresponding changes in EEG series surely come from the slowdown of the activity in neuron populations due to synaptic restrictions. The latter happens to generate an imbalance between excitation and inhibition causing a quick explosive increase of excitatory activity, which turns out to be a (first-order) transition among dynamic mental phases. Moreover, near this phase transition, our model system exhibits waves with a strong component in the so-called delta-theta domain that coexist with fast oscillations. These findings provide a simple explanation for the observed delta-gamma and theta-gamma modulation in actual brains, and open a serious and versatile path to understand deeply large amounts of apparently erratic, easily accessible brain data.
RESUMEN
The interplay between structure and function affects the emerging properties of many natural systems. Here we use an adaptive neural network model that couples activity and topological dynamics and reproduces the experimental temporal profiles of synaptic density observed in the brain. We prove that the existence of a transient period of relatively high synaptic connectivity is critical for the development of the system under noise circumstances, such that the resulting network can recover stored memories. Moreover, we show that intermediate synaptic densities provide optimal developmental paths with minimum energy consumption, and that ultimately it is the transient heterogeneity in the network that determines its evolution. These results could explain why the pruning curves observed in actual brain areas present their characteristic temporal profiles and they also suggest new design strategies to build biologically inspired neural networks with particular information processing capabilities.
Asunto(s)
Encéfalo , Redes Neurales de la ComputaciónRESUMEN
The higher-order interactions of complex systems, such as the brain, are captured by their simplicial complex structure and have a significant effect on dynamics. However, the existing dynamical models defined on simplicial complexes make the strong assumption that the dynamics resides exclusively on the nodes. Here we formulate the higher-order Kuramoto model which describes the interactions between oscillators placed not only on nodes but also on links, triangles, and so on. We show that higher-order Kuramoto dynamics can lead to an explosive synchronization transition by using an adaptive coupling dependent on the solenoidal and the irrotational component of the dynamics.
RESUMEN
Here we study the emergence of chimera states, a recently reported phenomenon referring to the coexistence of synchronized and unsynchronized dynamical units, in a population of Morris-Lecar neurons which are coupled by both electrical and chemical synapses, constituting a hybrid synaptic architecture, as in actual brain connectivity. This scheme consists of a nonlocal network where the nearest neighbor neurons are coupled by electrical synapses, while the synapses from more distant neurons are of the chemical type. We demonstrate that peculiar dynamical behaviors, including chimera state and traveling wave, exist in such a hybrid coupled neural system, and analyze how the relative abundance of chemical and electrical synapses affects the features of chimera and different synchrony states (i.e. incoherent, traveling wave and coherent) and the regions in the space of relevant parameters for their emergence. Additionally, we show that, when the relative population of chemical synapses increases further, a new intriguing chaotic dynamical behavior appears above the region for chimera states. This is characterized by the coexistence of two distinct synchronized states with different amplitude, and an unsynchronized state, that we denote as a chaotic amplitude chimera. We also discuss about the computational implications of such state.
Asunto(s)
Sinapsis Eléctricas/fisiología , Modelos Neurológicos , Neuronas/fisiología , Animales , Encéfalo/fisiología , Conectoma , HumanosRESUMEN
Nature exhibits countless examples of adaptive networks, whose topology evolves constantly coupled with the activity due to its function. The brain is an illustrative example of a system in which a dynamic complex network develops by the generation and pruning of synaptic contacts between neurons while memories are acquired and consolidated. Here, we consider a recently proposed brain developing model to study how mechanisms responsible for the evolution of brain structure affect and are affected by memory storage processes. Following recent experimental observations, we assume that the basic rules for adding and removing synapses depend on local synaptic currents at the respective neurons in addition to global mechanisms depending on the mean connectivity. In this way a feedback loop between "form" and "function" spontaneously emerges that influences the ability of the system to optimally store and retrieve sensory information in patterns of brain activity or memories. In particular, we report here that, as a consequence of such a feedback-loop, oscillations in the activity of the system among the memorized patterns can occur, depending on parameters, reminding mind dynamical processes. Such oscillations have their origin in the destabilization of memory attractors due to the pruning dynamics, which induces a kind of structural disorder or noise in the system at a long-term scale. This constantly modifies the synaptic disorder induced by the interference among the many patterns of activity memorized in the system. Such new intriguing oscillatory behavior is to be associated only to long-term synaptic mechanisms during the network evolution dynamics, and it does not depend on short-term synaptic processes, as assumed in other studies, that are not present in our model.
RESUMEN
Recently there is a surge of interest in network geometry and topology. Here we show that the spectral dimension plays a fundamental role in establishing a clear relation between the topological and geometrical properties of a network and its dynamics. Specifically we explore the role of the spectral dimension in determining the synchronization properties of the Kuramoto model. We show that the synchronized phase can only be thermodynamically stable for spectral dimensions above four and that phase entrainment of the oscillators can only be found for spectral dimensions greater than two. We numerically test our analytical predictions on the recently introduced model of network geometry called complex network manifolds, which displays a tunable spectral dimension.
RESUMEN
We observe and study a self-organized phenomenon whereby the activity in a network of spiking neurons spontaneously terminates. We consider different types of populations, consisting of bistable model neurons connected electrically by gap junctions, or by either excitatory or inhibitory synapses, in a scale-free connection topology. We find that strongly synchronized population spiking events lead to complete cessation of activity in excitatory networks, but not in gap junction or inhibitory networks. We identify the underlying mechanism responsible for this phenomenon by examining the particular shape of the excitatory postsynaptic currents that arise in the neurons. We also examine the effects of the synaptic time constant, coupling strength, and channel noise on the occurrence of the phenomenon.
Asunto(s)
Potenciales de Acción , Redes Neurales de la Computación , Neuronas , Potenciales de Acción/fisiología , Sincronización Cortical/fisiología , Uniones Comunicantes/fisiología , Humanos , Modelos Neurológicos , Red Nerviosa/fisiología , Neuronas/fisiología , Sinapsis/fisiologíaRESUMEN
The dynamics of networks of neuronal cultures has been recently shown to be strongly dependent on the network geometry and in particular on their dimensionality. However, this phenomenon has been so far mostly unexplored from the theoretical point of view. Here we reveal the rich interplay between network geometry and synchronization of coupled oscillators in the context of a simplicial complex model of manifolds called Complex Network Manifold. The networks generated by this model combine small world properties (infinite Hausdorff dimension) and a high modular structure with finite and tunable spectral dimension. We show that the networks display frustrated synchronization for a wide range of the coupling strength of the oscillators, and that the synchronization properties are directly affected by the spectral dimension of the network.
RESUMEN
Inverse Stochastic Resonance (ISR) is a phenomenon in which the average spiking rate of a neuron exhibits a minimum with respect to noise. ISR has been studied in individual neurons, but here, we investigate ISR in scale-free networks, where the average spiking rate is calculated over the neuronal population. We use Hodgkin-Huxley model neurons with channel noise (i.e., stochastic gating variable dynamics), and the network connectivity is implemented via electrical or chemical connections (i.e., gap junctions or excitatory/inhibitory synapses). We find that the emergence of ISR depends on the interplay between each neuron's intrinsic dynamical structure, channel noise, and network inputs, where the latter in turn depend on network structure parameters. We observe that with weak gap junction or excitatory synaptic coupling, network heterogeneity and sparseness tend to favor the emergence of ISR. With inhibitory coupling, ISR is quite robust. We also identify dynamical mechanisms that underlie various features of this ISR behavior. Our results suggest possible ways of experimentally observing ISR in actual neuronal systems.
Asunto(s)
Potenciales de Acción/fisiología , Modelos Neurológicos , Red Nerviosa/fisiología , Neuronas/fisiología , Algoritmos , Biología Computacional , Humanos , Procesos EstocásticosRESUMEN
We investigate the behavior of a model neuron that receives a biophysically realistic noisy postsynaptic current based on uncorrelated spiking activity from a large number of afferents. We show that, with static synapses, such noise can give rise to inverse stochastic resonance (ISR) as a function of the presynaptic firing rate. We compare this to the case with dynamic synapses that feature short-term synaptic plasticity and show that the interval of presynaptic firing rate over which ISR exists can be extended or diminished. We consider both short-term depression and facilitation. Interestingly, we find that a double inverse stochastic resonance (DISR), with two distinct wells centered at different presynaptic firing rates, can appear.
Asunto(s)
Modelos Neurológicos , Neuronas/fisiología , Sinapsis/fisiología , Potenciales de Acción , Animales , Plasticidad Neuronal/fisiología , Procesos EstocásticosRESUMEN
In the last years, network scientists have directed their interest to the multi-layer character of real-world systems, and explicitly considered the structural and dynamical organization of graphs made of diverse layers between its constituents. Most complex systems include multiple subsystems and layers of connectivity and, in many cases, the interdependent components of systems interact through many different channels. Such a new perspective is indeed found to be the adequate representation for a wealth of features exhibited by networked systems in the real world. The contributions presented in this Focus Issue cover, from different points of view, the many achievements and still open questions in the field of multi-layer networks, such as: new frameworks and structures to represent and analyze heterogeneous complex systems, different aspects related to synchronization and centrality of complex networks, interplay between layers, and applications to logistic, biological, social, and technological fields.
Asunto(s)
Modelos Teóricos , AlgoritmosRESUMEN
In this paper we analyze the interplay between the subthreshold oscillations of a single neuron conductance-based model and the short-term plasticity of a dynamic synapse with a depressing mechanism. In previous research, the computational properties of subthreshold oscillations and dynamic synapses have been studied separately. Our results show that dynamic synapses can influence different aspects of the dynamics of neuronal subthreshold oscillations. Factors such as maximum hyperpolarization level, oscillation amplitude and frequency or the resulting firing threshold are modulated by synaptic depression, which can even make subthreshold oscillations disappear. This influence reshapes the postsynaptic neuron's resonant properties arising from subthreshold oscillations and leads to specific input/output relations. We also study the neuron's response to another simultaneous input in the context of this modulation, and show a distinct contextual processing as a function of the depression, in particular for detection of signals through weak synapses. Intrinsic oscillations dynamics can be combined with the characteristic time scale of the modulatory input received by a dynamic synapse to build cost-effective cell/channel-specific information discrimination mechanisms, beyond simple resonances. In this regard, we discuss the functional implications of synaptic depression modulation on intrinsic subthreshold dynamics.
Asunto(s)
Potenciales Postsinápticos Inhibidores , Neuronas/fisiología , Sinapsis/fisiología , Simulación por Computador , Humanos , Modelos Neurológicos , Neuronas/citologíaRESUMEN
We here illustrate how a well-founded study of the brain may originate in assuming analogies with phase-transition phenomena. Analyzing to what extent a weak signal endures in noisy environments, we identify the underlying mechanisms, and it results a description of how the excitability associated to (non-equilibrium) phase changes and criticality optimizes the processing of the signal. Our setting is a network of integrate-and-fire nodes in which connections are heterogeneous with rapid time-varying intensities mimicking fatigue and potentiation. Emergence then becomes quite robust against wiring topology modification--in fact, we considered from a fully connected network to the Homo sapiens connectome--showing the essential role of synaptic flickering on computations. We also suggest how to experimentally disclose significant changes during actual brain operation.
Asunto(s)
Encéfalo/fisiología , Fenómenos Fisiológicos del Sistema Nervioso , Simulación por Computador , Humanos , Modelos Neurológicos , Factores de TiempoRESUMEN
We investigate the efficient transmission and processing of weak, subthreshold signals in a realistic neural medium in the presence of different levels of the underlying noise. Assuming Hebbian weights for maximal synaptic conductances--that naturally balances the network with excitatory and inhibitory synapses--and considering short-term synaptic plasticity affecting such conductances, we found different dynamic phases in the system. This includes a memory phase where population of neurons remain synchronized, an oscillatory phase where transitions between different synchronized populations of neurons appears and an asynchronous or noisy phase. When a weak stimulus input is applied to each neuron, increasing the level of noise in the medium we found an efficient transmission of such stimuli around the transition and critical points separating different phases for well-defined different levels of stochasticity in the system. We proved that this intriguing phenomenon is quite robust, as it occurs in different situations including several types of synaptic plasticity, different type and number of stored patterns and diverse network topologies, namely, diluted networks and complex topologies such as scale-free and small-world networks. We conclude that the robustness of the phenomenon in different realistic scenarios, including spiking neurons, short-term synaptic plasticity and complex networks topologies, make very likely that it could also occur in actual neural systems as recent psycho-physical experiments suggest.
Asunto(s)
Neuronas/fisiología , Transmisión Sináptica , Biología Computacional/métodos , Modelos Neurológicos , Plasticidad NeuronalRESUMEN
In this paper we review our research on the effect and computational role of dynamical synapses on feed-forward and recurrent neural networks. Among others, we report on the appearance of a new class of dynamical memories which result from the destabilization of learned memory attractors. This has important consequences for dynamic information processing allowing the system to sequentially access the information stored in the memories under changing stimuli. Although storage capacity of stable memories also decreases, our study demonstrated the positive effect of synaptic facilitation to recover maximum storage capacity and to enlarge the capacity of the system for memory recall in noisy conditions. Possibly, the new dynamical behavior can be associated with the voltage transitions between up and down states observed in cortical areas in the brain. We investigated the conditions for which the permanence times in the up state are power-law distributed, which is a sign for criticality, and concluded that the experimentally observed large variability of permanence times could be explained as the result of noisy dynamic synapses with large recovery times. Finally, we report how short-term synaptic processes can transmit weak signals throughout more than one frequency range in noisy neural networks, displaying a kind of stochastic multi-resonance. This effect is due to competition between activity-dependent synaptic fluctuations (due to dynamic synapses) and the existence of neuron firing threshold which adapts to the incoming mean synaptic input.
RESUMEN
Short-term memory in the brain cannot in general be explained the way long-term memory can--as a gradual modification of synaptic weights--since it takes place too quickly. Theories based on some form of cellular bistability, however, do not seem able to account for the fact that noisy neurons can collectively store information in a robust manner. We show how a sufficiently clustered network of simple model neurons can be instantly induced into metastable states capable of retaining information for a short time (a few seconds). The mechanism is robust to different network topologies and kinds of neural model. This could constitute a viable means available to the brain for sensory and/or short-term memory with no need of synaptic learning. Relevant phenomena described by neurobiology and psychology, such as local synchronization of synaptic inputs and power-law statistics of forgetting avalanches, emerge naturally from this mechanism, and we suggest possible experiments to test its viability in more biological settings.