Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 31
Filtrar
1.
Chaos ; 34(5)2024 May 01.
Artigo em Inglês | MEDLINE | ID: mdl-38809907

RESUMO

The properties of complex networked systems arise from the interplay between the dynamics of their elements and the underlying topology. Thus, to understand their behavior, it is crucial to convene as much information as possible about their topological organization. However, in large systems, such as neuronal networks, the reconstruction of such topology is usually carried out from the information encoded in the dynamics on the network, such as spike train time series, and by measuring the transfer entropy between system elements. The topological information recovered by these methods does not necessarily capture the connectivity layout, but rather the causal flow of information between elements. New theoretical frameworks, such as Integrated Information Decomposition (Φ-ID), allow one to explore the modes in which information can flow between parts of a system, opening a rich landscape of interactions between network topology, dynamics, and information. Here, we apply Φ-ID on in silico and in vitro data to decompose the usual transfer entropy measure into different modes of information transfer, namely, synergistic, redundant, or unique. We demonstrate that the unique information transfer is the most relevant measure to uncover structural topological details from network activity data, while redundant information only introduces residual information for this application. Although the retrieved network connectivity is still functional, it captures more details of the underlying structural topology by avoiding to take into account emergent high-order interactions and information redundancy between elements, which are important for the functional behavior, but mask the detection of direct simple interactions between elements constituted by the structural network topology.


Assuntos
Simulação por Computador , Modelos Neurológicos , Rede Nervosa , Neurônios , Rede Nervosa/fisiologia , Neurônios/fisiologia , Animais , Entropia , Potenciais de Ação/fisiologia
2.
Phys Rev Lett ; 124(21): 218301, 2020 May 29.
Artigo em Inglês | MEDLINE | ID: mdl-32530670

RESUMO

The higher-order interactions of complex systems, such as the brain, are captured by their simplicial complex structure and have a significant effect on dynamics. However, the existing dynamical models defined on simplicial complexes make the strong assumption that the dynamics resides exclusively on the nodes. Here we formulate the higher-order Kuramoto model which describes the interactions between oscillators placed not only on nodes but also on links, triangles, and so on. We show that higher-order Kuramoto dynamics can lead to an explosive synchronization transition by using an adaptive coupling dependent on the solenoidal and the irrotational component of the dynamics.

3.
PLoS Comput Biol ; 13(7): e1005646, 2017 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-28692643

RESUMO

Inverse Stochastic Resonance (ISR) is a phenomenon in which the average spiking rate of a neuron exhibits a minimum with respect to noise. ISR has been studied in individual neurons, but here, we investigate ISR in scale-free networks, where the average spiking rate is calculated over the neuronal population. We use Hodgkin-Huxley model neurons with channel noise (i.e., stochastic gating variable dynamics), and the network connectivity is implemented via electrical or chemical connections (i.e., gap junctions or excitatory/inhibitory synapses). We find that the emergence of ISR depends on the interplay between each neuron's intrinsic dynamical structure, channel noise, and network inputs, where the latter in turn depend on network structure parameters. We observe that with weak gap junction or excitatory synaptic coupling, network heterogeneity and sparseness tend to favor the emergence of ISR. With inhibitory coupling, ISR is quite robust. We also identify dynamical mechanisms that underlie various features of this ISR behavior. Our results suggest possible ways of experimentally observing ISR in actual neuronal systems.


Assuntos
Potenciais de Ação/fisiologia , Modelos Neurológicos , Rede Nervosa/fisiologia , Neurônios/fisiologia , Algoritmos , Biologia Computacional , Humanos , Processos Estocásticos
4.
Chaos ; 26(6): 065101, 2016 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-27368790

RESUMO

In the last years, network scientists have directed their interest to the multi-layer character of real-world systems, and explicitly considered the structural and dynamical organization of graphs made of diverse layers between its constituents. Most complex systems include multiple subsystems and layers of connectivity and, in many cases, the interdependent components of systems interact through many different channels. Such a new perspective is indeed found to be the adequate representation for a wealth of features exhibited by networked systems in the real world. The contributions presented in this Focus Issue cover, from different points of view, the many achievements and still open questions in the field of multi-layer networks, such as: new frameworks and structures to represent and analyze heterogeneous complex systems, different aspects related to synchronization and centrality of complex networks, interplay between layers, and applications to logistic, biological, social, and technological fields.


Assuntos
Modelos Teóricos , Algoritmos
5.
Front Comput Neurosci ; 16: 836532, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35465268

RESUMO

The last decade has witnessed a remarkable progress in our understanding of the brain. This has mainly been based on the scrutiny and modeling of the transmission of activity among neurons across lively synapses. A main conclusion, thus far, is that essential features of the mind rely on collective phenomena that emerge from a willful interaction of many neurons that, mediating other cells, form a complex network whose details keep constantly adapting to their activity and surroundings. In parallel, theoretical and computational studies developed to understand many natural and artificial complex systems, which have truthfully explained their amazing emergent features and precise the role of the interaction dynamics and other conditions behind the different collective phenomena they happen to display. Focusing on promising ideas that arise when comparing these neurobiology and physics studies, the present perspective article shortly reviews such fascinating scenarios looking for clues about how high-level cognitive processes such as consciousness, intelligence, and identity can emerge. We, thus, show that basic concepts of physics, such as dynamical phases and non-equilibrium phase transitions, become quite relevant to the brain activity while determined by factors at the subcellular, cellular, and network levels. We also show how these transitions depend on details of the processing mechanism of stimuli in a noisy background and, most important, that one may detect them in familiar electroencephalogram (EEG) recordings. Thus, we associate the existence of such phases, which reveal a brain operating at (non-equilibrium) criticality, with the emergence of most interesting phenomena during memory tasks.

6.
Biology (Basel) ; 10(7)2021 Jul 11.
Artigo em Inglês | MEDLINE | ID: mdl-34356502

RESUMO

We here study a network of synaptic relations mingling excitatory and inhibitory neuron nodes that displays oscillations quite similar to electroencephalogram (EEG) brain waves, and identify abrupt variations brought about by swift synaptic mediations. We thus conclude that corresponding changes in EEG series surely come from the slowdown of the activity in neuron populations due to synaptic restrictions. The latter happens to generate an imbalance between excitation and inhibition causing a quick explosive increase of excitatory activity, which turns out to be a (first-order) transition among dynamic mental phases. Moreover, near this phase transition, our model system exhibits waves with a strong component in the so-called delta-theta domain that coexist with fast oscillations. These findings provide a simple explanation for the observed delta-gamma and theta-gamma modulation in actual brains, and open a serious and versatile path to understand deeply large amounts of apparently erratic, easily accessible brain data.

7.
Neural Netw ; 142: 44-56, 2021 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-33984735

RESUMO

The interplay between structure and function affects the emerging properties of many natural systems. Here we use an adaptive neural network model that couples activity and topological dynamics and reproduces the experimental temporal profiles of synaptic density observed in the brain. We prove that the existence of a transient period of relatively high synaptic connectivity is critical for the development of the system under noise circumstances, such that the resulting network can recover stored memories. Moreover, we show that intermediate synaptic densities provide optimal developmental paths with minimum energy consumption, and that ultimately it is the transient heterogeneity in the network that determines its evolution. These results could explain why the pruning curves observed in actual brain areas present their characteristic temporal profiles and they also suggest new design strategies to build biologically inspired neural networks with particular information processing capabilities.


Assuntos
Encéfalo , Redes Neurais de Computação
8.
Phys Rev Lett ; 104(10): 108702, 2010 Mar 12.
Artigo em Inglês | MEDLINE | ID: mdl-20366458

RESUMO

Why are most empirical networks, with the prominent exception of social ones, generically degree-degree anticorrelated? To answer this long-standing question, we define the ensemble of correlated networks and obtain the associated Shannon entropy. Maximum entropy can correspond to either assortative (correlated) or disassortative (anticorrelated) configurations, but in the case of highly heterogeneous, scale-free networks a certain disassortativity is predicted--offering a parsimonious explanation for the question above. Our approach provides a neutral model from which, in the absence of further knowledge regarding network evolution, one can obtain the expected value of correlations. When empirical observations deviate from the neutral predictions--as happens for social networks--one can then infer that there are specific correlating mechanisms at work.

9.
Neural Netw ; 126: 108-117, 2020 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-32208304

RESUMO

Here we study the emergence of chimera states, a recently reported phenomenon referring to the coexistence of synchronized and unsynchronized dynamical units, in a population of Morris-Lecar neurons which are coupled by both electrical and chemical synapses, constituting a hybrid synaptic architecture, as in actual brain connectivity. This scheme consists of a nonlocal network where the nearest neighbor neurons are coupled by electrical synapses, while the synapses from more distant neurons are of the chemical type. We demonstrate that peculiar dynamical behaviors, including chimera state and traveling wave, exist in such a hybrid coupled neural system, and analyze how the relative abundance of chemical and electrical synapses affects the features of chimera and different synchrony states (i.e. incoherent, traveling wave and coherent) and the regions in the space of relevant parameters for their emergence. Additionally, we show that, when the relative population of chemical synapses increases further, a new intriguing chaotic dynamical behavior appears above the region for chimera states. This is characterized by the coexistence of two distinct synchronized states with different amplitude, and an unsynchronized state, that we denote as a chaotic amplitude chimera. We also discuss about the computational implications of such state.


Assuntos
Sinapses Elétricas/fisiologia , Modelos Neurológicos , Neurônios/fisiologia , Animais , Encéfalo/fisiologia , Conectoma , Humanos
10.
Phys Rev E Stat Nonlin Soft Matter Phys ; 79(5 Pt 1): 050104, 2009 May.
Artigo em Inglês | MEDLINE | ID: mdl-19518399

RESUMO

We present an evolving network model in which the total numbers of nodes and edges are conserved, but in which edges are continuously rewired according to nonlinear preferential detachment and reattachment. Assuming power-law kernels with exponents alpha and beta , the stationary states which the degree distributions evolve toward exhibit a second-order phase transition-from relatively homogeneous to highly heterogeneous (with the emergence of starlike structures) at alpha=beta . Temporal evolution of the distribution in this critical regime is shown to follow a nonlinear diffusion equation, arriving at either pure or mixed power laws of exponents -alpha and 1-alpha .

11.
Phys Rev E ; 99(2-1): 022307, 2019 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-30934278

RESUMO

Recently there is a surge of interest in network geometry and topology. Here we show that the spectral dimension plays a fundamental role in establishing a clear relation between the topological and geometrical properties of a network and its dynamics. Specifically we explore the role of the spectral dimension in determining the synchronization properties of the Kuramoto model. We show that the synchronized phase can only be thermodynamically stable for spectral dimensions above four and that phase entrainment of the oscillators can only be found for spectral dimensions greater than two. We numerically test our analytical predictions on the recently introduced model of network geometry called complex network manifolds, which displays a tunable spectral dimension.

12.
Neural Netw ; 110: 131-140, 2019 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-30550865

RESUMO

We observe and study a self-organized phenomenon whereby the activity in a network of spiking neurons spontaneously terminates. We consider different types of populations, consisting of bistable model neurons connected electrically by gap junctions, or by either excitatory or inhibitory synapses, in a scale-free connection topology. We find that strongly synchronized population spiking events lead to complete cessation of activity in excitatory networks, but not in gap junction or inhibitory networks. We identify the underlying mechanism responsible for this phenomenon by examining the particular shape of the excitatory postsynaptic currents that arise in the neurons. We also examine the effects of the synaptic time constant, coupling strength, and channel noise on the occurrence of the phenomenon.


Assuntos
Potenciais de Ação , Redes Neurais de Computação , Neurônios , Potenciais de Ação/fisiologia , Sincronização Cortical/fisiologia , Junções Comunicantes/fisiologia , Humanos , Modelos Neurológicos , Rede Nervosa/fisiologia , Neurônios/fisiologia , Sinapses/fisiologia
13.
Front Comput Neurosci ; 13: 22, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31057385

RESUMO

Nature exhibits countless examples of adaptive networks, whose topology evolves constantly coupled with the activity due to its function. The brain is an illustrative example of a system in which a dynamic complex network develops by the generation and pruning of synaptic contacts between neurons while memories are acquired and consolidated. Here, we consider a recently proposed brain developing model to study how mechanisms responsible for the evolution of brain structure affect and are affected by memory storage processes. Following recent experimental observations, we assume that the basic rules for adding and removing synapses depend on local synaptic currents at the respective neurons in addition to global mechanisms depending on the mean connectivity. In this way a feedback loop between "form" and "function" spontaneously emerges that influences the ability of the system to optimally store and retrieve sensory information in patterns of brain activity or memories. In particular, we report here that, as a consequence of such a feedback-loop, oscillations in the activity of the system among the memorized patterns can occur, depending on parameters, reminding mind dynamical processes. Such oscillations have their origin in the destabilization of memory attractors due to the pruning dynamics, which induces a kind of structural disorder or noise in the system at a long-term scale. This constantly modifies the synaptic disorder induced by the interference among the many patterns of activity memorized in the system. Such new intriguing oscillatory behavior is to be associated only to long-term synaptic mechanisms during the network evolution dynamics, and it does not depend on short-term synaptic processes, as assumed in other studies, that are not present in our model.

14.
Sci Rep ; 8(1): 9910, 2018 07 02.
Artigo em Inglês | MEDLINE | ID: mdl-29967410

RESUMO

The dynamics of networks of neuronal cultures has been recently shown to be strongly dependent on the network geometry and in particular on their dimensionality. However, this phenomenon has been so far mostly unexplored from the theoretical point of view. Here we reveal the rich interplay between network geometry and synchronization of coupled oscillators in the context of a simplicial complex model of manifolds called Complex Network Manifold. The networks generated by this model combine small world properties (infinite Hausdorff dimension) and a high modular structure with finite and tunable spectral dimension. We show that the networks display frustrated synchronization for a wide range of the coupling strength of the oscillators, and that the synchronization properties are directly affected by the spectral dimension of the network.

15.
Neural Netw ; 20(2): 230-5, 2007 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-17196366

RESUMO

We present a neurobiologically-inspired stochastic cellular automaton whose state jumps with time between the attractors corresponding to a series of stored patterns. The jumping varies from regular to chaotic as the model parameters are modified. The resulting irregular behavior, which mimics the state of attention in which a system shows a great adaptability to changing stimulus, is a consequence in the model of short-time presynaptic noise which induces synaptic depression. We discuss results from both a mean-field analysis and Monte Carlo simulations.


Assuntos
Encéfalo/fisiologia , Rede Nervosa/fisiologia , Redes Neurais de Computação , Dinâmica não Linear , Sinapses/fisiologia , Animais , Modelos Neurológicos , Método de Monte Carlo , Transmissão Sináptica/fisiologia , Fatores de Tempo
16.
Phys Rev E ; 95(1-1): 012404, 2017 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-28208458

RESUMO

We investigate the behavior of a model neuron that receives a biophysically realistic noisy postsynaptic current based on uncorrelated spiking activity from a large number of afferents. We show that, with static synapses, such noise can give rise to inverse stochastic resonance (ISR) as a function of the presynaptic firing rate. We compare this to the case with dynamic synapses that feature short-term synaptic plasticity and show that the interval of presynaptic firing rate over which ISR exists can be extended or diminished. We consider both short-term depression and facilitation. Interestingly, we find that a double inverse stochastic resonance (DISR), with two distinct wells centered at different presynaptic firing rates, can appear.


Assuntos
Modelos Neurológicos , Neurônios/fisiologia , Sinapses/fisiologia , Potenciais de Ação , Animais , Plasticidade Neuronal/fisiologia , Processos Estocásticos
17.
PLoS One ; 11(1): e0145830, 2016.
Artigo em Inglês | MEDLINE | ID: mdl-26730737

RESUMO

In this paper we analyze the interplay between the subthreshold oscillations of a single neuron conductance-based model and the short-term plasticity of a dynamic synapse with a depressing mechanism. In previous research, the computational properties of subthreshold oscillations and dynamic synapses have been studied separately. Our results show that dynamic synapses can influence different aspects of the dynamics of neuronal subthreshold oscillations. Factors such as maximum hyperpolarization level, oscillation amplitude and frequency or the resulting firing threshold are modulated by synaptic depression, which can even make subthreshold oscillations disappear. This influence reshapes the postsynaptic neuron's resonant properties arising from subthreshold oscillations and leads to specific input/output relations. We also study the neuron's response to another simultaneous input in the context of this modulation, and show a distinct contextual processing as a function of the depression, in particular for detection of signals through weak synapses. Intrinsic oscillations dynamics can be combined with the characteristic time scale of the modulatory input received by a dynamic synapse to build cost-effective cell/channel-specific information discrimination mechanisms, beyond simple resonances. In this regard, we discuss the functional implications of synaptic depression modulation on intrinsic subthreshold dynamics.


Assuntos
Potenciais Pós-Sinápticos Inibidores , Neurônios/fisiologia , Sinapses/fisiologia , Simulação por Computador , Humanos , Modelos Neurológicos , Neurônios/citologia
18.
Sci Rep ; 5: 12216, 2015 Jul 20.
Artigo em Inglês | MEDLINE | ID: mdl-26193453

RESUMO

We here illustrate how a well-founded study of the brain may originate in assuming analogies with phase-transition phenomena. Analyzing to what extent a weak signal endures in noisy environments, we identify the underlying mechanisms, and it results a description of how the excitability associated to (non-equilibrium) phase changes and criticality optimizes the processing of the signal. Our setting is a network of integrate-and-fire nodes in which connections are heterogeneous with rapid time-varying intensities mimicking fatigue and potentiation. Emergence then becomes quite robust against wiring topology modification--in fact, we considered from a fully connected network to the Homo sapiens connectome--showing the essential role of synaptic flickering on computations. We also suggest how to experimentally disclose significant changes during actual brain operation.


Assuntos
Encéfalo/fisiologia , Fenômenos Fisiológicos do Sistema Nervoso , Simulação por Computador , Humanos , Modelos Neurológicos , Fatores de Tempo
19.
PLoS One ; 10(3): e0121156, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-25799449

RESUMO

We investigate the efficient transmission and processing of weak, subthreshold signals in a realistic neural medium in the presence of different levels of the underlying noise. Assuming Hebbian weights for maximal synaptic conductances--that naturally balances the network with excitatory and inhibitory synapses--and considering short-term synaptic plasticity affecting such conductances, we found different dynamic phases in the system. This includes a memory phase where population of neurons remain synchronized, an oscillatory phase where transitions between different synchronized populations of neurons appears and an asynchronous or noisy phase. When a weak stimulus input is applied to each neuron, increasing the level of noise in the medium we found an efficient transmission of such stimuli around the transition and critical points separating different phases for well-defined different levels of stochasticity in the system. We proved that this intriguing phenomenon is quite robust, as it occurs in different situations including several types of synaptic plasticity, different type and number of stored patterns and diverse network topologies, namely, diluted networks and complex topologies such as scale-free and small-world networks. We conclude that the robustness of the phenomenon in different realistic scenarios, including spiking neurons, short-term synaptic plasticity and complex networks topologies, make very likely that it could also occur in actual neural systems as recent psycho-physical experiments suggest.


Assuntos
Neurônios/fisiologia , Transmissão Sináptica , Biologia Computacional/métodos , Modelos Neurológicos , Plasticidade Neuronal
20.
Phys Rev E Stat Nonlin Soft Matter Phys ; 66(6 Pt 1): 061910, 2002 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-12513321

RESUMO

We compute the capacity of a binary neural network with dynamic depressing synapses to store and retrieve an infinite number of patterns. We use a biologically motivated model of synaptic depression and a standard mean-field approach. We find that at T=0 the critical storage capacity decreases with the degree of the depression. We confirm the validity of our main mean-field results with numerical simulations.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA