Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
1.
PLoS Comput Biol ; 18(9): e1010086, 2022 09.
Article in English | MEDLINE | ID: mdl-36074778

ABSTRACT

Sustainable research on computational models of neuronal networks requires published models to be understandable, reproducible, and extendable. Missing details or ambiguities about mathematical concepts and assumptions, algorithmic implementations, or parameterizations hinder progress. Such flaws are unfortunately frequent and one reason is a lack of readily applicable standards and tools for model description. Our work aims to advance complete and concise descriptions of network connectivity but also to guide the implementation of connection routines in simulation software and neuromorphic hardware systems. We first review models made available by the computational neuroscience community in the repositories ModelDB and Open Source Brain, and investigate the corresponding connectivity structures and their descriptions in both manuscript and code. The review comprises the connectivity of networks with diverse levels of neuroanatomical detail and exposes how connectivity is abstracted in existing description languages and simulator interfaces. We find that a substantial proportion of the published descriptions of connectivity is ambiguous. Based on this review, we derive a set of connectivity concepts for deterministically and probabilistically connected networks and also address networks embedded in metric space. Beside these mathematical and textual guidelines, we propose a unified graphical notation for network diagrams to facilitate an intuitive understanding of network properties. Examples of representative network models demonstrate the practical use of the ideas. We hope that the proposed standardizations will contribute to unambiguous descriptions and reproducible implementations of neuronal network connectivity in computational neuroscience.


Subject(s)
Models, Neurological , Neurosciences , Computer Simulation , Neurons/physiology , Software
2.
J Comput Neurosci ; 45(2): 103-132, 2018 10.
Article in English | MEDLINE | ID: mdl-30146661

ABSTRACT

Capturing the response behavior of spiking neuron models with rate-based models facilitates the investigation of neuronal networks using powerful methods for rate-based network dynamics. To this end, we investigate the responses of two widely used neuron model types, the Izhikevich and augmented multi-adapative threshold (AMAT) models, to a range of spiking inputs ranging from step responses to natural spike data. We find (i) that linear-nonlinear firing rate models fitted to test data can be used to describe the firing-rate responses of AMAT and Izhikevich spiking neuron models in many cases; (ii) that firing-rate responses are generally too complex to be captured by first-order low-pass filters but require bandpass filters instead; (iii) that linear-nonlinear models capture the response of AMAT models better than of Izhikevich models; (iv) that the wide range of response types evoked by current-injection experiments collapses to few response types when neurons are driven by stationary or sinusoidally modulated Poisson input; and (v) that AMAT and Izhikevich models show different responses to spike input despite identical responses to current injections. Together, these findings suggest that rate-based models of network dynamics may capture a wider range of neuronal response properties by incorporating second-order bandpass filters fitted to responses of spiking model neurons. These models may contribute to bringing rate-based network modeling closer to the reality of biological neuronal networks.


Subject(s)
Action Potentials/physiology , Models, Neurological , Neurons/physiology , Animals , Computer Simulation , Electric Stimulation , Linear Models , Nerve Net , Nonlinear Dynamics
3.
J Comput Neurosci ; 35(3): 359-75, 2013 Dec.
Article in English | MEDLINE | ID: mdl-23783890

ABSTRACT

Firing-rate models provide a practical tool for studying signal processing in the early visual system, permitting more thorough mathematical analysis than spike-based models. We show here that essential response properties of relay cells in the lateral geniculate nucleus (LGN) can be captured by surprisingly simple firing-rate models consisting of a low-pass filter and a nonlinear activation function. The starting point for our analysis are two spiking neuron models based on experimental data: a spike-response model fitted to data from macaque (Carandini et al. J. Vis., 20(14), 1-2011, 2007), and a model with conductance-based synapses and afterhyperpolarizing currents fitted to data from cat (Casti et al. J. Comput. Neurosci., 24(2), 235-252, 2008). We obtained the nonlinear activation function by stimulating the model neurons with stationary stochastic spike trains, while we characterized the linear filter by fitting a low-pass filter to responses to sinusoidally modulated stochastic spike trains. To account for the non-Poisson nature of retinal spike trains, we performed all analyses with spike trains with higher-order gamma statistics in addition to Poissonian spike trains. Interestingly, the properties of the low-pass filter depend only on the average input rate, but not on the modulation depth of sinusoidally modulated input. Thus, the response properties of our model are fully specified by just three parameters (low-frequency gain, cutoff frequency, and delay) for a given mean input rate and input regularity. This simple firing-rate model reproduces the response of spiking neurons to a step in input rate very well for Poissonian as well as for non-Poissonian input. We also found that the cutoff frequencies, and thus the filter time constants, of the rate-based model are unrelated to the membrane time constants of the underlying spiking models, in agreement with similar observations for simpler models.


Subject(s)
Geniculate Bodies/physiology , Neurons/physiology , Algorithms , Animals , Computer Simulation , Electric Stimulation , Electrophysiological Phenomena/physiology , Excitatory Postsynaptic Potentials/physiology , Membrane Potentials/physiology , Models, Neurological , Nonlinear Dynamics , Synaptic Transmission/physiology
4.
Network ; 23(4): 131-49, 2012.
Article in English | MEDLINE | ID: mdl-22994683

ABSTRACT

As computational neuroscience matures, many simulation environments are available that are useful for neuronal network modeling. However, methods for successfully documenting models for publication and for exchanging models and model components among these projects are still under development. Here we briefly review existing software and applications for network model creation, documentation and exchange. Then we discuss a few of the larger issues facing the field of computational neuroscience regarding network modeling and suggest solutions to some of these problems, concentrating in particular on standardized network model terminology, notation, and descriptions and explicit documentation of model scaling. We hope this will enable and encourage computational neuroscientists to share their models more systematically in the future.


Subject(s)
Computer Simulation , Documentation/methods , Information Dissemination/methods , Models, Neurological , Nerve Net/physiology , Software , Terminology as Topic , Animals , Humans , Programming Languages
5.
Neuron ; 102(4): 735-744, 2019 05 22.
Article in English | MEDLINE | ID: mdl-31121126

ABSTRACT

A key element of the European Union's Human Brain Project (HBP) and other large-scale brain research projects is the simulation of large-scale model networks of neurons. Here, we argue why such simulations will likely be indispensable for bridging the scales between the neuron and system levels in the brain, and why a set of brain simulators based on neuron models at different levels of biological detail should therefore be developed. To allow for systematic refinement of candidate network models by comparison with experiments, the simulations should be multimodal in the sense that they should predict not only action potentials, but also electric, magnetic, and optical signals measured at the population and system levels.


Subject(s)
Brain/physiology , Computer Simulation , Models, Neurological , Neurons/physiology , Humans , Neural Networks, Computer , Neurosciences
6.
Front Neuroinform ; 11: 30, 2017.
Article in English | MEDLINE | ID: mdl-28559808

ABSTRACT

Recent advances in the development of data structures to represent spiking neuron network models enable us to exploit the complete memory of petascale computers for a single brain-scale network simulation. In this work, we investigate how well we can exploit the computing power of such supercomputers for the creation of neuronal networks. Using an established benchmark, we divide the runtime of simulation code into the phase of network construction and the phase during which the dynamical state is advanced in time. We find that on multi-core compute nodes network creation scales well with process-parallel code but exhibits a prohibitively large memory consumption. Thread-parallel network creation, in contrast, exhibits speedup only up to a small number of threads but has little overhead in terms of memory. We further observe that the algorithms creating instances of model neurons and their connections scale well for networks of ten thousand neurons, but do not show the same speedup for networks of millions of neurons. Our work uncovers that the lack of scaling of thread-parallel network creation is due to inadequate memory allocation strategies and demonstrates that thread-optimized memory allocators recover excellent scaling. An analysis of the loop order used for network construction reveals that more complex tests on the locality of operations significantly improve scaling and reduce runtime by allowing construction algorithms to step through large networks more efficiently than in existing code. The combination of these techniques increases performance by an order of magnitude and harnesses the increasingly parallel compute power of the compute nodes in high-performance clusters and supercomputers.

7.
Front Comput Neurosci ; 8: 136, 2014.
Article in English | MEDLINE | ID: mdl-25400575

ABSTRACT

Random networks of integrate-and-fire neurons with strong current-based synapses can, unlike previously believed, assume stable states of sustained asynchronous and irregular firing, even without external random background or pacemaker neurons. We analyze the mechanisms underlying the emergence, lifetime and irregularity of such self-sustained activity states. We first demonstrate how the competition between the mean and the variance of the synaptic input leads to a non-monotonic firing-rate transfer in the network. Thus, by increasing the synaptic coupling strength, the system can become bistable: In addition to the quiescent state, a second stable fixed-point at moderate firing rates can emerge by a saddle-node bifurcation. Inherently generated fluctuations of the population firing rate around this non-trivial fixed-point can trigger transitions into the quiescent state. Hence, the trade-off between the magnitude of the population-rate fluctuations and the size of the basin of attraction of the non-trivial rate fixed-point determines the onset and the lifetime of self-sustained activity states. During self-sustained activity, individual neuronal activity is moreover highly irregular, switching between long periods of low firing rate to short burst-like states. We show that this is an effect of the strong synaptic weights and the finite time constant of synaptic and neuronal integration, and can actually serve to stabilize the self-sustained state.

8.
Front Neuroinform ; 8: 78, 2014.
Article in English | MEDLINE | ID: mdl-25346682

ABSTRACT

Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

10.
Cogn Neurodyn ; 6(4): 307-24, 2012 Aug.
Article in English | MEDLINE | ID: mdl-24995047

ABSTRACT

A striking feature of the organization of the early visual pathway is the significant feedback from primary visual cortex to cells in the dorsal lateral geniculate nucleus (LGN). Despite numerous experimental and modeling studies, the functional role for this feedback remains elusive. We present a new firing-rate-based model for LGN relay cells in cat, explicitly accounting for thalamocortical loop effects. The established DOG model, here assumed to account for the spatial aspects of the feedforward processing of visual stimuli, is extended to incorporate the influence of thalamocortical loops including a full set of orientation-selective cortical cell populations. Assuming a phase-reversed push-pull arrangement of ON and OFF cortical feedback as seen experimentally, this extended DOG (eDOG) model exhibits linear firing properties despite non-linear firing characteristics of the corticothalamic cells. The spatiotemporal receptive field of the eDOG model has a simple algebraic structure in Fourier space, while the real-space receptive field, as well as responses to visual stimuli, are found by evaluation of an integral. As an example application we use the eDOG model to study effects of cortical feedback on responses to flashing circular spots and patch-grating stimuli and find that the eDOG model can qualitatively account for experimental findings.

11.
Neural Comput ; 21(2): 353-9, 2009 Feb.
Article in English | MEDLINE | ID: mdl-19431263

ABSTRACT

Lovelace and Cios (2008) recently proposed a very simple spiking neuron (VSSN) model for simulations of large neuronal networks as an efficient replacement for the integrate-and-fire neuron model. We argue that the VSSN model falls behind key advances in neuronal network modeling over the past 20 years, in particular, techniques that permit simulators to compute the state of the neuron without repeated summation over the history of input spikes and to integrate the subthreshold dynamics exactly. State-of-the-art solvers for networks of integrate-and-fire model neurons are substantially more efficient than the VSSN simulator and allow routine simulations of networks of some 10(5) neurons and 10(9) connections on moderate computer clusters.


Subject(s)
Action Potentials/physiology , Models, Neurological , Neural Networks, Computer , Neurons/physiology , Animals , Computer Simulation , Nonlinear Dynamics
SELECTION OF CITATIONS
SEARCH DETAIL