Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 124
Filter
1.
Phys Rev E ; 109(5-1): 054203, 2024 May.
Article in English | MEDLINE | ID: mdl-38907463

ABSTRACT

Time delays play a significant role in dynamical systems, as they affect their transient behavior and the dimensionality of their attractors. The number, values, and spacing of these time delays influences the eigenvalues of a nonlinear delay-differential system at its fixed point. Here we explore a multidelay system as the core computational element of a reservoir computer making predictions on its input in the usual regime close to fixed point instability. Variations in the number and separation of time delays are first examined to determine the effect of such parameters of the delay distribution on the effectiveness of time-delay reservoirs for nonlinear time series prediction. We demonstrate computationally that an optoelectronic device with multiple different delays can improve the mapping of scalar input into higher-dimensional dynamics, and thus its memory and prediction capabilities for input time series generated by low- and high-dimensional dynamical systems. In particular, this enhances the suitability of such reservoir computers for predicting input data with temporal correlations. Additionally, we highlight the pronounced harmful resonance condition for reservoir computing when using an electro-optic oscillator model with multiple delays. We illustrate that the resonance point may shift depending on the task at hand, such as cross prediction or multistep ahead prediction, in both single delay and multiple delay cases.

2.
PLoS Comput Biol ; 20(3): e1011846, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38489374

ABSTRACT

In a variety of neurons, action potentials (APs) initiate at the proximal axon, within a region called the axon initial segment (AIS), which has a high density of voltage-gated sodium channels (NaVs) on its membrane. In pyramidal neurons, the proximal AIS has been reported to exhibit a higher proportion of NaVs with gating properties that are "right-shifted" to more depolarized voltages, compared to the distal AIS. Further, recent experiments have revealed that as neurons develop, the spatial distribution of NaV subtypes along the AIS can change substantially, suggesting that neurons tune their excitability by modifying said distribution. When neurons are stimulated axonally, computational modelling has shown that this spatial separation of gating properties in the AIS enhances the backpropagation of APs into the dendrites. In contrast, in the more natural scenario of somatic stimulation, our simulations show that the same distribution can impede backpropagation, suggesting that the choice of orthodromic versus antidromic stimulation can bias or even invert experimental findings regarding the role of NaV subtypes in the AIS. We implemented a range of hypothetical NaV distributions in the AIS of three multicompartmental pyramidal cell models and investigated the precise kinetic mechanisms underlying such effects, as the spatial distribution of NaV subtypes is varied. With axonal stimulation, proximal NaV availability dominates, such that concentrating right-shifted NaVs in the proximal AIS promotes backpropagation. However, with somatic stimulation, the models are insensitive to availability kinetics. Instead, the higher activation threshold of right-shifted NaVs in the AIS impedes backpropagation. Therefore, recently observed developmental changes to the spatial separation and relative proportions of NaV1.2 and NaV1.6 in the AIS differentially impact activation and availability. The observed effects on backpropagation, and potentially learning via its putative role in synaptic plasticity (e.g. through spike-timing-dependent plasticity), are opposite for orthodromic versus antidromic stimulation, which should inform hypotheses about the impact of the developmentally regulated subcellular localization of these NaV subtypes.


Subject(s)
Axon Initial Segment , Voltage-Gated Sodium Channels , Axon Initial Segment/physiology , NAV1.6 Voltage-Gated Sodium Channel/ultrastructure , Axons/physiology , Neurons/physiology , Action Potentials/physiology
3.
J Physiol ; 602(3): 417-420, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38071740
4.
J Physiol ; 601(19): 4397-4422, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37676904

ABSTRACT

Hilar mossy cells (hMCs) in the dentate gyrus (DG) receive inputs from DG granule cells (GCs), CA3 pyramidal cells and inhibitory interneurons, and provide feedback input to GCs. Behavioural and in vivo recording experiments implicate hMCs in pattern separation, navigation and spatial learning. Our experiments link hMC intrinsic excitability to their synaptically evoked in vivo spiking outputs. We performed electrophysiological recordings from DG neurons and found that hMCs displayed an adaptative spike threshold that increased both in proportion to the intensity of injected currents, and in response to spiking itself, returning to baseline over a long time scale, thereby instantaneously limiting their firing rate responses. The hMC activity is additionally limited by a prominent medium after-hyperpolarizing potential (AHP) generated by small conductance K+ channels. We hypothesize that these intrinsic hMC properties are responsible for their low in vivo firing rates. Our findings extend previous studies that compare hMCs, CA3 pyramidal cells and hilar inhibitory cells and provide novel quantitative data that contrast the intrinsic properties of these cell types. We developed a phenomenological exponential integrate-and-fire model that closely reproduces the hMC adaptive threshold nonlinearities with respect to their threshold dependence on input current intensity, evoked spike latency and long-lasting spike-induced increase in spike threshold. Our robust and computationally efficient model is amenable to incorporation into large network models of the DG that will deepen our understanding of the neural bases of pattern separation, spatial navigation and learning. KEY POINTS: Previous studies have shown that hilar mossy cells (hMCs) are implicated in pattern separation and the formation of spatial memory, but how their intrinsic properties relate to their in vivo spiking patterns is still unknown. Here we show that the hMCs display electrophysiological properties that distinguish them from the other hilar cell types including a highly adaptive spike threshold that decays slowly. The spike-dependent increase in threshold combined with an after-hyperpolarizing potential mediated by a slow K+ conductance is hypothesized to be responsible for the low-firing rate of the hMC observed in vivo. The hMC's features are well captured by a modified stochastic exponential integrate-and-fire model that has the unique feature of a threshold intrinsically dependant on both the stimulus intensity and the spiking history. This computational model will allow future work to study how the hMCs can contribute to spatial memory formation and navigation.

5.
Entropy (Basel) ; 25(9)2023 Sep 03.
Article in English | MEDLINE | ID: mdl-37761590

ABSTRACT

Complex living systems, such as the human organism, are characterized by their self-organized and dissipative behaviors, where irreversible processes continuously produce entropy internally and export it to the environment; however, a means by which to measure human entropy production and entropy flow over time is not well-studied. In this article, we leverage prior experimental data to introduce an experimental approach for the continuous measurement of external entropy flow (released to the environment) and internal entropy production (within the body), using direct and indirect calorimetry, respectively, for humans exercising under heat stress. Direct calorimetry, performed with a whole-body modified Snellen calorimeter, was used to measure the external heat dissipation from the change in temperature and relative humidity between the air outflow and inflow, from which was derived the rates of entropy flow of the body. Indirect calorimetry, which measures oxygen consumption and carbon dioxide production from inspired and expired gases, was used to monitor internal entropy production. A two-compartment entropy flow model was used to calculate the rates of internal entropy production and external entropy flow for 11 middle-aged men during a schedule of alternating exercise and resting bouts at a fixed metabolic heat production rate. We measured a resting internal entropy production rate of (0.18 ± 0.01) W/(K·m2) during heat stress only, which is in agreement with published measurements. This research introduces an approach for the real-time monitoring of entropy production and entropy flow in humans, and aims for an improved understanding of human health and illness based on non-equilibrium thermodynamics.

6.
Elife ; 122023 01 19.
Article in English | MEDLINE | ID: mdl-36655738

ABSTRACT

By means of an expansive innervation, the serotonin (5-HT) neurons of the dorsal raphe nucleus (DRN) are positioned to enact coordinated modulation of circuits distributed across the entire brain in order to adaptively regulate behavior. Yet the network computations that emerge from the excitability and connectivity features of the DRN are still poorly understood. To gain insight into these computations, we began by carrying out a detailed electrophysiological characterization of genetically identified mouse 5-HT and somatostatin (SOM) neurons. We next developed a single-neuron modeling framework that combines the realism of Hodgkin-Huxley models with the simplicity and predictive power of generalized integrate-and-fire models. We found that feedforward inhibition of 5-HT neurons by heterogeneous SOM neurons implemented divisive inhibition, while endocannabinoid-mediated modulation of excitatory drive to the DRN increased the gain of 5-HT output. Our most striking finding was that the output of the DRN encodes a mixture of the intensity and temporal derivative of its input, and that the temporal derivative component dominates this mixture precisely when the input is increasing rapidly. This network computation primarily emerged from prominent adaptation mechanisms found in 5-HT neurons, including a previously undescribed dynamic threshold. By applying a bottom-up neural network modeling approach, our results suggest that the DRN is particularly apt to encode input changes over short timescales, reflecting one of the salient emerging computations that dominate its output to regulate behavior.


Subject(s)
Dorsal Raphe Nucleus , Serotonin , Mice , Animals , Dorsal Raphe Nucleus/physiology , Serotonin/physiology , Neurons/physiology , Neural Networks, Computer
7.
Chaos ; 32(5): 051101, 2022 May.
Article in English | MEDLINE | ID: mdl-35649970

ABSTRACT

Mounting evidence in recent years suggests that astrocytes, a sub-type of glial cells, not only serve metabolic and structural support for neurons and synapses but also play critical roles in the regulation of proper functioning of the nervous system. In this work, we investigate the effect of astrocytes on the spontaneous firing activity of a neuron through a combined model that includes a neuron-astrocyte pair. First, we show that an astrocyte may provide a kind of multistability in neuron dynamics by inducing different firing modes such as random and bursty spiking. Then, we identify the underlying mechanism of this behavior and search for the astrocytic factors that may have regulatory roles in different firing regimes. More specifically, we explore how an astrocyte can participate in the occurrence and control of spontaneous irregular spiking activity of a neuron in random spiking mode. Additionally, we systematically investigate the bursty firing regime dynamics of the neuron under the variation of biophysical facts related to the intracellular environment of the astrocyte. It is found that an astrocyte coupled to a neuron can provide a control mechanism for both spontaneous firing irregularity and burst firing statistics, i.e., burst regularity and size.


Subject(s)
Astrocytes , Models, Neurological , Neurons/physiology , Synapses/physiology
8.
Biol Cybern ; 116(2): 129-146, 2022 04.
Article in English | MEDLINE | ID: mdl-35486195

ABSTRACT

We elucidate how coupling delays and noise impact phase and mutual information relationships between two stochastic brain rhythms. This impact depends on the dynamical regime of each PING-based rhythm, as well as on network heterogeneity and coupling asymmetry. The number of peaks at positive and negative time lags in the delayed mutual information between the two bi-directionally communicating rhythms defines our measure of flexibility of information sharing and reflects the number of ways in which the two networks can alternately lead one another. We identify two distinct mechanisms for the appearance of qualitatively similar flexible information sharing. The flexibility in the quasi-cycle regime arises from the coupling delay-induced bimodality of the phase difference distribution, and the related bimodal mutual information. It persists in the presence of asymmetric coupling and heterogeneity but is limited to two routes of information sharing. The second mechanism in noisy limit cycle regime is not induced by the delay. However, delay-coupling and heterogeneity enable communication routes at multiple time lags. Noise disrupts the shared compromise frequency, allowing the expression of individual network frequencies which leads to a slow beating pattern. Simulations of an envelope-phase description for delay-coupled quasi-cycles yield qualitatively similar properties as for the full system. Near the bifurcation from in-phase to out-of-phase behaviour, a single preferred phase difference can coexist with two information sharing routes; further, the phase laggard can be the mutual information leader, or vice versa. Overall, the coupling delay endows a two-rhythm system with an array of lead-lag relationships and mutual information resonances that exist in spite of the noise and across the Hopf bifurcation. These beg to be mapped out experimentally with the help of our predictions.


Subject(s)
Brain
9.
Cogn Neurodyn ; 16(1): 117-133, 2022 Feb.
Article in English | MEDLINE | ID: mdl-35116084

ABSTRACT

Human brain imaging has revealed that stimulus-induced activity does generally not simply add to the pre-stimulus activity, but rather builds in a non-additive way on this activity. Here we investigate this subject at the single neuron level and address the question whether and to what extent a strong form of non-additivity where activity drops post-cue is present in different areas of monkey cortex, including prefrontal and agranular frontal areas, during a perceptual decision making task involving action and tactic selection. Specifically we analyze spike train data recorded in vivo from the posterior dorsomedial prefrontal cortex (pmPFC), the supplementary motor area (SMA) and the presupplementary motor area (pre-SMA). For each neuron, we compute the ratio of the trial-averaged pre-stimulus spike count to the trial-averaged post-stimulus count. We also perform the ratio and averaging procedures in reverse order. We find that the statistics of these quantities behave differently across areas. pmPFC involved in tactic selection shows stronger non-additivity compared to the two other areas which more generically just increase their firing rate pos-stimulus. pmPFC behaved more similarly to pre-SMA, a likely consequence of the reciprocal connections between these areas. The trial-averaged ratio statistic was reproduced by a surrogate inhomogeneous Poisson process in which the measured trial-averaged firing rate for a given neuron is used as its time-dependent rate. Principal component analysis (PCA) of the trial-averaged firing rates of neuronal ensembles further reveals area-specific time courses of response to the stimulus, including latency to peak neural response, for the typical population activity. Our work demonstrates subtle forms of area-specific non-additivity based on the fine variability structure of pre- and post-stimulus spiking activity on the single neuron level. It also reveals significant differences between areas for PCA and surrogate analysis, complementing previous observations of regional differences based solely on post-stimulus responses. Moreover, we observe regional differences in non-additivity which are related to the monkey's successful tactic selection and decision making. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s11571-021-09702-0.

10.
Curr Biol ; 32(1): 51-63.e3, 2022 01 10.
Article in English | MEDLINE | ID: mdl-34741807

ABSTRACT

High-level neural activity often exhibits mixed selectivity to multivariate signals. How such representations arise and modulate natural behavior is poorly understood. We addressed this question in weakly electric fish, whose social behavior is relatively low dimensional and can be easily reproduced in the laboratory. We report that the preglomerular complex, a thalamic region exclusively connecting midbrain with pallium, implements a mixed selectivity strategy to encode interactions related to courtship and rivalry. We discuss how this code enables the pallial recurrent networks to control social behavior, including dominance in male-male competition and female mate selection. Notably, response latency analysis and computational modeling suggest that corollary discharge from premotor regions is implicated in flagging outgoing communications and thereby disambiguating self- versus non-self-generated signals. These findings provide new insights into the neural substrates of social behavior, multi-dimensional neural representation, and its role in perception and decision making.


Subject(s)
Electric Fish , Animals , Electric Fish/physiology , Electric Organ/physiology , Female , Male , Mesencephalon , Reaction Time , Thalamus
11.
Front Syst Neurosci ; 15: 720744, 2021.
Article in English | MEDLINE | ID: mdl-34867219

ABSTRACT

Neural circuits operate with delays over a range of time scales, from a few milliseconds in recurrent local circuitry to tens of milliseconds or more for communication between populations. Modeling usually incorporates single fixed delays, meant to represent the mean conduction delay between neurons making up the circuit. We explore conditions under which the inclusion of more delays in a high-dimensional chaotic neural network leads to a reduction in dynamical complexity, a phenomenon recently described as multi-delay complexity collapse (CC) in delay-differential equations with one to three variables. We consider a recurrent local network of 80% excitatory and 20% inhibitory rate model neurons with 10% connection probability. An increase in the width of the distribution of local delays, even to unrealistically large values, does not cause CC, nor does adding more local delays. Interestingly, multiple small local delays can cause CC provided there is a moderate global delayed inhibitory feedback and random initial conditions. CC then occurs through the settling of transient chaos onto a limit cycle. In this regime, there is a form of noise-induced order in which the mean activity variance decreases as the noise increases and disrupts the synchrony. Another novel form of CC is seen where global delayed feedback causes "dropouts," i.e., epochs of low firing rate network synchrony. Their alternation with epochs of higher firing rate asynchrony closely follows Poisson statistics. Such dropouts are promoted by larger global feedback strength and delay. Finally, periodic driving of the chaotic regime with global feedback can cause CC; the extinction of chaos can outlast the forcing, sometimes permanently. Our results suggest a wealth of phenomena that remain to be discovered in networks with clusters of delays.

12.
Chaos ; 31(10): 103129, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34717310

ABSTRACT

We investigate transitions to simple dynamics in first-order nonlinear differential equations with multiple delays. With a proper choice of parameters, a single delay can destabilize a fixed point. In contrast, multiple delays can both destabilize fixed points and promote high-dimensional chaos but also induce stabilization onto simpler dynamics. We show that the dynamics of these systems depend on the precise distribution of the delays. Narrow spacing between individual delays induces chaotic behavior, while a lower density of delays enables stable periodic or fixed point behavior. As the dynamics become simpler, the number of unstable roots of the characteristic equation around the fixed point decreases. In fact, the behavior of these roots exhibits an astonishing parallel with that of the Lyapunov exponents and the Kolmogorov-Sinai entropy for these multi-delay systems. A theoretical analysis shows how these roots move back toward stability as the number of delays increases. Our results are based on numerical determination of the Lyapunov spectrum for these multi-delay systems as well as on permutation entropy computations. Finally, we report how complexity reduction upon adding more delays can occur through an inverse period-doubling sequence.

13.
Biology (Basel) ; 10(8)2021 Aug 10.
Article in English | MEDLINE | ID: mdl-34439996

ABSTRACT

Brain areas must be able to interact and share information in a time-varying, dynamic manner on a fast timescale. Such flexibility in information sharing has been linked to the synchronization of rhythm phases between areas. One definition of flexibility is the number of local maxima in the delayed mutual information curve between two connected areas. However, the precise relationship between phase synchronization and information sharing is not clear, nor is the flexibility in the face of the fixed structural connectivity and noise. Here, we consider two coupled oscillatory excitatory-inhibitory networks connected through zero-delay excitatory connections, each of which mimics a rhythmic brain area. We numerically compute phase-locking and delayed mutual information between the phases of excitatory local field potential (LFPs) of the two networks, which measures the shared information and its direction. The flexibility in information sharing is shown to depend on the dynamical origin of oscillations, and its properties in different regimes are found to persist in the presence of asymmetry in the connectivity as well as system heterogeneity. For coupled noise-induced rhythms (quasi-cycles), phase synchronization is robust even in the presence of asymmetry and heterogeneity. However, they do not show flexibility, in contrast to noise-perturbed rhythms (noisy limit cycles), which are shown here to exhibit two local information maxima, i.e., flexibility. For quasi-cycles, phase difference and information measures for the envelope-phase dynamics obtained from previous analytical work using the Stochastic Averaging Method (SAM) are found to be in good qualitative agreement with those obtained from the original dynamics. The relation between phase synchronization and communication patterns is not trivial, particularly in the noisy limit cycle regime. There, complex patterns of information sharing can be observed for a single value of the phase difference. The mechanisms reported here can be extended to I-I networks since their phase synchronizations are similar. Our results set the stage for investigating information sharing between several connected noisy rhythms in neural and other complex biological networks.

14.
Neuroimage ; 238: 118160, 2021 09.
Article in English | MEDLINE | ID: mdl-34058331

ABSTRACT

Neural responses to the same stimulus show significant variability over trials, with this variability typically reduced (quenched) after a stimulus is presented. This trial-to-trial variability (TTV) has been much studied, however how this neural variability quenching is influenced by the ongoing dynamics of the prestimulus period is unknown. Utilizing a human intracranial stereo-electroencephalography (sEEG) data set, we investigate how prestimulus dynamics, as operationalized by standard deviation (SD), shapes poststimulus activity through trial-to-trial variability (TTV). We first observed greater poststimulus variability quenching in those real trials exhibiting high prestimulus variability as observed in all frequency bands. Next, we found that the relative effect of the stimulus was higher in the later (300-600ms) than the earlier (0-300ms) poststimulus period. Lastly, we replicate our findings in a separate EEG dataset and extend them by finding that trials with high prestimulus variability in the theta and alpha bands had faster reaction times. Together, our results demonstrate that stimulus-related activity, including its variability, is a blend of two factors: 1) the effects of the external stimulus itself, and 2) the effects of the ongoing dynamics spilling over from the prestimulus period - the state at stimulus onset - with the second dwarfing the influence of the first.


Subject(s)
Brain/physiopathology , Drug Resistant Epilepsy/physiopathology , Evoked Potentials, Auditory/physiology , Acoustic Stimulation , Adult , Brain Mapping , Electroencephalography , Female , Humans , Male , Reaction Time/physiology , Young Adult
15.
Philos Trans A Math Phys Eng Sci ; 379(2198): 20200267, 2021 May 31.
Article in English | MEDLINE | ID: mdl-33840211

ABSTRACT

Recent findings have revealed that not only neurons but also astrocytes, a special type of glial cells, are major players of neuronal information processing. It is now widely accepted that they contribute to the regulation of their microenvironment by cross-talking with neurons via gliotransmitters. In this context, we here study the phenomenon of vibrational resonance in neurons by considering their interaction with astrocytes. Our analysis of a neuron-astrocyte pair reveals that intracellular dynamics of astrocytes can induce a double vibrational resonance effect in the weak signal detection performance of a neuron, exhibiting two distinct wells centred at different high-frequency driving amplitudes. We also identify the underlying mechanism of this behaviour, showing that the interaction of widely separated time scales of neurons, astrocytes and driving signals is the key factor for the emergence and control of double vibrational resonance. This article is part of the theme issue 'Vibrational and stochastic resonance in driven nonlinear systems (part 2)'.

16.
Chaos ; 31(1): 013117, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33754759

ABSTRACT

Many healthy and pathological brain rhythms, including beta and gamma rhythms and essential tremor, are suspected to be induced by noise. This yields randomly occurring, brief epochs of higher amplitude oscillatory activity known as "bursts," the statistics of which are important for proper neural function. Here, we consider a more realistic model with both multiplicative and additive noise instead of only additive noise, to understand how state-dependent fluctuations further affect rhythm induction. For illustrative purposes, we calibrate the model at the lower end of the beta band that relates to movement; parameter tuning can extend the relevance of our analysis to the higher frequency gamma band or to lower frequency essential tremors. A stochastic Wilson-Cowan model for reciprocally as well as self-coupled excitatory (E) and inhibitory (I) populations is analyzed in the parameter regime where the noise-free dynamics spiral in to a fixed point. Noisy oscillations known as quasi-cycles are then generated by stochastic synaptic inputs. The corresponding dynamics of E and I local field potentials can be studied using linear stochastic differential equations subject to both additive and multiplicative noises. As the prevalence of bursts is proportional to the slow envelope of the E and I firing activities, we perform an envelope-phase decomposition using the stochastic averaging method. The resulting envelope dynamics are uni-directionally coupled to the phase dynamics as in the case of additive noise alone but both dynamics involve new noise-dependent terms. We derive the stationary probability and compute power spectral densities of envelope fluctuations. We find that multiplicative noise can enhance network synchronization by reducing the magnitude of the negative real part of the complex conjugate eigenvalues. Higher noise can lead to a "virtual limit cycle," where the deterministically stable eigenvalues around the fixed point acquire a positive real part, making the system act more like a noisy limit cycle rather than a quasi-cycle. Multiplicative noise can thus exacerbate synchronization and possibly contribute to the onset of symptoms in certain motor diseases.


Subject(s)
Brain , Gamma Rhythm , Humans , Noise
17.
Neural Comput ; 33(2): 341-375, 2021 02.
Article in English | MEDLINE | ID: mdl-33253034

ABSTRACT

Spike trains with negative interspike interval (ISI) correlations, in which long/short ISIs are more likely followed by short/long ISIs, are common in many neurons. They can be described by stochastic models with a spike-triggered adaptation variable. We analyze a phenomenon in these models where such statistically dependent ISI sequences arise in tandem with quasi-statistically independent and identically distributed (quasi-IID) adaptation variable sequences. The sequences of adaptation states and resulting ISIs are linked by a nonlinear decorrelating transformation. We establish general conditions on a family of stochastic spiking models that guarantee this quasi-IID property and establish bounds on the resulting baseline ISI correlations. Inputs that elicit weak firing rate changes in samples with many spikes are known to be more detectible when negative ISI correlations are present because they reduce spike count variance; this defines a variance-reduced firing rate coding benchmark. We performed a Fisher information analysis on these adapting models exhibiting ISI correlations to show that a spike pattern code based on the quasi-IID property achieves the upper bound of detection performance, surpassing rate codes with the same mean rate-including the variance-reduced rate code benchmark-by 20% to 30%. The information loss in rate codes arises because the benefits of reduced spike count variance cannot compensate for the lower firing rate gain due to adaptation. Since adaptation states have similar dynamics to synaptic responses, the quasi-IID decorrelation transformation of the spike train is plausibly implemented by downstream neurons through matched postsynaptic kinetics. This provides an explanation for observed coding performance in sensory systems that cannot be accounted for by rate coding, for example, at the detection threshold where rate changes can be insignificant.

18.
Neural Comput ; 32(8): 1448-1498, 2020 08.
Article in English | MEDLINE | ID: mdl-32521212

ABSTRACT

Understanding how rich dynamics emerge in neural populations requires models exhibiting a wide range of behaviors while remaining interpretable in terms of connectivity and single-neuron dynamics. However, it has been challenging to fit such mechanistic spiking networks at the single-neuron scale to empirical population data. To close this gap, we propose to fit such data at a mesoscale, using a mechanistic but low-dimensional and, hence, statistically tractable model. The mesoscopic representation is obtained by approximating a population of neurons as multiple homogeneous pools of neurons and modeling the dynamics of the aggregate population activity within each pool. We derive the likelihood of both single-neuron and connectivity parameters given this activity, which can then be used to optimize parameters by gradient ascent on the log likelihood or perform Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. We illustrate this approach using a model of generalized integrate-and-fire neurons for which mesoscopic dynamics have been previously derived and show that both single-neuron and connectivity parameters can be recovered from simulated data. In particular, our inference method extracts posterior correlations between model parameters, which define parameter subsets able to reproduce the data. We compute the Bayesian posterior for combinations of parameters using MCMC sampling and investigate how the approximations inherent in a mesoscopic population model affect the accuracy of the inferred single-neuron parameters.

19.
J Math Neurosci ; 10(1): 6, 2020 Apr 20.
Article in English | MEDLINE | ID: mdl-32314104

ABSTRACT

Following publication of the original article (Naud and Longtin in J Math Neurosci 9:3, 2019), the authors noticed a mistake in the first paragraph within "Altered propagation".

20.
Sci Rep ; 9(1): 18335, 2019 12 04.
Article in English | MEDLINE | ID: mdl-31797877

ABSTRACT

Brain rhythms recorded in vivo, such as gamma oscillations, are notoriously variable both in amplitude and frequency. They are characterized by transient epochs of higher amplitude known as bursts. It has been suggested that, despite their short-life and random occurrence, bursts in gamma and other rhythms can efficiently contribute to working memory or communication tasks. Abnormalities in bursts have also been associated with e.g. motor and psychiatric disorders. It is thus crucial to understand how single cell and connectivity parameters influence burst statistics and the corresponding brain states. To address this problem, we consider a generic stochastic recurrent network of Pyramidal Interneuron Network Gamma (PING) type. Using the stochastic averaging method, we derive dynamics for the phase and envelope of the amplitude process, and find that they depend on only two meta-parameters that combine all the model parameters. This allows us to identify an optimal parameter regime of healthy variability with similar statistics to those seen in vivo; in this regime, oscillations and bursts are supported by synaptic noise. The probability density for the rhythm's envelope as well as the mean burst duration are then derived using first passage time analysis. Our analysis enables us to link burst attributes, such as duration and frequency content, to system parameters. Our general approach can be extended to different frequency bands, network topologies and extra populations. It provides the much needed insight into the biophysical determinants of rhythm burst statistics, and into what needs to be changed to correct rhythms with pathological statistics.


Subject(s)
Brain/physiology , Electroencephalography , Models, Theoretical , Respiratory Burst/physiology , Action Potentials/physiology , Animals , Gamma Rhythm , Humans , Interneurons/pathology , Interneurons/physiology , Memory, Short-Term/physiology , Models, Neurological , Pyramidal Cells/pathology , Pyramidal Cells/physiology
SELECTION OF CITATIONS
SEARCH DETAIL