Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 55
1.
Sci Adv ; 10(18): eadk7257, 2024 May 03.
Article En | MEDLINE | ID: mdl-38701208

Neuromodulators have been shown to alter the temporal profile of short-term synaptic plasticity (STP); however, the computational function of this neuromodulation remains unexplored. Here, we propose that the neuromodulation of STP provides a general mechanism to scale neural dynamics and motor outputs in time and space. We trained recurrent neural networks that incorporated STP to produce complex motor trajectories-handwritten digits-with different temporal (speed) and spatial (size) scales. Neuromodulation of STP produced temporal and spatial scaling of the learned dynamics and enhanced temporal or spatial generalization compared to standard training of the synaptic weights in the absence of STP. The model also accounted for the results of two experimental studies involving flexible sensorimotor timing. Neuromodulation of STP provides a unified and biologically plausible mechanism to control the temporal and spatial scales of neural dynamics and sensorimotor behaviors.


Neuronal Plasticity , Neuronal Plasticity/physiology , Humans , Models, Neurological , Neurotransmitter Agents/metabolism , Animals , Learning/physiology , Neural Networks, Computer
2.
J Neurosci ; 43(45): 7565-7574, 2023 11 08.
Article En | MEDLINE | ID: mdl-37940593

The ability to store information about the past to dynamically predict and prepare for the future is among the most fundamental tasks the brain performs. To date, the problems of understanding how the brain stores and organizes information about the past (memory) and how the brain represents and processes temporal information for adaptive behavior have generally been studied as distinct cognitive functions. This Symposium explores the inherent link between memory and temporal cognition, as well as the potential shared neural mechanisms between them. We suggest that working memory and implicit timing are interconnected and may share overlapping neural mechanisms. Additionally, we explore how temporal structure is encoded in associative and episodic memory and, conversely, the influences of episodic memory on subsequent temporal anticipation and the perception of time. We suggest that neural sequences provide a general computational motif that contributes to timing and working memory, as well as the spatiotemporal coding and recall of episodes.


Brain , Memory, Episodic , Mental Recall , Cognition , Memory, Short-Term
3.
Neuron ; 111(18): 2863-2880.e6, 2023 09 20.
Article En | MEDLINE | ID: mdl-37451263

Changes in the function of inhibitory interneurons (INs) during cortical development could contribute to the pathophysiology of neurodevelopmental disorders. Using all-optical in vivo approaches, we find that parvalbumin (PV) INs and their immature precursors are hypoactive and transiently decoupled from excitatory neurons in postnatal mouse somatosensory cortex (S1) of Fmr1 KO mice, a model of fragile X syndrome (FXS). This leads to a loss of parvalbumin INs (PV-INs) in both mice and humans with FXS. Increasing the activity of future PV-INs in neonatal Fmr1 KO mice restores PV-IN density and ameliorates transcriptional dysregulation in S1, but not circuit dysfunction. Critically, administering an allosteric modulator of Kv3.1 channels after the S1 critical period does rescue circuit dynamics and tactile defensiveness. Symptoms in FXS and related disorders could be mitigated by targeting PV-INs.


Fragile X Syndrome , Parvalbumins , Humans , Mice , Animals , Parvalbumins/genetics , Parvalbumins/metabolism , Fragile X Mental Retardation Protein/genetics , Interneurons/physiology , Neurons/metabolism , Touch , Fragile X Syndrome/genetics , Mice, Knockout , Disease Models, Animal
4.
Nat Hum Behav ; 7(7): 1170-1184, 2023 07.
Article En | MEDLINE | ID: mdl-37081099

Working memory (WM) and timing are generally considered distinct cognitive functions, but similar neural signatures have been implicated in both. To explore the hypothesis that WM and timing may rely on shared neural mechanisms, we used psychophysical tasks that contained either task-irrelevant timing or WM components. In both cases, the task-irrelevant component influenced performance. We then developed recurrent neural network (RNN) simulations that revealed that cue-specific neural sequences, which multiplexed WM and time, emerged as the dominant regime that captured the behavioural findings. During training, RNN dynamics transitioned from low-dimensional ramps to high-dimensional neural sequences, and depending on task requirements, steady-state or ramping activity was also observed. Analysis of RNN structure revealed that neural sequences relied primarily on inhibitory connections, and could survive the deletion of all excitatory-to-excitatory connections. Our results indicate that in some instances WM is encoded in time-varying neural activity because of the importance of predicting when WM will be used.


Cognition , Memory, Short-Term , Humans , Neural Networks, Computer
5.
J Neurosci ; 43(1): 82-92, 2023 01 04.
Article En | MEDLINE | ID: mdl-36400529

Cortical computations emerge from the dynamics of neurons embedded in complex cortical circuits. Within these circuits, neuronal ensembles, which represent subnetworks with shared functional connectivity, emerge in an experience-dependent manner. Here we induced ensembles in ex vivo cortical circuits from mice of either sex by differentially activating subpopulations through chronic optogenetic stimulation. We observed a decrease in voltage correlation, and importantly a synaptic decoupling between the stimulated and nonstimulated populations. We also observed a decrease in firing rate during Up-states in the stimulated population. These ensemble-specific changes were accompanied by decreases in intrinsic excitability in the stimulated population, and a decrease in connectivity between stimulated and nonstimulated pyramidal neurons. By incorporating the empirically observed changes in intrinsic excitability and connectivity into a spiking neural network model, we were able to demonstrate that changes in both intrinsic excitability and connectivity accounted for the decreased firing rate, but only changes in connectivity accounted for the observed decorrelation. Our findings help ascertain the mechanisms underlying the ability of chronic patterned stimulation to create ensembles within cortical circuits and, importantly, show that while Up-states are a global network-wide phenomenon, functionally distinct ensembles can preserve their identity during Up-states through differential firing rates and correlations.SIGNIFICANCE STATEMENT The connectivity and activity patterns of local cortical circuits are shaped by experience. This experience-dependent reorganization of cortical circuits is driven by complex interactions between different local learning rules, external input, and reciprocal feedback between many distinct brain areas. Here we used an ex vivo approach to demonstrate how simple forms of chronic external stimulation can shape local cortical circuits in terms of their correlated activity and functional connectivity. The absence of feedback between different brain areas and full control of external input allowed for a tractable system to study the underlying mechanisms and development of a computational model. Results show that differential stimulation of subpopulations of neurons significantly reshapes cortical circuits and forms subnetworks referred to as neuronal ensembles.


Neuronal Plasticity , Optogenetics , Mice , Animals , Neuronal Plasticity/physiology , Neurons/physiology , Pyramidal Cells/physiology , Homeostasis/physiology
6.
Proc Natl Acad Sci U S A ; 119(43): e2200621119, 2022 10 25.
Article En | MEDLINE | ID: mdl-36251988

Self-sustained neural activity maintained through local recurrent connections is of fundamental importance to cortical function. Converging theoretical and experimental evidence indicates that cortical circuits generating self-sustained dynamics operate in an inhibition-stabilized regime. Theoretical work has established that four sets of weights (WE←E, WE←I, WI←E, and WI←I) must obey specific relationships to produce inhibition-stabilized dynamics, but it is not known how the brain can appropriately set the values of all four weight classes in an unsupervised manner to be in the inhibition-stabilized regime. We prove that standard homeostatic plasticity rules are generally unable to generate inhibition-stabilized dynamics and that their instability is caused by a signature property of inhibition-stabilized networks: the paradoxical effect. In contrast, we show that a family of "cross-homeostatic" rules overcome the paradoxical effect and robustly lead to the emergence of stable dynamics. This work provides a model of how-beginning from a silent network-self-sustained inhibition-stabilized dynamics can emerge from learning rules governing all four synaptic weight classes in an orchestrated manner.


Nerve Net , Neuronal Plasticity , Brain , Homeostasis , Learning , Models, Neurological
7.
Behav Neurosci ; 136(5): 374-382, 2022 Oct.
Article En | MEDLINE | ID: mdl-35446093

The ability to predict and prepare for near- and far-future events is among the most fundamental computations the brain performs. Because of the importance of time for prediction and sensorimotor processing, the brain has evolved multiple mechanisms to tell and encode time across scales ranging from microseconds to days and beyond. Converging experimental and computational data indicate that, on the scale of seconds, timing relies on diverse neural mechanisms distributed across different brain areas. Among the different encoding mechanisms on the scale of seconds, we distinguish between neural population clocks and ramping activity as distinct strategies to encode time. One instance of neural population clocks, neural sequences, represents in some ways an optimal and flexible dynamic regime for the encoding of time. Specifically, neural sequences comprise a high-dimensional representation that can be used by downstream areas to flexibly generate arbitrarily simple and complex output patterns using biologically plausible learning rules. We propose that high-level integration areas may use high-dimensional dynamics such as neural sequences to encode time, providing downstream areas information to build low-dimensional ramp-like activity that can drive movements and temporal expectation. (PsycInfo Database Record (c) 2022 APA, all rights reserved).


Brain , Time Perception , Learning , Models, Neurological
8.
PLoS Comput Biol ; 18(3): e1009271, 2022 03.
Article En | MEDLINE | ID: mdl-35239644

Converging evidence suggests the brain encodes time in dynamic patterns of neural activity, including neural sequences, ramping activity, and complex dynamics. Most temporal tasks, however, require more than just encoding time, and can have distinct computational requirements including the need to exhibit temporal scaling, generalize to novel contexts, or robustness to noise. It is not known how neural circuits can encode time and satisfy distinct computational requirements, nor is it known whether similar patterns of neural activity at the population level can exhibit dramatically different computational or generalization properties. To begin to answer these questions, we trained RNNs on two timing tasks based on behavioral studies. The tasks had different input structures but required producing identically timed output patterns. Using a novel framework we quantified whether RNNs encoded two intervals using either of three different timing strategies: scaling, absolute, or stimulus-specific dynamics. We found that similar neural dynamic patterns at the level of single intervals, could exhibit fundamentally different properties, including, generalization, the connectivity structure of the trained networks, and the contribution of excitatory and inhibitory neurons. Critically, depending on the task structure RNNs were better suited for generalization or robustness to noise. Further analysis revealed different connection patterns underlying the different regimes. Our results predict that apparently similar neural dynamic patterns at the population level (e.g., neural sequences) can exhibit fundamentally different computational properties in regards to their ability to generalize to novel stimuli and their robustness to noise-and that these differences are associated with differences in network connectivity and distinct contributions of excitatory and inhibitory neurons. We also predict that the task structure used in different experimental studies accounts for some of the experimentally observed variability in how networks encode time.


Models, Neurological , Neurons , Brain/physiology , Neurons/physiology
9.
J Neurosci ; 41(34): 7182-7196, 2021 08 25.
Article En | MEDLINE | ID: mdl-34253625

Up states are the best studied example of an emergent neural dynamic regime. Computational models based on a single class of inhibitory neurons indicate that Up states reflect bistable dynamic systems in which positive feedback is stabilized by strong inhibition and predict a paradoxical effect in which increased drive to inhibitory neurons results in decreased inhibitory activity. To date, however, computational models have not incorporated empirically defined properties of parvalbumin (PV) and somatostatin (SST) neurons. Here we first experimentally characterized the frequency-current (F-I) curves of pyramidal (Pyr), PV, and SST neurons from mice of either sex, and confirmed a sharp difference between the threshold and slopes of PV and SST neurons. The empirically defined F-I curves were incorporated into a three-population computational model that simulated the empirically derived firing rates of pyramidal, PV, and SST neurons. Simulations revealed that the intrinsic properties were sufficient to predict that PV neurons are primarily responsible for generating the nontrivial fixed points representing Up states. Simulations and analytical methods demonstrated that while the paradoxical effect is not obligatory in a model with two classes of inhibitory neurons, it is present in most regimes. Finally, experimental tests validated predictions of the model that the Pyr ↔ PV inhibitory loop is stronger than the Pyr ↔ SST loop.SIGNIFICANCE STATEMENT Many cortical computations, such as working memory, rely on the local recurrent excitatory connections that define cortical circuit motifs. Up states are among the best studied examples of neural dynamic regimes that rely on recurrent excitatory excitation. However, this positive feedback must be held in check by inhibition. To address the relative contribution of PV and SST neurons, we characterized the intrinsic input-output differences between these classes of inhibitory neurons and, using experimental and theoretical methods, show that the higher threshold and gain of PV leads to a dominant role in network stabilization.


Neurons/physiology , Action Potentials , Animals , Computer Simulation , Feedback, Physiological , Mice , Models, Neurological , Neurons/chemistry , Neurons/classification , Optogenetics , Parvalbumins/analysis , Pyramidal Cells/chemistry , Pyramidal Cells/physiology , Somatostatin/analysis , Transfection
10.
J Neurosci ; 40(48): 9224-9235, 2020 11 25.
Article En | MEDLINE | ID: mdl-33097639

Cortical responses to sensory stimuli are strongly modulated by temporal context. One of the best studied examples of such modulation is sensory adaptation. We first show that in response to repeated tones pyramidal (Pyr) neurons in male mouse auditory cortex (A1) exhibit facilitating and stable responses, in addition to adapting responses. To examine the potential mechanisms underlying these distinct temporal profiles, we developed a reduced spiking model of sensory cortical circuits that incorporated the signature short-term synaptic plasticity (STP) profiles of the inhibitory parvalbumin (PV) and somatostatin (SST) interneurons. The model accounted for all three temporal response profiles as the result of dynamic changes in excitatory/inhibitory balance produced by STP, primarily through shifts in the relative latency of Pyr and inhibitory neurons. Transition between the three response profiles was possible by changing the strength of the inhibitory PV→Pyr and SST→Pyr synapses. The model predicted that a unit's latency would be related to its temporal profile. Consistent with this prediction, the latency of stable units was significantly shorter than that of adapting and facilitating units. Furthermore, because of the history-dependence of STP the model generated a paradoxical prediction: that inactivation of inhibitory neurons during one tone would decrease the response of A1 neurons to a subsequent tone. Indeed, we observed that optogenetic inactivation of PV neurons during one tone counterintuitively decreased the spiking of Pyr neurons to a subsequent tone 400 ms later. These results provide evidence that STP is critical to temporal context-dependent responses in the sensory cortex.SIGNIFICANCE STATEMENT Our perception of speech and music depends strongly on temporal context, i.e., the significance of a stimulus depends on the preceding stimuli. Complementary neural mechanisms are needed to sometimes ignore repetitive stimuli (e.g., the tic of a clock) or detect meaningful repetition (e.g., consecutive tones in Morse code). We modeled a neural circuit that accounts for diverse experimentally-observed response profiles in auditory cortex (A1) neurons, based on known forms of short-term synaptic plasticity (STP). Whether the simulated circuit reduced, maintained, or enhanced its response to repeated tones depended on the relative dominance of two different types of inhibitory cells. The model made novel predictions that were experimentally validated. Results define an important role for STP in temporal context-dependent perception.


Acoustic Stimulation , Auditory Cortex/physiology , Neuronal Plasticity/physiology , Neurons/physiology , Parvalbumins/physiology , Somatostatin/physiology , Algorithms , Animals , Auditory Cortex/cytology , Computer Simulation , Male , Mice , Optogenetics , Pyramidal Cells/physiology
11.
Neuron ; 108(4): 651-658.e5, 2020 11 25.
Article En | MEDLINE | ID: mdl-32946745

Converging evidence suggests that the brain encodes time through dynamically changing patterns of neural activity, including neural sequences, ramping activity, and complex spatiotemporal dynamics. However, the potential computational significance and advantage of these different regimes have remained unaddressed. We combined large-scale recordings and modeling to compare population dynamics between premotor cortex and striatum in mice performing a two-interval timing task. Conventional decoders revealed that the dynamics within each area encoded time equally well; however, the dynamics in striatum exhibited a higher degree of sequentiality. Analysis of premotor and striatal dynamics, together with a large set of simulated prototypical dynamical regimes, revealed that regimes with higher sequentiality allowed a biologically constrained artificial downstream network to better read out time. These results suggest that, although different strategies exist for encoding time in the brain, neural sequences represent an ideal and flexible dynamical regime for enabling downstream areas to read out this information.


Corpus Striatum/physiology , Models, Neurological , Motor Cortex/physiology , Time Perception/physiology , Action Potentials/physiology , Animals , Computer Simulation , Male , Mice , Neurons/physiology
12.
PLoS One ; 15(1): e0221000, 2020.
Article En | MEDLINE | ID: mdl-31905200

A key feature of the brain's ability to tell time and generate complex temporal patterns is its capacity to produce similar temporal patterns at different speeds. For example, humans can tie a shoe, type, or play an instrument at different speeds or tempi-a phenomenon referred to as temporal scaling. While it is well established that training improves timing precision and accuracy, it is not known whether expertise improves temporal scaling, and if so, whether it generalizes across skill domains. We quantified temporal scaling and timing precision in musicians and non-musicians as they learned to tap a Morse code sequence. We found that non-musicians improved significantly over the course of days of training at the standard speed. In contrast, musicians exhibited a high level of temporal precision on the first day, which did not improve significantly with training. Although there was no significant difference in performance at the end of training at the standard speed, musicians were significantly better at temporal scaling-i.e., at reproducing the learned Morse code pattern at faster and slower speeds. Interestingly, both musicians and non-musicians exhibited a Weber-speed effect, where temporal precision at the same absolute time was higher when producing patterns at the faster speed. These results are the first to establish that the ability to generate the same motor patterns at different speeds improves with extensive training and generalizes to non-musical domains.


Auditory Perception/physiology , Brain/physiology , Music , Psychomotor Performance/physiology , Acoustic Stimulation , Acoustics , Adult , Female , Humans , Learning/physiology , Linear Models , Male
13.
Nat Commun ; 9(1): 4732, 2018 11 09.
Article En | MEDLINE | ID: mdl-30413692

Timing is fundamental to complex motor behaviors: from tying a knot to playing the piano. A general feature of motor timing is temporal scaling: the ability to produce motor patterns at different speeds. One theory of temporal processing proposes that the brain encodes time in dynamic patterns of neural activity (population clocks), here we first examine whether recurrent neural network (RNN) models can account for temporal scaling. Appropriately trained RNNs exhibit temporal scaling over a range similar to that of humans and capture a signature of motor timing, Weber's law, but predict that temporal precision improves at faster speeds. Human psychophysics experiments confirm this prediction: the variability of responses in absolute time are lower at faster speeds. These results establish that RNNs can account for temporal scaling and suggest a novel psychophysical principle: the Weber-Speed effect.


Models, Biological , Motor Activity/physiology , Adolescent , Humans , Neural Networks, Computer , Time Factors , Young Adult
14.
Trends Neurosci ; 41(10): 701-711, 2018 10.
Article En | MEDLINE | ID: mdl-30274605

The ability to detect time intervals and temporal patterns is critical to some of the most fundamental computations the brain performs, including the ability to communicate and appraise a dynamically changing environment. Many of these computations take place on the scale of tens to hundreds of milliseconds. Electrophysiological evidence shows that some neurons respond selectively to duration, interval, rate, or order. Because the time constants of many time-varying neural and synaptic properties, including short-term synaptic plasticity (STP), are also in the range of tens to hundreds of milliseconds, they are strong candidates to underlie the formation of temporally selective neurons. Neurophysiological studies indicate that STP is indeed one of the mechanisms that contributes to temporal selectivity, and computational models demonstrate that neurons embedded in local microcircuits exhibit temporal selectivity if their synapses undergo STP. Converging evidence suggests that some forms of temporal selectivity emerge from the dynamic changes in the balance of excitation and inhibition imposed by STP.


Action Potentials/physiology , Neuronal Plasticity/physiology , Neurons/physiology , Synapses/physiology , Animals , Humans , Models, Neurological , Nerve Net/physiology
15.
Neuron ; 98(4): 687-705, 2018 05 16.
Article En | MEDLINE | ID: mdl-29772201

Timing is critical to most forms of learning, behavior, and sensory-motor processing. Converging evidence supports the notion that, precisely because of its importance across a wide range of brain functions, timing relies on intrinsic and general properties of neurons and neural circuits; that is, the brain uses its natural cellular and network dynamics to solve a diversity of temporal computations. Many circuits have been shown to encode elapsed time in dynamically changing patterns of neural activity-so-called population clocks. But temporal processing encompasses a wide range of different computations, and just as there are different circuits and mechanisms underlying computations about space, there are a multitude of circuits and mechanisms underlying the ability to tell time and generate temporal patterns.


Biological Clocks/physiology , Neurons/physiology , Time Perception , Animals , Anticipation, Psychological , Behavior , Brain , Cognition , Humans , Learning , Models, Neurological , Time
16.
Elife ; 72018 03 14.
Article En | MEDLINE | ID: mdl-29537963

Much of the information the brain processes and stores is temporal in nature-a spoken word or a handwritten signature, for example, is defined by how it unfolds in time. However, it remains unclear how neural circuits encode complex time-varying patterns. We show that by tuning the weights of a recurrent neural network (RNN), it can recognize and then transcribe spoken digits. The model elucidates how neural dynamics in cortical networks may resolve three fundamental challenges: first, encode multiple time-varying sensory and motor patterns as stable neural trajectories; second, generalize across relevant spatial features; third, identify the same stimuli played at different speeds-we show that this temporal invariance emerges because the recurrent dynamics generate neural trajectories with appropriately modulated angular velocities. Together our results generate testable predictions as to how recurrent networks may use different mechanisms to generalize across the relevant spatial and temporal features of complex time-varying stimuli.


Brain/physiology , Cerebellar Cortex/physiology , Nerve Net/physiology , Neurons/physiology , Computer Simulation , Humans , Models, Neurological , Time
17.
Neural Comput ; 30(2): 378-396, 2018 02.
Article En | MEDLINE | ID: mdl-29162002

Brain activity evolves through time, creating trajectories of activity that underlie sensorimotor processing, behavior, and learning and memory. Therefore, understanding the temporal nature of neural dynamics is essential to understanding brain function and behavior. In vivo studies have demonstrated that sequential transient activation of neurons can encode time. However, it remains unclear whether these patterns emerge from feedforward network architectures or from recurrent networks and, furthermore, what role network structure plays in timing. We address these issues using a recurrent neural network (RNN) model with distinct populations of excitatory and inhibitory units. Consistent with experimental data, a single RNN could autonomously produce multiple functionally feedforward trajectories, thus potentially encoding multiple timed motor patterns lasting up to several seconds. Importantly, the model accounted for Weber's law, a hallmark of timing behavior. Analysis of network connectivity revealed that efficiency-a measure of network interconnectedness-decreased as the number of stored trajectories increased. Additionally, the balance of excitation (E) and inhibition (I) shifted toward excitation during each unit's activation time, generating the prediction that observed sequential activity relies on dynamic control of the E/I balance. Our results establish for the first time that the same RNN can generate multiple functionally feedforward patterns of activity as a result of dynamic shifts in the E/I balance imposed by the connectome of the RNN. We conclude that recurrent network architectures account for sequential neural activity, as well as for a fundamental signature of timing behavior: Weber's law.


Models, Neurological , Neurons/physiology , Animals , Behavior/physiology , Brain/physiology , Neural Inhibition/physiology , Neural Pathways/physiology , Synapses/physiology , Time Factors
18.
J Neurosci ; 37(4): 854-870, 2017 01 25.
Article En | MEDLINE | ID: mdl-28123021

Telling time is fundamental to many forms of learning and behavior, including the anticipation of rewarding events. Although the neural mechanisms underlying timing remain unknown, computational models have proposed that the brain represents time in the dynamics of neural networks. Consistent with this hypothesis, changing patterns of neural activity dynamically in a number of brain areas-including the striatum and cortex-has been shown to encode elapsed time. To date, however, no studies have explicitly quantified and contrasted how well different areas encode time by recording large numbers of units simultaneously from more than one area. Here, we performed large-scale extracellular recordings in the striatum and orbitofrontal cortex of mice that learned the temporal relationship between a stimulus and a reward and reported their response with anticipatory licking. We used a machine-learning algorithm to quantify how well populations of neurons encoded elapsed time from stimulus onset. Both the striatal and cortical networks encoded time, but the striatal network outperformed the orbitofrontal cortex, a finding replicated both in simultaneously and nonsimultaneously recorded corticostriatal datasets. The striatal network was also more reliable in predicting when the animals would lick up to ∼1 s before the actual lick occurred. Our results are consistent with the hypothesis that temporal information is encoded in a widely distributed manner throughout multiple brain areas, but that the striatum may have a privileged role in timing because it has a more accurate "clock" as it integrates information across multiple cortical areas. SIGNIFICANCE STATEMENT: The neural representation of time is thought to be distributed across multiple functionally specialized brain structures, including the striatum and cortex. However, until now, the neural code for time has not been compared quantitatively between these areas. Here, we performed large-scale recordings in the striatum and orbitofrontal cortex of mice trained on a stimulus-reward association task involving a delay period and used a machine-learning algorithm to quantify how well populations of simultaneously recorded neurons encoded elapsed time from stimulus onset. We found that, although both areas encoded time, the striatum consistently outperformed the orbitofrontal cortex. These results suggest that the striatum may refine the code for time by integrating information from multiple inputs.


Anticipation, Psychological/physiology , Corpus Striatum/physiology , Nerve Net/physiology , Prefrontal Cortex/physiology , Time Perception/physiology , Animals , Conditioning, Psychological/physiology , Male , Mice , Mice, Inbred C57BL
19.
Curr Opin Behav Sci ; 8: 250-257, 2016 Apr.
Article En | MEDLINE | ID: mdl-27790629

Most of the computations and tasks performed by the brain require the ability to tell time, and process and generate temporal patterns. Thus, there is a diverse set of neural mechanisms in place to allow the brain to tell time across a wide range of scales: from interaural delays on the order of microseconds to circadian rhythms and beyond. Temporal processing is most sophisticated on the scale of tens of milliseconds to a few seconds, because it is within this range that the brain must recognize and produce complex temporal patterns-such as those that characterize speech and music. Most models of timing, however, have focused primarily on simple intervals and durations, thus it is not clear whether they will generalize to complex pattern-based temporal tasks. Here, we review neurobiologically based models of timing in the subsecond range, focusing on whether they generalize to tasks that require placing consecutive intervals in the context of an overall pattern, that is, pattern timing.

20.
Neuron ; 91(2): 320-7, 2016 07 20.
Article En | MEDLINE | ID: mdl-27346530

Telling time and anticipating when external events will happen is among the most important tasks the brain performs. Yet the neural mechanisms underlying timing remain elusive. One theory proposes that timing is a general and intrinsic computation of cortical circuits. We tested this hypothesis using electrical and optogenetic stimulation to determine if brain slices could "learn" temporal intervals. Presentation of intervals between 100 and 500 ms altered the temporal profile of evoked network activity in an interval and pathway-specific manner-suggesting that the network learned to anticipate an expected stimulus. Recordings performed during training revealed a progressive increase in evoked network activity, followed by subsequent refinement of temporal dynamics, which was related to a time-window-specific increase in the excitatory-inhibitory balance. These results support the hypothesis that subsecond timing is an intrinsic computation and that timing emerges from network-wide, yet pathway-specific, changes in evoked neural dynamics.


Brain/physiology , Learning/physiology , Nerve Net/physiology , Neuronal Plasticity/physiology , Neurons/physiology , Animals , Models, Neurological , Optogenetics/methods , Patch-Clamp Techniques/methods , Tissue Culture Techniques/methods
...