Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Proc Natl Acad Sci U S A ; 117(34): 20881-20889, 2020 08 25.
Artículo en Inglés | MEDLINE | ID: mdl-32788365

RESUMEN

Language processing involves the ability to store and integrate pieces of information in working memory over short periods of time. According to the dominant view, information is maintained through sustained, elevated neural activity. Other work has argued that short-term synaptic facilitation can serve as a substrate of memory. Here we propose an account where memory is supported by intrinsic plasticity that downregulates neuronal firing rates. Single neuron responses are dependent on experience, and we show through simulations that these adaptive changes in excitability provide memory on timescales ranging from milliseconds to seconds. On this account, spiking activity writes information into coupled dynamic variables that control adaptation and move at slower timescales than the membrane potential. From these variables, information is continuously read back into the active membrane state for processing. This neuronal memory mechanism does not rely on persistent activity, excitatory feedback, or synaptic plasticity for storage. Instead, information is maintained in adaptive conductances that reduce firing rates and can be accessed directly without cued retrieval. Memory span is systematically related to both the time constant of adaptation and baseline levels of neuronal excitability. Interference effects within memory arise when adaptation is long lasting. We demonstrate that this mechanism is sensitive to context and serial order which makes it suitable for temporal integration in sequence processing within the language domain. We also show that it enables the binding of linguistic features over time within dynamic memory registers. This work provides a step toward a computational neurobiology of language.


Asunto(s)
Memoria a Corto Plazo/fisiología , Plasticidad Neuronal/fisiología , Neuronas/fisiología , Animales , Humanos , Lenguaje , Modelos Neurológicos , Redes Neurales de la Computación , Neuronas/metabolismo , Sinapsis/fisiología
2.
PLoS Comput Biol ; 12(6): e1004895, 2016 06.
Artículo en Inglés | MEDLINE | ID: mdl-27309381

RESUMEN

Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them.


Asunto(s)
Potenciales de Acción/fisiología , Aprendizaje/fisiología , Modelos Neurológicos , Animales , Biología Computacional , Humanos , Memoria a Largo Plazo/fisiología , Red Nerviosa/fisiología , Redes Neurales de la Computación , Neuronas/fisiología , Dinámicas no Lineales , Transmisión Sináptica/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA