Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Neuron ; 107(4): 703-716.e4, 2020 08 19.
Artículo en Inglés | MEDLINE | ID: mdl-32521223

RESUMEN

Neurons are often considered specialized functional units that encode a single variable. However, many neurons are observed to respond to a mix of disparate sensory, cognitive, and behavioral variables. For such representations, information is distributed across multiple neurons. Here we find this distributed code in the dentate gyrus and CA1 subregions of the hippocampus. Using calcium imaging in freely moving mice, we decoded an animal's position, direction of motion, and speed from the activity of hundreds of cells. The response properties of individual neurons were only partially predictive of their importance for encoding position. Non-place cells encoded position and contributed to position encoding when combined with other cells. Indeed, disrupting the correlations between neural activities decreased decoding performance, mostly in CA1. Our analysis indicates that population methods rather than classical analyses based on single-cell response properties may more accurately characterize the neural code in the hippocampus.


Asunto(s)
Potenciales de Acción/fisiología , Región CA1 Hipocampal/fisiología , Calcio/metabolismo , Giro Dentado/fisiología , Neuronas/fisiología , Conducta Espacial/fisiología , Animales , Ratones
2.
J Neurosci ; 38(46): 9900-9924, 2018 11 14.
Artículo en Inglés | MEDLINE | ID: mdl-30249794

RESUMEN

For many neural network models in which neurons are trained to classify inputs like perceptrons, the number of inputs that can be classified is limited by the connectivity of each neuron, even when the total number of neurons is very large. This poses the problem of how the biological brain can take advantage of its huge number of neurons given that the connectivity is sparse. One solution is to combine multiple perceptrons together, as in committee machines. The number of classifiable random patterns would then grow linearly with the number of perceptrons, even when each perceptron has limited connectivity. However, the problem is moved to the downstream readout neurons, which would need a number of connections as large as the number of perceptrons. Here we propose a different approach in which the readout is implemented by connecting multiple perceptrons in a recurrent attractor neural network. We prove analytically that the number of classifiable random patterns can grow unboundedly with the number of perceptrons, even when the connectivity of each perceptron remains finite. Most importantly, both the recurrent connectivity and the connectivity of downstream readouts also remain finite. Our study shows that feedforward neural classifiers with numerous long-range afferent connections can be replaced by recurrent networks with sparse long-range connectivity without sacrificing the classification performance. Our strategy could be used to design more general scalable network architectures with limited connectivity, which resemble more closely the brain neural circuits that are dominated by recurrent connectivity.SIGNIFICANCE STATEMENT The mammalian brain has a huge number of neurons, but the connectivity is rather sparse. This observation seems to contrast with the theoretical studies showing that for many neural network models the performance scales with the number of connections per neuron and not with the total number of neurons. To solve this dilemma, we propose a model in which a recurrent network reads out multiple neural classifiers. Its performance scales with the total number of neurons even when each neuron of the network has limited connectivity. Our study reveals an important role of recurrent connections in neural systems like the hippocampus, in which the computational limitations due to sparse long-range feedforward connectivity might be compensated by local recurrent connections.


Asunto(s)
Modelos Neurológicos , Redes Neurales de la Computación , Neuronas/fisiología , Potenciales de Acción/fisiología , Animales , Encéfalo/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA