Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Más filtros

Banco de datos
Tipo del documento
Revista
País de afiliación
Intervalo de año de publicación
1.
Elife ; 112022 04 25.
Artículo en Inglés | MEDLINE | ID: mdl-35467527

RESUMEN

In many normative theories of synaptic plasticity, weight updates implicitly depend on the chosen parametrization of the weights. This problem relates, for example, to neuronal morphology: synapses which are functionally equivalent in terms of their impact on somatic firing can differ substantially in spine size due to their different positions along the dendritic tree. Classical theories based on Euclidean-gradient descent can easily lead to inconsistencies due to such parametrization dependence. The issues are solved in the framework of Riemannian geometry, in which we propose that plasticity instead follows natural-gradient descent. Under this hypothesis, we derive a synaptic learning rule for spiking neurons that couples functional efficiency with the explanation of several well-documented biological phenomena such as dendritic democracy, multiplicative scaling, and heterosynaptic plasticity. We therefore suggest that in its search for functional synaptic plasticity, evolution might have come up with its own version of natural-gradient descent.


Asunto(s)
Neuronas , Sinapsis , Aprendizaje/fisiología , Plasticidad Neuronal/fisiología , Neuronas/fisiología , Sinapsis/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA