Your browser doesn't support javascript.
loading
A scalable implementation of the recursive least-squares algorithm for training spiking neural networks.
Arthur, Benjamin J; Kim, Christopher M; Chen, Susu; Preibisch, Stephan; Darshan, Ran.
Afiliação
  • Arthur BJ; Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, United States.
  • Kim CM; Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, United States.
  • Chen S; Laboratory of Biological Modeling, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD, United States.
  • Preibisch S; Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, United States.
  • Darshan R; Janelia Research Campus, Howard Hughes Medical Institute, Ashburn, VA, United States.
Front Neuroinform ; 17: 1099510, 2023.
Article em En | MEDLINE | ID: mdl-37441157
ABSTRACT
Training spiking recurrent neural networks on neuronal recordings or behavioral tasks has become a popular way to study computations performed by the nervous system. As the size and complexity of neural recordings increase, there is a need for efficient algorithms that can train models in a short period of time using minimal resources. We present optimized CPU and GPU implementations of the recursive least-squares algorithm in spiking neural networks. The GPU implementation can train networks of one million neurons, with 100 million plastic synapses and a billion static synapses, about 1,000 times faster than an unoptimized reference CPU implementation. We demonstrate the code's utility by training a network, in less than an hour, to reproduce the activity of > 66, 000 recorded neurons of a mouse performing a decision-making task. The fast implementation enables a more interactive in-silico study of the dynamics and connectivity underlying multi-area computations. It also admits the possibility to train models as in-vivo experiments are being conducted, thus closing the loop between modeling and experiments.
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2023 Tipo de documento: Article