Your browser doesn't support javascript.
loading
An exact mapping from ReLU networks to spiking neural networks.
Stanojevic, Ana; Wozniak, Stanislaw; Bellec, Guillaume; Cherubini, Giovanni; Pantazi, Angeliki; Gerstner, Wulfram.
Afiliação
  • Stanojevic A; IBM Research Europe - Zurich, Rüschlikon, Switzerland; École polytechnique fédérale de Lausanne, School of Life Sciences and School of Computer and Communication Sciences, Lausanne EPFL, Switzerland. Electronic address: ans@zurich.ibm.com.
  • Wozniak S; IBM Research Europe - Zurich, Rüschlikon, Switzerland.
  • Bellec G; École polytechnique fédérale de Lausanne, School of Life Sciences and School of Computer and Communication Sciences, Lausanne EPFL, Switzerland.
  • Cherubini G; IBM Research Europe - Zurich, Rüschlikon, Switzerland.
  • Pantazi A; IBM Research Europe - Zurich, Rüschlikon, Switzerland.
  • Gerstner W; École polytechnique fédérale de Lausanne, School of Life Sciences and School of Computer and Communication Sciences, Lausanne EPFL, Switzerland.
Neural Netw ; 168: 74-88, 2023 Nov.
Article em En | MEDLINE | ID: mdl-37742533
Deep spiking neural networks (SNNs) offer the promise of low-power artificial intelligence. However, training deep SNNs from scratch or converting deep artificial neural networks to SNNs without loss of performance has been a challenge. Here we propose an exact mapping from a network with Rectified Linear Units (ReLUs) to an SNN that fires exactly one spike per neuron. For our constructive proof, we assume that an arbitrary multi-layer ReLU network with or without convolutional layers, batch normalization and max pooling layers was trained to high performance on some training set. Furthermore, we assume that we have access to a representative example of input data used during training and to the exact parameters (weights and biases) of the trained ReLU network. The mapping from deep ReLU networks to SNNs causes zero percent drop in accuracy on CIFAR10, CIFAR100 and the ImageNet-like data sets Places365 and PASS. More generally our work shows that an arbitrary deep ReLU network can be replaced by an energy-efficient single-spike neural network without any loss of performance.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Inteligência Artificial / Redes Neurais de Computação Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Inteligência Artificial / Redes Neurais de Computação Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2023 Tipo de documento: Article