Your browser doesn't support javascript.
loading
Perceptrons from memristors.
Silva, Francisco; Sanz, Mikel; Seixas, João; Solano, Enrique; Omar, Yasser.
Afiliação
  • Silva F; Instituto de Telecomunicações, Physics of Information and Quantum Technologies Group, Portugal. Electronic address: francisco.horta.ferreira.da.silva@tecnico.ulisboa.pt.
  • Sanz M; Department of Physical Chemistry, University of the Basque Country UPV/EHU, Apartado 644, E-48080 Bilbao, Spain. Electronic address: mikel.sanz@ehu.eus.
  • Seixas J; Instituto de Telecomunicações, Physics of Information and Quantum Technologies Group, Portugal; Instituto Superior Técnico, Universidade de Lisboa, Portugal; CeFEMA, Instituto Superior Técnico, Universidade de Lisboa, Portugal; Laboratório de Instrumentação e Física Experimental de Partículas (LIP),
  • Solano E; Department of Physical Chemistry, University of the Basque Country UPV/EHU, Apartado 644, E-48080 Bilbao, Spain; IKERBASQUE, Basque Foundation for Science, Maria Diaz de Haro 3, 48013 Bilbao, Spain; International Center of Quantum Artificial Intelligence for Science and Technology (QuArtist), and De
  • Omar Y; Instituto de Telecomunicações, Physics of Information and Quantum Technologies Group, Portugal; Instituto Superior Técnico, Universidade de Lisboa, Portugal. Electronic address: yasser.omar@lx.it.pt.
Neural Netw ; 122: 273-278, 2020 Feb.
Article em En | MEDLINE | ID: mdl-31731044
ABSTRACT
Memristors, resistors with memory whose outputs depend on the history of their inputs, have been used with success in neuromorphic architectures, particularly as synapses and non-volatile memories. However, to the best of our knowledge, no model for a network in which both the synapses and the neurons are implemented using memristors has been proposed so far. In the present work we introduce models for single and multilayer perceptrons based exclusively on memristors. We adapt the delta rule to the memristor-based single-layer perceptron and the backpropagation algorithm to the memristor-based multilayer perceptron. Our results show that both perform as expected for perceptrons, including satisfying Minsky-Papert's theorem. As a consequence of the Universal Approximation Theorem, they also show that memristors are universal function approximators. By using memristors for both the neurons and the synapses, our models pave the way for novel memristor-based neural network architectures and algorithms. A neural network based on memristors could show advantages in terms of energy conservation and open up possibilities for other learning systems to be adapted to a memristor-based paradigm, both in the classical and quantum learning realms.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Sinapses / Redes Neurais de Computação / Neurônios Tipo de estudo: Prognostic_studies Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2020 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Sinapses / Redes Neurais de Computação / Neurônios Tipo de estudo: Prognostic_studies Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2020 Tipo de documento: Article