Your browser doesn't support javascript.
loading
Overcoming catastrophic forgetting in neural networks.
Kirkpatrick, James; Pascanu, Razvan; Rabinowitz, Neil; Veness, Joel; Desjardins, Guillaume; Rusu, Andrei A; Milan, Kieran; Quan, John; Ramalho, Tiago; Grabska-Barwinska, Agnieszka; Hassabis, Demis; Clopath, Claudia; Kumaran, Dharshan; Hadsell, Raia.
Afiliación
  • Kirkpatrick J; DeepMind, London EC4 5TW, United Kingdom; kirkpatrick@google.com.
  • Pascanu R; DeepMind, London EC4 5TW, United Kingdom.
  • Rabinowitz N; DeepMind, London EC4 5TW, United Kingdom.
  • Veness J; DeepMind, London EC4 5TW, United Kingdom.
  • Desjardins G; DeepMind, London EC4 5TW, United Kingdom.
  • Rusu AA; DeepMind, London EC4 5TW, United Kingdom.
  • Milan K; DeepMind, London EC4 5TW, United Kingdom.
  • Quan J; DeepMind, London EC4 5TW, United Kingdom.
  • Ramalho T; DeepMind, London EC4 5TW, United Kingdom.
  • Grabska-Barwinska A; DeepMind, London EC4 5TW, United Kingdom.
  • Hassabis D; DeepMind, London EC4 5TW, United Kingdom.
  • Clopath C; Bioengineering Department, Imperial College London, London SW7 2AZ, United Kingdom.
  • Kumaran D; DeepMind, London EC4 5TW, United Kingdom.
  • Hadsell R; DeepMind, London EC4 5TW, United Kingdom.
Proc Natl Acad Sci U S A ; 114(13): 3521-3526, 2017 03 28.
Article en En | MEDLINE | ID: mdl-28292907
ABSTRACT
The ability to learn tasks in a sequential fashion is crucial to the development of artificial intelligence. Until now neural networks have not been capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models. We show that it is possible to overcome this limitation and train networks that can maintain expertise on tasks that they have not experienced for a long time. Our approach remembers old tasks by selectively slowing down learning on the weights important for those tasks. We demonstrate our approach is scalable and effective by solving a set of classification tasks based on a hand-written digit dataset and by learning several Atari 2600 games sequentially.
Asunto(s)
Palabras clave

Texto completo: 1 Bases de datos: MEDLINE Asunto principal: Redes Neurales de la Computación Límite: Humans Idioma: En Revista: Proc Natl Acad Sci U S A Año: 2017 Tipo del documento: Article

Texto completo: 1 Bases de datos: MEDLINE Asunto principal: Redes Neurales de la Computación Límite: Humans Idioma: En Revista: Proc Natl Acad Sci U S A Año: 2017 Tipo del documento: Article