Your browser doesn't support javascript.
loading
Compositional memory in attractor neural networks with one-step learning.
Davis, Gregory P; Katz, Garrett E; Gentili, Rodolphe J; Reggia, James A.
  • Davis GP; Department of Computer Science, University of Maryland, College Park, MD, USA. Electronic address: grpdavis@umd.edu.
  • Katz GE; Department of Elec. Engr. and Comp. Sci., Syracuse University, Syracuse, NY, USA. Electronic address: gkatz01@syr.edu.
  • Gentili RJ; Department of Kinesiology, University of Maryland, College Park, MD, USA. Electronic address: rodolphe@umd.edu.
  • Reggia JA; Department of Computer Science, University of Maryland, College Park, MD, USA. Electronic address: reggia@umd.edu.
Neural Netw ; 138: 78-97, 2021 Jun.
Article en En | MEDLINE | ID: mdl-33631609
Compositionality refers to the ability of an intelligent system to construct models out of reusable parts. This is critical for the productivity and generalization of human reasoning, and is considered a necessary ingredient for human-level artificial intelligence. While traditional symbolic methods have proven effective for modeling compositionality, artificial neural networks struggle to learn systematic rules for encoding generalizable structured models. We suggest that this is due in part to short-term memory that is based on persistent maintenance of activity patterns without fast weight changes. We present a recurrent neural network that encodes structured representations as systems of contextually-gated dynamical attractors called attractor graphs. This network implements a functionally compositional working memory that is manipulated using top-down gating and fast local learning. We evaluate this approach with empirical experiments on storage and retrieval of graph-based data structures, as well as an automated hierarchical planning task. Our results demonstrate that compositional structures can be stored in and retrieved from neural working memory without persistent maintenance of multiple activity patterns. Further, memory capacity is improved by the use of a fast store-erase learning rule that permits controlled erasure and mutation of previously learned associations. We conclude that the combination of top-down gating and fast associative learning provides recurrent neural networks with a robust functional mechanism for compositional working memory.
Asunto(s)
Palabras clave

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Aprendizaje Automático Límite: Humans Idioma: En Año: 2021 Tipo del documento: Article

Texto completo: 1 Banco de datos: MEDLINE Asunto principal: Aprendizaje Automático Límite: Humans Idioma: En Año: 2021 Tipo del documento: Article