RESUMEN
We consider a coupled map lattice defined on a hypercube in M dimensions, taken here as the information space, to model memory retrieval and information association by a neural network. We assume that both neuronal activity and spike timing may carry information. In this model the state of the network at a given time t is completely determined by the intensity y(sigma,t) with which the information pattern represented by the integer sigma is being expressed by the network. Logistic maps, coupled in the information space, are used to describe the evolution of the intensity function y(sigma(upper arrow),t) with the intent to model memory retrieval in neural systems. We calculate the phase diagram of the system regarding the model ability to work as an associative memory. We show that this model is capable of retrieving simultaneously a correlated set of memories, after a relatively long transient that may be associated to the retrieving of concatenated memorized patterns that lead to a final attractor.
Asunto(s)
Potenciales de Acción/fisiología , Almacenamiento y Recuperación de la Información/métodos , Potenciación a Largo Plazo/fisiología , Memoria/fisiología , Modelos Neurológicos , Red Nerviosa/fisiología , Neuronas/fisiología , Simulación por Computador , Modelos Logísticos , Estadística como AsuntoRESUMEN
We propose a coupled map lattice defined on a hypercube in M dimensions, the information space, to model memory retrieval by a neural network. We consider that both neuronal activity and the spiking phase may carry information. In this model the state of the network at a given time t is completely determined by a function y(sigma-->,t) of the bit strings sigma-->=(sigma1,sigma2,...,sigmaM), where sigma(i)=+/-1 with i=1,2, ...,M, that gives the intensity with which the information sigma--> is being expressed by the network. As an example, we consider logistic maps, coupled in the information space, to describe the evolution of the intensity function y(sigma-->,t). We propose an interpretation of the maps in terms of the physiological state of the neurons and the coupling between them, obtain Hebb-like learning rules, show that the model works as an associative memory, numerically investigate the capacity of the network and the size of the basins of attraction, and estimate finite size effects. We finally show that the model, when exposed to sequences of uncorrelated stimuli, shows recency and latency effects that depend on the noise level, delay time of measurement, and stimulus intensity.