Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Phys Rev Lett ; 115(10): 107002, 2015 Sep 04.
Artículo en Inglés | MEDLINE | ID: mdl-26382697

RESUMEN

We report on microwave emission from linear parallel arrays of underdamped Josephson junctions, which are described by the Frenkel-Kontorova (FK) model. Electromagnetic radiation is detected from the arrays when biased on current singularities (steps) appearing at voltages V(n)=Φ(0)(nc̅/L), where Φ(0)=2.07×10(-15) Wb is the magnetic flux quantum, and c̅, L, and n are, respectively, the speed of light in the transmission line embedding the array, L its physical length, and n an integer. The radiation, detected at fundamental frequency c̅/2L when biased on different singularities, indicates shuttling of bunched 2π kinks (magnetic flux quanta). Resonance of flux-quanta motion with the small-amplitude oscillations induced in the arrays gives rise to fine structures in the radiation spectrum, which are interpreted on the basis of the FK model describing the resonance. The impact of our results on design and performances of new digital circuit families is discussed.

2.
IEEE Trans Neural Netw ; 14(5): 1297-307, 2003.
Artículo en Inglés | MEDLINE | ID: mdl-18244578

RESUMEN

Electronic neuromorphic devices with on-chip, on-line learning should be able to modify quickly the synaptic couplings to acquire information about new patterns to be stored (synaptic plasticity) and, at the same time, preserve this information on very long time scales (synaptic stability). Here, we illustrate the electronic implementation of a simple solution to this stability-plasticity problem, recently proposed and studied in various contexts. It is based on the observation that reducing the analog depth of the synapses to the extreme (bistable synapses) does not necessarily disrupt the performance of the device as an associative memory, provided that 1) the number of neurons is large enough; 2) the transitions between stable synaptic states are stochastic; and 3) learning is slow. The drastic reduction of the analog depth of the synaptic variable also makes this solution appealing from the point of view of electronic implementation and offers a simple methodological alternative to the technological solution based on floating gates. We describe the full custom analog very large-scale integration (VLSI) realization of a small network of integrate-and-fire neurons connected by bistable deterministic plastic synapses which can implement the idea of stochastic learning. In the absence of stimuli, the memory is preserved indefinitely. During the stimulation the synapse undergoes quick temporary changes through the activities of the pre- and postsynaptic neurons; those changes stochastically result in a long-term modification of the synaptic efficacy. The intentionally disordered pattern of connectivity allows the system to generate a randomness suited to drive the stochastic selection mechanism. We check by a suitable stimulation protocol that the stochastic synaptic plasticity produces the expected pattern of potentiation and depression in the electronic network.

3.
Neural Comput ; 12(10): 2227-58, 2000 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-11032032

RESUMEN

We present a model for spike-driven dynamics of a plastic synapse, suited for aVLSI implementation. The synaptic device behaves as a capacitor on short timescales and preserves the memory of two stable states (efficacies) on long timescales. The transitions (LTP/LTD) are stochastic because both the number and the distribution of neural spikes in any finite (stimulation) interval fluctuate, even at fixed pre- and postsynaptic spike rates. The dynamics of the single synapse is studied analytically by extending the solution to a classic problem in queuing theory (Takacs process). The model of the synapse is implemented in aVLSI and consists of only 18 transistors. It is also directly simulated. The simulations indicate that LTP/LTD probabilities versus rates are robust to fluctuations of the electronic parameters in a wide range of rates. The solutions for these probabilities are in very good agreement with both the simulations and measurements. Moreover, the probabilities are readily manipulable by variations of the chip's parameters, even in ranges where they are very small. The tests of the electronic device cover the range from spontaneous activity (3-4 Hz) to stimulus-driven rates (50 Hz). Low transition probabilities can be maintained in all ranges, even though the intrinsic time constants of the device are short (approximately 100 ms). Synaptic transitions are triggered by elevated presynaptic rates: for low presynaptic rates, there are essentially no transitions. The synaptic device can preserve its memory for years in the absence of stimulation. Stochasticity of learning is a result of the variability of interspike intervals; noise is a feature of the distributed dynamics of the network. The fact that the synapse is binary on long timescales solves the stability problem of synaptic efficacies in the absence of stimulation. Yet stochastic learning theory ensures that it does not affect the collective behavior of the network, if the transition probabilities are low and LTP is balanced against LTD.


Asunto(s)
Simulación por Computador , Redes Neurales de la Computación , Plasticidad Neuronal/fisiología , Potenciales de Acción/fisiología , Inteligencia Artificial , Computadores , Conductividad Eléctrica , Potenciación a Largo Plazo/fisiología , Modelos Neurológicos , Inhibición Neural/fisiología , Probabilidad , Procesos Estocásticos , Sinapsis/fisiología
4.
Network ; 9(2): 183-205, 1998 May.
Artículo en Inglés | MEDLINE | ID: mdl-9861985

RESUMEN

LANN27 is an electronic device implementing in discrete electronics a fully connected (full feedback) network of 27 neurons and 351 plastic synapses with stochastic Hebbian learning. Both neurons and synapses are dynamic elements, with two time constants--fast for neurons and slow for synapses. Learning, synaptic dynamics, is analogue and is driven in a Hebbian way by neural activities. Long-term memorization takes place on a discrete set of synaptic efficacies and is effected in a stochastic manner. The intense feedback between the nonlinear neural elements, via the learned synaptic structure, creates in an organic way a set of attractors for the collective retrieval dynamics of the neural system, akin to Hebbian learned reverberations. The resulting structure of the attractors is a record of the large-scale statistics in the uncontrolled, incoming flow of stimuli. As the statistics in the stimulus flow changes significantly, the attractors slowly follow it and the network behaves as a palimpsest--old is gradually replaced by new. Moreover, the slow learning creates attractors which render the network a prototype extractor: entire clouds of stimuli, noisy versions of a prototype, used in training, all retrieve the attractor corresponding to the prototype upon retrieval. Here we describe the process of studying the collective dynamics of the network, before, during and following learning, which is rendered complex by the richness of the possible stimulus streams and the large dimensionality of the space of states of the network. We propose sampling techniques and modes of representation for the outcome.


Asunto(s)
Aprendizaje/fisiología , Redes Neurales de la Computación , Procesos Estocásticos , Animales , Artefactos , Humanos , Neuronas/fisiología , Sinapsis/fisiología , Factores de Tiempo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA