Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 8 de 8
Filtrar
1.
Nat Commun ; 12(1): 2325, 2021 04 23.
Artículo en Inglés | MEDLINE | ID: mdl-33893296

RESUMEN

Nonlinear dynamics of spiking neural networks have recently attracted much interest as an approach to understand possible information processing in the brain and apply it to artificial intelligence. Since information can be processed by collective spiking dynamics of neurons, the fine control of spiking dynamics is desirable for neuromorphic devices. Here we show that photonic spiking neurons implemented with paired nonlinear optical oscillators can be controlled to generate two modes of bio-realistic spiking dynamics by changing optical-pump amplitude. When the photonic neurons are coupled in a network, the interaction between them induces an effective change in the pump amplitude depending on the order parameter that characterizes synchronization. The experimental results show that the effective change causes spontaneous modification of the spiking modes and firing rates of clustered neurons, and such collective dynamics can be utilized to realize efficient heuristics for solving NP-hard combinatorial optimization problems.


Asunto(s)
Potenciales de Acción/fisiología , Algoritmos , Modelos Neurológicos , Redes Neurales de la Computación , Neuronas/fisiología , Animales , Simulación por Computador , Humanos , Dinámicas no Lineales , Fotones
2.
Sci Rep ; 10(1): 21794, 2020 Dec 11.
Artículo en Inglés | MEDLINE | ID: mdl-33311595

RESUMEN

Reservoir computing (RC) is a machine learning algorithm that can learn complex time series from data very rapidly based on the use of high-dimensional dynamical systems, such as random networks of neurons, called "reservoirs." To implement RC in edge computing, it is highly important to reduce the amount of computational resources that RC requires. In this study, we propose methods that reduce the size of the reservoir by inputting the past or drifting states of the reservoir to the output layer at the current time step. To elucidate the mechanism of model-size reduction, the proposed methods are analyzed based on information processing capacity proposed by Dambre et al. (Sci Rep 2:514, 2012). In addition, we evaluate the effectiveness of the proposed methods on time-series prediction tasks: the generalized Hénon-map and NARMA. On these tasks, we found that the proposed methods were able to reduce the size of the reservoir up to one tenth without a substantial increase in regression error.

3.
Phys Rev Lett ; 122(4): 040607, 2019 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-30768355

RESUMEN

The relaxation of binary spins to analog values has been the subject of much debate in the field of statistical physics, neural networks, and more recently quantum computing, notably because the benefits of using an analog state for finding lower energy spin configurations are usually offset by the negative impact of the improper mapping of the energy function that results from the relaxation. We show that it is possible to destabilize trapping sets of analog states that correspond to local minima of the binary spin Hamiltonian by extending the phase space to include error signals that correct amplitude inhomogeneity of the analog spin states and controlling the divergence of their velocity. Performance of the proposed analog spin system in finding lower energy states is competitive against state-of-the-art heuristics.

4.
Phys Rev E ; 95(2-1): 022118, 2017 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-28297856

RESUMEN

The dynamics of driven-dissipative systems is shown to be well-fitted for achieving efficient combinatorial optimization. The proposed method can be applied to solve any combinatorial optimization problem that is equivalent to minimizing an Ising Hamiltonian. Moreover, the dynamics considered can be implemented using various physical systems as it is based on generic dynamics-the normal form of the supercritical pitchfork bifurcation. The computational principle of the proposed method relies on an hybrid analog-digital representation of the binary Ising spins by considering the gradient descent of a Lyapunov function that is the sum of an analog Ising Hamiltonian and archetypal single or double-well potentials. By gradually changing the shape of the latter potentials from a single to double well shape, it can be shown that the first nonzero steady states to become stable are associated with global minima of the Ising Hamiltonian, under the approximation that all analog spins have the same amplitude. In the more general case, the heterogeneity in amplitude between analog spins induces the stabilization of local minima, which reduces the quality of solutions to combinatorial optimization problems. However, we show that the heterogeneity in amplitude can be reduced by setting the parameters of the driving signal near a regime, called the dynamic phase transition, where the analog spins' DC components map more accurately the global minima of the Ising Hamiltonian which, in turn, increases the quality of solutions found. Last, we discuss the possibility of a physical implementation of the proposed method using networks of degenerate optical parametric oscillators.

5.
Neural Comput ; 29(5): 1263-1292, 2017 05.
Artículo en Inglés | MEDLINE | ID: mdl-28333586

RESUMEN

Recent experiments have shown that stereotypical spatiotemporal patterns occur during brief packets of spiking activity in the cortex, and it has been suggested that top-down inputs can modulate these patterns according to the context. We propose a simple model that may explain important features of these experimental observations and is analytically tractable. The key mechanism underlying this model is that context-dependent top-down inputs can modulate the effective connection strengths between neurons because of short-term synaptic depression. As a result, the degree of synchrony and, in turn, the spatiotemporal patterns of spiking activity that occur during packets are modulated by the top-down inputs. This is shown using an analytical framework, based on avalanche dynamics, that allows calculating the probability that a given neuron spikes during a packet and numerical simulations. Finally, we show that the spatiotemporal patterns that replay previously experienced sequential stimuli and their binding with their corresponding context can be learned because of spike-timing-dependent plasticity.

6.
Artículo en Inglés | MEDLINE | ID: mdl-25768549

RESUMEN

A robust method for inferring the structure of networks is presented based on the one-to-one correspondence between the expected composition of cascades of bursts of activity, called crackling noise or avalanches, and the weight matrix. Using a model of neuronal avalanches as a paradigmatic example, we derive this correspondence exactly by calculating the closed-form expression of the joint probability distribution of avalanche sizes obtained by counting separately the number of elements active in each subnetwork during avalanches.

7.
Neural Comput ; 25(12): 3131-82, 2013 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-24001341

RESUMEN

We study a realistic model of a cortical column taking into account short-term plasticity between pyramidal cells and interneurons. The simulation of leaky integrate-and-fire neurons shows that low-frequency oscillations emerge spontaneously as a result of intrinsic network properties. These oscillations are composed of prolonged phases of high and low activity reminiscent of cortical up and down states, respectively. We simplify the description of the network activity by using a mean field approximation and reduce the system to two slow variables exhibiting some relaxation oscillations. We identify two types of slow oscillations. When the combination of dynamic synapses between pyramidal cells and those between interneurons accounts for the generation of these slow oscillations, the end of the up phase is characterized by asynchronous fluctuations of the membrane potentials. When the slow oscillations are mainly driven by the dynamic synapses between interneurons, the network exhibits fluctuations of membrane potentials, which are more synchronous at the end than at the beginning of the up phase. Additionally, finite size effect and slow synaptic currents can modify the irregularity and frequency, respectively, of these oscillations. Finally, we consider possible roles of a slow oscillatory input modeling long-range interactions in the brain. Spontaneous slow oscillations of local networks are modulated by the oscillatory input, which induces, notably, synchronization, subharmonic synchronization, and chaotic relaxation oscillations in the mean field approximation. In the case of forced oscillations, the slow population-averaged activity of leaky integrate-and-fire neurons can have both deterministic and stochastic temporal features. We discuss the possibility that long-range connectivity controls the emergence of slow sequential patterns in local populations due to the tendency of a cortical column to oscillate at low frequency.


Asunto(s)
Algoritmos , Corteza Cerebral/fisiología , Redes Neurales de la Computación , Plasticidad Neuronal/fisiología , Neuronas/fisiología , Animales , Humanos
8.
Cogn Neurodyn ; 6(6): 499-524, 2012 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-24294335

RESUMEN

In this article, we analyze combined effects of LTP/LTD and synaptic scaling and study the creation of persistent activity from a periodic or chaotic baseline attractor. The bifurcations leading to the creation of new attractors have been detailed; this was achieved using a mean field approximation. Attractors encoding persistent activity can notably appear via generalized period-doubling bifurcations, tangent bifurcations of the second iterates or boundary crises, after which the basins of attraction become irregular. Synaptic scaling is shown to maintain the coexistence of a state of persistent activity and the baseline. According to the rate of change of the external inputs, different types of attractors can be formed: line attractors for rapidly changing external inputs and discrete attractors for constant external inputs.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA