Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Mais filtros

Base de dados
Tipo de documento
Assunto da revista
País de afiliação
Intervalo de ano de publicação
1.
Curr Issues Mol Biol ; 44(2): 817-832, 2022 Feb 07.
Artigo em Inglês | MEDLINE | ID: mdl-35723341

RESUMO

Large-scale artificial neural networks have many redundant structures, making the network fall into the issue of local optimization and extended training time. Moreover, existing neural network topology optimization algorithms have the disadvantage of many calculations and complex network structure modeling. We propose a Dynamic Node-based neural network Structure optimization algorithm (DNS) to handle these issues. DNS consists of two steps: the generation step and the pruning step. In the generation step, the network generates hidden layers layer by layer until accuracy reaches the threshold. Then, the network uses a pruning algorithm based on Hebb's rule or Pearson's correlation for adaptation in the pruning step. In addition, we combine genetic algorithm to optimize DNS (GA-DNS). Experimental results show that compared with traditional neural network topology optimization algorithms, GA-DNS can generate neural networks with higher construction efficiency, lower structure complexity, and higher classification accuracy.

2.
Inorg Chem ; 59(7): 4349-4356, 2020 Apr 06.
Artigo em Inglês | MEDLINE | ID: mdl-32208647

RESUMO

Low-dimensional lead-free organic-inorganic hybrid perovskites have gained increasing attention as having low toxicity, ease of processing, and good optoelectronic properties. Seeking for lead-free and narrow band gap organic-inorganic hybrid perovskites are of great importance for the development and application of photoelectric materials. Here, we reported a Sb-based organic-inorganic hybrid perovskite (MV)[SbI3Cl2], which has one-dimensional inorganic frameworks of the I-sharing double octahedra. (MV)[SbI3Cl2] shows a narrow direct band gap of 1.5 eV, and displays obvious photoresponse for the 532 nm light with rapid response speed of trise = 0.69 s, tdecay = 0.28 s. With an illumination power of 5 mW and a 50 V bias, the responsivities (R) and external quantum efficiency (EQE) for (MV)[SbI3Cl2] photodetector under 532 nm laser are 29.75 mA/W and 6.69% respectively. This Sb-based halide double perovskite material will provide an alternative material for photodetector devices.

3.
Brain Sci ; 12(2)2022 Jan 21.
Artigo em Inglês | MEDLINE | ID: mdl-35203904

RESUMO

Small sample learning ability is one of the most significant characteristics of the human brain. However, its mechanism is yet to be fully unveiled. In recent years, brain-inspired artificial intelligence has become a very hot research domain. Researchers explored brain-inspired technologies or architectures to construct neural networks that could achieve human-alike intelligence. In this work, we presented our effort at evaluation of the effect of dynamic behavior and topology co-learning of neurons and synapses on the small sample learning ability of spiking neural network. Results show that the dynamic behavior and topology co-learning mechanism of neurons and synapses presented in our work could significantly reduce the number of required samples, while maintaining a reasonable performance on the MNIST data-set, resulting in a very lightweight neural network structure.

4.
Brain Sci ; 11(2)2021 Jan 25.
Artigo em Inglês | MEDLINE | ID: mdl-33503833

RESUMO

In neuroscience, the Default Mode Network (DMN), also known as the default network or the default-state network, is a large-scale brain network known to have highly correlated activities that are distinct from other networks in the brain. Many studies have revealed that DMNs can influence other cognitive functions to some extent. This paper is motivated by this idea and intends to further explore on how DMNs could help Spiking Neural Networks (SNNs) on image classification problems through an experimental study. The approach emphasizes the bionic meaning on model selection and parameters settings. For modeling, we select Leaky Integrate-and-Fire (LIF) as the neuron model, Additive White Gaussian Noise (AWGN) as the input DMN, and design the learning algorithm based on Spike-Timing-Dependent Plasticity (STDP). Then, we experiment on a two-layer SNN to evaluate the influence of DMN on classification accuracy, and on a three-layer SNN to examine the influence of DMN on structure evolution, where the results both appear positive. Finally, we discuss possible directions for future works.

5.
Front Neurosci ; 13: 650, 2019.
Artigo em Inglês | MEDLINE | ID: mdl-31333397

RESUMO

Development of computer science has led to the blooming of artificial intelligence (AI), and neural networks are the core of AI research. Although mainstream neural networks have done well in the fields of image processing and speech recognition, they do not perform well in models aimed at understanding contextual information. In our opinion, the reason for this is that the essence of building a neural network through parameter training is to fit the data to the statistical law through parameter training. Since the neural network built using this approach does not possess memory ability, it cannot reflect the relationship between data with respect to the causality. Biological memory is fundamentally different from the current mainstream digital memory in terms of the storage method. The information stored in digital memory is converted to binary code and written in separate storage units. This physical isolation destroys the correlation of information. Therefore, the information stored in digital memory does not have the recall or association functions of biological memory which can present causality. In this paper, we present the results of our preliminary effort at constructing an associative memory system based on a spiking neural network. We broke the neural network building process into two phases: the Structure Formation Phase and the Parameter Training Phase. The Structure Formation Phase applies a learning method based on Hebb's rule to provoke neurons in the memory layer growing new synapses to connect to neighbor neurons as a response to the specific input spiking sequences fed to the neural network. The aim of this phase is to train the neural network to memorize the specific input spiking sequences. During the Parameter Training Phase, STDP and reinforcement learning are employed to optimize the weight of synapses and thus to find a way to let the neural network recall the memorized specific input spiking sequences. The results show that our memory neural network could memorize different targets and could recall the images it had memorized.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA