Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Mais filtros

Base de dados
Tipo de estudo
Ano de publicação
Tipo de documento
Intervalo de ano de publicação
1.
Front Artif Intell ; 7: 1397915, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-39081931

RESUMO

Introduction: The deep echo state network (Deep-ESN) architecture, which comprises a multi-layered reservoir layer, exhibits superior performance compared to conventional echo state networks (ESNs) owing to the divergent layer-specific time-scale responses in the Deep-ESN. Although researchers have attempted to use experimental trial-and-error grid searches and Bayesian optimization methods to adjust the hyperparameters, suitable guidelines for setting hyperparameters to adjust the time scale of the dynamics in each layer from the perspective of dynamical characteristics have not been established. In this context, we hypothesized that evaluating the dependence of the multi-time-scale dynamical response on the leaking rate as a typical hyperparameter of the time scale in each neuron would help to achieve a guideline for optimizing the hyperparameters of the Deep-ESN. Method: First, we set several leaking rates for each layer of the Deep-ESN and performed multi-scale entropy (MSCE) analysis to analyze the impact of the leaking rate on the dynamics in each layer. Second, we performed layer-by-layer cross-correlation analysis between adjacent layers to elucidate the structural mechanisms to enhance the performance. Results: As a result, an optimum task-specific leaking rate value for producing layer-specific multi-time-scale responses and a queue structure with layer-to-layer signal transmission delays for retaining past applied input enhance the Deep-ESN prediction performance. Discussion: These findings can help to establish ideal design guidelines for setting the hyperparameters of Deep-ESNs.

2.
Int J Neural Syst ; 18(2): 135-45, 2008 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-18452247

RESUMO

Associative memory networks based on quaternionic Hopfield neural network are investigated in this paper. These networks are composed of quaternionic neurons, and input, output, threshold, and connection weights are represented in quaternions, which is a class of hypercomplex number systems. The energy function of the network and the Hebbian rule for embedding patterns are introduced. The stable states and their basins are explored for the networks with three neurons and four neurons. It is clarified that there exist at most 16 stable states, called multiplet components, as the degenerated stored patterns, and each of these states has its basin in the quaternionic networks.


Assuntos
Aprendizagem por Associação/fisiologia , Memória/fisiologia , Redes Neurais de Computação , Simulação por Computador , Humanos , Modelos Neurológicos , Rede Nervosa/fisiologia , Percepção Espacial
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa