Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
IEEE Trans Cybern ; 51(3): 1613-1625, 2021 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-31217137

RESUMO

As efficient recurrent neural network (RNN) models, echo state networks (ESNs) have attracted widespread attention and been applied in many application domains in the last decade. Although they have achieved great success in modeling time series, a single ESN may have difficulty in capturing the multitimescale structures that naturally exist in temporal data. In this paper, we propose the convolutional multitimescale ESN (ConvMESN), which is a novel training-efficient model for capturing multitimescale structures and multiscale temporal dependencies of temporal data. In particular, a multitimescale memory encoder is constructed with a multireservoir structure, in which different reservoirs have recurrent connections with different skip lengths (or time spans). By collecting all past echo states in each reservoir, this multireservoir structure encodes the history of a time series as nonlinear multitimescale echo state representations (MESRs). Our visualization analysis verifies that the MESRs provide better discriminative features for time series. Finally, multiscale temporal dependencies of MESRs are learned by a convolutional layer. By leveraging the multitimescale reservoirs followed by a convolutional learner, the ConvMESN has not only efficient memory encoding ability for temporal data with multitimescale structures but also strong learning ability for complex temporal dependencies. Furthermore, the training-free reservoirs and the single convolutional layer provide high-computational efficiency for the ConvMESN to model complex temporal data. Extensive experiments on 18 multivariate time series (MTS) benchmark datasets and 3 skeleton-based action recognition datasets demonstrate that the ConvMESN captures multitimescale dynamics and outperforms existing methods.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA