Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Más filtros

Banco de datos
Tipo de estudio
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Comput Intell Neurosci ; 2022: 4725639, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35237308

RESUMEN

Recently, long short-term memory (LSTM) networks are extensively utilized for text classification. Compared to feed-forward neural networks, it has feedback connections, and thus, it has the ability to learn long-term dependencies. However, the LSTM networks suffer from the parameter tuning problem. Generally, initial and control parameters of LSTM are selected on a trial and error basis. Therefore, in this paper, an evolving LSTM (ELSTM) network is proposed. A multiobjective genetic algorithm (MOGA) is used to optimize the architecture and weights of LSTM. The proposed model is tested on a well-known factory reports dataset. Extensive analyses are performed to evaluate the performance of the proposed ELSTM network. From the comparative analysis, it is found that the LSTM network outperforms the competitive models.


Asunto(s)
Memoria a Corto Plazo , Redes Neurales de la Computación , Aprendizaje , Memoria a Largo Plazo
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA