An optimization methodology for neural network weights and architectures.
IEEE Trans Neural Netw
; 17(6): 1452-9, 2006 Nov.
Article
em En
| MEDLINE
| ID: mdl-17131660
ABSTRACT
This paper introduces a methodology for neural network global optimization. The aim is the simultaneous optimization of multilayer perceptron (MLP) network weights and architectures, in order to generate topologies with few connections and high classification performance for any data sets. The approach combines the advantages of simulated annealing, tabu search and the backpropagation training algorithm in order to generate an automatic process for producing networks with high classification performance and low complexity. Experimental results obtained with four classification problems and one prediction problem has shown to be better than those obtained by the most commonly used optimization techniques.
Buscar no Google
Base de dados:
MEDLINE
Assunto principal:
Algoritmos
/
Processamento de Sinais Assistido por Computador
/
Reconhecimento Automatizado de Padrão
/
Armazenamento e Recuperação da Informação
/
Redes Neurais de Computação
Tipo de estudo:
Evaluation_studies
/
Prognostic_studies
Idioma:
En
Revista:
IEEE Trans Neural Netw
Assunto da revista:
INFORMATICA MEDICA
Ano de publicação:
2006
Tipo de documento:
Article
País de afiliação:
Brasil