RESUMO
We propose a constructive approach to building single-hidden-layer neural networks for nonlinear function approximation using frequency domain analysis. We introduce a spectrum-based learning procedure that minimizes the difference between the spectrum of the training data and the spectrum of the network's estimates. The network is built up incrementally during training and automatically determines the appropriate number of hidden units. This technique achieves similar or better approximation with faster convergence times than traditional techniques such as backpropagation.
RESUMO
Neural network architecture optimization is often a critical issue, particularly when VLSI implementation is considered. This paper proposes a new minimization method for multilayered feedforward ANNs and an original approach to their synthesis, both based on the analysis of the information quantity (entropy) flowing through the network. A layer is described as an information filter which selects the relevant characteristics until the complete classification is performed. The basic incremental synthesis method, including the supervised training procedure, is derived to design application-tailored neural paradigms with good generalization capability.