RESUMO
The focus of this paper is a convergence study of the frequency sensitive competitive learning (FSCL) algorithm. We approximate the final phase of FSCL learning by a diffusion process described by the Fokker-Plank equation. Sufficient and necessary conditions are presented for the convergence of the diffusion process to a local equilibrium. The analysis parallels that by Ritter-Schulten (1988) for Kohonen's self-organizing map. We show that the convergence conditions involve only the learning rate and that they are the same as the conditions for weak convergence described previously. Our analysis thus broadens the class of algorithms that have been shown to have these types of convergence characteristics.
RESUMO
We study the codeword distribution for a conscience-type competitive learning algorithm, frequency sensitive competitive learning (FSCL), using one-dimensional input data. We prove that the asymptotic codeword density in the limit of large number of codewords is given by a power law of the form Q(x)=C.P(x)(alpha), where P(x) is the input data density and alpha depends on the algorithm and the form of the distortion measure to be minimized. We further show that the algorithm can be adjusted to minimize any L(p) distortion measure with p ranging in (0,2].