Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros

Banco de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Sensors (Basel) ; 13(3): 3848-77, 2013 Mar 19.
Artículo en Inglés | MEDLINE | ID: mdl-23519346

RESUMEN

This paper presents a novel VLSI architecture for the training of radial basis function (RBF) networks. The architecture contains the circuits for fuzzy C-means (FCM) and the recursive Least Mean Square (LMS) operations. The FCM circuit is designed for the training of centers in the hidden layer of the RBF network. The recursive LMS circuit is adopted for the training of connecting weights in the output layer. The architecture is implemented by the field programmable gate array (FPGA). It is used as a hardware accelerator in a system on programmable chip (SOPC) for real-time training and classification. Experimental results reveal that the proposed RBF architecture is an effective alternative for applications where fast and efficient RBF training is desired.


Asunto(s)
Algoritmos , Computadores , Redes Neurales de la Computación , Lógica Difusa , Humanos , Análisis de los Mínimos Cuadrados
2.
IEEE Trans Neural Netw Learn Syst ; 31(7): 2638-2652, 2020 07.
Artículo en Inglés | MEDLINE | ID: mdl-31502991

RESUMEN

Vector-valued neural learning has emerged as a promising direction in deep learning recently. Traditionally, training data for neural networks (NNs) are formulated as a vector of scalars; however, its performance may not be optimal since associations among adjacent scalars are not modeled. In this article, we propose a new vector neural architecture called the Arbitrary BIlinear Product NN (ABIPNN), which processes information as vectors in each neuron, and the feedforward projections are defined using arbitrary bilinear products. Such bilinear products can include circular convolution, 7-D vector product, skew circular convolution, reversed-time circular convolution, or other new products that are not seen in the previous work. As a proof-of-concept, we apply our proposed network to multispectral image denoising and singing voice separation. Experimental results show that ABIPNN obtains substantial improvements when compared to conventional NNs, suggesting that associations are learned during training.


Asunto(s)
Aprendizaje Profundo , Redes Neurales de la Computación , Neuronas , Máquina de Vectores de Soporte , Humanos , Neuronas/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA