Your browser doesn't support javascript.
loading
A distributed semi-supervised learning algorithm based on manifold regularization using wavelet neural network.
Xie, Jin; Liu, Sanyang; Dai, Hao.
Afiliação
  • Xie J; School of Mathematics and Statistics, Xidian University, Xi'an 710071, PR China. Electronic address: xj6417@126.com.
  • Liu S; School of Mathematics and Statistics, Xidian University, Xi'an 710071, PR China. Electronic address: liusanyang@126.com.
  • Dai H; School of Aerospace Science and Technology, Xidian University, Xi'an 710071, PR China. Electronic address: dai0519hao@163.com.
Neural Netw ; 118: 300-309, 2019 Oct.
Article em En | MEDLINE | ID: mdl-31330270
ABSTRACT
This paper aims to propose a distributed semi-supervised learning (D-SSL) algorithm to solve D-SSL problems, where training samples are often extremely large-scale and located on distributed nodes over communication networks. Training data of each node consists of labeled and unlabeled samples whose output values or labels are unknown. These nodes communicate in a distributed way, where each node has only access to its own data and can only exchange local information with its neighboring nodes. In some scenarios, these distributed data cannot be processed centrally. As a result, D-SSL problems cannot be centrally solved by using traditional semi-supervised learning (SSL) algorithms. The state-of-the-art D-SSL algorithm, denoted as Distributed Laplacian Regularization Least Square (D-LapRLS), is a kernel based algorithm. It is essential for the D-LapRLS algorithm to estimate the global Euclidian Distance Matrix (EDM) with respect to total samples, which is time-consuming especially when the scale of training data is large. In order to solve D-SSL problems and overcome the common drawback of kernel based D-SSL algorithms, we propose a novel Manifold Regularization (MR) based D-SSL algorithm using Wavelet Neural Network (WNN) and Zero-Gradient-Sum (ZGS) distributed optimization strategy. Accordingly, each node is assigned an individual WNN with the same basis functions. In order to initialize the proposed D-SSL algorithm, we propose a centralized MR based SSL algorithm using WNN. We denote the proposed SSL and D-SSL algorithms as Laplacian WNN (LapWNN) and distributed LapWNN (D-LapWNN), respectively. The D-LapWNN algorithm works in a fully distributed fashion by using ZGS strategy, whose convergence is guaranteed by the Lyapunov method. During the learning process, each node only exchanges local coefficients with its neighbors rather than raw data. It means that the D-LapWNN algorithm is a privacy preserving method. At last, several illustrative simulations are presented to show the efficiency and advantage of the proposed algorithm.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Redes Neurais de Computação / Aprendizado de Máquina Supervisionado Tipo de estudo: Prognostic_studies Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2019 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Redes Neurais de Computação / Aprendizado de Máquina Supervisionado Tipo de estudo: Prognostic_studies Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2019 Tipo de documento: Article