Your browser doesn't support javascript.
loading
Generalization analysis of deep CNNs under maximum correntropy criterion.
Zhang, Yingqiao; Fang, Zhiying; Fan, Jun.
Afiliação
  • Zhang Y; Department of Mathematics, Hong Kong Baptist University, Kowloon, Hong Kong, China. Electronic address: 20482655@life.hkbu.edu.hk.
  • Fang Z; Institute of Applied Mathematics, Shenzhen Polytechnic University, Shahexi Road 4089, Shenzhen, 518000, Guangdong, China. Electronic address: fangzhiying@szpu.edu.cn.
  • Fan J; Department of Mathematics, Hong Kong Baptist University, Kowloon, Hong Kong, China. Electronic address: junfan@hkbu.edu.hk.
Neural Netw ; 174: 106226, 2024 Jun.
Article em En | MEDLINE | ID: mdl-38490117
ABSTRACT
Convolutional neural networks (CNNs) have gained immense popularity in recent years, finding their utility in diverse fields such as image recognition, natural language processing, and bio-informatics. Despite the remarkable progress made in deep learning theory, most studies on CNNs, especially in regression tasks, tend to heavily rely on the least squares loss function. However, there are situations where such learning algorithms may not suffice, particularly in the presence of heavy-tailed noises or outliers. This predicament emphasizes the necessity of exploring alternative loss functions that can handle such scenarios more effectively, thereby unleashing the true potential of CNNs. In this paper, we investigate the generalization error of deep CNNs with the rectified linear unit (ReLU) activation function for robust regression problems within an information-theoretic learning framework. Our study demonstrates that when the regression function exhibits an additive ridge structure and the noise possesses a finite pth moment, the empirical risk minimization scheme, generated by the maximum correntropy criterion and deep CNNs, achieves fast convergence rates. Notably, these rates align with the mini-max optimal convergence rates attained by fully connected neural network model with the Huber loss function up to a logarithmic factor. Additionally, we further establish the convergence rates of deep CNNs under the maximum correntropy criterion when the regression function resides in a Sobolev space on the sphere.
Assuntos
Palavras-chave

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Algoritmos / Redes Neurais de Computação Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Assunto principal: Algoritmos / Redes Neurais de Computação Idioma: En Ano de publicação: 2024 Tipo de documento: Article