Your browser doesn't support javascript.
loading
Fast Rates of Gaussian Empirical Gain Maximization With Heavy-Tailed Noise.
IEEE Trans Neural Netw Learn Syst ; 33(10): 6038-6043, 2022 Oct.
Article en En | MEDLINE | ID: mdl-35560074
In a regression setup, we study in this brief the performance of Gaussian empirical gain maximization (EGM), which includes a broad variety of well-established robust estimation approaches. In particular, we conduct a refined learning theory analysis for Gaussian EGM, investigate its regression calibration properties, and develop improved convergence rates in the presence of heavy-tailed noise. To achieve these purposes, we first introduce a new weak moment condition that could accommodate the cases where the noise distribution may be heavy-tailed. Based on the moment condition, we then develop a novel comparison theorem that can be used to characterize the regression calibration properties of Gaussian EGM. It also plays an essential role in deriving improved convergence rates. Therefore, the present study broadens our theoretical understanding of Gaussian EGM.

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: IEEE Trans Neural Netw Learn Syst Año: 2022 Tipo del documento: Article Pais de publicación: Estados Unidos

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: IEEE Trans Neural Netw Learn Syst Año: 2022 Tipo del documento: Article Pais de publicación: Estados Unidos