Your browser doesn't support javascript.
loading
Sharp Guarantees and Optimal Performance for Inference in Binary and Gaussian-Mixture Models.
Taheri, Hossein; Pedarsani, Ramtin; Thrampoulidis, Christos.
Afiliación
  • Taheri H; Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA 93106, USA.
  • Pedarsani R; Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA 93106, USA.
  • Thrampoulidis C; Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA 93106, USA.
Entropy (Basel) ; 23(2)2021 Jan 30.
Article en En | MEDLINE | ID: mdl-33573327
ABSTRACT
We study convex empirical risk minimization for high-dimensional inference in binary linear classification under both discriminative binary linear models, as well as generative Gaussian-mixture models. Our first result sharply predicts the statistical performance of such estimators in the proportional asymptotic regime under isotropic Gaussian features. Importantly, the predictions hold for a wide class of convex loss functions, which we exploit to prove bounds on the best achievable performance. Notably, we show that the proposed bounds are tight for popular binary models (such as signed and logistic) and for the Gaussian-mixture model by constructing appropriate loss functions that achieve it. Our numerical simulations suggest that the theory is accurate even for relatively small problem dimensions and that it enjoys a certain universality property.
Palabras clave

Texto completo: 1 Bases de datos: MEDLINE Tipo de estudio: Prognostic_studies Idioma: En Revista: Entropy (Basel) Año: 2021 Tipo del documento: Article País de afiliación: Estados Unidos

Texto completo: 1 Bases de datos: MEDLINE Tipo de estudio: Prognostic_studies Idioma: En Revista: Entropy (Basel) Año: 2021 Tipo del documento: Article País de afiliación: Estados Unidos