Your browser doesn't support javascript.
loading
L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs.
Neykov, Matey; Liu, Jun S; Cai, Tianxi.
Afiliación
  • Neykov M; Department of Operations Research and Financial Engineering, Princeton University, Princeton, NJ 08544, USA.
  • Liu JS; Department of Statistics, Harvard University, Cambridge, MA 02138, USA.
  • Cai T; Department of Biostatistics, Harvard University, Boston, MA 02115, USA.
J Mach Learn Res ; 17(1): 2976-3012, 2016 May.
Article en En | MEDLINE | ID: mdl-28503101
ABSTRACT
It is known that for a certain class of single index models (SIMs) [Formula see text], support recovery is impossible when X ~ 𝒩(0, 𝕀 p×p ) and a model complexity adjusted sample size is below a critical threshold. Recently, optimal algorithms based on Sliced Inverse Regression (SIR) were suggested. These algorithms work provably under the assumption that the design X comes from an i.i.d. Gaussian distribution. In the present paper we analyze algorithms based on covariance screening and least squares with L1 penalization (i.e. LASSO) and demonstrate that they can also enjoy optimal (up to a scalar) rescaled sample size in terms of support recovery, albeit under slightly different assumptions on f and ε compared to the SIR based algorithms. Furthermore, we show more generally, that LASSO succeeds in recovering the signed support of ß0 if X ~ 𝒩 (0, Σ), and the covariance Σ satisfies the irrepresentable condition. Our work extends existing results on the support recovery of LASSO for the linear model, to a more general class of SIMs.
Palabras clave

Texto completo: 1 Colección: 01-internacional Banco de datos: MEDLINE Idioma: En Revista: J Mach Learn Res Año: 2016 Tipo del documento: Article País de afiliación: Estados Unidos

Texto completo: 1 Colección: 01-internacional Banco de datos: MEDLINE Idioma: En Revista: J Mach Learn Res Año: 2016 Tipo del documento: Article País de afiliación: Estados Unidos