Open Access
2008 Selection of variables and dimension reduction in high-dimensional non-parametric regression
Karine Bertin, Guillaume Lecué
Electron. J. Statist. 2: 1224-1241 (2008). DOI: 10.1214/08-EJS327

Abstract

We consider a l1-penalization procedure in the non-parametric Gaussian regression model. In many concrete examples, the dimension d of the input variable X is very large (sometimes depending on the number of observations). Estimation of a β-regular regression function f cannot be faster than the slow rate n2β/(2β+d). Hopefully, in some situations, f depends only on a few numbers of the coordinates of X. In this paper, we construct two procedures. The first one selects, with high probability, these coordinates. Then, using this subset selection method, we run a local polynomial estimator (on the set of interesting coordinates) to estimate the regression function at the rate n2β/(2β+d*), where d*, the “real” dimension of the problem (exact number of variables whom f depends on), has replaced the dimension d of the design. To achieve this result, we used a l1 penalization method in this non-parametric setup.

Citation

Download Citation

Karine Bertin. Guillaume Lecué. "Selection of variables and dimension reduction in high-dimensional non-parametric regression." Electron. J. Statist. 2 1224 - 1241, 2008. https://doi.org/10.1214/08-EJS327

Information

Published: 2008
First available in Project Euclid: 16 December 2008

zbMATH: 1320.62085
MathSciNet: MR2461900
Digital Object Identifier: 10.1214/08-EJS327

Subjects:
Primary: 62G08

Keywords: Dimension reduction , high dimension , Lasso

Rights: Copyright © 2008 The Institute of Mathematical Statistics and the Bernoulli Society

Back to Top