Electronic Journal of Statistics
- Electron. J. Statist.
- Volume 2 (2008), 1129-1152.
LASSO, Iterative Feature Selection and the Correlation Selector: Oracle inequalities and numerical performances
We propose a general family of algorithms for regression estimation with quadratic loss, on the basis of geometrical considerations. These algorithms are able to select relevant functions into a large dictionary. We prove that a lot of methods that have already been studied for this task (LASSO, Dantzig selector, Iterative Feature Selection, among others) belong to our family, and exhibit another particular member of this family that we call Correlation Selector in this paper. Using general properties of our family of algorithm we prove oracle inequalities for IFS, for the LASSO and for the Correlation Selector, and compare numerical performances of these estimators on a toy example.
Electron. J. Statist., Volume 2 (2008), 1129-1152.
First available in Project Euclid: 21 November 2008
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Primary: 62G08: Nonparametric regression
Secondary: 62J07: Ridge regression; shrinkage estimators 62G15: Tolerance and confidence regions 68T05: Learning and adaptive systems [See also 68Q32, 91E40]
Alquier, Pierre. LASSO, Iterative Feature Selection and the Correlation Selector: Oracle inequalities and numerical performances. Electron. J. Statist. 2 (2008), 1129--1152. doi:10.1214/08-EJS288. https://projecteuclid.org/euclid.ejs/1227287695