Open Access
April 2011 Exponential Screening and optimal rates of sparse estimation
Philippe Rigollet, Alexandre Tsybakov
Ann. Statist. 39(2): 731-771 (April 2011). DOI: 10.1214/10-AOS854

Abstract

In high-dimensional linear regression, the goal pursued here is to estimate an unknown regression function using linear combinations of a suitable set of covariates. One of the key assumptions for the success of any statistical procedure in this setup is to assume that the linear combination is sparse in some sense, for example, that it involves only few covariates. We consider a general, nonnecessarily linear, regression with Gaussian noise and study a related question, that is, to find a linear combination of approximating functions, which is at the same time sparse and has small mean squared error (MSE). We introduce a new estimation procedure, called Exponential Screening, that shows remarkable adaptation properties. It adapts to the linear combination that optimally balances MSE and sparsity, whether the latter is measured in terms of the number of nonzero entries in the combination (0 norm) or in terms of the global weight of the combination (1 norm). The power of this adaptation result is illustrated by showing that Exponential Screening solves optimally and simultaneously all the problems of aggregation in Gaussian regression that have been discussed in the literature. Moreover, we show that the performance of the Exponential Screening estimator cannot be improved in a minimax sense, even if the optimal sparsity is known in advance. The theoretical and numerical superiority of Exponential Screening compared to state-of-the-art sparse procedures is also discussed.

Citation

Download Citation

Philippe Rigollet. Alexandre Tsybakov. "Exponential Screening and optimal rates of sparse estimation." Ann. Statist. 39 (2) 731 - 771, April 2011. https://doi.org/10.1214/10-AOS854

Information

Published: April 2011
First available in Project Euclid: 9 March 2011

zbMATH: 1215.62043
MathSciNet: MR2816337
Digital Object Identifier: 10.1214/10-AOS854

Subjects:
Primary: 62G08
Secondary: 62C20 , 62G05 , 62G20 , 62J05

Keywords: Adaptation , Aggregation , BIC , high-dimensional regression , Lasso , Minimax rates , Sparsity , sparsity oracle inequalities

Rights: Copyright © 2011 Institute of Mathematical Statistics

Vol.39 • No. 2 • April 2011
Back to Top