Electronic Journal of Statistics

Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators

Karim Lounici

Full-text: Open access

Abstract

We derive the l convergence rate simultaneously for Lasso and Dantzig estimators in a high-dimensional linear regression model under a mutual coherence assumption on the Gram matrix of the design and two different assumptions on the noise: Gaussian noise and general noise with finite variance. Then we prove that simultaneously the thresholded Lasso and Dantzig estimators with a proper choice of the threshold enjoy a sign concentration property provided that the non-zero components of the target vector are not too small.

Article information

Source
Electron. J. Statist., Volume 2 (2008), 90-102.

Dates
First available in Project Euclid: 12 February 2008

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1202844625

Digital Object Identifier
doi:10.1214/08-EJS177

Mathematical Reviews number (MathSciNet)
MR2386087

Zentralblatt MATH identifier
1306.62155

Subjects
Primary: 62J05: Linear regression
Secondary: 62F12: Asymptotic properties of estimators

Keywords
Linear model Lasso Dantzig Sparsity Model selection Sign consistency

Citation

Lounici, Karim. Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators. Electron. J. Statist. 2 (2008), 90--102. doi:10.1214/08-EJS177. https://projecteuclid.org/euclid.ejs/1202844625


Export citation

References

  • P.J. Bickel, Y. Ritov and A.B. Tsybakov (2007). Simultaneous analysis of Lasso and Dantzig selector. Submitted to, Ann. Statist. Available at http://www.proba.jussieu.fr/pageperso/tsybakov/.
  • F. Bunea (2007). Consistent selection via the Lasso for high dimensional approximating regression models., IMS Lecture Notes-Monograph Series, to appear.
  • F. Bunea, A.B. Tsybakov and M.H. Wegkamp (2007). Sparsity oracle inequalities for the Lasso., Electronic Journal of Statistics 1, 169-194.
  • F. Bunea, A.B. Tsybakov and M.H. Wegkamp (2007). Aggregation for Gaussian regression., Ann. Statist. 35 4, 1674-1697.
  • S.S. Chen, D.L. Donoho and M.A. Saunders (1999). Atomic Decomposition by Basis Pursuit., SIAM Journal on Scientific Computing 20, 33-61.
  • E. Candes and T. Tao (2007). The Dantzig selector: statistical estimation when, p is much larger than n. Ann. Statist., to appear.
  • D.L. Donoho, M. Elad and V. Temlyakov (2006). Stable recovery of Sparse Overcomplete representations in the Presence of Noise., IEEE Trans. on Information Theory 52, 6-18.
  • B. Efron, T. Hastie, I. Johnstone and R. Tibshirani (2004). Least angle regression., Ann. Statist. 32, 402-451.
  • E. Greenshtein, Y. Ritov (2004). Persistence in high-dimensional linear predictor selection and the virtue of overparametrization., Bernoulli 10 6, 971-988.
  • K. Knight and W. J. Fu (2000). Asymptotics for lasso-type estimators., Ann. Statist. 28, 1356-1378.
  • V. Koltchinskii (2006). Sparsity in penalized empirical risk minimization., Manuscript.
  • V. Koltchinskii (2007). Dantzig selector and sparsity oracle inequalities., Manuscript.
  • N. Meinshausen and P. Bühlmann (2006). High dimensional graphs and variable selection with the Lasso., Ann. Statist. 34, 1436-1462.
  • N. Meinshausen and B. Yu (2006). Lasso-type recovery of sparse representations for high-dimensional data., Ann. Statist., to appear.
  • A. Nemirovski (2000). Topics in nonparametric statistics. In, Lectures on probability theory and statistics (Saint Flour, 1998), Lecture Notes in Math., vol. 1738. Springer, Berlin, 85 - 277.
  • M.R. Osborne, B. Presnell and B.A. Turlach (2000). On the Lasso and its dual., Journal of Computational and Graphical Statistics 9 319-337.
  • R. Tibshirani (1996). Regression shrinkage and selection via the Lasso., Journal of the Royal Statistical Society, Series B 58, 267-288.
  • S.A. Van der Geer (2007). High dimensional generalized linear models and the Lasso., Ann. Statist., to appear.
  • S.A. Van der Geer (2007). The Deterministic Lasso. Tech Report n°140, Seminar für Statistik ETH, Zürich.
  • M.J. Wainwright (2006). Sharp thresholds for noisy and high-dimensional recovery of sparsity using, l1-constrained quadratic programming. Technical report n°709, Department of Statistics, UC Berkeley.
  • C.H. Zhang and J. Huang (2007). The sparsity and biais of the Lasso selection in high-dimensional linear regression., Ann. Statist., to appear.
  • P. Zhao and B. Yu (2007). On model selection consistency of Lasso., Journal of Machine Learning Research 7, 2541-2567.
  • H. Zou (2006). The adaptive Lasso and its oracle properties., Journal of the American Statistical Association 101 n°476, 1418-1429.