The Annals of Statistics

Rates of convergence of the Adaptive LASSO estimators to the Oracle distribution and higher order refinements by the bootstrap

A. Chatterjee and S. N. Lahiri

Full-text: Open access

Abstract

Zou [J. Amer. Statist. Assoc. 101 (2006) 1418–1429] proposed the Adaptive LASSO (ALASSO) method for simultaneous variable selection and estimation of the regression parameters, and established its oracle property. In this paper, we investigate the rate of convergence of the ALASSO estimator to the oracle distribution when the dimension of the regression parameters may grow to infinity with the sample size. It is shown that the rate critically depends on the choices of the penalty parameter and the initial estimator, among other factors, and that confidence intervals (CIs) based on the oracle limit law often have poor coverage accuracy. As an alternative, we consider the residual bootstrap method for the ALASSO estimators that has been recently shown to be consistent; cf. Chatterjee and Lahiri [J. Amer. Statist. Assoc. 106 (2011a) 608–625]. We show that the bootstrap applied to a suitable studentized version of the ALASSO estimator achieves second-order correctness, even when the dimension of the regression parameters is unbounded. Results from a moderately large simulation study show marked improvement in coverage accuracy for the bootstrap CIs over the oracle based CIs.

Article information

Source
Ann. Statist., Volume 41, Number 3 (2013), 1232-1259.

Dates
First available in Project Euclid: 13 June 2013

Permanent link to this document
https://projecteuclid.org/euclid.aos/1371150899

Digital Object Identifier
doi:10.1214/13-AOS1106

Mathematical Reviews number (MathSciNet)
MR3113809

Zentralblatt MATH identifier
1293.62153

Subjects
Primary: 62J07: Ridge regression; shrinkage estimators
Secondary: 62G09: Resampling methods 62E20: Asymptotic distribution theory

Keywords
Bootstrap Edgeworth expansion penalized regression

Citation

Chatterjee, A.; Lahiri, S. N. Rates of convergence of the Adaptive LASSO estimators to the Oracle distribution and higher order refinements by the bootstrap. Ann. Statist. 41 (2013), no. 3, 1232--1259. doi:10.1214/13-AOS1106. https://projecteuclid.org/euclid.aos/1371150899


Export citation

References

  • Bach, F. (2009). Model-consistent sparse estimation through the bootstrap. Preprint. Available at http://arxiv.org/abs/0901.3202.
  • Berk, R. A., Brown, L. D., Buja, A., Zhang, K. and Zhao, L. (2013). Valid post selection inference. Ann. Statist. 41 802–837.
  • Bhattacharya, R. N. and Ghosh, J. K. (1978). On the validity of the formal Edgeworth expansion. Ann. Statist. 6 434–451.
  • Bickel, P. J., Ritov, Y. and Tsybakov, A. B. (2009). Simultaneous analysis of lasso and Dantzig selector. Ann. Statist. 37 1705–1732.
  • Bunea, F., Tsybakov, A. and Wegkamp, M. (2007). Sparsity oracle inequalities for the Lasso. Electron. J. Stat. 1 169–194.
  • Candes, E. and Tao, T. (2007). The Dantzig selector: Statistical estimation when $p$ is much larger than $n$. Ann. Statist. 35 2313–2351.
  • Chatterjee, A. and Lahiri, S. N. (2010). Asymptotic properties of the residual bootstrap for Lasso estimators. Proc. Amer. Math. Soc. 138 4497–4509.
  • Chatterjee, A. and Lahiri, S. N. (2011a). Bootstrapping lasso estimators. J. Amer. Statist. Assoc. 106 608–625.
  • Chatterjee, A. and Lahiri, S. N. (2011b). Strong consistency of Lasso estimators. Sankhyā A 73 55–78.
  • Chatterjee, A. and Lahiri, S. N. (2013). Supplement to “Rates of convergence of the adaptive LASSO estimators to the Oracle distribution and higher order refinements by the bootstrap.” DOI:10.1214/13-AOS1106SUPP.
  • Efron, B. (1979). Bootstrap methods: Another look at the jackknife. Ann. Statist. 7 1–26.
  • Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348–1360.
  • Freedman, D. A. (1981). Bootstrapping regression models. Ann. Statist. 9 1218–1228.
  • Götze, F. (1987). Approximations for multivariate $U$-statistics. J. Multivariate Anal. 22 212–229.
  • Gupta, S. (2012). A note on the asymptotic distribution of LASSO estimator for correlated data. Sankhyā A 74 10–28.
  • Hall, P. (1992). The Bootstrap and Edgeworth Expansion. Springer, New York.
  • Hall, P. and Miller, H. (2009). Using generalized correlation to effect variable selection in very high dimensional problems. J. Comput. Graph. Statist. 18 533–550.
  • Huang, J., Horowitz, J. L. and Ma, S. (2008). Asymptotic properties of bridge estimators in sparse high-dimensional regression models. Ann. Statist. 36 587–613.
  • Huang, J., Ma, S. and Zhang, C.-H. (2008). Adaptive Lasso for sparse high-dimensional regression models. Statist. Sinica 18 1603–1618.
  • Knight, K. and Fu, W. (2000). Asymptotics for lasso-type estimators. Ann. Statist. 28 1356–1378.
  • Lahiri, S. N. (1994). On two-term Edgeworth expansions and bootstrap approximations for Studentized multivariate $M$-estimators. Sankhyā A 56 201–226.
  • Meinshausen, N. and Bühlmann, P. (2006). High-dimensional graphs and variable selection with the lasso. Ann. Statist. 34 1436–1462.
  • Meinshausen, N. and Yu, B. (2009). Lasso-type recovery of sparse representations for high-dimensional data. Ann. Statist. 37 246–270.
  • Minnier, J., Tian, L. and Cai, T. (2011). A perturbation method for inference on regularized regression estimates. J. Amer. Statist. Assoc. 106 1371–1382.
  • Pötscher, B. M. and Schneider, U. (2009). On the distribution of the adaptive LASSO estimator. J. Statist. Plann. Inference 139 2775–2790.
  • Segal, M., Dahlquist, K. and Conklin, B. (2003). Regression approaches for microarray data analysis. J. Comput. Biol. 10 961–980.
  • Stamey, T. A., Kabalin, J. N., McNeal, J. E., Johnstone, I. M., Freiha, F., Redwine, E. A. and Yang, N. (1989). Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate. II. Radical prostatectomy treated patients. J. Urol. 141 1076–1083.
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
  • Wainwright, M. J. (2006). Sharp thresholds for high-dimensional and noisy recovery of sparsity. Technical report, Dept. of Statistics, Univ. California, Berkeley. Available at http://arxiv.org/abs/math/0605740.
  • Yuan, M. and Lin, Y. (2007). Model selection and estimation in the Gaussian graphical model. Biometrika 94 19–35.
  • Zhang, C.-H. and Huang, J. (2008). The sparsity and bias of the LASSO selection in high-dimensional linear regression. Ann. Statist. 36 1567–1594.
  • Zhao, P. and Yu, B. (2006). On model selection consistency of Lasso. J. Mach. Learn. Res. 7 2541–2563.
  • Zou, H. (2006). The adaptive lasso and its oracle properties. J. Amer. Statist. Assoc. 101 1418–1429.

Supplemental materials

  • Supplementary material: Supplement to “Rates of convergence of the Adaptive LASSO estimators to the Oracle distribution and higher order refinements by the bootstrap”. Detailed proofs of all results.