The Annals of Statistics

Bootstrap tuning in Gaussian ordered model selection

Vladimir Spokoiny and Niklas Willrich

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


The paper focuses on the problem of model selection in linear Gaussian regression with unknown possibly inhomogeneous noise. For a given family of linear estimators $\{\widetilde{\boldsymbol{{\theta}}}_{m},m\in\mathscr{M}\}$, ordered by their variance, we offer a new “smallest accepted” approach motivated by Lepski’s device and the multiple testing idea. The procedure selects the smallest model which satisfies the acceptance rule based on comparison with all larger models. The method is completely data-driven and does not use any prior information about the variance structure of the noise: its parameters are adjusted to the underlying possibly heterogeneous noise by the so-called “propagation condition” using a wild bootstrap method. The validity of the bootstrap calibration is proved for finite samples with an explicit error bound. We provide a comprehensive theoretical study of the method, describe in details the set of possible values of the selected model $\widehat{m}\in\mathscr{M}$ and establish some oracle error bounds for the corresponding estimator $\widehat{\boldsymbol{{\theta}}}=\widetilde{\boldsymbol{{\theta}}}_{\widehat{m}}$.

Article information

Ann. Statist., Volume 47, Number 3 (2019), 1351-1380.

Received: July 2015
Revised: April 2018
First available in Project Euclid: 13 February 2019

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G05: Estimation
Secondary: 62G09: Resampling methods 62J15: Paired and multiple comparisons

Smallest accepted oracle propagation condition


Spokoiny, Vladimir; Willrich, Niklas. Bootstrap tuning in Gaussian ordered model selection. Ann. Statist. 47 (2019), no. 3, 1351--1380. doi:10.1214/18-AOS1717.

Export citation


  • Arlot, S. (2009). Model selection by resampling penalization. Electron. J. Stat. 3 557–624.
  • Baraud, Y., Huet, S. and Laurent, B. (2003). Adaptive tests of linear hypotheses by model selection. Ann. Statist. 31 225–251.
  • Barron, A., Birgé, L. and Massart, P. (1999). Risk bounds for model selection via penalization. Probab. Theory Related Fields 113 301–413.
  • Beran, R. (1986). Discussion: Jackknife, bootstrap and other resampling methods in regression analysis. Ann. Statist. 14 1295–1298.
  • Birgé, L. (2001). An alternative point of view on Lepski’s method. In State of the Art in Probability and Statistics (Leiden, 1999). Institute of Mathematical Statistics Lecture Notes—Monograph Series 36 113–133. IMS, Beachwood, OH.
  • Birgé, L. and Massart, P. (2007). Minimal penalties for Gaussian model selection. Probab. Theory Related Fields 138 33–73.
  • Cavalier, L. and Golubev, Y. (2006). Risk hull method and regularization by projections of ill-posed inverse problems. Ann. Statist. 34 1653–1677.
  • Chernozhukov, V., Chetverikov, D. and Kato, K. (2014). Anti-concentration and honest, adaptive confidence bands. Ann. Statist. 42 1787–1818.
  • Dalalyan, A. S. and Salmon, J. (2012). Sharp oracle inequalities for aggregation of affine estimators. Ann. Statist. 40 2327–2355.
  • Gach, F., Nickl, R. and Spokoiny, V. (2013). Spatially adaptive density estimation by localised Haar projections. Ann. Inst. Henri Poincaré Probab. Stat. 49 900–914.
  • Giné, E. and Nickl, R. (2010). Confidence bands in density estimation. Ann. Statist. 38 1122–1170.
  • Goeman, J. J. and Solari, A. (2010). The sequential rejection principle of familywise error control. Ann. Statist. 38 3782–3810.
  • Goldenshluger, A. (2009). A universal procedure for aggregating estimators. Ann. Statist. 37 542–568.
  • Härdle, W. and Mammen, E. (1993). Comparing nonparametric versus parametric regression fits. Ann. Statist. 21 1926–1947.
  • Ibragimov, I. A. and Has’minskiĭ, R. Z. (1981). Statistical Estimation: Asymptotic Theory. Applications of Mathematics 16. Springer, New York.
  • Kneip, A. (1994). Ordered linear smoothers. Ann. Statist. 22 835–866.
  • Lepski, O. V., Mammen, E. and Spokoiny, V. G. (1997). Optimal spatial adaptation to inhomogeneous smoothness: An approach based on kernel estimates with variable bandwidth selectors. Ann. Statist. 25 929–947.
  • Lepski, O. V. and Spokoiny, V. G. (1997). Optimal pointwise adaptive methods in nonparametric estimation. Ann. Statist. 25 2512–2546.
  • Lepskiĭ, O. V. (1990). A problem of adaptive estimation in Gaussian white noise. Teor. Veroyatn. Primen. 35 459–470.
  • Lepskiĭ, O. V. (1991). Asymptotically minimax adaptive estimation. I. Upper bounds. Optimally adaptive estimates. Teor. Veroyatn. Primen. 36 645–659.
  • Lepskiĭ, O. V. (1992). Asymptotically minimax adaptive estimation. II. Schemes without optimal adaptation. Adaptive estimates. Teor. Veroyatn. Primen. 37 468–481.
  • Mammen, E. (1993). Bootstrap and wild bootstrap for high-dimensional linear models. Ann. Statist. 21 255–285.
  • Marcus, R., Peritz, E. and Gabriel, K. R. (1976). On closed testing procedures with special reference to ordered analysis of variance. Biometrika 63 655–660.
  • Massart, P. (2007). Concentration Inequalities and Model Selection. Lecture Notes in Math. 1896. Springer, Berlin.
  • Pinsker, M. S. (1980). Optimal filtration of square-integrable signals in Gaussian noise. Probl. Inf. Transm. 16 52–68.
  • Romano, J. P. and Wolf, M. (2005). Stepwise multiple testing as formalized data snooping. Econometrica 73 1237–1282.
  • Spokoiny, V. G. (1996). Adaptive hypothesis testing using wavelets. Ann. Statist. 24 2477–2498.
  • Spokoiny, V. (2012). Parametric estimation. Finite sample theory. Ann. Statist. 40 2877–2909.
  • Spokoiny, V. and Vial, C. (2009). Parameter tuning in pointwise adaptation using a propagation approach. Ann. Statist. 37 2783–2807.
  • Spokoiny, V., Wang, W. and Härdle, W. K. (2013). Local quantile regression. J. Statist. Plann. Inference 143 1109–1129.
  • Spokoiny, V. and Willrich, N. (2019). Supplement to “Bootstrap tuning in Gaussian ordered model selection.” DOI:10.1214/18-AOS1717SUPP.
  • Spokoiny, V. and Zhilova, M. (2015). Bootstrap confidence sets under model misspecification. Ann. Statist. 43 2653–2675.
  • Wu, C.-F. J. (1986). Jackknife, bootstrap and other resampling methods in regression analysis. Ann. Statist. 14 1261–1350.

Supplemental materials

  • Some auxiliary results. The supplement collects some useful technical facts and extensions.