Open Access
June, 1989 On Smoothing and the Bootstrap
Peter Hall, Thomas J. DiCiccio, Joseph P. Romano
Ann. Statist. 17(2): 692-704 (June, 1989). DOI: 10.1214/aos/1176347135

Abstract

Recent attention has focussed on possible improvements in performance of estimators which might flow from using the smoothed bootstrap. We point out that in a great many problems, such as those involving functions of vector means, any such improvements will be only second-order effects. However, we argue that substantial and significant improvements can occur in problems where local properties of underlying distributions play a decisive role. This situation often occurs in estimating the variance of an estimator defined in an $L^1$ setting; we illustrate in the special case of the variance of a quantile estimator. There we show that smoothing appropriately can improve estimator convergence rate from $n^{-1/4}$ for the unsmoothed bootstrap to $n^{-(1/2) + \varepsilon}$, for arbitrary $\varepsilon > 0$. We provide a concise description of the smoothing parameter which optimizes the convergence rate.

Citation

Download Citation

Peter Hall. Thomas J. DiCiccio. Joseph P. Romano. "On Smoothing and the Bootstrap." Ann. Statist. 17 (2) 692 - 704, June, 1989. https://doi.org/10.1214/aos/1176347135

Information

Published: June, 1989
First available in Project Euclid: 12 April 2007

zbMATH: 0672.62051
MathSciNet: MR994260
Digital Object Identifier: 10.1214/aos/1176347135

Subjects:
Primary: 62G05
Secondary: 62G30

Keywords: $L^1$ regression , bandwidth , bootstrap , ‎kernel‎ , mean squared error , Nonparametric density estimation , quantile , smoothing , variance estimation

Rights: Copyright © 1989 Institute of Mathematical Statistics

Vol.17 • No. 2 • June, 1989
Back to Top