Bernoulli

  • Bernoulli
  • Volume 4, Number 3 (1998), 305-328.

Matched-block bootstrap for dependent data

Edward Carlstein, Kim-Anh Do, Peter Hall, Tim Hesterberg, and Hans R. Künsch

Full-text: Open access

Abstract

The block bootstrap for time series consists in randomly resampling blocks of consecutive values of the given data and aligning these blocks into a bootstrap sample. Here we suggest improving the performance of this method by aligning with higher likelihood those blocks which match at their ends. This is achieved by resampling the blocks according to a Markov chain whose transitions depend on the data. The matching algorithms that we propose take some of the dependence structure of the data into account. They are based on a kernel estimate of the conditional lag one distribution or on a fitted autoregression of small order. Numerical and theoretical analysis in the case of estimating the variance of the sample mean show that matching reduces bias and, perhaps unexpectedly, has relatively little effect of variance. Our theory extends to the case of smooth functions of a vector mean.

Article information

Source
Bernoulli, Volume 4, Number 3 (1998), 305-328.

Dates
First available in Project Euclid: 19 March 2007

Permanent link to this document
https://projecteuclid.org/euclid.bj/1174324983

Mathematical Reviews number (MathSciNet)
MR1653268

Zentralblatt MATH identifier
0920.62106

Keywords
blocking methods bootstrap kernel methods resampling time series variance estimation

Citation

Carlstein, Edward; Do, Kim-Anh; Hall, Peter; Hesterberg, Tim; Künsch, Hans R. Matched-block bootstrap for dependent data. Bernoulli 4 (1998), no. 3, 305--328. https://projecteuclid.org/euclid.bj/1174324983


Export citation

References

  • [1] Bühlmann, P. (1994) Blockwise bootstrapped empirical processes for stationary sequences. Ann. Statist., 22, 995-1012.
  • [2] Bühlmann, P. (1995) The blockwise bootstrap for general empirical processes of stationary sequences. Stoch. Processes Applic., 58, 247-265.
  • [3] Bühlmann, P. and Künsch, H.R. (1994) Block length selection in the bootstrap for time series. Research Report 72, Seminar für Statistik, Eidgenössische Technische Hochschule Zurich.
  • [4] Bühlmann, P. and Künsch, H.R. (1995) The blockwise bootstrap for general parameters of a stationary time series. Scand. J. Statist., 22, 35-54.
  • [5] Carlstein, E. (1986) The use of subseries values for estimating the variance of a general statistic from a stationary sequence. Ann. Statist., 14, 1171-1179.
  • [6] Davison, A.C. and Hall, P. (1993) On Studentizing and blocking methods for implementing the bootstrap with dependent data. Aust. J. Statist., 35, 215-224.
  • [7] Efron, B. (1979) Bootstrap methods: another look at the jackknife (With discussion). Ann. Statist., 7, 1-26.
  • [8] Efron, B. and Tibshirani, R.J. (1993) An Introduction to the Bootstrap. London: Chapman & Hall.
  • [9] Götze, F. and Künsch, H.R. (1996) Second order correctness of the blockwise bootstrap for stationary observations. Ann. Statist., 24, 1914-1933.
  • [10] Hall, P. (1985) Resampling a coverage pattern. Stoch. Processes Applic., 20, 231-246.
  • [11] Hall, P. and Jing, B. (1996) On sample reuse methods for dependent data. J. Roy. Statist. Soc. Ser. B, 58, 727-737.
  • [12] Hall, P. Horowitz, J.L. and Jing, B. (1995) On blocking rules for the block bootstrap with dependent data. Biometrika, 82, 561-574.
  • [13] Künsch, H.R. (1989) The jackknife and the bootstrap for general stationary observations. Ann. Statist., 17, 1217-1241.
  • [14] Lahiri, S.N. (1991) Second-order optimality of stationary bootstrap. Statist. Probab. Lett., 11, 335-341.
  • [15] Lahiri, S.N. (1996) On Edgeworth expansion and moving block bootstrap for Studentized M-estimators in multiple linear regression models. J. Multivar. Anal., 56, 42-59.
  • [16] Liu, R. and Singh, K. (1992) Moving blocks jackknife and bootstrap capture weak dependence. In R. LePage and L. Billard (eds), Exploring the Limits of the Bootstrap, pp. 225-248. New York: Wiley.
  • [17] Naik-Nimbalkar, U.V. and Rajarshi, M.B. (1994) Validity of blockwise bootstrap for empirical processes with stationary observations. Ann. Statist., 22, 980-994.
  • [18] Politis, D.N. and Romano, J.P. (1992) A general resampling scheme for triangular arrays of á-mixing random variables with application to the problem of spectral density estimation. Ann. Statist., 20, 1985-2007.
  • [19] Politis, D.N. and Romano, J.P. (1994) The stationary bootstrap. J. Amer. Statist. Assoc., 89, 1303-1313.
  • [20] Politis, D.N. and Romano, J.P. (1995) Bias-corrected nonparametric spectral estimation. J. Time Series Anal., 16, 67-103.
  • [21] Radulovic, D. (1995) The bootstrap of empirical processes for á-mixing sequences. Preprint, University of Connecticut.
  • [22] Radulovic, D. (1996a) The bootstrap of the mean for strong mixing sequences under minimal conditions. Statist. Probab. Lett. 28, 65-72.
  • [23] Radulovic, D. (1996b) The bootstrap for empirical processes based on stationary observations. Stoch. Processes Applic., 65, 259-279.
  • [24] Shao, Q.M. and Yu, H. (1993) Bootstrapping the sample means for stationary mixing sequences. Stoch. Processes Applic., 48, 175-190.
  • [25] Silverman, B.W. (1986) Density Estimation for Statistics and Data Analysis. London: Chapman & Hall.
  • [26] Wood, A.T.A. and Chan, G. (1994) Simulation of stationary Gaussian processes. J. Comput. Graph. Statist., 31, 409-432.