Statistical Science

The Impact of Bootstrap Methods on Time Series Analysis

Dimitris N. Politis

Full-text: Open access

Abstract

Sparked by Efron's seminal paper, the decade of the 1980s was a period of active research on bootstrap methods for independent data--mainly i.i.d. or regression set-ups. By contrast, in the 1990s much research was directed towards resampling dependent data, for example, time series and random fields. Consequently, the availability of valid nonparametric inference procedures based on resampling and/or subsampling has freed practitioners from the necessity of resorting to simplifying assumptions such as normality or linearity that may be misleading.

Article information

Source
Statist. Sci., Volume 18, Issue 2 (2003), 219-230.

Dates
First available in Project Euclid: 19 September 2003

Permanent link to this document
https://projecteuclid.org/euclid.ss/1063994977

Digital Object Identifier
doi:10.1214/ss/1063994977

Mathematical Reviews number (MathSciNet)
MR2026081

Zentralblatt MATH identifier
1332.62340

Keywords
Block bootstrap confidence intervals linear models resampling large sample inference nonparametric estimation subsampling

Citation

Politis, Dimitris N. The Impact of Bootstrap Methods on Time Series Analysis. Statist. Sci. 18 (2003), no. 2, 219--230. doi:10.1214/ss/1063994977. https://projecteuclid.org/euclid.ss/1063994977


Export citation

References

  • Arcones, M. A. (2001). On the asymptotic accuracy of the bootstrap under arbitrary resampling size. Ann. Inst. Statist. Math. To appear.
  • Babu, G. J. and Singh, K. (1983). Inference on means using the bootstrap. Ann. Statist. 11 999--1003.
  • Bartlett, M. S. (1946). On the theoretical specification and sampling properties of autocorrelated time-series. Suppl. J. Roy. Statist. Soc. 8 27--41.
  • Bertail, P. and Politis, D. N. (2001). Extrapolation of subsampling distribution estimators: The i.i.d. and strong mixing cases. Canad. J. Statist. 29 667--680.
  • Bickel, P. and Freedman, D. A. (1981). Some asymptotic theory for the bootstrap. Ann. Statist. 9 1196--1217.
  • Bickel, P., Götze, F. and van Zwet, W. R. (1997). Resampling fewer than $n$ observations: Gains, losses, and remedies for losses. Statist. Sinica 7 1--32.
  • Bollerslev, T., Chou, R. and Kroner, K. (1992). ARCH modelling in finance: A review of the theory and empirical evidence. J. Econometrics 52 5--59.
  • Booth, J. G. and Hall, P. (1993). An improvement of the jackknife distribution function estimator. Ann. Statist. 21 1476--1485.
  • Bose, A. (1988). Edgeworth correction by bootstrap in autoregressions. Ann. Statist. 16 1709--1722.
  • Brockwell, P. and Davis, R. (1991). Time Series: Theory and Methods, 2nd ed. Springer, New York.
  • Bühlmann, P. (1997). Sieve bootstrap for time series. Bernoulli 3 123--148.
  • Bühlmann, P. (2002). Bootstraps for time series. Statist. Sci. 17 52--72.
  • Carlstein, E. (1986). The use of subseries values for estimating the variance of a general statistic from a stationary time series. Ann. Statist. 14 1171--1179.
  • Choi, E. and Hall, P. (2000). Bootstrap confidence regions computed from autoregressions of arbitrary order. J. R. Stat. Soc. Ser. B Stat. Methodol. 62 461--477.
  • Dahlhaus, R. (1997). Fitting time series models to nonstationary processes. Ann. Statist. 25 1--37.
  • Davison, A. C. and Hall, P. (1993). On Studentizing and blocking methods for implementing the bootstrap with dependent data. Austral. J. Statist. 35 215--224.
  • Efron, B. (1979). Bootstrap methods: Another look at the jackknife. Ann. Statist. 7 1--26.
  • Efron, B. and Tibshirani, R. J. (1986). Bootstrap methods for standard errors, confidence intervals, and other measures of statistical accuracy (with discussion). Statist. Sci. 1 54--77.
  • Efron, B. and Tibshirani, R. J. (1993). An Introduction to the Bootstrap. Chapman and Hall, New York.
  • Engle, R., ed. (1995). ARCH: Selected Readings. Oxford Univ. Press.
  • Franke, J., Kreiss, J.-P. and Mammen, E. (2002). Bootstrap of kernel smoothing in nonlinear time series. Bernoulli 8 1--37.
  • Freedman, D. A. (1981). Bootstrapping regression models. Ann. Statist. 9 1218--1228.
  • Freedman, D. A. (1984). On bootstrapping two-stage least-squares estimates in stationary linear models. Ann. Statist. 12 827--842.
  • Fuller, W. A. (1996). Introduction to Statistical Time Series, 2nd ed. Wiley, New York.
  • Giné, E. and Zinn, J. (1990). Necessary conditions for the bootstrap of the mean. Ann. Statist. 17 684--691.
  • Götze, F. and Künsch, H. (1996). Second-order correctness of the blockwise bootstrap for stationary observations. Ann. Statist. 24 1914--1933.
  • Granger, C. and Andersen, A. (1978). An Introduction to Bilinear Time Series Models. Vandenhoeck und Ruprecht, Göttingen.
  • Grenander, U. and Rosenblatt, M. (1957). Statistical Analysis of Stationary Time Series. Wiley, New York.
  • Hall, P. (1985). Resampling a coverage pattern. Stochastic Process. Appl. 20 231--246.
  • Hall, P. (1992). The Bootstrap and Edgeworth Expansion. Springer, New York.
  • Hall, P., DiCiccio, T. J. and Romano, J. P. (1989). On smoothing and the bootstrap. Ann. Statist. 17 692--704.
  • Hall, P., Horowitz, J. L. and Jing, B.-Y. (1995). On blocking rules for the bootstrap with dependent data. Biometrika 82 561--574.
  • Hamilton, J. D. (1994). Time Series Analysis. Princeton Univ. Press.
  • Härdle, W. and Bowman, A. (1988). Bootstrapping in nonparametric regression: Local adaptive smoothing and confidence bands. J. Amer. Statist. Assoc. 83 102--110.
  • Horowitz, J. L. (2003). Bootstrap methods for Markov processes. Econometrica 71 1049--1082.
  • Kreiss, J.-P. (1988). Asymptotic statistical inference for a class of stochastic processes. Habilitationsschrift, Faculty of Mathematics, Univ. Hamburg, Germany.
  • Kreiss, J.-P. (1992). Bootstrap procedures for AR($\infty$) processes. In Bootstrapping and Related Techniques (K. H. Jöckel, G. Rothe and W. Sendler, eds.) 107--113. Springer, Berlin.
  • Künsch, H. R. (1989). The jackknife and the bootstrap for general stationary observations. Ann. Statist. 17 1217--1241.
  • Lahiri, S. N. (1991). Second order optimality of stationary bootstrap. Statist. Probab. Lett. 11 335--341.
  • Lahiri, S. N. (1999). Theoretical comparisons of block bootstrap methods. Ann. Statist. 27 386--404.
  • Liu, R. Y. and Singh, K. (1992). Moving blocks jackknife and bootstrap capture weak dependence. In Exploring the Limits of Bootstrap (R. LePage and L. Billard, eds.) 225--248. Wiley, New York.
  • Masry, E. and Tjøstheim, D. (1995). Nonparametric estimation and identification of nonlinear ARCH time series. Econometric Theory 11 258--289.
  • Neumann, M. and Kreiss, J.-P. (1998). Regression-type inference in nonparametric autoregression. Ann. Statist. 26 1570--1613.
  • Paparoditis, E. (1992). Bootstrapping some statistics useful in identifying ARMA models. In Bootstrapping and Related Techniques (K. H. Jöckel, G. Rothe and W. Sendler, eds.) 115--119. Springer, Berlin.
  • Paparoditis, E. and Politis, D. N. (2000). The local bootstrap for kernel estimators under general dependence conditions. Ann. Inst. Statist. Math. 52 139--159.
  • Paparoditis, E. and Politis, D. N. (2001a). Tapered block bootstrap. Biometrika 88 1105--1119.
  • Paparoditis, E. and Politis, D. N. (2001b). A Markovian local resampling scheme for nonparametric estimators in time series analysis. Econometric Theory 17 540--566.
  • Paparoditis, E. and Politis, D. N. (2001c). The continuous-path block-bootstrap. In Asymptotics in Statistics and Probability (M. Puri, ed.) 305--320. VSP Publications, Zeist, The Netherlands.
  • Paparoditis, E. and Politis, D. N. (2002a). The local bootstrap for Markov processes. J. Statist. Plann. Inference 108 301--328.
  • Paparoditis, E. and Politis, D. N. (2002b). The tapered block bootstrap for general statistics from stationary sequences. Econom. J. 5 131--148.
  • Paparoditis, E. and Politis, D. N. (2002c). Local block bootstrap. C. R. Math. Acad. Sci. Paris. 335 959--962.
  • Paparoditis, E. and Politis, D. N. (2003). Residual-based block bootstrap for unit root testing. Econometrica 71 813--855.
  • Politis, D. N. (2001a). Resampling time series with seasonal components. In Frontiers in Data Mining and Bioinformatics: Proceedings of the 33rd Symposium on the Interface of Computing Science and Statistics.
  • Politis, D. N. (2001b). Adaptive bandwidth choice. J. Nonparametr. Statist. To appear.
  • Politis, D. N. and Romano, J. P. (1992a). A general resampling scheme for triangular arrays of $\alpha$-mixing random variables with application to the problem of spectral density estimation. Ann. Statist. 20 1985--2007.
  • Politis, D. N. and Romano, J. P. (1992b). A circular block-resampling procedure for stationary data. In Exploring the Limits of Bootstrap (R. LePage and L. Billard, eds.) 263--270. Wiley, New York.
  • Politis, D. N. and Romano, J. P. (1992c). A general theory for large sample confidence regions based on subsamples under minimal assumptions. Technical Report 399, Dept. Statistics, Stanford Univ.
  • Politis, D. N. and Romano, J. P. (1993). Estimating the distribution of a Studentized statistic by subsampling. Bull. Internat. Statist. Inst. 2 315--316.
  • Politis, D. N. and Romano, J. P. (1994a). The stationary bootstrap. J. Amer. Statist. Assoc. 89 1303--1313.
  • Politis, D. N. and Romano, J. P. (1994b). Large sample confidence regions based on subsamples under minimal assumptions. Ann. Statist. 22 2031--2050.
  • Politis, D. N. and Romano, J. P. (1995). Bias-corrected nonparametric spectral estimation. J. Time Ser. Anal. 16 67--103.
  • Politis, D. N., Romano, J. P. and Wolf, M. (1999). Subsampling. Springer, New York.
  • Politis, D. N. and White, H. (2001). Automatic block-length selection for the dependent bootstrap. Econometric Rev. To appear.
  • Quenouille, M. (1949). Approximate tests of correlation in time-series. J. Roy. Statist. Soc. Ser. B 11 68--84.
  • Quenouille, M. (1956). Notes on bias in estimation. Biometrika 43 353--360.
  • Radulovic, D. (1996). The bootstrap of the mean for strong mixing sequences under minimal conditions. Statist. Probab. Lett. 28 65--72.
  • Rajarshi, M. B. (1990). Bootstrap in Markov sequences based on estimates of transition density. Ann. Inst. Statist. Math. 42 253--268.
  • Romano, J. P. and Thombs, L. (1996). Inference for autocorrelations under weak assumptions. J. Amer. Statist. Assoc. 91 590--600.
  • Sakov, A. and Bickel, P. (2000). An Edgeworth expansion for the $m$ out of $n$ bootstrapped median. Statist. Probab. Lett. 49 217--223.
  • Shao, J. and Wu, C.-F. J. (1989). A general theory for jackknife variance estimation. Ann. Statist. 17 1176--1197.
  • Sherman, M. and Carlstein, E. (1996). Replicate histograms. J. Amer. Statist. Assoc. 91 566--576.
  • Shi, S. G. (1991). Local bootstrap. Ann. Inst. Statist. Math. 43 667--676.
  • Shibata, R. (1976). Selection of the order of an autoregressive model by Akaike's information criterion. Biometrika 63 117--126.
  • Singh, K. (1981). On the asymptotic accuracy of Efron's bootstrap. Ann. Statist. 9 1187--1195.
  • Subba Rao, T. and Gabr, M. (1984). An Introduction to Bispectral Analysis and Bilinear Time Series Models. Lecture Notes in Statist. 24. Springer, New York.
  • Swanepoel, J. W. H. (1986). A note on proving that the (modified) bootstrap works. Comm. Statist. Theory Methods. 15 3193--3203.
  • Swanepoel, J. W. H. and van Wyk, J. W. J. (1986). The bootstrap applied to power spectral density function estimation. Biometrika 73 135--141.
  • Tong, H. (1990). Non-linear Time Series: A Dynamical Systems Approach. Oxford Univ. Press.
  • Tukey, J. W. (1958). Bias and confidence in not-quite large samples (abstract). Ann. Math. Statist. 29 614.
  • Wu, C.-F. J. (1986). Jackknife, bootstrap and other resampling methods in regression analysis (with discussion). Ann. Statist. 14 1261--1350.