Annals of Statistics

Generalized bootstrap for estimating equations

Snigdhansu Chatterjee and Arup Bose

Full-text: Open access


We introduce a generalized bootstrap technique for estimators obtained by solving estimating equations. Some special cases of this generalized bootstrap are the classical bootstrap of Efron, the delete-d jackknife and variations of the Bayesian bootstrap. The use of the proposed technique is discussed in some examples. Distributional consistency of the method is established and an asymptotic representation of the resampling variance estimator is obtained.

Article information

Ann. Statist., Volume 33, Number 1 (2005), 414-436.

First available in Project Euclid: 8 April 2005

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G09: Resampling methods 62E20: Asymptotic distribution theory
Secondary: 62G05: Estimation 62F12: Asymptotic properties of estimators 62F40: Bootstrap, jackknife and other resampling methods 62M99: None of the above, but in this section

Estimating equations resampling generalized bootstrap jackknife Bayesian bootstrap wild bootstrap paired bootstrap M-estimation nonlinear regression generalized linear models dimension asymptotics


Chatterjee, Snigdhansu; Bose, Arup. Generalized bootstrap for estimating equations. Ann. Statist. 33 (2005), no. 1, 414--436. doi:10.1214/009053604000000904.

Export citation


  • Basawa, I. V., Godambe, V. P. and Taylor, R. L., eds. (1997). Selected Proceedings of the Symposium on Estimating Functions. IMS, Hayward, CA.
  • Bickel, P. J. and Freedman, D. A. (1983). Bootstrapping regression models with many parameters. In A Festschrift for Erich L. Lehmann (P. J. Bickel, K. A. Doksum and J. L. Hodges, Jr., eds.) 28–48. Wadsworth, Belmont, CA.
  • Borovskikh, Yu. V. and Korolyuk, V. S. (1997). Martingale Approximation. VSP, Utrecht.
  • Bose, A. and Chatterjee, S. (2002). Comparison of bootstrap and jackknife variance estimators in linear regression: Second order results. Statist. Sinica 12 575–598.
  • Bose, A. and Kushary, D. (1996). Jackknife and weighted jackknife estimation of the variance of $M$-estimators in linear regression. Technical Report 96-12, Dept. Statistics, Purdue Univ.
  • Chatterjee, S. (1999). Generalised bootstrap techniques. Ph.D. dissertation, Indian Statistical Institute, Calcutta.
  • Efron, B. (1979). Bootstrap methods: Another look at the jackknife. Ann. Statist. 7 1–26.
  • Ferguson, T. S. (1996). A Course in Large Sample Theory. Chapman and Hall, London.
  • Freedman, D. A. and Peters, S. C. (1984). Bootstrapping a regression equation: Some empirical results. J. Amer. Statist. Assoc. 79 97–106.
  • Godambe, V. P., ed. (1991). Estimating Functions. Clarendon, Oxford.
  • Hu, F. (2001). Efficiency and robustness of a resampling $M$-estimator in the linear model. J. Multivariate Anal. 78 252–271.
  • Hu, F. and Kalbfleisch, J. D. (2000). The estimating function bootstrap (with discussion). Canad. J. Statist. 28 449–499.
  • Huet, S., Bouvier, A., Gruet, M. and Jolivet, E. (1996). Statistical Tools for Nonlinear Regression. Springer, New York.
  • Lahiri, S. N. (1992). Bootstrapping $M$-estimators of a multiple linear regression parameter. Ann. Statist. 20 1548–1570.
  • Lele, S. (1991). Resampling using estimating functions. In Estimating Functions (V. P. Godambe, ed.) 295–304. Clarendon, Oxford.
  • Liu, R. Y. and Singh, K. (1992). Efficiency and robustness in resampling. Ann. Statist. 20 370–384.
  • Lo, A. Y. (1991). Bayesian bootstrap clones and a biometry function. Sankhyā Ser. A 53 320–333.
  • Mammen, E. (1989). Asymptotics with increasing dimension for robust regression with applications to the bootstrap. Ann. Statist. 17 382–400.
  • Mammen, E. (1992). When Does Bootstrap Work? Asymptotic Results and Simulations. Lecture Notes in Statist. 77. Springer, New York.
  • Mammen, E. (1993). Bootstrap and wild bootstrap for high-dimensional linear models. Ann. Statist. 21 255–285.
  • Myers, R. H., Montgomery, D. C. and Vining, G. G. (2002). Generalized Linear Models. Wiley, New York.
  • Newton, M. A. and Raftery, A. E. (1994). Approximate Bayesian inference with the weighted likelihood bootstrap (with discussion). J. Roy. Statist. Soc. Ser. B 56 3–48.
  • Ortega, J. M. and Rheinboldt, W. C. (1970). Iterative Solution of Nonlinear Equations in Several Variables. Academic Press, New York.
  • Præ stgaard, J. and Wellner, J. A. (1993). Exchangeably weighted bootstraps of the general empirical process. Ann. Probab. 21 2053–2086.
  • Rao, C. R. and Zhao, L. C. (1992). Approximation to the distribution of $M$-estimates in linear models by randomly weighted bootstrap. Sankhyā Ser. A 54 323–331.
  • Rubin, D. B. (1981). The Bayesian bootstrap. Ann. Statist. 9 130–134.
  • Serfling, R. J. (1980). Approximation Theorems of Mathematical Statistics. Wiley, New York.
  • Wu, C.-F. J. (1986). Jackknife, bootstrap and other resampling methods in regression analysis (with discussion). Ann. Statist. 14 1261–1350.
  • Zheng, Z. and Tu, D. (1988). Random weighting method in regression models. Sci. Sinica Ser. A 31 1442–1459.