Open Access
December, 1986 Jackknife, Bootstrap and Other Resampling Methods in Regression Analysis
C. F. J. Wu
Ann. Statist. 14(4): 1261-1295 (December, 1986). DOI: 10.1214/aos/1176350142


Motivated by a representation for the least squares estimator, we propose a class of weighted jackknife variance estimators for the least squares estimator by deleting any fixed number of observations at a time. They are unbiased for homoscedastic errors and a special case, the delete-one jackknife, is almost unbiased for heteroscedastic errors. The method is extended to cover nonlinear parameters, regression $M$-estimators, nonlinear regression and generalized linear models. Interval estimators can be constructed from the jackknife histogram. Three bootstrap methods are considered. Two are shown to give biased variance estimators and one does not have the bias-robustness property enjoyed by the weighted delete-one jackknife. A general method for resampling residuals is proposed. It gives variance estimators that are bias-robust. Several bias-reducing estimators are proposed. Some simulation results are reported.


Download Citation

C. F. J. Wu. "Jackknife, Bootstrap and Other Resampling Methods in Regression Analysis." Ann. Statist. 14 (4) 1261 - 1295, December, 1986.


Published: December, 1986
First available in Project Euclid: 12 April 2007

zbMATH: 0618.62072
MathSciNet: MR868303
Digital Object Identifier: 10.1214/aos/1176350142

Primary: 62J05
Secondary: 62G05 , 62J02

Keywords: $M$-regression , balanced residuals , bias reduction , bias-robustness , bootstrap , Fieller's linterval , generalized linear models , jackknife percentile , Linear regression , Nonlinear regression , representation of the least squares estimator , variable jackknife , Weighted jackknife

Rights: Copyright © 1986 Institute of Mathematical Statistics

Vol.14 • No. 4 • December, 1986
Back to Top