The Annals of Statistics

Bootstrap Methods: Another Look at the Jackknife

B. Efron

Full-text: Open access

Abstract

We discuss the following problem: given a random sample $\mathbf{X} = (X_1, X_2, \cdots, X_n)$ from an unknown probability distribution $F$, estimate the sampling distribution of some prespecified random variable $R(\mathbf{X}, F)$, on the basis of the observed data $\mathbf{x}$. (Standard jackknife theory gives an approximate mean and variance in the case $R(\mathbf{X}, F) = \theta(\hat{F}) - \theta(F), \theta$ some parameter of interest.) A general method, called the "bootstrap," is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.

Article information

Source
Ann. Statist. Volume 7, Number 1 (1979), 1-26.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
http://projecteuclid.org/euclid.aos/1176344552

JSTOR
links.jstor.org

Digital Object Identifier
doi:10.1214/aos/1176344552

Mathematical Reviews number (MathSciNet)
MR515681

Zentralblatt MATH identifier
0406.62024

Subjects
Primary: 62G05: Estimation
Secondary: 62G15: Tolerance and confidence regions 62H30: Classification and discrimination; cluster analysis [See also 68T10, 91C20] 62J05: Linear regression

Keywords
Jackknife bootstrap resampling subsample values nonparametric variance estimation error rate estimation discriminant analysis nonlinear regression

Citation

Efron, B. Bootstrap Methods: Another Look at the Jackknife. Ann. Statist. 7 (1979), no. 1, 1--26. doi:10.1214/aos/1176344552. http://projecteuclid.org/euclid.aos/1176344552.


Export citation