The Annals of Statistics

Bootstrap and Cross-Validation Estimates of the Prediction Error for Linear Regression Models

Olaf Bunke and Bernd Droge

Full-text: Open access

Abstract

Different estimates of the mean squared error of prediction for linear regression models are derived by the bootstrap and cross-validation approaches. A comparison is made under normal error distributions, especially by the biases and the mean square errors. The results indicate that the bias corrected bootstrap estimator is best unbiased and should be the first choice, while its simulated variant has approximately the same behaviour. On the other hand, if only a comparison between uncorrected estimators is made (with implications for nonlinear regression models in mind), then other variants of bootstrap estimates are preferable for a large or a small dimension of the model parameter. For a small dimension, the cross-validation estimate and sometimes grouped variants of it seem also to be acceptable if the model error is known to be small.

Article information

Source
Ann. Statist., Volume 12, Number 4 (1984), 1400-1424.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176346800

Digital Object Identifier
doi:10.1214/aos/1176346800

Mathematical Reviews number (MathSciNet)
MR760696

Zentralblatt MATH identifier
0557.62039

JSTOR
links.jstor.org

Subjects
Primary: 62G05: Estimation
Secondary: 62J05: Linear regression

Keywords
Linear regression model selection prediction error bootstrap approach cross-validation

Citation

Bunke, Olaf; Droge, Bernd. Bootstrap and Cross-Validation Estimates of the Prediction Error for Linear Regression Models. Ann. Statist. 12 (1984), no. 4, 1400--1424. doi:10.1214/aos/1176346800. https://projecteuclid.org/euclid.aos/1176346800


Export citation