Open Access
April 2005 Nonparametric bootstrap prediction
Tadayoshi Fushiki, Fumiyasu Komaki, Kazuyuki Aihara
Author Affiliations +
Bernoulli 11(2): 293-307 (April 2005). DOI: 10.3150/bj/1116340296

Abstract

Ensemble learning has recently been intensively studied in the field of machine learning. `Bagging' is a method of ensemble learning and uses bootstrap data to construct various predictors. The required prediction is then obtained by averaging the predictors. Harris proposed using this technique with the parametric bootstrap predictive distribution to construct predictive distributions, and showed that the parametric bootstrap predictive distribution gives asymptotically better prediction than a plug-in distribution with the maximum likelihood estimator. In this paper, we investigate nonparametric bootstrap predictive distributions. The nonparametric bootstrap predictive distribution is precisely that obtained by applying bagging to the statistical prediction problem. We show that the nonparametric bootstrap predictive distribution gives predictions asymptotically as good as the parametric bootstrap predictive distribution.

Citation

Download Citation

Tadayoshi Fushiki. Fumiyasu Komaki. Kazuyuki Aihara. "Nonparametric bootstrap prediction." Bernoulli 11 (2) 293 - 307, April 2005. https://doi.org/10.3150/bj/1116340296

Information

Published: April 2005
First available in Project Euclid: 17 May 2005

zbMATH: 1063.62062
MathSciNet: MR2132728
Digital Object Identifier: 10.3150/bj/1116340296

Keywords: Asymptotic theory , bagging , bootstrap predictive distribution , information geometry , Kullback-Leibler divergence

Rights: Copyright © 2005 Bernoulli Society for Mathematical Statistics and Probability

Vol.11 • No. 2 • April 2005
Back to Top