• Bernoulli
  • Volume 11, Number 4 (2005), 747-758.

Bootstrap prediction and Bayesian prediction under misspecified models

Tadayoshi Fushiki

Full-text: Open access


We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, both prediction methods are compared by using the Kullback-Leibler loss under the assumption that the model does not contain the true distribution. We show that bootstrap prediction is asymptotically more effective than Bayesian prediction under misspecified models.

Article information

Bernoulli, Volume 11, Number 4 (2005), 747-758.

First available in Project Euclid: 7 September 2005

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

bagging Bayesian prediction bootstrap Kullback-Leibler divergence misspecification prediction


Fushiki, Tadayoshi. Bootstrap prediction and Bayesian prediction under misspecified models. Bernoulli 11 (2005), no. 4, 747--758. doi:10.3150/bj/1126126768.

Export citation


  • [1] Aitchison, J. (1975) Goodness of predictive fit. Biometrika, 62, 547-554.
  • [2] Akaike, H. (1973) Information theory and an extension of the maximum likelihood principle. In B.N. Petrov and F. Csáki (eds), Proceedings of the Second International Symposium on Information Theory, pp. 267-281, Budapest: Akadémiai Kiado´.
  • [3] Amari, S. and Nagaoka, H. (2000) Methods of Information Geometry. New York: AMS and Oxford University Press.
  • [4] Breiman, L. (1996) Bagging predictors. Machine Learning, 24, 123-140.
  • [5] Fushiki, T., Komaki, F. and Aihara, K. (2004) On parametric bootstrapping and Bayesian prediction. Scand. J. Statist., 31, 403-416.
  • [6] Fushiki, T., Komaki, F. and Aihara, K. (2005) Nonparametric bootstrap prediction. Bernoulli, 11, 293-307.
  • [7] Geisser, S. (1993) Predictive Inference: An Introduction. New York: Chapman & Hall.
  • [8] Harris, I.R. (1989) Predictive fit for natural exponential families. Biometrika, 76, 675-684.
  • [9] Komaki, F. (1996) On asymptotic properties of predictive distributions. Biometrika, 83, 299-313.
  • [10] McCullagh, P. (1987) Tensor Methods in Statistics. London: Chapman & Hall.
  • [11] Shimodaira, H. (2000) Improving predictive inference under covariate shift by weighting the loglikelihood function, J. Statist. Plann. Inference, 90, 227-240.
  • [12] Takeuchi, K. (1976) Distributions of information statistics and criteria for adequacy of models (in Japanese). Math. Sci., 153, 12-18.
  • [13] White, H. (1982) Maximum likelihood estimation of misspecified models, Econometrica, 50, 1-26.