The Annals of Statistics

Admissible predictive density estimation

Lawrence D. Brown, Edward I. George, and Xinyi Xu

Full-text: Open access


Let X|μNp(μ, vxI) and Y|μNp(μ, vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ. Based on observing X=x, we consider the problem of estimating the true predictive density p(y|μ) of Y under expected Kullback–Leibler loss. Our focus here is the characterization of admissible procedures for this problem. We show that the class of all generalized Bayes rules is a complete class, and that the easily interpretable conditions of Brown and Hwang [Statistical Decision Theory and Related Topics (1982) III 205–230] are sufficient for a formal Bayes rule to be admissible.

Article information

Ann. Statist., Volume 36, Number 3 (2008), 1156-1170.

First available in Project Euclid: 26 May 2008

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62C15: Admissibility
Secondary: 62C07: Complete class results 62C10: Bayesian problems; characterization of Bayes procedures 62C20: Minimax procedures

Admissibility Bayesian predictive distribution complete class prior distributions


Brown, Lawrence D.; George, Edward I.; Xu, Xinyi. Admissible predictive density estimation. Ann. Statist. 36 (2008), no. 3, 1156--1170. doi:10.1214/07-AOS506.

Export citation


  • Aitchison, J. (1975). Goodness of prediction fit. Biometrika 62 547–554.
  • Berger, J. O. (1985). Statistical Decision Theory and Bayesian Analysis, 2nd ed. Springer, New York.
  • Brown, L. D. (1971). Admissible estimators, recurrent diffusions, and insoluble boundary value problems. Ann. Math. Statist. 42 855–903.
  • Brown, L. D. (1986). Fundamentals of Statistical Exponential Families with Applications in Statistical Decision Theory. IMS, Hayward, CA.
  • Brown, L. D. and Hwang, J. (1982). A unified admissibility proof. In Statistical Decision Theory and Related Topics III (S. S. Gupta and J. O. Berger, eds.) 1 205–230. Academic Press, New York.
  • Eaton, M. L. (1982). A method for evaluating improper prior distributions. In Statistical Decision Theory and Related Topics III (S. S. Gupta and J. O. Berger, eds.) 1 329–352. Academic Press, New York.
  • Eaton, M. L. (1992). A statistical diptych: Admissible inferences–recurrence of symmetric Markov chains. Ann. Statist. 20 1147–1179.
  • Eaton, M. L., Hobert, J. P., Jones, G. L. and Lai, W.-L. (2007). Evaluation of formal posterior distributions via Markov chain arguments. Preprint. Available at
  • Gatsonis, C. A. (1984). Deriving posterior distributions for a location parameter: A decision theoretic approach. Ann. Statist. 12 958–970.
  • George, E. I., Liang, F. and Xu, X. (2006). Improved minimax prediction under Kullback–Leibler loss. Ann. Statist. 34 78–91.
  • Komaki, F. (2001). A shrinkage predictive distribution for multivariate normal observations. Biometrika 88 859–864.
  • Liang, F. (2002). Exact minimax procedures for predictive density estimation and data compression. Ph.D. dissertation, Dept. Statistics, Yale Univ.
  • Liang, F. and Barron, A. (2004). Exact minimax strategies for predictive density estimation, data compression and model selection. IEEE Trans. Inform. Theory 50 2708–2726.
  • Murray, G. D. (1977). A note on the estimation of probability density functions. Biometrika 64 150–152.
  • Ng, V. M. (1980). On the estimation of parametric density functions. Biometrika 67 505–506.
  • Stein, C. (1974). Estimation of the mean of a multivariate normal distribution. In Proceedings of the Prague Symposium on Asymptotic Statistics (J. Hajek, ed.) 345–381. Univ. Karlova, Prague.
  • Stein, C. (1981). Estimation of a multivariate normal mean. Ann. Statist. 9 1135–1151.
  • Strawderman, W. E. (1971). Proper Bayes minimax estimators of the multivariate normal mean. Ann. Math. Statist. 42 385–388.