Annals of Statistics

Asymptotically minimax Bayes predictive densities

Mihaela Aslan

Full-text: Open access


Given a random sample from a distribution with density function that depends on an unknown parameter θ, we are interested in accurately estimating the true parametric density function at a future observation from the same distribution. The asymptotic risk of Bayes predictive density estimates with Kullback–Leibler loss function D(fθ||{})=∫fθ log(fθ/) is used to examine various ways of choosing prior distributions; the principal type of choice studied is minimax. We seek asymptotically least favorable predictive densities for which the corresponding asymptotic risk is minimax. A result resembling Stein’s paradox for estimating normal means by maximum likelihood holds for the uniform prior in the multivariate location family case: when the dimensionality of the model is at least three, the Jeffreys prior is minimax, though inadmissible. The Jeffreys prior is both admissible and minimax for one- and two-dimensional location problems.

Article information

Ann. Statist., Volume 34, Number 6 (2006), 2921-2938.

First available in Project Euclid: 23 May 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G07: Density estimation 62G07: Density estimation
Secondary: 62C20: Minimax procedures

Bayes predictive density Kullback–Leibler loss the Jeffreys prior asymptotically least favorable priors minimax risk


Aslan, Mihaela. Asymptotically minimax Bayes predictive densities. Ann. Statist. 34 (2006), no. 6, 2921--2938. doi:10.1214/009053606000000885.

Export citation