The Annals of Statistics

Asymptotically minimax Bayes predictive densities

Mihaela Aslan

Full-text: Open access

Abstract

Given a random sample from a distribution with density function that depends on an unknown parameter θ, we are interested in accurately estimating the true parametric density function at a future observation from the same distribution. The asymptotic risk of Bayes predictive density estimates with Kullback–Leibler loss function D(fθ||{})=∫fθ log(fθ/) is used to examine various ways of choosing prior distributions; the principal type of choice studied is minimax. We seek asymptotically least favorable predictive densities for which the corresponding asymptotic risk is minimax. A result resembling Stein’s paradox for estimating normal means by maximum likelihood holds for the uniform prior in the multivariate location family case: when the dimensionality of the model is at least three, the Jeffreys prior is minimax, though inadmissible. The Jeffreys prior is both admissible and minimax for one- and two-dimensional location problems.

Article information

Source
Ann. Statist., Volume 34, Number 6 (2006), 2921-2938.

Dates
First available in Project Euclid: 23 May 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1179935070

Digital Object Identifier
doi:10.1214/009053606000000885

Mathematical Reviews number (MathSciNet)
MR2329473

Zentralblatt MATH identifier
1114.62039

Subjects
Primary: 62G07: Density estimation 62G07: Density estimation
Secondary: 62C20: Minimax procedures

Keywords
Bayes predictive density Kullback–Leibler loss the Jeffreys prior asymptotically least favorable priors minimax risk

Citation

Aslan, Mihaela. Asymptotically minimax Bayes predictive densities. Ann. Statist. 34 (2006), no. 6, 2921--2938. doi:10.1214/009053606000000885. https://projecteuclid.org/euclid.aos/1179935070


Export citation

References