Open Access
December 2006 Asymptotically minimax Bayes predictive densities
Mihaela Aslan
Ann. Statist. 34(6): 2921-2938 (December 2006). DOI: 10.1214/009053606000000885

Abstract

Given a random sample from a distribution with density function that depends on an unknown parameter θ, we are interested in accurately estimating the true parametric density function at a future observation from the same distribution. The asymptotic risk of Bayes predictive density estimates with Kullback–Leibler loss function D(fθ||{})=∫fθ log(fθ/) is used to examine various ways of choosing prior distributions; the principal type of choice studied is minimax. We seek asymptotically least favorable predictive densities for which the corresponding asymptotic risk is minimax. A result resembling Stein’s paradox for estimating normal means by maximum likelihood holds for the uniform prior in the multivariate location family case: when the dimensionality of the model is at least three, the Jeffreys prior is minimax, though inadmissible. The Jeffreys prior is both admissible and minimax for one- and two-dimensional location problems.

Citation

Download Citation

Mihaela Aslan. "Asymptotically minimax Bayes predictive densities." Ann. Statist. 34 (6) 2921 - 2938, December 2006. https://doi.org/10.1214/009053606000000885

Information

Published: December 2006
First available in Project Euclid: 23 May 2007

zbMATH: 1114.62039
MathSciNet: MR2329473
Digital Object Identifier: 10.1214/009053606000000885

Subjects:
Primary: 62G07 , 62G07
Secondary: 62C20

Keywords: asymptotically least favorable priors , Bayes predictive density , Kullback–Leibler loss , minimax risk , the Jeffreys prior

Rights: Copyright © 2006 Institute of Mathematical Statistics

Vol.34 • No. 6 • December 2006
Back to Top