Annals of Statistics
- Ann. Statist.
- Volume 34, Number 1 (2006), 78-91.
Improved minimax predictive densities under Kullback–Leibler loss
Let X|μ∼Np(μ,vxI) and Y|μ∼Np(μ,vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ. Based on only observing X=x, we consider the problem of obtaining a predictive density p̂(y|x) for Y that is close to p(y|μ) as measured by expected Kullback–Leibler loss. A natural procedure for this problem is the (formal) Bayes predictive density p̂U(y|x) under the uniform prior πU(μ)≡1, which is best invariant and minimax. We show that any Bayes predictive density will be minimax if it is obtained by a prior yielding a marginal that is superharmonic or whose square root is superharmonic. This yields wide classes of minimax procedures that dominate p̂U(y|x), including Bayes predictive densities under superharmonic priors. Fundamental similarities and differences with the parallel theory of estimating a multivariate normal mean under quadratic loss are described.
Ann. Statist., Volume 34, Number 1 (2006), 78-91.
First available in Project Euclid: 2 May 2006
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Primary: 62C20: Minimax procedures
Secondary: 62C10: Bayesian problems; characterization of Bayes procedures 62F15: Bayesian inference
Bayes rules heat equation inadmissibility multiple shrinkage multivariate normal prior distributions shrinkage estimation superharmonic marginals superharmonic priors unbiased estimate of risk
George, Edward I.; Liang, Feng; Xu, Xinyi. Improved minimax predictive densities under Kullback–Leibler loss. Ann. Statist. 34 (2006), no. 1, 78--91. doi:10.1214/009053606000000155. https://projecteuclid.org/euclid.aos/1146576256