Abstract
This paper considers estimation of the predictive density for a normal linear model with unknown variance under α-divergence loss for −1≤α≤1. We first give a general canonical form for the problem, and then give general expressions for the generalized Bayes solution under the above loss for each α. For a particular class of hierarchical generalized priors studied in Maruyama and Strawderman (2005, 2006) for the problems of estimating the mean vector and the variance respectively, we give the generalized Bayes predictive density. Additionally, we show that, for a subclass of these priors, the resulting estimator dominates the generalized Bayes estimator with respect to the right invariant prior, i.e., the best (fully) equivariant minimax estimator when α=1.
Information
Digital Object Identifier: 10.1214/11-IMSCOLL803