Abstract
We consider the problem of estimating the predictive density in a heteroskedastic Gaussian model under general divergence loss. Based on a conjugate hierarchical set-up, we consider generic classes of shrinkage predictive densities that are governed by location and scale hyper-parameters. For any α-divergence loss, we propose a risk-estimation based methodology for tuning these shrinkage hyper-parameters. Our proposed predictive density estimators enjoy optimal asymptotic risk properties that are in concordance with the optimal shrinkage calibration point estimation results established by Xie, Kou, and Brown (2012) for heteroskedastic hierarchical models. These α-divergence risk optimality properties of our proposed predictors are not shared by empirical Bayes predictive density estimators that are calibrated by traditional methods such as maximum likelihood and method of moments. We conduct several numerical studies to compare the non-asymptotic performance of our proposed predictive density estimators with other competing methods and obtain encouraging results.
Funding Statement
The research was partly supported by NSF grants DMS-1811866, DMS-1916245 and JST grant KAKENHI 19K20222.
Citation
Edward George. Gourab Mukherjee. Keisuke Yano. "Optimal Shrinkage Estimation of Predictive Densities Under α-Divergences." Bayesian Anal. 16 (4) 1139 - 1155, December 2021. https://doi.org/10.1214/21-BA1264
Information