Translator Disclaimer
2021 Optimal Shrinkage Estimation of Predictive Densities Under α-Divergences
Edward George, Gourab Mukherjee, Keisuke Yano
Author Affiliations +
Bayesian Anal. Advance Publication 1-17 (2021). DOI: 10.1214/21-BA1264


We consider the problem of estimating the predictive density in a heteroskedastic Gaussian model under general divergence loss. Based on a conjugate hierarchical set-up, we consider generic classes of shrinkage predictive densities that are governed by location and scale hyper-parameters. For any α-divergence loss, we propose a risk-estimation based methodology for tuning these shrinkage hyper-parameters. Our proposed predictive density estimators enjoy optimal asymptotic risk properties that are in concordance with the optimal shrinkage calibration point estimation results established by Xie, Kou, and Brown (2012) for heteroskedastic hierarchical models. These α-divergence risk optimality properties of our proposed predictors are not shared by empirical Bayes predictive density estimators that are calibrated by traditional methods such as maximum likelihood and method of moments. We conduct several numerical studies to compare the non-asymptotic performance of our proposed predictive density estimators with other competing methods and obtain encouraging results.

Funding Statement

The research was partly supported by NSF grants DMS-1811866, DMS-1916245 and JST grant KAKENHI 19K20222.


Download Citation

Edward George. Gourab Mukherjee. Keisuke Yano. "Optimal Shrinkage Estimation of Predictive Densities Under α-Divergences." Bayesian Anal. Advance Publication 1 - 17, 2021.


Published: 2021
First available in Project Euclid: 16 April 2021

Digital Object Identifier: 10.1214/21-BA1264

Primary: 62L20
Secondary: 60F15, 60G42


Advance Publication
Back to Top