Translator Disclaimer
2021 On discrete priors and sparse minimax optimal predictive densities
Ujan Gangopadhyay, Gourab Mukherjee
Author Affiliations +
Electron. J. Statist. 15(1): 1636-1660 (2021). DOI: 10.1214/21-EJS1818


We consider the problem of predictive density estimation under Kullback-Leibler loss in a high-dimensional Gaussian model with exact sparsity constraints on the location parameters. For non-asymptotic sparsity levels, the least favorable prior is discrete. Here, we study the first order asymptotic minimax risk of Bayes predictive density estimates where the proportion of non-zero coordinates converges to zero as dimension increases. Motivated by an optimal thresholding rule in Mukherjee and Johnstone (2015), we propose a discrete prior and show that its Bayes predictive density estimate is minimax optimal. This produces a nonsubjective discrete prior distribution that minimizes the maximum posterior predictive relative entropy regret. We discuss the decision theoretic implications and the structural differences between our proposed prior and its closest predecessor – the geometrically decaying discrete prior of Johnstone (1994a) that produced minimax optimal point estimators under quadratic loss. Through numerical experiments, we present non-asymptotic worst-case risk of our proposed estimator across different sparsity levels.

Funding Statement

The research here was partially supported by NSF DMS-1811866.


GM is indebted to Professor Iain Johnstone for numerous stimulating discussions which led to many of the ideas in this paper.


Download Citation

Ujan Gangopadhyay. Gourab Mukherjee. "On discrete priors and sparse minimax optimal predictive densities." Electron. J. Statist. 15 (1) 1636 - 1660, 2021.


Received: 1 September 2020; Published: 2021
First available in Project Euclid: 26 March 2021

Digital Object Identifier: 10.1214/21-EJS1818

Primary: 62L20
Secondary: 60F15, 60G42


Vol.15 • No. 1 • 2021
Back to Top