We study predictive density estimation under Kullback–Leibler loss in -sparse Gaussian sequence models. We propose proper Bayes predictive density estimates and establish asymptotic minimaxity in sparse models. Fundamental for this is a new risk decomposition for sparse, or spike-and-slab priors.
A surprise is the existence of a phase transition in the future-to-past variance ratio r. For , the natural discrete prior ceases to be asymptotically optimal. Instead, for subcritical r, a ‘bi-grid’ prior with a central region of reduced grid spacing recovers asymptotic minimaxity. This phenomenon seems to have no analog in the otherwise parallel theory of point estimation of a multivariate normal mean under quadratic loss.
For spike-and-uniform slab priors to have any prospect of minimaxity, we show that the sparse parameter space needs also to be magnitude constrained. Within a substantial range of magnitudes, such spike-and-slab priors can attain asymptotic minimaxity.
GM was supported in part by the Zumberge individual award from the University of Southern California’s James H. Zumberge faculty research and innovation fund and by NSF Grant DMS-1811866.
IMJ was supported in part by NSF Grants DMS-1407813, 1418362 and 1811614 and thanks the Australian National University for hospitality while working on this paper.
The authors thank the Associate Editor and three referees for especially stimulating comments that improved the presentation.
"On minimax optimality of sparse Bayes predictive density estimates." Ann. Statist. 50 (1) 81 - 106, February 2022. https://doi.org/10.1214/21-AOS2086