Use of continuous shrinkage priors — with a “spike” near zero and heavy-tails towards infinity — is an increasingly popular approach to induce sparsity in parameter estimates. When the parameters are only weakly identified by the likelihood, however, the posterior may end up with tails as heavy as the prior, jeopardizing robustness of inference. A natural solution is to “shrink the shoulders” of a shrinkage prior by lightening up its tails beyond a reasonable parameter range, yielding a regularized version of the prior. We develop a regularization approach which, unlike previous proposals, preserves computationally attractive structures of original shrinkage priors. We study theoretical properties of the Gibbs sampler on resulting posterior distributions, with emphasis on convergence rates of the Pólya-Gamma Gibbs sampler for sparse logistic regression. Our analysis shows that the proposed regularization leads to geometric ergodicity under a broad range of global-local shrinkage priors. Essentially, the only requirement is for the prior on the local scale λ to satisfy . If further satisfies for , as in the case of Bayesian bridge priors, we show the sampler to be uniformly ergodic.
This work was partially supported through National Institutes of Health grants R01 AI107034, U19 AI135995 and R01 AI153044 and through Food and Drug Administration grant HHS 75F40120D00039.
We are indebted to Andrew Holbrook for the alliteration in the article title.
"Shrinkage with Shrunken Shoulders: Gibbs Sampling Shrinkage Model Posteriors with Guaranteed Convergence Rates." Bayesian Anal. 18 (2) 367 - 390, June 2023. https://doi.org/10.1214/22-BA1308