Open Access
December 2017 The Horseshoe+ Estimator of Ultra-Sparse Signals
Anindya Bhadra, Jyotishka Datta, Nicholas G. Polson, Brandon Willard
Bayesian Anal. 12(4): 1105-1131 (December 2017). DOI: 10.1214/16-BA1028

Abstract

We propose a new prior for ultra-sparse signal detection that we term the “horseshoe+ prior.” The horseshoe+ prior is a natural extension of the horseshoe prior that has achieved success in the estimation and detection of sparse signals and has been shown to possess a number of desirable theoretical properties while enjoying computational feasibility in high dimensions. The horseshoe+ prior builds upon these advantages. Our work proves that the horseshoe+ posterior concentrates at a rate faster than that of the horseshoe in the Kullback–Leibler (K-L) sense. We also establish theoretically that the proposed estimator has lower posterior mean squared error in estimating signals compared to the horseshoe and achieves the optimal Bayes risk in testing up to a constant. For one-group global–local scale mixture priors, we develop a new technique for analyzing the marginal sparse prior densities using the class of Meijer-G functions. In simulations, the horseshoe+ estimator demonstrates superior performance in a standard design setting against competing methods, including the horseshoe and Dirichlet–Laplace estimators. We conclude with an illustration on a prostate cancer data set and by pointing out some directions for future research.

Citation

Download Citation

Anindya Bhadra. Jyotishka Datta. Nicholas G. Polson. Brandon Willard. "The Horseshoe+ Estimator of Ultra-Sparse Signals." Bayesian Anal. 12 (4) 1105 - 1131, December 2017. https://doi.org/10.1214/16-BA1028

Information

Published: December 2017
First available in Project Euclid: 22 September 2016

zbMATH: 1384.62079
MathSciNet: MR3724980
Digital Object Identifier: 10.1214/16-BA1028

Subjects:
Primary: 62F15
Secondary: 62C10 , 62F12

Keywords: Bayesian , global–local shrinkage , horseshoe , horseshoe+ , normal means , Sparsity

Vol.12 • No. 4 • December 2017
Back to Top