Open Access
August 2019 Lasso Meets Horseshoe: A Survey
Anindya Bhadra, Jyotishka Datta, Nicholas G. Polson, Brandon Willard
Statist. Sci. 34(3): 405-427 (August 2019). DOI: 10.1214/19-STS700

Abstract

The goal of this paper is to contrast and survey the major advances in two of the most commonly used high-dimensional techniques, namely, the Lasso and horseshoe regularization. Lasso is a gold standard for predictor selection while horseshoe is a state-of-the-art Bayesian estimator for sparse signals. Lasso is fast and scalable and uses convex optimization whilst the horseshoe is nonconvex. Our novel perspective focuses on three aspects: (i) theoretical optimality in high-dimensional inference for the Gaussian sparse model and beyond, (ii) efficiency and scalability of computation and (iii) methodological development and performance.

Citation

Download Citation

Anindya Bhadra. Jyotishka Datta. Nicholas G. Polson. Brandon Willard. "Lasso Meets Horseshoe: A Survey." Statist. Sci. 34 (3) 405 - 427, August 2019. https://doi.org/10.1214/19-STS700

Information

Published: August 2019
First available in Project Euclid: 11 October 2019

zbMATH: 07162130
MathSciNet: MR4017521
Digital Object Identifier: 10.1214/19-STS700

Keywords: Global-local priors , horseshoe , horseshoe+ , hyper-parameter tuning , Lasso , regression , regularization , Sparsity

Rights: Copyright © 2019 Institute of Mathematical Statistics

Vol.34 • No. 3 • August 2019
Back to Top