- Bayesian Anal.
- Volume 5, Number 3 (2010), 533-556.
Bayesian regularized quantile regression
Regularization, e.g. lasso, has been shown to be effective in quantile regression in improving the prediction accuracy (Li and Zhu, 2008; Wu and Liu, 2009). This paper studies regularization in quantile regressions from a Bayesian perspective. By proposing a hierarchical model framework, we give a generic treatment to a set of regularization approaches, including lasso, group lasso and elastic net penalties. Gibbs samplers are derived for all cases. This is the first work to discuss regularized quantile regression with the group lasso penalty and the elastic net penalty. Both simulated and real data examples show that Bayesian regularized quantile regression methods often outperform quantile regression without regularization and their non-Bayesian counterparts with regularization.
Bayesian Anal. Volume 5, Number 3 (2010), 533-556.
First available in Project Euclid: 22 June 2012
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Li, Qing; Xi, Ruibin; Lin, Nan. Bayesian regularized quantile regression. Bayesian Anal. 5 (2010), no. 3, 533--556. doi:10.1214/10-BA521. https://projecteuclid.org/euclid.ba/1340380540