The Annals of Statistics

The solution path of the generalized lasso

Ryan J. Tibshirani and Jonathan Taylor

Full-text: Open access


We present a path algorithm for the generalized lasso problem. This problem penalizes the 1 norm of a matrix D times the coefficient vector, and has a wide range of applications, dictated by the choice of D. Our algorithm is based on solving the dual of the generalized lasso, which greatly facilitates computation of the path. For D = I (the usual lasso), we draw a connection between our approach and the well-known LARS algorithm. For an arbitrary D, we derive an unbiased estimate of the degrees of freedom of the generalized lasso fit. This estimate turns out to be quite intuitive in many applications.

Article information

Ann. Statist. Volume 39, Number 3 (2011), 1335-1371.

First available in Project Euclid: 4 May 2011

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62-XX: STATISTICS

Lasso path algorithm Lagrange dual LARS degrees of freedom


Tibshirani, Ryan J.; Taylor, Jonathan. The solution path of the generalized lasso. Ann. Statist. 39 (2011), no. 3, 1335--1371. doi:10.1214/11-AOS878.

Export citation


  • [1] Becker, S., Bobin, J. and Candes, E. J. (2011). NESTA: A fast and accurate first-order method for sparse recovery. SIAM Journal on Imaging Sciences 4 1–39.
  • [2] Bertsekas, D. P. (1999). Nonlinear Programming. Athena Scientific, Nashua, NH.
  • [3] Best, M. J. (1982). An algorithm for the solution of the parametric quadratic programming problem. CORR Report 82–84, Univ. Waterloo.
  • [4] Boyd, S. and Vandenberghe, L. (2004). Convex Optimization. Cambridge Univ. Press, Cambridge.
  • [5] Bredel, M., Bredel, C., Juric, D., Harsh, G. R., Vogel, H., Recht, L. D. and Sikic, B. I. (2005). High-resolution genome-wide mapping of genetic alterations in human glial brain tumors. Cancer Res. 65 4088–4096.
  • [6] Centers for Disease Control and Prevention. (2009). “Novel H1N1 flu situation update.” Available at
  • [7] Chen, S. S., Donoho, D. L. and Saunders, M. A. (1998). Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20 33–61.
  • [8] Cleveland, W., Grosse, E., Shyu, W. and Terpenning, I. (1991). Local regression models. In Statistical Models in S (J. Chambers and T. Hastie, eds.). Wadsworth, Belmont, CA.
  • [9] Donoho, D. L. and Johnstone, I. M. (1995). Adapting to unknown smoothness via wavelet shrinkage. J. Amer. Statist. Assoc. 90 1200–1224.
  • [10] Donoho, D. L. and Tanner, J. (2010). Counting the faces of randomly-projected hypercubes and orthants, with applications. Discrete Comput. Geom. 43 522–541.
  • [11] Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression. Ann. Statist. 32 407–499.
  • [12] Elad, M., Milanfar, P. and Rubinstein, R. (2007). Analysis versus synthesis in signal priors. Inverse Problems 23 947–968.
  • [13] Evans, L. C. and Gariepy, R. F. (1992). Measure Theory and Fine Properties of Functions. CRC Press, Boca Raton, FL.
  • [14] Friedman, J., Hastie, T., Höfling, H. and Tibshirani, R. (2007). Pathwise coordinate optimization. Ann. Appl. Statist. 1 302–332.
  • [15] Golub, G. H. and Van Loan, C. F. (1996). Matrix Computations, 3rd ed. Johns Hopkins Univ. Press, Baltimore, MD.
  • [16] Hastie, T., Rosset, S., Tibshirani, R. and Zhu, J. (2003/04). The entire regularization path for the support vector machine. J. Mach. Learn. Res. 5 1391–1415 (electronic).
  • [17] Hastie, T. and Tibshirani, R. (1990). Generalized Additive Models. Monographs on Statistics and Applied Probability 43. Chapman & Hall, London.
  • [18] Hastie, T. and Tibshirani, R. (1993). Varying-coefficient models. J. Roy. Statist. Soc. Ser. B 55 757–796.
  • [19] Hoefling, H. (2009). A path algorithm for the fused lasso signal approximator. Unpublished manuscript. Available at
  • [20] James, G. M., Radchenko, P. and Lv, J. (2009). DASSO: Connections between the Dantzig selector and lasso. J. R. Stat. Soc. Ser. B Stat. Methodol. 71 127–142.
  • [21] Kim, S.-J., Koh, K., Boyd, S. and Gorinevsky, D. (2009). l1 trend filtering. SIAM Rev. 51 339–360.
  • [22] Osborne, M. R., Presnell, B. and Turlach, B. A. (2000). On the LASSO and its dual. J. Comput. Graph. Statist. 9 319–337.
  • [23] R Development Core Team (2008). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. Available at
  • [24] Rosset, S. and Zhu, J. (2007). Piecewise linear regularized solution paths. Ann. Statist. 35 1012–1030.
  • [25] Rudin, L. I., Osher, S. and Faterni, E. (1992). Nonlinear total variation based noise removal algorithms. Phys. D 60 259–268.
  • [26] Schneider, R. (1993). Convex Bodies: The Brunn–Minkowski Theory. Encyclopedia of Mathematics and Its Applications 44. Cambridge Univ. Press, Cambridge.
  • [27] She, Y. (2010). Sparse regression with exact clustering. Electron. J. Stat. 4 1055–1096.
  • [28] She, Y. and Owen, A. B. (2010). Outlier detection using nonconvex penalized regression. Unpublished manuscript. Available at owen/reports/theta-ipod.pdf.
  • [29] Stein, C. M. (1981). Estimation of the mean of a multivariate normal distribution. Ann. Statist. 9 1135–1151.
  • [30] Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
  • [31] Tibshirani, R., Saunders, M., Rosset, S., Zhu, J. and Knight, K. (2005). Sparsity and smoothness via the fused lasso. J. R. Stat. Soc. Ser. B Stat. Methodol. 67 91–108.
  • [32] Tibshirani, R. J. and Taylor, J. (2011). Supplement to “The solution path of the generalized lasso” DOI:10.1214/11-AOS878SUPP.
  • [33] Tseng, P. (2001). Convergence of a block coordinate descent method for nondifferentiable minimization. J. Optim. Theory Appl. 109 475–494.
  • [34] Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. J. R. Stat. Soc. Ser. B Stat. Methodol. 67 301–320.
  • [35] Zou, H., Hastie, T. and Tibshirani, R. (2007). On the “degrees of freedom” of the lasso. Ann. Statist. 35 2173–2192.

Supplemental materials

  • Supplementary material: Proofs and technical details. A supplementary document that contains a number of proofs and technical details concerning “The solution path of the generalized lasso”.