The Annals of Statistics

Degrees of freedom in lasso problems

Ryan J. Tibshirani and Jonathan Taylor

Full-text: Open access


We derive the degrees of freedom of the lasso fit, placing no assumptions on the predictor matrix $X$. Like the well-known result of Zou, Hastie and Tibshirani [Ann. Statist. 35 (2007) 2173–2192], which gives the degrees of freedom of the lasso fit when $X$ has full column rank, we express our result in terms of the active set of a lasso solution. We extend this result to cover the degrees of freedom of the generalized lasso fit for an arbitrary predictor matrix $X$ (and an arbitrary penalty matrix $D$). Though our focus is degrees of freedom, we establish some intermediate results on the lasso and generalized lasso that may be interesting on their own.

Article information

Ann. Statist. Volume 40, Number 2 (2012), 1198-1232.

First available in Project Euclid: 18 July 2012

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62J07: Ridge regression; shrinkage estimators 90C46: Optimality conditions, duality [See also 49N15]

Lasso generalized lasso degrees of freedom high-dimensional


Tibshirani, Ryan J.; Taylor, Jonathan. Degrees of freedom in lasso problems. Ann. Statist. 40 (2012), no. 2, 1198--1232. doi:10.1214/12-AOS1003.

Export citation


  • Chen, S. S., Donoho, D. L. and Saunders, M. A. (1998). Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20 33–61.
  • Dossal, C., Kachour, M., Fadili, J., Peyre, G. and Chesneau, C. (2011). The degrees of freedom of the lasso for general design matrix. Available at arXiv:1111.1162.
  • Efron, B. (1986). How biased is the apparent error rate of a prediction rule? J. Amer. Statist. Assoc. 81 461–470.
  • Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression (with discussion, and a rejoinder by the authors). Ann. Statist. 32 407–499.
  • Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96 1348–1360.
  • Grünbaum, B. (2003). Convex Polytopes, 2nd ed. Graduate Texts in Mathematics 221. Springer, New York.
  • Hastie, T. J. and Tibshirani, R. J. (1990). Generalized Additive Models. Monographs on Statistics and Applied Probability 43. Chapman & Hall, London.
  • Loubes, J. M. and Massart, P. (2004). Dicussion to “Least angle regression.” Ann. Statist. 32 460–465.
  • Mallows, C. (1973). Some comments on $C_p$. Technometrics 15 661–675.
  • Meyer, M. and Woodroofe, M. (2000). On the degrees of freedom in shape-restricted regression. Ann. Statist. 28 1083–1104.
  • Osborne, M. R., Presnell, B. and Turlach, B. A. (2000). On the LASSO and its dual. J. Comput. Graph. Statist. 9 319–337.
  • Rosset, S., Zhu, J. and Hastie, T. (2004). Boosting as a regularized path to a maximum margin classifier. J. Mach. Learn. Res. 5 941–973.
  • Schneider, R. (1993). Convex Bodies: The Brunn–Minkowski Theory. Encyclopedia of Mathematics and Its Applications 44. Cambridge Univ. Press, Cambridge.
  • Stein, C. M. (1981). Estimation of the mean of a multivariate normal distribution. Ann. Statist. 9 1135–1151.
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
  • Tibshirani, R. J. (2011). The solution path of the generalized lasso, Ph.D. thesis, Dept. Statistics, Stanford Univ.
  • Tibshirani, R. J. and Taylor, J. (2011). The solution path of the generalized lasso. Ann. Statist. 39 1335–1371.
  • Vaiter, S., Peyre, G., Dossal, C. and Fadili, J. (2011). Robust sparse analysis regularization. Available at arXiv:1109.6222.
  • Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. J. R. Stat. Soc. Ser. B Stat. Methodol. 67 301–320.
  • Zou, H., Hastie, T. and Tibshirani, R. (2007). On the “degrees of freedom” of the lasso. Ann. Statist. 35 2173–2192.