Statistical Science

Comment: Boosting Algorithms: Regularization, Prediction and Model Fitting

Trevor Hastie

Full-text: Open access

Article information

Source
Statist. Sci., Volume 22, Number 4 (2007), 513-515.

Dates
First available in Project Euclid: 7 April 2008

Permanent link to this document
https://projecteuclid.org/euclid.ss/1207580165

Digital Object Identifier
doi:10.1214/07-STS242A

Mathematical Reviews number (MathSciNet)
MR2420456

Zentralblatt MATH identifier
1246.62167

Citation

Hastie, Trevor. Comment: Boosting Algorithms: Regularization, Prediction and Model Fitting. Statist. Sci. 22 (2007), no. 4, 513--515. doi:10.1214/07-STS242A. https://projecteuclid.org/euclid.ss/1207580165


Export citation

References

  • Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression (with discussion). Ann. Statist. 32 407–499.
  • Hastie, T., Taylor, J., Tibshirani, R. and Walther, G. (2007). Forward stagewise regression and the monotone lasso. Electron. J. Statist. 1 1–29.
  • Hastie, T., Tibshirani, R. and Friedman, J. (2001). The Elements of Statistical Learning. Data Mining, Inference, and Prediction. Springer, New York.
  • James, G., Radchenko, P. and Lv, J. (2007). The dasso algorithm for fitting the dantzig selector and the lasso. Technical report, Marshall School of Business, Univ. Southern California.
  • Zou, H., Hastie, T. and Tibshirani, R. (2007). On the “degrees of freedom” of the lasso. Ann. Statist. 35 2173–2192.