Statistical Science

Rejoinder: Best Subset, Forward Stepwise or Lasso? Analysis and Recommendations Based on Extensive Comparisons

Trevor Hastie, Robert Tibshirani, and Ryan J. Tibshirani

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Article information

Source
Statist. Sci., Volume 35, Number 4 (2020), 625-626.

Dates
First available in Project Euclid: 17 November 2020

Permanent link to this document
https://projecteuclid.org/euclid.ss/1605603637

Digital Object Identifier
doi:10.1214/20-STS733REJ

Mathematical Reviews number (MathSciNet)
MR4175388

Citation

Hastie, Trevor; Tibshirani, Robert; Tibshirani, Ryan J. Rejoinder: Best Subset, Forward Stepwise or Lasso? Analysis and Recommendations Based on Extensive Comparisons. Statist. Sci. 35 (2020), no. 4, 625--626. doi:10.1214/20-STS733REJ. https://projecteuclid.org/euclid.ss/1605603637


Export citation

References

  • Hazimeh, H. and Mazumder, R. (2018). Fast best subset selection: Coordinate descent and local combinatorial optimization algorithms. Oper. Res. 68 1517–1537.
  • Johndrow, J. E., Orenstein, P. and Bhattacharya, A. (2020). Bayes shrinkage at GWAS scale: Convergence and approximation theory of a scalable MCMC algorithm for the horseshoe prior. J. Mach. Learn. Res.. To appear.
  • Mazumder, R., Friedman, J. H. and Hastie, T. (2011). SparseNet: Coordinate descent with nonconvex penalties. J. Amer. Statist. Assoc. 106 1125–1138.

See also

  • Main article: Sparse Regression: Scalable Algorithms and Empirical Performance.
  • Main article: Best Subset, Forward Stepwise or Lasso? Analysis and Recommendations Based on Extensive Comparisons.