- Statist. Sci.
- Volume 30, Number 4 (2015), 559-581.
Proximal Algorithms in Statistics and Machine Learning
Proximal algorithms are useful for obtaining solutions to difficult optimization problems, especially those involving nonsmooth or composite objective functions. A proximal algorithm is one whose basic iterations involve the proximal operator of some function, whose evaluation requires solving a specific optimization problem that is typically easier than the original problem. Many familiar algorithms can be cast in this form, and this “proximal view” turns out to provide a set of broad organizing principles for many algorithms useful in statistics and machine learning. In this paper, we show how a number of recent advances in this area can inform modern statistical practice. We focus on several main themes: (1) variable splitting strategies and the augmented Lagrangian; (2) the broad utility of envelope (or variational) representations of objective functions; (3) proximal algorithms for composite objective functions; and (4) the surprisingly large number of functions for which there are closed-form solutions of proximal operators. We illustrate our methodology with regularized Logistic and Poisson regression incorporating a nonconvex bridge penalty and a fused lasso penalty. We also discuss several related issues, including the convergence of nondescent algorithms, acceleration and optimization for nonconvex functions. Finally, we provide directions for future research in this exciting area at the intersection of statistics and optimization.
Statist. Sci. Volume 30, Number 4 (2015), 559-581.
First available in Project Euclid: 9 December 2015
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Polson, Nicholas G.; Scott, James G.; Willard, Brandon T. Proximal Algorithms in Statistics and Machine Learning. Statist. Sci. 30 (2015), no. 4, 559--581. doi:10.1214/15-STS530. https://projecteuclid.org/euclid.ss/1449670858.