Open Access
Translator Disclaimer
November 2015 Proximal Algorithms in Statistics and Machine Learning
Nicholas G. Polson, James G. Scott, Brandon T. Willard
Statist. Sci. 30(4): 559-581 (November 2015). DOI: 10.1214/15-STS530


Proximal algorithms are useful for obtaining solutions to difficult optimization problems, especially those involving nonsmooth or composite objective functions. A proximal algorithm is one whose basic iterations involve the proximal operator of some function, whose evaluation requires solving a specific optimization problem that is typically easier than the original problem. Many familiar algorithms can be cast in this form, and this “proximal view” turns out to provide a set of broad organizing principles for many algorithms useful in statistics and machine learning. In this paper, we show how a number of recent advances in this area can inform modern statistical practice. We focus on several main themes: (1) variable splitting strategies and the augmented Lagrangian; (2) the broad utility of envelope (or variational) representations of objective functions; (3) proximal algorithms for composite objective functions; and (4) the surprisingly large number of functions for which there are closed-form solutions of proximal operators. We illustrate our methodology with regularized Logistic and Poisson regression incorporating a nonconvex bridge penalty and a fused lasso penalty. We also discuss several related issues, including the convergence of nondescent algorithms, acceleration and optimization for nonconvex functions. Finally, we provide directions for future research in this exciting area at the intersection of statistics and optimization.


Download Citation

Nicholas G. Polson. James G. Scott. Brandon T. Willard. "Proximal Algorithms in Statistics and Machine Learning." Statist. Sci. 30 (4) 559 - 581, November 2015.


Published: November 2015
First available in Project Euclid: 9 December 2015

zbMATH: 06946202
MathSciNet: MR3432841
Digital Object Identifier: 10.1214/15-STS530

Keywords: ADMM , Bayes MAP , Divide and Concur , envelopes , Kurdyka–Łojasiewicz , nonconvex , optimization , regularization , shrinkage , Sparsity , splitting

Rights: Copyright © 2015 Institute of Mathematical Statistics


Vol.30 • No. 4 • November 2015
Back to Top