Annals of Statistics
- Ann. Statist.
- Volume 46, Number 2 (2018), 814-841.
I-LAMM for sparse learning: Simultaneous control of algorithmic complexity and statistical error
Jianqing Fan, Han Liu, Qiang Sun, and Tong Zhang
Abstract
We propose a computational framework named iterative local adaptive majorize-minimization (I-LAMM) to simultaneously control algorithmic complexity and statistical error when fitting high-dimensional models. I-LAMM is a two-stage algorithmic implementation of the local linear approximation to a family of folded concave penalized quasi-likelihood. The first stage solves a convex program with a crude precision tolerance to obtain a coarse initial estimator, which is further refined in the second stage by iteratively solving a sequence of convex programs with smaller precision tolerances. Theoretically, we establish a phase transition: the first stage has a sublinear iteration complexity, while the second stage achieves an improved linear rate of convergence. Though this framework is completely algorithmic, it provides solutions with optimal statistical performances and controlled algorithmic complexity for a large family of nonconvex optimization problems. The iteration effects on statistical errors are clearly demonstrated via a contraction property. Our theory relies on a localized version of the sparse/restricted eigenvalue condition, which allows us to analyze a large family of loss and penalty functions and provide optimality guarantees under very weak assumptions (e.g., I-LAMM requires much weaker minimal signal strength than other procedures). Thorough numerical results are provided to support the obtained theory.
Article information
Source
Ann. Statist., Volume 46, Number 2 (2018), 814-841.
Dates
Received: July 2015
Revised: March 2017
First available in Project Euclid: 3 April 2018
Permanent link to this document
https://projecteuclid.org/euclid.aos/1522742437
Digital Object Identifier
doi:10.1214/17-AOS1568
Mathematical Reviews number (MathSciNet)
MR3782385
Zentralblatt MATH identifier
06870280
Subjects
Primary: 62J07: Ridge regression; shrinkage estimators
Secondary: 62C20: Minimax procedures 62H35: Image analysis
Keywords
Algorithmic statistics iteration complexity local adaptive MM nonconvex statistical optimization optimal rate of convergence
Citation
Fan, Jianqing; Liu, Han; Sun, Qiang; Zhang, Tong. I-LAMM for sparse learning: Simultaneous control of algorithmic complexity and statistical error. Ann. Statist. 46 (2018), no. 2, 814--841. doi:10.1214/17-AOS1568. https://projecteuclid.org/euclid.aos/1522742437
Supplemental materials
- Supplement to “I-LAMM for Sparse learning: simultaneous control of algorithmic complexity and statistical error”. The Supplementary Material [Fan et al. (2018)] contains proofs for Corollary 4.3, Theorem 4.4, Proposition 4.5, Proposition 4.6 and Theorem 4.7 in Section 4. It collects proofs of the lemmas presented in Section 5. An application to robust linear regression is given in Appendix D. Other technical lemmas are collected in Appendices E and F.Digital Object Identifier: doi:10.1214/17-AOS1568SUPPSupplemental files are immediately available to subscribers. Non-subscribers gain access to supplemental files with the purchase of the article.