Open Access
April 2018 Regularization and the small-ball method I: Sparse recovery
Guillaume Lecué, Shahar Mendelson
Ann. Statist. 46(2): 611-641 (April 2018). DOI: 10.1214/17-AOS1562

Abstract

We obtain bounds on estimation error rates for regularization procedures of the form \begin{equation*}\hat{f}\in\mathop{\operatorname{argmin}}_{f\in F}(\frac{1}{N}\sum_{i=1}^{N}(Y_{i}-f(X_{i}))^{2}+\lambda \Psi(f))\end{equation*} when $\Psi$ is a norm and $F$ is convex.

Our approach gives a common framework that may be used in the analysis of learning problems and regularization problems alike. In particular, it sheds some light on the role various notions of sparsity have in regularization and on their connection with the size of subdifferentials of $\Psi$ in a neighborhood of the true minimizer.

As “proof of concept” we extend the known estimates for the LASSO, SLOPE and trace norm regularization.

Citation

Download Citation

Guillaume Lecué. Shahar Mendelson. "Regularization and the small-ball method I: Sparse recovery." Ann. Statist. 46 (2) 611 - 641, April 2018. https://doi.org/10.1214/17-AOS1562

Information

Received: 1 February 2016; Revised: 1 January 2017; Published: April 2018
First available in Project Euclid: 3 April 2018

zbMATH: 06870274
MathSciNet: MR3782379
Digital Object Identifier: 10.1214/17-AOS1562

Subjects:
Primary: 60K35 , 62G08
Secondary: 62C20 , 62G05 , 62G20

Keywords: Empirical processes , High-dimensional statistics

Rights: Copyright © 2018 Institute of Mathematical Statistics

Vol.46 • No. 2 • April 2018
Back to Top