Translator Disclaimer
February 2022 Fundamental barriers to high-dimensional regression with convex penalties
Michael Celentano, Andrea Montanari
Author Affiliations +
Ann. Statist. 50(1): 170-196 (February 2022). DOI: 10.1214/21-AOS2100


In high-dimensional regression, we attempt to estimate a parameter vector β0Rp from np observations {(yi,xi)}in, where xiRp is a vector of predictors and yi is a response variable. A well-established approach uses convex regularizers to promote specific structures (e.g., sparsity) of the estimate βˆ while allowing for practical algorithms. Theoretical analysis implies that convex penalization schemes have nearly optimal estimation properties in certain settings. However, in general the gaps between statistically optimal estimation (with unbounded computational resources) and convex methods are poorly understood.

We show that when the statistican has very simple structural information about the distribution of the entries of β0, a large gap frequently exists between the best performance achieved by any convex regularizer satisfying a mild technical condition and either: (i) the optimal statistical error or (ii) the statistical error achieved by optimal approximate message passing algorithms. Remarkably, a gap occurs at high enough signal-to-noise ratio if and only if the distribution of the coordinates of β0 is not log-concave. These conclusions follow from an analysis of standard Gaussian designs. Our lower bounds are expected to be generally tight, and we prove tightness under certain conditions.

Funding Statement

The first author was supported in part by NSF Grants DGE – 1656518, CCF – 1714305, IIS – 1741162, and ONR N00014-18-1-2729.


Download Citation

Michael Celentano. Andrea Montanari. "Fundamental barriers to high-dimensional regression with convex penalties." Ann. Statist. 50 (1) 170 - 196, February 2022.


Received: 1 April 2019; Revised: 1 June 2021; Published: February 2022
First available in Project Euclid: 16 February 2022

Digital Object Identifier: 10.1214/21-AOS2100

Primary: 62J05 , 62J07
Secondary: 62F12

Keywords: approximate message passing , computational to statistical gaps , convex , high-dimensional regression , M-estimation , Penalty

Rights: Copyright © 2022 Institute of Mathematical Statistics


This article is only available to subscribers.
It is not available for individual sale.

Vol.50 • No. 1 • February 2022
Back to Top