Abstract
New upper bounds are developed for the distance between and linear and quadratic functions of for random variables of the form . The linear approximation yields a central limit theorem when the squared norm of dominates the squared Frobenius norm of in expectation.
Applications of this normal approximation are given for the asymptotic normality of debiased estimators in linear regression with correlated design and convex penalty in the regime for constant . For the estimation of linear functions of the unknown coefficient vector β, this analysis leads to asymptotic normality of the debiased estimate for most normalized directions , where “most” is quantified in a precise sense. This asymptotic normality holds for any convex penalty if and for any strongly convex penalty if . In particular, the penalty needs not be separable or permutation invariant. By allowing arbitrary regularizers, the results vastly broaden the scope of applicability of debiasing methodologies to obtain confidence intervals in high dimensions. In the absence of strong convexity for , asymptotic normality of the debiased estimate is obtained for the Lasso and the group Lasso under additional conditions. For general convex penalties, our analysis also provides prediction and estimation error bounds of independent interest.
Funding Statement
P.C. Bellec’s Research was partially supported by the NSF Grants DMS-1811976 and DMS-1945428.
C.-H. Zhang’s research was partially supported by the NSF Grants DMS-1721495, IIS-1741390, CCF-1934924, DMS-2052949 and DMS-2210850.
Citation
Pierre C. Bellec. Cun-Hui Zhang. "Debiasing convex regularized estimators and interval estimation in linear models." Ann. Statist. 51 (2) 391 - 436, April 2023. https://doi.org/10.1214/22-AOS2243
Information