Open Access
August 2019 Convergence rates of least squares regression estimators with heavy-tailed errors
Qiyang Han, Jon A. Wellner
Ann. Statist. 47(4): 2286-2319 (August 2019). DOI: 10.1214/18-AOS1748

Abstract

We study the performance of the least squares estimator (LSE) in a general nonparametric regression model, when the errors are independent of the covariates but may only have a $p$th moment ($p\geq1$). In such a heavy-tailed regression setting, we show that if the model satisfies a standard “entropy condition” with exponent $\alpha\in(0,2)$, then the $L_{2}$ loss of the LSE converges at a rate

\[\mathcal{O}_{\mathbf{P}}\bigl(n^{-\frac{1}{2+\alpha}}\vee n^{-\frac{1}{2}+\frac{1}{2p}}\bigr).\] Such a rate cannot be improved under the entropy condition alone.

This rate quantifies both some positive and negative aspects of the LSE in a heavy-tailed regression setting. On the positive side, as long as the errors have $p\geq1+2/\alpha$ moments, the $L_{2}$ loss of the LSE converges at the same rate as if the errors are Gaussian. On the negative side, if $p<1+2/\alpha$, there are (many) hard models at any entropy level $\alpha$ for which the $L_{2}$ loss of the LSE converges at a strictly slower rate than other robust estimators.

The validity of the above rate relies crucially on the independence of the covariates and the errors. In fact, the $L_{2}$ loss of the LSE can converge arbitrarily slowly when the independence fails.

The key technical ingredient is a new multiplier inequality that gives sharp bounds for the “multiplier empirical process” associated with the LSE. We further give an application to the sparse linear regression model with heavy-tailed covariates and errors to demonstrate the scope of this new inequality.

Citation

Download Citation

Qiyang Han. Jon A. Wellner. "Convergence rates of least squares regression estimators with heavy-tailed errors." Ann. Statist. 47 (4) 2286 - 2319, August 2019. https://doi.org/10.1214/18-AOS1748

Information

Received: 1 February 2018; Revised: 1 May 2018; Published: August 2019
First available in Project Euclid: 21 May 2019

zbMATH: 07082287
MathSciNet: MR3953452
Digital Object Identifier: 10.1214/18-AOS1748

Subjects:
Primary: 60E15
Secondary: 62G05

Keywords: heavy-tailed errors , Least squares estimation , Multiplier empirical process , multiplier inequality , Nonparametric regression , sparse linear regression

Rights: Copyright © 2019 Institute of Mathematical Statistics

Vol.47 • No. 4 • August 2019
Back to Top