February 2022 On least squares estimation under heteroscedastic and heavy-tailed errors
Arun K. Kuchibhotla, Rohit K. Patra
Author Affiliations +
Ann. Statist. 50(1): 277-302 (February 2022). DOI: 10.1214/21-AOS2105

Abstract

We consider least squares estimation in a general nonparametric regression model where the error is allowed to depend on the covariates. The rate of convergence of the least squares estimator (LSE) for the unknown regression function is well studied when the errors are sub-Gaussian. We find upper bounds on the rates of convergence of the LSE when the error has a uniformly bounded conditional variance and has only finitely many moments. Our upper bound on the rate of convergence of the LSE depends on the moment assumptions on the error, the metric entropy of the class of functions involved and the “local” structure of the function class around the truth. We find sufficient conditions on the error distribution under which the rate of the LSE matches the rate of the LSE under sub-Gaussian error. Our results are finite sample and allow for heteroscedastic and heavy-tailed errors.

Citation

Download Citation

Arun K. Kuchibhotla. Rohit K. Patra. "On least squares estimation under heteroscedastic and heavy-tailed errors." Ann. Statist. 50 (1) 277 - 302, February 2022. https://doi.org/10.1214/21-AOS2105

Information

Received: 1 September 2019; Revised: 1 April 2021; Published: February 2022
First available in Project Euclid: 16 February 2022

MathSciNet: MR4382017
zbMATH: 1486.62113
Digital Object Identifier: 10.1214/21-AOS2105

Subjects:
Primary: 62E17 , 63GO8

Keywords: Dyadic peeling , finite sample tail probability bounds , heavy tails , interpolation inequality , local envelopes , maximal inequality

Rights: Copyright © 2022 Institute of Mathematical Statistics

JOURNAL ARTICLE
26 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.50 • No. 1 • February 2022
Back to Top