February 2024 Non-asymptotic bounds for the estimator in linear regression with uniform noise
Yufei Yi, Matey Neykov
Author Affiliations +
Bernoulli 30(1): 534-553 (February 2024). DOI: 10.3150/23-BEJ1607

Abstract

The Chebyshev or estimator is an unconventional alternative to the ordinary least squares in solving linear regressions. It is defined as the minimizer of the objective function

βˆ:=argminβYXβ.

The asymptotic distribution of the Chebyshev estimator under fixed number of covariates was recently studied (Knight (2020)), yet finite-sample guarantees and generalizations to high-dimensional settings remain open. In this paper, we develop non-asymptotic upper bounds on the estimation error βˆβ for a Chebyshev estimator βˆ, in a regression setting with uniformly distributed noise εiU([a,a]) where a is either known or unknown. With relatively mild assumptions on the (random) design matrix X, we can bound the error rate by Cpn with high probability, for some constant Cp depending on the dimension p and the law of the design. Furthermore, we illustrate that there exist designs for which the Chebyshev estimator is (nearly) minimax optimal. On the other hand we also argue that there exist designs for which this estimator behaves sub-optimally in terms of the constant Cp’s dependence on p. Finally, we show that “Chebyshev’s LASSO” has advantages over the regular LASSO in high dimensional situations, provided that the noise is uniform. Specifically, we argue that it achieves a much faster rate of estimation under certain assumptions on the growth rate of the sparsity level and the ambient dimension with respect to the sample size.

Acknowledgements

The authors would like to thank Sivaraman Balakrishnan for inspiring discussions on the topic, and in particular the lower bounds and his advice on the presentation of this work. The second author is also indebted to Alexandre Tsybakov and Tony Cai for communicating to him their belief that the lower bound is tight, and one should try to improve the upper bound in the Gaussian case. Finally the authors would like to express their gratitude to the AE and two anonymous referees for their insightful suggestions which led to substantial improvements of the manuscript.

Citation

Download Citation

Yufei Yi. Matey Neykov. "Non-asymptotic bounds for the estimator in linear regression with uniform noise." Bernoulli 30 (1) 534 - 553, February 2024. https://doi.org/10.3150/23-BEJ1607

Information

Received: 1 October 2022; Published: February 2024
First available in Project Euclid: 8 November 2023

MathSciNet: MR4665588
zbMATH: 07788894
Digital Object Identifier: 10.3150/23-BEJ1607

Keywords: Chebyshev estimator , Chebyshev’s LASSO , linear model , uniform distribution

JOURNAL ARTICLE
20 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.30 • No. 1 • February 2024
Back to Top