August 2023 On lower bounds for the bias-variance trade-off
Alexis Derumigny, Johannes Schmidt-Hieber
Author Affiliations +
Ann. Statist. 51(4): 1510-1533 (August 2023). DOI: 10.1214/23-AOS2279

Abstract

It is a common phenomenon that for high-dimensional and nonparametric statistical models, rate-optimal estimators balance squared bias and variance. Although this balancing is widely observed, little is known whether methods exist that could avoid the trade-off between bias and variance. We propose a general strategy to obtain lower bounds on the variance of any estimator with bias smaller than a prespecified bound. This shows to which extent the bias-variance trade-off is unavoidable and allows to quantify the loss of performance for methods that do not obey it. The approach is based on a number of abstract lower bounds for the variance involving the change of expectation with respect to different probability measures as well as information measures such as the Kullback–Leibler or χ2-divergence. Some of these inequalities rely on a new concept of information matrices. In a second part of the article, the abstract lower bounds are applied to several statistical models including the Gaussian white noise model, a boundary estimation problem, the Gaussian sequence model and the high-dimensional linear regression model. For these specific statistical applications, different types of bias-variance trade-offs occur that vary considerably in their strength. For the trade-off between integrated squared bias and integrated variance in the Gaussian white noise model, we propose to combine the general strategy for lower bounds with a reduction technique. This allows us to reduce the original problem to a lower bound on the bias-variance trade-off for estimators with additional symmetry properties in a simpler statistical model. In the Gaussian sequence model, different phase transitions of the bias-variance trade-off occur. Although there is a non-trivial interplay between bias and variance, the rate of the squared bias and the variance do not have to be balanced in order to achieve the minimax estimation rate.

Funding Statement

The project has received funding from the Dutch Science Foundation (NWO) via the Vidi grant VI.Vidi.192.021.

Acknowledgments

We are grateful to Ming Yuan for helpful discussions during an early stage of the project, to Zijian Guo for pointing us to the article [11], and to Tomohiro Nishiyama for mentioning a typo in an earlier version. We thank two reviewers and an Associate Editor for many helpful comments and suggestions that significantly improved the manuscript.

Citation

Download Citation

Alexis Derumigny. Johannes Schmidt-Hieber. "On lower bounds for the bias-variance trade-off." Ann. Statist. 51 (4) 1510 - 1533, August 2023. https://doi.org/10.1214/23-AOS2279

Information

Received: 1 June 2020; Revised: 1 March 2023; Published: August 2023
First available in Project Euclid: 19 October 2023

Digital Object Identifier: 10.1214/23-AOS2279

Subjects:
Primary: 62C05 , 62C20 , 62G05

Keywords: Bias-variance decomposition , Cramér–Rao inequality , High-dimensional statistics , minimax estimation , nonparametric estimation

Rights: Copyright © 2023 Institute of Mathematical Statistics

JOURNAL ARTICLE
24 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.51 • No. 4 • August 2023
Back to Top