Open Access
May 2019 Oracle inequalities for high-dimensional prediction
Johannes Lederer, Lu Yu, Irina Gaynanova
Bernoulli 25(2): 1225-1255 (May 2019). DOI: 10.3150/18-BEJ1019

Abstract

The abundance of high-dimensional data in the modern sciences has generated tremendous interest in penalized estimators such as the lasso, scaled lasso, square-root lasso, elastic net, and many others. In this paper, we establish a general oracle inequality for prediction in high-dimensional linear regression with such methods. Since the proof relies only on convexity and continuity arguments, the result holds irrespective of the design matrix and applies to a wide range of penalized estimators. Overall, the bound demonstrates that generic estimators can provide consistent prediction with any design matrix. From a practical point of view, the bound can help to identify the potential of specific estimators, and they can help to get a sense of the prediction accuracy in a given application.

Citation

Download Citation

Johannes Lederer. Lu Yu. Irina Gaynanova. "Oracle inequalities for high-dimensional prediction." Bernoulli 25 (2) 1225 - 1255, May 2019. https://doi.org/10.3150/18-BEJ1019

Information

Received: 1 April 2017; Revised: 1 December 2017; Published: May 2019
First available in Project Euclid: 6 March 2019

zbMATH: 07049405
MathSciNet: MR3920371
Digital Object Identifier: 10.3150/18-BEJ1019

Keywords: high-dimensional regression , Oracle inequalities , prediction

Rights: Copyright © 2019 Bernoulli Society for Mathematical Statistics and Probability

Vol.25 • No. 2 • May 2019
Back to Top