Open Access
August 2020 Model selection for high-dimensional linear regression with dependent observations
Ching-Kang Ing
Ann. Statist. 48(4): 1959-1980 (August 2020). DOI: 10.1214/19-AOS1872

Abstract

We investigate the prediction capability of the orthogonal greedy algorithm (OGA) in high-dimensional regression models with dependent observations. The rates of convergence of the prediction error of OGA are obtained under a variety of sparsity conditions. To prevent OGA from overfitting, we introduce a high-dimensional Akaike’s information criterion (HDAIC) to determine the number of OGA iterations. A key contribution of this work is to show that OGA, used in conjunction with HDAIC, can achieve the optimal convergence rate without knowledge of how sparse the underlying high-dimensional model is.

Citation

Download Citation

Ching-Kang Ing. "Model selection for high-dimensional linear regression with dependent observations." Ann. Statist. 48 (4) 1959 - 1980, August 2020. https://doi.org/10.1214/19-AOS1872

Information

Received: 1 November 2018; Revised: 1 May 2019; Published: August 2020
First available in Project Euclid: 14 August 2020

MathSciNet: MR4134782
Digital Object Identifier: 10.1214/19-AOS1872

Subjects:
Primary: 63M30
Secondary: 62F07 , 62F12

Keywords: Best $m$-term approximations , high-dimensional Akaike’s information criterion , orthogonal greedy algorithm , sparsity conditions , time series

Rights: Copyright © 2020 Institute of Mathematical Statistics

Vol.48 • No. 4 • August 2020
Back to Top