Open Access
October 2011 Factor models and variable selection in high-dimensional regression analysis
Alois Kneip, Pascal Sarda
Ann. Statist. 39(5): 2410-2447 (October 2011). DOI: 10.1214/11-AOS905

Abstract

The paper considers linear regression problems where the number of predictor variables is possibly larger than the sample size. The basic motivation of the study is to combine the points of view of model selection and functional regression by using a factor approach: it is assumed that the predictor vector can be decomposed into a sum of two uncorrelated random components reflecting common factors and specific variabilities of the explanatory variables. It is shown that the traditional assumption of a sparse vector of parameters is restrictive in this context. Common factors may possess a significant influence on the response variable which cannot be captured by the specific effects of a small number of individual variables. We therefore propose to include principal components as additional explanatory variables in an augmented regression model. We give finite sample inequalities for estimates of these components. It is then shown that model selection procedures can be used to estimate the parameters of the augmented model, and we derive theoretical properties of the estimators. Finite sample performance is illustrated by a simulation study.

Citation

Download Citation

Alois Kneip. Pascal Sarda. "Factor models and variable selection in high-dimensional regression analysis." Ann. Statist. 39 (5) 2410 - 2447, October 2011. https://doi.org/10.1214/11-AOS905

Information

Published: October 2011
First available in Project Euclid: 30 November 2011

zbMATH: 1231.62131
MathSciNet: MR2906873
Digital Object Identifier: 10.1214/11-AOS905

Subjects:
Primary: 62J05
Secondary: 62F12 , 62H25

Keywords: factor models , functional regression , Linear regression , Model selection

Rights: Copyright © 2011 Institute of Mathematical Statistics

Vol.39 • No. 5 • October 2011
Back to Top