The Annals of Statistics
- Ann. Statist.
- Volume 40, Number 5 (2012), 2359-2388.
Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
We propose dimension reduction methods for sparse, high-dimensional multivariate response regression models. Both the number of responses and that of the predictors may exceed the sample size. Sometimes viewed as complementary, predictor selection and rank reduction are the most popular strategies for obtaining lower-dimensional approximations of the parameter matrix in such models. We show in this article that important gains in prediction accuracy can be obtained by considering them jointly. We motivate a new class of sparse multivariate regression models, in which the coefficient matrix has low rank and zero rows or can be well approximated by such a matrix. Next, we introduce estimators that are based on penalized least squares, with novel penalties that impose simultaneous row and rank restrictions on the coefficient matrix. We prove that these estimators indeed adapt to the unknown matrix sparsity and have fast rates of convergence. We support our theoretical results with an extensive simulation study and two data analyses.
Ann. Statist., Volume 40, Number 5 (2012), 2359-2388.
First available in Project Euclid: 4 February 2013
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Bunea, Florentina; She, Yiyuan; Wegkamp, Marten H. Joint variable and rank selection for parsimonious estimation of high-dimensional matrices. Ann. Statist. 40 (2012), no. 5, 2359--2388. doi:10.1214/12-AOS1039. https://projecteuclid.org/euclid.aos/1359987524