Open Access
October 2012 Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
Florentina Bunea, Yiyuan She, Marten H. Wegkamp
Ann. Statist. 40(5): 2359-2388 (October 2012). DOI: 10.1214/12-AOS1039


We propose dimension reduction methods for sparse, high-dimensional multivariate response regression models. Both the number of responses and that of the predictors may exceed the sample size. Sometimes viewed as complementary, predictor selection and rank reduction are the most popular strategies for obtaining lower-dimensional approximations of the parameter matrix in such models. We show in this article that important gains in prediction accuracy can be obtained by considering them jointly. We motivate a new class of sparse multivariate regression models, in which the coefficient matrix has low rank and zero rows or can be well approximated by such a matrix. Next, we introduce estimators that are based on penalized least squares, with novel penalties that impose simultaneous row and rank restrictions on the coefficient matrix. We prove that these estimators indeed adapt to the unknown matrix sparsity and have fast rates of convergence. We support our theoretical results with an extensive simulation study and two data analyses.


Download Citation

Florentina Bunea. Yiyuan She. Marten H. Wegkamp. "Joint variable and rank selection for parsimonious estimation of high-dimensional matrices." Ann. Statist. 40 (5) 2359 - 2388, October 2012.


Published: October 2012
First available in Project Euclid: 4 February 2013

zbMATH: 1373.62246
MathSciNet: MR3097606
Digital Object Identifier: 10.1214/12-AOS1039

Primary: 62H15 , 62J07

Keywords: adaptive estimation , Dimension reduction , group lasso , Multivariate response regression , Oracle inequalities , rank constrained minimization , reduced rank estimators , row and rank sparse models

Rights: Copyright © 2012 Institute of Mathematical Statistics

Vol.40 • No. 5 • October 2012
Back to Top