Electronic Journal of Statistics

Rank penalized estimators for high-dimensional matrices

Olga Klopp

Full-text: Open access

Abstract

In this paper we consider the trace regression model. Assume that we observe a small set of entries or linear combinations of entries of an unknown matrix A0 corrupted by noise. We propose a new rank penalized estimator of A0. For this estimator we establish general oracle inequality for the prediction error both in probability and in expectation. We also prove upper bounds for the rank of our estimator. Then, we apply our general results to the problems of matrix completion and matrix regression. In these cases our estimator has a particularly simple form: it is obtained by hard thresholding of the singular values of a matrix constructed from the observations.

Article information

Source
Electron. J. Statist., Volume 5 (2011), 1161-1183.

Dates
First available in Project Euclid: 6 October 2011

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1317906992

Digital Object Identifier
doi:10.1214/11-EJS637

Mathematical Reviews number (MathSciNet)
MR2842903

Zentralblatt MATH identifier
1274.62489

Subjects
Primary: 62J99: None of the above, but in this section 62H12: Estimation 60B20: Random matrices (probabilistic aspects; for algebraic aspects see 15B52) 60G15: Gaussian processes

Keywords
Matrix completion low rank matrix estimation recovery of the rank statistical learning

Citation

Klopp, Olga. Rank penalized estimators for high-dimensional matrices. Electron. J. Statist. 5 (2011), 1161--1183. doi:10.1214/11-EJS637. https://projecteuclid.org/euclid.ejs/1317906992


Export citation

References

  • [1] Argyriou, A., Evgeniou, T. and Pontil, M. (2008) Convex multi-task feature learning., Mach. Learn., 73, 243–272.
  • [2] Argyriou, A., Micchelli, C.A. and Pontil, M. (2010) On spectral learning., J. Mach. Learn. Res., 11, 935–953.
  • [3] Argyriou, A., Micchelli, C.A., Pontil, M. and Ying, Y. (2007) A spectral regularization framework for multi-task structure learning. In, NIPS.
  • [4] Bach, F.R. (2008) Consistency of trace norm minimization., J. Mach. Learn. Res., 9, 1019–1048.
  • [5] Bunea, F., She, Y. and Wegkamp, M. (2011) Optimal selection of reduced rank estimators of high-dimensional matrices., Annals of Statistics, 39, 1282–1309.
  • [6] Candès, E.J. and Plan, Y. (2009). Matrix completion with noise., Proceedings of IEEE.
  • [7] Candès, E.J. and Plan, Y. (2010) Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements., ArXiv:1001.0339.
  • [8] Candès, E.J. and Recht, B. (2009) Exact matrix completion via convex optimization., Fondations of Computational Mathematics, 9(6), 717–772.
  • [9] Gaïffas, S. and Lecué, G. (2010) Sharp oracle inequalities for the prediction of a high-dimensional matrix., IEEE Transactions on Information Theory, to appear.
  • [10] Giraud, C. (2011) Low rank Multivariate regression., Electronic Journal of Statistics, 5, 775–799.
  • [11] Gross, D. Recovering low-rank matrices from few coefficients in any basis. (2011)., IEEE Transactions, 57(3), 1548–1566.
  • [12] Keshavan, R.H., Montanari, A. and Oh, S. (2010) Matrix completion from noisy entries., Journal of Machine Learning Research, 11, 2057–2078.
  • [13] Koltchinskii, V., Lounici, K. and Tsybakov, A. Nuclear norm penalization and optimal rates for noisy low rank matrix completion., Annals of Statistics, to appear.
  • [14] Negahban, S. and Wainwright, M.J. (2011). Estimation of (near) low-rank matrices with noise and high-dimensional scaling., Annals of Statistics, 39, 1069–1097.
  • [15] Recht, B. (2009) A simpler approach to matrix completion., Journal of Machine Learning Research, to appear.
  • [16] Reinsel, G.C. and Velu, R.P., Multivariate Reduced-Rank Regression: Theory and Applications. Springer, 1998.
  • [17] Rohde, A. and Tsybakov, A. (2011) Estimation of High-Dimensional Low-Rank Matrices., Annals of Statistics, 39, 887–930.