Bernoulli

  • Bernoulli
  • Volume 20, Number 1 (2014), 282-303.

Noisy low-rank matrix completion with general sampling distribution

Olga Klopp

Full-text: Open access

Abstract

In the present paper, we consider the problem of matrix completion with noise. Unlike previous works, we consider quite general sampling distribution and we do not need to know or to estimate the variance of the noise. Two new nuclear-norm penalized estimators are proposed, one of them of “square-root” type. We analyse their performance under high-dimensional scaling and provide non-asymptotic bounds on the Frobenius norm error. Up to a logarithmic factor, these performance guarantees are minimax optimal in a number of circumstances.

Article information

Source
Bernoulli, Volume 20, Number 1 (2014), 282-303.

Dates
First available in Project Euclid: 22 January 2014

Permanent link to this document
https://projecteuclid.org/euclid.bj/1390407290

Digital Object Identifier
doi:10.3150/12-BEJ486

Mathematical Reviews number (MathSciNet)
MR3160583

Zentralblatt MATH identifier
06282552

Keywords
high-dimensional sparse model low rank matrix estimation matrix completion unknown variance

Citation

Klopp, Olga. Noisy low-rank matrix completion with general sampling distribution. Bernoulli 20 (2014), no. 1, 282--303. doi:10.3150/12-BEJ486. https://projecteuclid.org/euclid.bj/1390407290


Export citation

References

  • [1] Belloni, A., Chernozhukov, V. and Wang, L. (2011). Square-root lasso: Pivotal recovery of sparse signals via conic programming. Biometrika 98 791–806.
  • [2] Bühlmann, P. and van de Geer, S. (2011). Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer Series in Statistics. Heidelberg: Springer.
  • [3] Bunea, F., She, Y. and Wegkamp, M.H. (2011). Optimal selection of reduced rank estimators of high-dimensional matrices. Ann. Statist. 39 1282–1309.
  • [4] Candès, E.J. and Plan, Y. (2009). Matrix completion with noise. Proceedings of IEEE 98 925–936.
  • [5] Candès, E.J. and Recht, B. (2009). Exact matrix completion via convex optimization. Found. Comput. Math. 9 717–772.
  • [6] Candès, E.J. and Tao, T. (2010). The power of convex relaxation: Near-optimal matrix completion. IEEE Trans. Inform. Theory 56 2053–2080.
  • [7] Foygel, R., Salakhutdinov, R., Shamir, O. and Srebro, N. (2011). Learning with the weighted trace-norm under arbitrary sampling distributions. In Advances in Neural Information Processing Systems (NIPS) 24.
  • [8] Foygel, R. and Srebro, N. (2011). Concentration-based guarantees for low-rank matrix reconstruction. In 24nd Annual Conference on Learning Theory (COLT).
  • [9] Gaïffas, S. and Lecué, G. (2011). Sharp oracle inequalities for high-dimensional matrix prediction. IEEE Trans. Inform. Theory 57 6942–6957.
  • [10] Gross, D. (2011). Recovering low-rank matrices from few coefficients in any basis. IEEE Trans. Inform. Theory 57 1548–1566.
  • [11] Keshavan, R.H., Montanari, A. and Oh, S. (2010). Matrix completion from noisy entries. J. Mach. Learn. Res. 11 2057–2078.
  • [12] Keshavan, R.H., Montanari, A. and Oh, S. (2010). Matrix completion from a few entries. IEEE Trans. Inform. Theory 56 2980–2998.
  • [13] Klopp, O. (2011). Matrix completion with unknown variance of the noise. Available at http://arxiv.org/abs/1112.3055.
  • [14] Klopp, O. (2011). Rank penalized estimators for high-dimensional matrices. Electron. J. Stat. 5 1161–1183.
  • [15] Koltchinskii, V. (2011). A remark on low rank matrix recovery and noncommutative Bernstein type inequalities. In IMS Collections, Festschrift in Honor of J. Wellner.
  • [16] Koltchinskii, V. (2011). Oracle Inequalities in Empirical Risk Minimization and Sparse Recovery Problems. Lecture Notes in Math. 2033. Heidelberg: Springer.
  • [17] Koltchinskii, V. (2011). Von Neumann entropy penalization and low rank matrix estimation. Ann. Statist. 39 2936–2973.
  • [18] Koltchinskii, V., Lounici, K. and Tsybakov, A.B. (2011). Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion. Ann. Statist. 39 2302–2329.
  • [19] Lounici, K. (2011). Optimal spectral norm rates for noisy low-rank matrix completion. Available at http://arxiv.org/abs/1110.5346.
  • [20] Negahban, S. and Wainwright, M.J. (2011). Estimation of (near) low-rank matrices with noise and high-dimensional scaling. Ann. Statist. 39 1069–1097.
  • [21] Negahban, S. and Wainwright, M.J. (2012). Restricted strong convexity and weighted matrix completion: Optimal bounds with noise. J. Mach. Learn. Res. 13 1665–1697.
  • [22] Recht, B. (2011). A simpler approach to matrix completion. J. Mach. Learn. Res. 12 3413–3430.
  • [23] Rohde, A. and Tsybakov, A.B. (2011). Estimation of high-dimensional low-rank matrices. Ann. Statist. 39 887–930.
  • [24] Salakhutdinov, R. and Srebro, N. (2010). Collaborative filtering in a non-uniform world: Learning with the weighted trace norm. In Advances in Neural Information Processing Systems (NIPS) 23.
  • [25] Tropp, J.A. (2012). User-friendly tail bounds for sums of random matrices. Found. Comput. Math. 12 389–434.