Electronic Journal of Statistics

Adaptive multinomial matrix completion

Olga Klopp, Jean Lafond, Éric Moulines, and Joseph Salmon

Full-text: Open access

Abstract

The task of estimating a matrix given a sample of observed entries is known as the matrix completion problem. Most works on matrix completion have focused on recovering an unknown real-valued low-rank matrix from a random sample of its entries. Here, we investigate the case of highly quantized observations when the measurements can take only a small number of values. These quantized outputs are generated according to a probability distribution parametrized by the unknown matrix of interest. This model corresponds, for example, to ratings in recommender systems or labels in multi-class classification. We consider a general, non-uniform, sampling scheme and give theoretical guarantees on the performance of a constrained, nuclear norm penalized maximum likelihood estimator. One important advantage of this estimator is that it does not require knowledge of the rank or an upper bound on the nuclear norm of the unknown matrix and, thus, it is adaptive. We provide lower bounds showing that our estimator is minimax optimal. An efficient algorithm based on lifted coordinate gradient descent is proposed to compute the estimator. A limited Monte-Carlo experiment, using both simulated and real data is provided to support our claims.

Article information

Source
Electron. J. Statist. Volume 9, Number 2 (2015), 2950-2975.

Dates
Received: July 2014
First available in Project Euclid: 5 January 2016

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1452004956

Digital Object Identifier
doi:10.1214/15-EJS1093

Mathematical Reviews number (MathSciNet)
MR3439190

Zentralblatt MATH identifier
1329.62304

Subjects
Primary: 62J02: General nonlinear regression 62J99: None of the above, but in this section
Secondary: 62H12,60B20

Keywords
Low rank matrix estimation; matrix completion; multinomial model

Citation

Klopp, Olga; Lafond, Jean; Moulines, Éric; Salmon, Joseph. Adaptive multinomial matrix completion. Electron. J. Statist. 9 (2015), no. 2, 2950--2975. doi:10.1214/15-EJS1093. https://projecteuclid.org/euclid.ejs/1452004956


Export citation

References

  • [1] R. Bhatia., Matrix analysis, volume 169 of Graduate Texts in Mathematics. Springer-Verlag, New York, 1997.
  • [2] J. Bobadilla, F. Ortega, A. Hernando, and A. Gutiérrez. Recommender systems survey., Knowledge-Based Systems, 46(0):109 – 132, 2013.
  • [3] J-F. Cai, E. J. Candès, and Z. Shen. A singular value thresholding algorithm for matrix completion., SIAM Journal on Optimization, 20(4) :1956–1982, 2010.
  • [4] T. T. Cai and W-X. Zhou. Matrix completion via max-norm constrained optimization., CoRR, abs /1303.0341, 2013.
  • [5] T. T. Cai and W-X. Zhou. A max-norm constrained minimization approach to 1-bit matrix completion., J. Mach. Learn. Res., 14 :3619–3647, 2013.
  • [6] E. J. Candès and Y. Plan. Matrix completion with noise., Proceedings of the IEEE, 98(6):925–936, 2010.
  • [7] M. A. Davenport, Y. Plan, E. van den Berg, and M. Wootters. 1-bit matrix completion., Information and Inference, 3(3):189–223, 2014.
  • [8] M. Dudík, Z. Harchaoui, and J. Malick. Lifted coordinate descent for learning with trace-norm regularization. In, AISTATS, 2012.
  • [9] M. Fazel., Matrix rank minimization with applications. PhD thesis, Stanford University, 2002.
  • [10] R. Foygel, R. Salakhutdinov, O. Shamir, and N. Srebro. Learning with the weighted trace-norm under arbitrary sampling distributions. In, NIPS, pages 2133–2141, 2011.
  • [11] G. H. Golub and C. F. van Loan., Matrix computations. Johns Hopkins University Press, Baltimore, MD, fourth edition, 2013.
  • [12] D. Gross. Recovering low-rank matrices from few coefficients in any basis., Information Theory, IEEE Transactions on, 57(3) :1548–1566, 2011.
  • [13] S. Gunasekar, P. Ravikumar, and J. Ghosh. Exponential family matrix completion under structural constraints., ICML, 2014.
  • [14] J. Hui, L. Chaoqiang, S. Zuowei, and X. Yuhong. Robust video denoising using low rank matrix completion., CVPR, 0 :1791–1798, 2010.
  • [15] L. Ji, P. Musialski, P. Wonka, and Y. Jieping. Tensor completion for estimating missing values in visual data., IEEE Trans. Pattern Anal. Mach. Intell., 35(1):208–220, 2013.
  • [16] R. H. Keshavan, A. Montanari, and S. Oh. Matrix completion from noisy entries., J. Mach. Learn. Res., 11 :2057–2078, 2010.
  • [17] O. Klopp. Rank penalized estimators for high-dimensional matrices., Electron. J. Stat., 5 :1161–1183, 2011.
  • [18] O. Klopp. Noisy low-rank matrix completion with general sampling distribution., Bernoulli, 2(1):282–303, 02 2014.
  • [19] V. Koltchinskii, A. B. Tsybakov, and K. Lounici. Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion., Ann. Statist., 39(5) :2302–2329, 2011.
  • [20] Y. Koren, R. Bell, and C. Volinsky. Matrix factorization techniques for recommender systems., Computer, 42(8):30–37, 2009.
  • [21] M. Ledoux and M. Talagrand., Probability in Banach spaces, volume 23 of Ergebnisse der Mathematik und ihrer Grenzgebiete (3) [Results in Mathematics and Related Areas (3)]. Springer-Verlag, Berlin, 1991. Isoperimetry and processes.
  • [22] P. Massart. About the constants in Talagrand’s concentration inequalities for empirical processes., Ann. Probab., 28(2):863–884, 2000.
  • [23] R. Mazumder, T. Hastie, and R. Tibshirani. Spectral regularization algorithms for learning large incomplete matrices., J. Mach. Learn. Res., 11 :2287–2322, 2010.
  • [24] S. Negahban and M. J. Wainwright. Restricted strong convexity and weighted matrix completion: optimal bounds with noise., J. Mach. Learn. Res., 13 :1665–1697, 2012.
  • [25] J. A. Tropp. User-friendly tail bounds for sums of random matrices., Found. Comput. Math., 12(4):389–434, 2012.
  • [26] A. B. Tsybakov., Introduction to nonparametric estimation. Springer Series in Statistics. Springer, New York, 2009.
  • [27] H. Xu, W. Jiasong, W. Lu, C. Yang, L. Senhadji, and H. Shu. Linear total variation approximate regularized nuclear norm optimization for matrix completion., Abstr. Appl. Anal., pages Art. ID 765782, 8, 2014.
  • [28] Y. Yang, J. Ma, and S. Osher. Seismic data reconstruction via matrix completion., Inverse Probl. Imaging, 7(4) :1379–1392, 2013.
  • [29] Y. Koren and J. Sill. Ordrec: An ordinal model for predicting personalized item rating distributions. In, Proceedings of the Fifth ACM Conference on Recommender Systems, RecSys ’11, pages 117–124, New York, NY, USA, 2011. ACM.