Open Access
2015 Adaptive multinomial matrix completion
Olga Klopp, Jean Lafond, Éric Moulines, Joseph Salmon
Electron. J. Statist. 9(2): 2950-2975 (2015). DOI: 10.1214/15-EJS1093
Abstract

The task of estimating a matrix given a sample of observed entries is known as the matrix completion problem. Most works on matrix completion have focused on recovering an unknown real-valued low-rank matrix from a random sample of its entries. Here, we investigate the case of highly quantized observations when the measurements can take only a small number of values. These quantized outputs are generated according to a probability distribution parametrized by the unknown matrix of interest. This model corresponds, for example, to ratings in recommender systems or labels in multi-class classification. We consider a general, non-uniform, sampling scheme and give theoretical guarantees on the performance of a constrained, nuclear norm penalized maximum likelihood estimator. One important advantage of this estimator is that it does not require knowledge of the rank or an upper bound on the nuclear norm of the unknown matrix and, thus, it is adaptive. We provide lower bounds showing that our estimator is minimax optimal. An efficient algorithm based on lifted coordinate gradient descent is proposed to compute the estimator. A limited Monte-Carlo experiment, using both simulated and real data is provided to support our claims.

References

1.

[1] R. Bhatia., Matrix analysis, volume 169 of Graduate Texts in Mathematics. Springer-Verlag, New York, 1997. MR1477662[1] R. Bhatia., Matrix analysis, volume 169 of Graduate Texts in Mathematics. Springer-Verlag, New York, 1997. MR1477662

2.

[2] J. Bobadilla, F. Ortega, A. Hernando, and A. Gutiérrez. Recommender systems survey., Knowledge-Based Systems, 46(0):109 – 132, 2013.[2] J. Bobadilla, F. Ortega, A. Hernando, and A. Gutiérrez. Recommender systems survey., Knowledge-Based Systems, 46(0):109 – 132, 2013.

3.

[3] J-F. Cai, E. J. Candès, and Z. Shen. A singular value thresholding algorithm for matrix completion., SIAM Journal on Optimization, 20(4) :1956–1982, 2010. MR2600248 1201.90155 10.1137/080738970[3] J-F. Cai, E. J. Candès, and Z. Shen. A singular value thresholding algorithm for matrix completion., SIAM Journal on Optimization, 20(4) :1956–1982, 2010. MR2600248 1201.90155 10.1137/080738970

4.

[4] T. T. Cai and W-X. Zhou. Matrix completion via max-norm constrained optimization., CoRR, abs /1303.0341, 2013.[4] T. T. Cai and W-X. Zhou. Matrix completion via max-norm constrained optimization., CoRR, abs /1303.0341, 2013.

5.

[5] T. T. Cai and W-X. Zhou. A max-norm constrained minimization approach to 1-bit matrix completion., J. Mach. Learn. Res., 14 :3619–3647, 2013. MR3159403 06378066[5] T. T. Cai and W-X. Zhou. A max-norm constrained minimization approach to 1-bit matrix completion., J. Mach. Learn. Res., 14 :3619–3647, 2013. MR3159403 06378066

6.

[6] E. J. Candès and Y. Plan. Matrix completion with noise., Proceedings of the IEEE, 98(6):925–936, 2010.[6] E. J. Candès and Y. Plan. Matrix completion with noise., Proceedings of the IEEE, 98(6):925–936, 2010.

7.

[7] M. A. Davenport, Y. Plan, E. van den Berg, and M. Wootters. 1-bit matrix completion., Information and Inference, 3(3):189–223, 2014. MR3311452 10.1093/imaiai/iau006[7] M. A. Davenport, Y. Plan, E. van den Berg, and M. Wootters. 1-bit matrix completion., Information and Inference, 3(3):189–223, 2014. MR3311452 10.1093/imaiai/iau006

8.

[8] M. Dudík, Z. Harchaoui, and J. Malick. Lifted coordinate descent for learning with trace-norm regularization. In, AISTATS, 2012.[8] M. Dudík, Z. Harchaoui, and J. Malick. Lifted coordinate descent for learning with trace-norm regularization. In, AISTATS, 2012.

9.

[9] M. Fazel., Matrix rank minimization with applications. PhD thesis, Stanford University, 2002.[9] M. Fazel., Matrix rank minimization with applications. PhD thesis, Stanford University, 2002.

10.

[10] R. Foygel, R. Salakhutdinov, O. Shamir, and N. Srebro. Learning with the weighted trace-norm under arbitrary sampling distributions. In, NIPS, pages 2133–2141, 2011.[10] R. Foygel, R. Salakhutdinov, O. Shamir, and N. Srebro. Learning with the weighted trace-norm under arbitrary sampling distributions. In, NIPS, pages 2133–2141, 2011.

11.

[11] G. H. Golub and C. F. van Loan., Matrix computations. Johns Hopkins University Press, Baltimore, MD, fourth edition, 2013. MR3024913[11] G. H. Golub and C. F. van Loan., Matrix computations. Johns Hopkins University Press, Baltimore, MD, fourth edition, 2013. MR3024913

12.

[12] D. Gross. Recovering low-rank matrices from few coefficients in any basis., Information Theory, IEEE Transactions on, 57(3) :1548–1566, 2011. MR2815834 10.1109/TIT.2011.2104999[12] D. Gross. Recovering low-rank matrices from few coefficients in any basis., Information Theory, IEEE Transactions on, 57(3) :1548–1566, 2011. MR2815834 10.1109/TIT.2011.2104999

13.

[13] S. Gunasekar, P. Ravikumar, and J. Ghosh. Exponential family matrix completion under structural constraints., ICML, 2014.[13] S. Gunasekar, P. Ravikumar, and J. Ghosh. Exponential family matrix completion under structural constraints., ICML, 2014.

14.

[14] J. Hui, L. Chaoqiang, S. Zuowei, and X. Yuhong. Robust video denoising using low rank matrix completion., CVPR, 0 :1791–1798, 2010.[14] J. Hui, L. Chaoqiang, S. Zuowei, and X. Yuhong. Robust video denoising using low rank matrix completion., CVPR, 0 :1791–1798, 2010.

15.

[15] L. Ji, P. Musialski, P. Wonka, and Y. Jieping. Tensor completion for estimating missing values in visual data., IEEE Trans. Pattern Anal. Mach. Intell., 35(1):208–220, 2013.[15] L. Ji, P. Musialski, P. Wonka, and Y. Jieping. Tensor completion for estimating missing values in visual data., IEEE Trans. Pattern Anal. Mach. Intell., 35(1):208–220, 2013.

16.

[16] R. H. Keshavan, A. Montanari, and S. Oh. Matrix completion from noisy entries., J. Mach. Learn. Res., 11 :2057–2078, 2010. MR2678022 1242.62069[16] R. H. Keshavan, A. Montanari, and S. Oh. Matrix completion from noisy entries., J. Mach. Learn. Res., 11 :2057–2078, 2010. MR2678022 1242.62069

17.

[17] O. Klopp. Rank penalized estimators for high-dimensional matrices., Electron. J. Stat., 5 :1161–1183, 2011. MR2842903 1274.62489 10.1214/11-EJS637 euclid.ejs/1317906992 [17] O. Klopp. Rank penalized estimators for high-dimensional matrices., Electron. J. Stat., 5 :1161–1183, 2011. MR2842903 1274.62489 10.1214/11-EJS637 euclid.ejs/1317906992

18.

[18] O. Klopp. Noisy low-rank matrix completion with general sampling distribution., Bernoulli, 2(1):282–303, 02 2014. MR3160583 10.3150/12-BEJ486 euclid.bj/1390407290 [18] O. Klopp. Noisy low-rank matrix completion with general sampling distribution., Bernoulli, 2(1):282–303, 02 2014. MR3160583 10.3150/12-BEJ486 euclid.bj/1390407290

19.

[19] V. Koltchinskii, A. B. Tsybakov, and K. Lounici. Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion., Ann. Statist., 39(5) :2302–2329, 2011. MR2906869 1231.62097 10.1214/11-AOS894 euclid.aos/1322663459 [19] V. Koltchinskii, A. B. Tsybakov, and K. Lounici. Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion., Ann. Statist., 39(5) :2302–2329, 2011. MR2906869 1231.62097 10.1214/11-AOS894 euclid.aos/1322663459

20.

[20] Y. Koren, R. Bell, and C. Volinsky. Matrix factorization techniques for recommender systems., Computer, 42(8):30–37, 2009.[20] Y. Koren, R. Bell, and C. Volinsky. Matrix factorization techniques for recommender systems., Computer, 42(8):30–37, 2009.

21.

[21] M. Ledoux and M. Talagrand., Probability in Banach spaces, volume 23 of Ergebnisse der Mathematik und ihrer Grenzgebiete (3) [Results in Mathematics and Related Areas (3)]. Springer-Verlag, Berlin, 1991. Isoperimetry and processes. MR1102015[21] M. Ledoux and M. Talagrand., Probability in Banach spaces, volume 23 of Ergebnisse der Mathematik und ihrer Grenzgebiete (3) [Results in Mathematics and Related Areas (3)]. Springer-Verlag, Berlin, 1991. Isoperimetry and processes. MR1102015

22.

[22] P. Massart. About the constants in Talagrand’s concentration inequalities for empirical processes., Ann. Probab., 28(2):863–884, 2000. MR1782276 1140.60310 10.1214/aop/1019160263 euclid.aop/1019160263 [22] P. Massart. About the constants in Talagrand’s concentration inequalities for empirical processes., Ann. Probab., 28(2):863–884, 2000. MR1782276 1140.60310 10.1214/aop/1019160263 euclid.aop/1019160263

23.

[23] R. Mazumder, T. Hastie, and R. Tibshirani. Spectral regularization algorithms for learning large incomplete matrices., J. Mach. Learn. Res., 11 :2287–2322, 2010. MR2719857 1242.68237[23] R. Mazumder, T. Hastie, and R. Tibshirani. Spectral regularization algorithms for learning large incomplete matrices., J. Mach. Learn. Res., 11 :2287–2322, 2010. MR2719857 1242.68237

24.

[24] S. Negahban and M. J. Wainwright. Restricted strong convexity and weighted matrix completion: optimal bounds with noise., J. Mach. Learn. Res., 13 :1665–1697, 2012. MR2930649 06276162[24] S. Negahban and M. J. Wainwright. Restricted strong convexity and weighted matrix completion: optimal bounds with noise., J. Mach. Learn. Res., 13 :1665–1697, 2012. MR2930649 06276162

25.

[25] J. A. Tropp. User-friendly tail bounds for sums of random matrices., Found. Comput. Math., 12(4):389–434, 2012. MR2946459 10.1007/s10208-011-9099-z[25] J. A. Tropp. User-friendly tail bounds for sums of random matrices., Found. Comput. Math., 12(4):389–434, 2012. MR2946459 10.1007/s10208-011-9099-z

26.

[26] A. B. Tsybakov., Introduction to nonparametric estimation. Springer Series in Statistics. Springer, New York, 2009. MR2724359[26] A. B. Tsybakov., Introduction to nonparametric estimation. Springer Series in Statistics. Springer, New York, 2009. MR2724359

27.

[27] H. Xu, W. Jiasong, W. Lu, C. Yang, L. Senhadji, and H. Shu. Linear total variation approximate regularized nuclear norm optimization for matrix completion., Abstr. Appl. Anal., pages Art. ID 765782, 8, 2014. MR3216075[27] H. Xu, W. Jiasong, W. Lu, C. Yang, L. Senhadji, and H. Shu. Linear total variation approximate regularized nuclear norm optimization for matrix completion., Abstr. Appl. Anal., pages Art. ID 765782, 8, 2014. MR3216075

28.

[28] Y. Yang, J. Ma, and S. Osher. Seismic data reconstruction via matrix completion., Inverse Probl. Imaging, 7(4) :1379–1392, 2013. MR3180685 1292.15017 10.3934/ipi.2013.7.1379[28] Y. Yang, J. Ma, and S. Osher. Seismic data reconstruction via matrix completion., Inverse Probl. Imaging, 7(4) :1379–1392, 2013. MR3180685 1292.15017 10.3934/ipi.2013.7.1379

29.

[29] Y. Koren and J. Sill. Ordrec: An ordinal model for predicting personalized item rating distributions. In, Proceedings of the Fifth ACM Conference on Recommender Systems, RecSys ’11, pages 117–124, New York, NY, USA, 2011. ACM.[29] Y. Koren and J. Sill. Ordrec: An ordinal model for predicting personalized item rating distributions. In, Proceedings of the Fifth ACM Conference on Recommender Systems, RecSys ’11, pages 117–124, New York, NY, USA, 2011. ACM.
Copyright © 2015 The Institute of Mathematical Statistics and the Bernoulli Society
Olga Klopp, Jean Lafond, Éric Moulines, and Joseph Salmon "Adaptive multinomial matrix completion," Electronic Journal of Statistics 9(2), 2950-2975, (2015). https://doi.org/10.1214/15-EJS1093
Received: 1 July 2014; Published: 2015
Vol.9 • No. 2 • 2015
Back to Top