Open Access
February 2021 Asymmetry helps: Eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices
Yuxin Chen, Chen Cheng, Jianqing Fan
Ann. Statist. 49(1): 435-458 (February 2021). DOI: 10.1214/20-AOS1963
Abstract

This paper is concerned with the interplay between statistical asymmetry and spectral methods. Suppose we are interested in estimating a rank-1 and symmetric matrix $\boldsymbol{M}^{\star }\in \mathbb{R}^{n\times n}$, yet only a randomly perturbed version $\boldsymbol{M}$ is observed. The noise matrix $\boldsymbol{M}-\boldsymbol{M}^{\star }$ is composed of independent (but not necessarily homoscedastic) entries and is, therefore, not symmetric in general. This might arise if, for example, when we have two independent samples for each entry of $\boldsymbol{M}^{\star }$ and arrange them in an asymmetric fashion. The aim is to estimate the leading eigenvalue and the leading eigenvector of $\boldsymbol{M}^{\star }$.

We demonstrate that the leading eigenvalue of the data matrix $\boldsymbol{M}$ can be $O(\sqrt{n})$ times more accurate (up to some log factor) than its (unadjusted) leading singular value of $\boldsymbol{M}$ in eigenvalue estimation. Moreover, the eigen-decomposition approach is fully adaptive to heteroscedasticity of noise, without the need of any prior knowledge about the noise distributions. In a nutshell, this curious phenomenon arises since the statistical asymmetry automatically mitigates the bias of the eigenvalue approach, thus eliminating the need of careful bias correction. Additionally, we develop appealing nonasymptotic eigenvector perturbation bounds; in particular, we are able to bound the perturbation of any linear function of the leading eigenvector of $\boldsymbol{M}$ (e.g., entrywise eigenvector perturbation). We also provide partial theory for the more general rank-$r$ case. The takeaway message is this: arranging the data samples in an asymmetric manner and performing eigendecomposition could sometimes be quite beneficial.

References

1.

Abbe, E., Fan, J., Wang, K. and Zhong, Y. (2020). Entrywise eigenvector analysis of random matrices with low expected rank. Ann. Statist. 48 1452–1474. 1450.62066 10.1214/19-AOS1854Abbe, E., Fan, J., Wang, K. and Zhong, Y. (2020). Entrywise eigenvector analysis of random matrices with low expected rank. Ann. Statist. 48 1452–1474. 1450.62066 10.1214/19-AOS1854

2.

Bai, Z. and Yao, J. (2008). Central limit theorems for eigenvalues in a spiked population model. Ann. Inst. Henri Poincaré Probab. Stat. 44 447–474. 1274.62129 10.1214/07-AIHP118 euclid.aihp/1211819420Bai, Z. and Yao, J. (2008). Central limit theorems for eigenvalues in a spiked population model. Ann. Inst. Henri Poincaré Probab. Stat. 44 447–474. 1274.62129 10.1214/07-AIHP118 euclid.aihp/1211819420

3.

Baik, J. and Silverstein, J. W. (2006). Eigenvalues of large sample covariance matrices of spiked population models. J. Multivariate Anal. 97 1382–1408. 1220.15011 10.1016/j.jmva.2005.08.003Baik, J. and Silverstein, J. W. (2006). Eigenvalues of large sample covariance matrices of spiked population models. J. Multivariate Anal. 97 1382–1408. 1220.15011 10.1016/j.jmva.2005.08.003

4.

Bao, Z., Ding, X. and Wang, K. (2018). Singular vector and singular subspace distribution for the matrix denoising model. Preprint. Available at  arXiv:1809.104761809.10476Bao, Z., Ding, X. and Wang, K. (2018). Singular vector and singular subspace distribution for the matrix denoising model. Preprint. Available at  arXiv:1809.104761809.10476

5.

Bauer, F. L. and Fike, C. T. (1960). Norms and exclusion theorems. Numer. Math. 2 137–141. 0101.25503 10.1007/BF01386217Bauer, F. L. and Fike, C. T. (1960). Norms and exclusion theorems. Numer. Math. 2 137–141. 0101.25503 10.1007/BF01386217

6.

Benaych-Georges, F. and Nadakuditi, R. R. (2011). The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices. Adv. Math. 227 494–521. 1226.15023 10.1016/j.aim.2011.02.007Benaych-Georges, F. and Nadakuditi, R. R. (2011). The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices. Adv. Math. 227 494–521. 1226.15023 10.1016/j.aim.2011.02.007

7.

Benaych-Georges, F. and Nadakuditi, R. R. (2012). The singular values and vectors of low rank perturbations of large rectangular random matrices. J. Multivariate Anal. 111 120–135. 1252.15039 10.1016/j.jmva.2012.04.019Benaych-Georges, F. and Nadakuditi, R. R. (2012). The singular values and vectors of low rank perturbations of large rectangular random matrices. J. Multivariate Anal. 111 120–135. 1252.15039 10.1016/j.jmva.2012.04.019

8.

Benaych-Georges, F. and Rochet, J. (2016). Outliers in the single ring theorem. Probab. Theory Related Fields 165 313–363. 1342.15027 10.1007/s00440-015-0632-xBenaych-Georges, F. and Rochet, J. (2016). Outliers in the single ring theorem. Probab. Theory Related Fields 165 313–363. 1342.15027 10.1007/s00440-015-0632-x

9.

Bordenave, C. and Capitaine, M. (2016). Outlier eigenvalues for deformed i.i.d. random matrices. Comm. Pure Appl. Math. 69 2131–2194. 1353.15032 10.1002/cpa.21629Bordenave, C. and Capitaine, M. (2016). Outlier eigenvalues for deformed i.i.d. random matrices. Comm. Pure Appl. Math. 69 2131–2194. 1353.15032 10.1002/cpa.21629

10.

Brézin, E. and Zee, A. (1998). Non-Hermitean delocalization: Multiple scattering and bounds. Nuclear Phys. B 509 599–614. 0953.82026 10.1016/S0550-3213(97)00652-4Brézin, E. and Zee, A. (1998). Non-Hermitean delocalization: Multiple scattering and bounds. Nuclear Phys. B 509 599–614. 0953.82026 10.1016/S0550-3213(97)00652-4

11.

Bryc, W. and Silverstein, J. W. (2018). Singular values of large non-central random matrices. Preprint. Available at  arXiv:1802.029601802.02960Bryc, W. and Silverstein, J. W. (2018). Singular values of large non-central random matrices. Preprint. Available at  arXiv:1802.029601802.02960

12.

Cai, T., Han, X. and Pan, G. (2017). Limiting laws for divergent spiked eigenvalues and largest non-spiked eigenvalue of sample covariance matrices. Preprint. Available at  arXiv:1711.002171711.00217 07241590 10.1214/18-AOS1798Cai, T., Han, X. and Pan, G. (2017). Limiting laws for divergent spiked eigenvalues and largest non-spiked eigenvalue of sample covariance matrices. Preprint. Available at  arXiv:1711.002171711.00217 07241590 10.1214/18-AOS1798

13.

Cai, T. T. and Zhang, A. (2018). Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics. Ann. Statist. 46 60–89. 1395.62122 10.1214/17-AOS1541 euclid.aos/1519268424Cai, T. T. and Zhang, A. (2018). Rate-optimal perturbation bounds for singular subspaces with applications to high-dimensional statistics. Ann. Statist. 46 60–89. 1395.62122 10.1214/17-AOS1541 euclid.aos/1519268424

14.

Cai, C., Li, G., Chi, Y., Poor, H. V. and Chen, Y. (2019). Subspace estimation from unbalanced and incomplete data matrices: $\ell _{2,\infty}$ statistical guarantees. Preprint. Available at  arXiv:1910.042671910.04267Cai, C., Li, G., Chi, Y., Poor, H. V. and Chen, Y. (2019). Subspace estimation from unbalanced and incomplete data matrices: $\ell _{2,\infty}$ statistical guarantees. Preprint. Available at  arXiv:1910.042671910.04267

15.

Candès, E. J. and Recht, B. (2009). Exact matrix completion via convex optimization. Found. Comput. Math. 9 717–772. 1219.90124 10.1007/s10208-009-9045-5Candès, E. J. and Recht, B. (2009). Exact matrix completion via convex optimization. Found. Comput. Math. 9 717–772. 1219.90124 10.1007/s10208-009-9045-5

16.

Cape, J., Tang, M. and Priebe, C. E. (2019). Signal-plus-noise matrix models: Eigenvector deviations and fluctuations. Biometrika 106 243–250. 07051945 10.1093/biomet/asy070Cape, J., Tang, M. and Priebe, C. E. (2019). Signal-plus-noise matrix models: Eigenvector deviations and fluctuations. Biometrika 106 243–250. 07051945 10.1093/biomet/asy070

17.

Capitaine, M., Donati-Martin, C. and Féral, D. (2009). The largest eigenvalues of finite rank deformation of large Wigner matrices: Convergence and nonuniversality of the fluctuations. Ann. Probab. 37 1–47. 1163.15026 10.1214/08-AOP394 euclid.aop/1234881683Capitaine, M., Donati-Martin, C. and Féral, D. (2009). The largest eigenvalues of finite rank deformation of large Wigner matrices: Convergence and nonuniversality of the fluctuations. Ann. Probab. 37 1–47. 1163.15026 10.1214/08-AOP394 euclid.aop/1234881683

18.

Chen, Y., Cheng, C. and Fan, J. (2020). Supplement to “Asymmetry helps: Eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices.”  https://doi.org/10.1214/20-AOS1963SUPPChen, Y., Cheng, C. and Fan, J. (2020). Supplement to “Asymmetry helps: Eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices.”  https://doi.org/10.1214/20-AOS1963SUPP

19.

Chen, Y., Chi, Y., Fan, J., Ma, C. and Yan, Y. (2019a). Noisy matrix completion: Understanding statistical guarantees for convex relaxation via nonconvex optimization. Available at  arXiv:1902.076981902.07698 07271856 10.1137/19M1290000Chen, Y., Chi, Y., Fan, J., Ma, C. and Yan, Y. (2019a). Noisy matrix completion: Understanding statistical guarantees for convex relaxation via nonconvex optimization. Available at  arXiv:1902.076981902.07698 07271856 10.1137/19M1290000

20.

Chen, Y., Fan, J., Ma, C. and Wang, K. (2019b). Spectral method and regularized MLE are both optimal for top-$K$ ranking. Ann. Statist. 47 2204–2235. 1425.62038 10.1214/18-AOS1745 euclid.aos/1558425643Chen, Y., Fan, J., Ma, C. and Wang, K. (2019b). Spectral method and regularized MLE are both optimal for top-$K$ ranking. Ann. Statist. 47 2204–2235. 1425.62038 10.1214/18-AOS1745 euclid.aos/1558425643

21.

Chen, Y., Fan, J., Ma, C. and Yan, Y. (2019c). Inference and uncertainty quantification for noisy matrix completion. Proc. Natl. Acad. Sci. USA 116 22931–22937. 1431.90117 10.1073/pnas.1910053116Chen, Y., Fan, J., Ma, C. and Yan, Y. (2019c). Inference and uncertainty quantification for noisy matrix completion. Proc. Natl. Acad. Sci. USA 116 22931–22937. 1431.90117 10.1073/pnas.1910053116

22.

Cheng, C., Wei, Y. and Chen, Y. (2020). Tackling small eigen-gaps: Fine-grained eigenvector estimation and inference under heteroscedastic noise. Preprint. Available at  arXiv:2001.046202001.04620Cheng, C., Wei, Y. and Chen, Y. (2020). Tackling small eigen-gaps: Fine-grained eigenvector estimation and inference under heteroscedastic noise. Preprint. Available at  arXiv:2001.046202001.04620

23.

Chi, Y., Lu, Y. M. and Chen, Y. (2019). Nonconvex optimization meets low-rank matrix factorization: An overview. IEEE Trans. Signal Process. 67 5239–5269. 07123429 10.1109/TSP.2019.2937282Chi, Y., Lu, Y. M. and Chen, Y. (2019). Nonconvex optimization meets low-rank matrix factorization: An overview. IEEE Trans. Signal Process. 67 5239–5269. 07123429 10.1109/TSP.2019.2937282

24.

Davis, C. and Kahan, W. M. (1970). The rotation of eigenvectors by a perturbation. III. SIAM J. Numer. Anal. 7 1–46. 0198.47201 10.1137/0707001Davis, C. and Kahan, W. M. (1970). The rotation of eigenvectors by a perturbation. III. SIAM J. Numer. Anal. 7 1–46. 0198.47201 10.1137/0707001

25.

Eldridge, J., Belkin, M. and Wang, Y. (2018). Unperturbed: Spectral analysis beyond Davis–Kahan. In Algorithmic Learning Theory 2018. Proc. Mach. Learn. Res. (PMLR) 83 38. Proceedings of Machine Learning Research PMLR. 1406.60014Eldridge, J., Belkin, M. and Wang, Y. (2018). Unperturbed: Spectral analysis beyond Davis–Kahan. In Algorithmic Learning Theory 2018. Proc. Mach. Learn. Res. (PMLR) 83 38. Proceedings of Machine Learning Research PMLR. 1406.60014

26.

Erdos, L., Knowles, A., Yau, H.-T. and Yin, J. (2013). Spectral statistics of Erdos–Rényi graphs I: Local semicircle law. Ann. Probab. 41 2279–2375. 1272.05111 10.1214/11-AOP734 euclid.aop/1368623526Erdos, L., Knowles, A., Yau, H.-T. and Yin, J. (2013). Spectral statistics of Erdos–Rényi graphs I: Local semicircle law. Ann. Probab. 41 2279–2375. 1272.05111 10.1214/11-AOP734 euclid.aop/1368623526

27.

Fan, J., Wang, W. and Zhong, Y. (2017). An $\ell_{\infty}$ eigenvector perturbation bound and its application to robust covariance estimation. J. Mach. Learn. Res. 18 Paper No. 207, 42. MR3827095Fan, J., Wang, W. and Zhong, Y. (2017). An $\ell_{\infty}$ eigenvector perturbation bound and its application to robust covariance estimation. J. Mach. Learn. Res. 18 Paper No. 207, 42. MR3827095

28.

Feinberg, J. and Zee, A. (1997). Non-Hermitian random matrix theory: Method of Hermitian reduction. Nuclear Phys. B 504 579–608. 0925.15010 10.1016/S0550-3213(97)00502-6Feinberg, J. and Zee, A. (1997). Non-Hermitian random matrix theory: Method of Hermitian reduction. Nuclear Phys. B 504 579–608. 0925.15010 10.1016/S0550-3213(97)00502-6

29.

Féral, D. and Péché, S. (2007). The largest eigenvalue of rank one deformation of large Wigner matrices. Comm. Math. Phys. 272 185–228.Féral, D. and Péché, S. (2007). The largest eigenvalue of rank one deformation of large Wigner matrices. Comm. Math. Phys. 272 185–228.

30.

Füredi, Z. and Komlós, J. (1981). The eigenvalues of random symmetric matrices. Combinatorica 1 233–241.Füredi, Z. and Komlós, J. (1981). The eigenvalues of random symmetric matrices. Combinatorica 1 233–241.

31.

Jain, P. and Netrapalli, P. (2015). Fast exact matrix completion with finite samples. In Conference on Learning Theory 1007–1034.Jain, P. and Netrapalli, P. (2015). Fast exact matrix completion with finite samples. In Conference on Learning Theory 1007–1034.

32.

Johnstone, I. M. (2001). On the distribution of the largest eigenvalue in principal components analysis. Ann. Statist. 29 295–327. 1016.62078 10.1214/aos/1009210544 euclid.aos/1009210544Johnstone, I. M. (2001). On the distribution of the largest eigenvalue in principal components analysis. Ann. Statist. 29 295–327. 1016.62078 10.1214/aos/1009210544 euclid.aos/1009210544

33.

Johnstone, I. M. and Lu, A. Y. (2009). On consistency and sparsity for principal components analysis in high dimensions. J. Amer. Statist. Assoc. 104 682–693. 1388.62174 10.1198/jasa.2009.0121Johnstone, I. M. and Lu, A. Y. (2009). On consistency and sparsity for principal components analysis in high dimensions. J. Amer. Statist. Assoc. 104 682–693. 1388.62174 10.1198/jasa.2009.0121

34.

Keshavan, R. H., Montanari, A. and Oh, S. (2010). Matrix completion from a few entries. IEEE Trans. Inf. Theory 56 2980–2998. 1366.62111 10.1109/TIT.2010.2046205Keshavan, R. H., Montanari, A. and Oh, S. (2010). Matrix completion from a few entries. IEEE Trans. Inf. Theory 56 2980–2998. 1366.62111 10.1109/TIT.2010.2046205

35.

Khoruzhenko, B. (1996). Large-$N$ eigenvalue distribution of randomly perturbed asymmetric matrices. J. Phys. A 29 L165–L169. 0917.60035 10.1088/0305-4470/29/7/003Khoruzhenko, B. (1996). Large-$N$ eigenvalue distribution of randomly perturbed asymmetric matrices. J. Phys. A 29 L165–L169. 0917.60035 10.1088/0305-4470/29/7/003

36.

Knowles, A. and Yin, J. (2013). The isotropic semicircle law and deformation of Wigner matrices. Comm. Pure Appl. Math. 66 1663–1750. 1290.60004 10.1002/cpa.21450Knowles, A. and Yin, J. (2013). The isotropic semicircle law and deformation of Wigner matrices. Comm. Pure Appl. Math. 66 1663–1750. 1290.60004 10.1002/cpa.21450

37.

Koltchinskii, V. and Xia, D. (2016). Perturbation of linear forms of singular vectors under Gaussian noise. In High Dimensional Probability VII. Progress in Probability 71 397–423. Springer, Cham. 1353.15034Koltchinskii, V. and Xia, D. (2016). Perturbation of linear forms of singular vectors under Gaussian noise. In High Dimensional Probability VII. Progress in Probability 71 397–423. Springer, Cham. 1353.15034

38.

Li, X., Ling, S., Strohmer, T. and Wei, K. (2019). Rapid, robust, and reliable blind deconvolution via nonconvex optimization. Appl. Comput. Harmon. Anal. 47 893–934. 1422.94013 10.1016/j.acha.2018.01.001Li, X., Ling, S., Strohmer, T. and Wei, K. (2019). Rapid, robust, and reliable blind deconvolution via nonconvex optimization. Appl. Comput. Harmon. Anal. 47 893–934. 1422.94013 10.1016/j.acha.2018.01.001

39.

Lytova, A. and Tikhomirov, K. (2018). On delocalization of eigenvectors of random non-Hermitian matrices. Preprint. Available at  arXiv:1810.015901810.01590 07202716 10.1007/s00440-019-00956-8Lytova, A. and Tikhomirov, K. (2018). On delocalization of eigenvectors of random non-Hermitian matrices. Preprint. Available at  arXiv:1810.015901810.01590 07202716 10.1007/s00440-019-00956-8

40.

Ma, C., Wang, K., Chi, Y. and Chen, Y. (2020). Implicit regularization in nonconvex statistical estimation: Gradient descent converges linearly for phase retrieval, matrix completion and blind deconvolution. Found. Comput. Math. 20 451–632. 1445.90089 10.1007/s10208-019-09429-9Ma, C., Wang, K., Chi, Y. and Chen, Y. (2020). Implicit regularization in nonconvex statistical estimation: Gradient descent converges linearly for phase retrieval, matrix completion and blind deconvolution. Found. Comput. Math. 20 451–632. 1445.90089 10.1007/s10208-019-09429-9

41.

Mehlig, B. and Chalker, J. T. (1998). Eigenvector correlations in non-Hermitian random matrix ensembles. Ann. Phys. 7 427–436.Mehlig, B. and Chalker, J. T. (1998). Eigenvector correlations in non-Hermitian random matrix ensembles. Ann. Phys. 7 427–436.

42.

O’Rourke, S., Vu, V. and Wang, K. (2016). Eigenvectors of random matrices: A survey. J. Combin. Theory Ser. A 144 361–442. 1347.15044 10.1016/j.jcta.2016.06.008O’Rourke, S., Vu, V. and Wang, K. (2016). Eigenvectors of random matrices: A survey. J. Combin. Theory Ser. A 144 361–442. 1347.15044 10.1016/j.jcta.2016.06.008

43.

O’Rourke, S., Vu, V. and Wang, K. (2018). Random perturbation of low rank matrices: Improving classical bounds. Linear Algebra Appl. 540 26–59. 1380.65076 10.1016/j.laa.2017.11.014O’Rourke, S., Vu, V. and Wang, K. (2018). Random perturbation of low rank matrices: Improving classical bounds. Linear Algebra Appl. 540 26–59. 1380.65076 10.1016/j.laa.2017.11.014

44.

Péché, S. (2006). The largest eigenvalue of small rank perturbations of Hermitian random matrices. Probab. Theory Related Fields 134 127–173. 1088.15025 10.1007/s00440-005-0466-zPéché, S. (2006). The largest eigenvalue of small rank perturbations of Hermitian random matrices. Probab. Theory Related Fields 134 127–173. 1088.15025 10.1007/s00440-005-0466-z

45.

Rajagopalan, A. B. (2015). Outlier eigenvalue fluctuations of perturbed iid matrices. Preprint. Available at  arXiv:1507.014411507.01441Rajagopalan, A. B. (2015). Outlier eigenvalue fluctuations of perturbed iid matrices. Preprint. Available at  arXiv:1507.014411507.01441

46.

Renfrew, D. and Soshnikov, A. (2013). On finite rank deformations of Wigner matrices II: Delocalized perturbations. Random Matrices Theory Appl. 2 1250015, 36. 1296.60008 10.1142/S2010326312500153Renfrew, D. and Soshnikov, A. (2013). On finite rank deformations of Wigner matrices II: Delocalized perturbations. Random Matrices Theory Appl. 2 1250015, 36. 1296.60008 10.1142/S2010326312500153

47.

Silverstein, J. W. (1994). The spectral radii and norms of large-dimensional non-central random matrices. Comm. Statist. Stochastic Models 10 525–532. 0806.15018 10.1080/15326349408807308Silverstein, J. W. (1994). The spectral radii and norms of large-dimensional non-central random matrices. Comm. Statist. Stochastic Models 10 525–532. 0806.15018 10.1080/15326349408807308

48.

Sommers, H.-J., Crisanti, A., Sompolinsky, H. and Stein, Y. (1988). Spectrum of large random asymmetric matrices. Phys. Rev. Lett. 60 1895–1898.Sommers, H.-J., Crisanti, A., Sompolinsky, H. and Stein, Y. (1988). Spectrum of large random asymmetric matrices. Phys. Rev. Lett. 60 1895–1898.

49.

Tao, T. (2012). Topics in Random Matrix Theory. Graduate Studies in Mathematics 132. Amer. Math. Soc., Providence, RI.Tao, T. (2012). Topics in Random Matrix Theory. Graduate Studies in Mathematics 132. Amer. Math. Soc., Providence, RI.

50.

Tao, T. (2013). Outliers in the spectrum of iid matrices with bounded rank perturbations. Probab. Theory Related Fields 155 231–263. 1261.60009 10.1007/s00440-011-0397-9Tao, T. (2013). Outliers in the spectrum of iid matrices with bounded rank perturbations. Probab. Theory Related Fields 155 231–263. 1261.60009 10.1007/s00440-011-0397-9

51.

Tropp, J. A. (2015). An introduction to matrix concentration inequalities. Found. Trends Mach. Learn. 8 1–230. 1391.15071 10.1561/2200000048Tropp, J. A. (2015). An introduction to matrix concentration inequalities. Found. Trends Mach. Learn. 8 1–230. 1391.15071 10.1561/2200000048

52.

Vu, V. (2011). Singular vectors under random perturbation. Random Structures Algorithms 39 526–538. 1242.65069 10.1002/rsa.20367Vu, V. (2011). Singular vectors under random perturbation. Random Structures Algorithms 39 526–538. 1242.65069 10.1002/rsa.20367

53.

Vu, V. and Wang, K. (2015). Random weighted projections, random quadratic forms and random eigenvectors. Random Structures Algorithms 47 792–821. 1384.60029 10.1002/rsa.20561Vu, V. and Wang, K. (2015). Random weighted projections, random quadratic forms and random eigenvectors. Random Structures Algorithms 47 792–821. 1384.60029 10.1002/rsa.20561

54.

Wang, R. (2015). Singular vector perturbation under Gaussian noise. SIAM J. Matrix Anal. Appl. 36 158–177. 1315.15011 10.1137/130938177Wang, R. (2015). Singular vector perturbation under Gaussian noise. SIAM J. Matrix Anal. Appl. 36 158–177. 1315.15011 10.1137/130938177

55.

Wedin, P. (1972). Perturbation bounds in connection with singular value decomposition. BIT 12 99–111. 0239.15015Wedin, P. (1972). Perturbation bounds in connection with singular value decomposition. BIT 12 99–111. 0239.15015

56.

Xia, D. (2016). Statistical inference for large matrices. Ph.D. thesis, Georgia Institute of Technology.Xia, D. (2016). Statistical inference for large matrices. Ph.D. thesis, Georgia Institute of Technology.

57.

Xia, D. (2019). Confidence region of singular subspaces for low-rank matrix regression. IEEE Trans. Inf. Theory 65 7437–7459. 1433.94034Xia, D. (2019). Confidence region of singular subspaces for low-rank matrix regression. IEEE Trans. Inf. Theory 65 7437–7459. 1433.94034

58.

Yin, Y. Q., Bai, Z. D. and Krishnaiah, P. R. (1988). On the limit of the largest eigenvalue of the large-dimensional sample covariance matrix. Probab. Theory Related Fields 78 509–521. 0627.62022 10.1007/BF00353874Yin, Y. Q., Bai, Z. D. and Krishnaiah, P. R. (1988). On the limit of the largest eigenvalue of the large-dimensional sample covariance matrix. Probab. Theory Related Fields 78 509–521. 0627.62022 10.1007/BF00353874

59.

Zhang, A. and Xia, D. (2018). Tensor SVD: Statistical and computational limits. IEEE Trans. Inf. Theory 64 7311–7338. 1432.62176 10.1109/TIT.2018.2841377Zhang, A. and Xia, D. (2018). Tensor SVD: Statistical and computational limits. IEEE Trans. Inf. Theory 64 7311–7338. 1432.62176 10.1109/TIT.2018.2841377
Copyright © 2021 Institute of Mathematical Statistics
Yuxin Chen, Chen Cheng, and Jianqing Fan "Asymmetry helps: Eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices," The Annals of Statistics 49(1), 435-458, (February 2021). https://doi.org/10.1214/20-AOS1963
Received: 1 July 2019; Published: February 2021
Vol.49 • No. 1 • February 2021
Back to Top