Abstract
This paper is concerned with the interplay between statistical asymmetry and spectral methods. Suppose we are interested in estimating a rank-1 and symmetric matrix $\boldsymbol{M}^{\star }\in \mathbb{R}^{n\times n}$, yet only a randomly perturbed version $\boldsymbol{M}$ is observed. The noise matrix $\boldsymbol{M}-\boldsymbol{M}^{\star }$ is composed of independent (but not necessarily homoscedastic) entries and is, therefore, not symmetric in general. This might arise if, for example, when we have two independent samples for each entry of $\boldsymbol{M}^{\star }$ and arrange them in an asymmetric fashion. The aim is to estimate the leading eigenvalue and the leading eigenvector of $\boldsymbol{M}^{\star }$.
We demonstrate that the leading eigenvalue of the data matrix $\boldsymbol{M}$ can be $O(\sqrt{n})$ times more accurate (up to some log factor) than its (unadjusted) leading singular value of $\boldsymbol{M}$ in eigenvalue estimation. Moreover, the eigen-decomposition approach is fully adaptive to heteroscedasticity of noise, without the need of any prior knowledge about the noise distributions. In a nutshell, this curious phenomenon arises since the statistical asymmetry automatically mitigates the bias of the eigenvalue approach, thus eliminating the need of careful bias correction. Additionally, we develop appealing nonasymptotic eigenvector perturbation bounds; in particular, we are able to bound the perturbation of any linear function of the leading eigenvector of $\boldsymbol{M}$ (e.g., entrywise eigenvector perturbation). We also provide partial theory for the more general rank-$r$ case. The takeaway message is this: arranging the data samples in an asymmetric manner and performing eigendecomposition could sometimes be quite beneficial.
Citation
Yuxin Chen. Chen Cheng. Jianqing Fan. "Asymmetry helps: Eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices." Ann. Statist. 49 (1) 435 - 458, February 2021. https://doi.org/10.1214/20-AOS1963
Information