Abstract
Principal Component Analysis (PCA) is a powerful tool in statistics and machine learning. While existing study of PCA focuses on the recovery of principal components and their associated eigenvalues, there are few precise characterizations of individual principal component scores that yield low-dimensional embedding of samples. That hinders the analysis of various spectral methods. In this paper, we first develop an perturbation theory for a hollowed version of PCA in Hilbert spaces which provably improves upon the vanilla PCA in the presence of heteroscedastic noises. Through a novel analysis of eigenvectors, we investigate entrywise behaviors of principal component score vectors and show that they can be approximated by linear functionals of the Gram matrix in norm, which includes and as special cases. For sub-Gaussian mixture models, the choice of p giving optimal bounds depends on the signal-to-noise ratio, which further yields optimality guarantees for spectral clustering. For contextual community detection, the theory leads to simple spectral algorithms that achieve the information threshold for exact recovery and the optimal misclassification rate.
Acknowledgments
E. Abbe, was supported by the NSF CAREER Award CCF-1552131.
J. Fan was supported by ONR Grant N00014-19-1-2120 and NSF Grants DMS-2052926, DMS-1712591, and DMS-2053832.
K. Wang was supported by a startup fund from Columbia University and NIH Grant 2R01-GM072611-15 when he was a student at Princeton University.
Citation
Emmanuel Abbe. Jianqing Fan. Kaizheng Wang. "An theory of PCA and spectral clustering." Ann. Statist. 50 (4) 2359 - 2385, August 2022. https://doi.org/10.1214/22-AOS2196
Information