This paper is on the normal approximation of singular subspaces when the noise matrix has i.i.d. entries. Our contributions are three-fold. First, we derive an explicit representation formula of the empirical spectral projectors. The formula is neat and holds for deterministic matrix perturbations. Second, we calculate the expected projection distance between the empirical singular subspaces and true singular subspaces. Our method allows obtaining arbitrary k-th order approximation of the expected projection distance. Third, we prove the non-asymptotical normal approximation of the projection distance with different levels of bias corrections. By the -th order bias corrections, the asymptotical normality holds under optimal signal-to-noise ratio (SNR) condition where and denote the matrix sizes. In addition, it shows that higher order approximations are unnecessary when . Finally, we provide comprehensive simulation results to merit our theoretic discoveries.
Unlike the existing results, our approach is non-asymptotical and the convergence rates are established. Our method allows the rank r to diverge as fast as . Moreover, our method requires no eigen-gap condition (except the SNR) and no constraints between and .
This research is supported partially by Hong Kong RGC Grant ECS 26302019 and GRF 16303320.
The author would like to thank Yik-Man Chiang for the insightful recommendations on applying the Residue theorem, Jeff Yao for the encouragements on improving the former results, and an anonymous referee for pointing out the reference Kato (2013).
"Normal approximation and confidence region of singular subspaces." Electron. J. Statist. 15 (2) 3798 - 3851, 2021. https://doi.org/10.1214/21-EJS1876