Annals of Statistics
- Ann. Statist.
- Volume 47, Number 6 (2019), 3009-3031.
Distributed estimation of principal eigenspaces
Jianqing Fan, Dong Wang, Kaizheng Wang, and Ziwei Zhu
Abstract
Principal component analysis (PCA) is fundamental to statistical machine learning. It extracts latent principal factors that contribute to the most variation of the data. When data are stored across multiple machines, however, communication cost can prohibit the computation of PCA in a central location and distributed algorithms for PCA are thus needed. This paper proposes and studies a distributed PCA algorithm: each node machine computes the top $K$ eigenvectors and transmits them to the central server; the central server then aggregates the information from all the node machines and conducts a PCA based on the aggregated information. We investigate the bias and variance for the resulting distributed estimator of the top $K$ eigenvectors. In particular, we show that for distributions with symmetric innovation, the empirical top eigenspaces are unbiased, and hence the distributed PCA is “unbiased.” We derive the rate of convergence for distributed PCA estimators, which depends explicitly on the effective rank of covariance, eigengap, and the number of machines. We show that when the number of machines is not unreasonably large, the distributed PCA performs as well as the whole sample PCA, even without full access of whole data. The theoretical results are verified by an extensive simulation study. We also extend our analysis to the heterogeneous case where the population covariance matrices are different across local machines but share similar top eigenstructures.
Article information
Source
Ann. Statist., Volume 47, Number 6 (2019), 3009-3031.
Dates
Received: February 2017
Revised: January 2018
First available in Project Euclid: 31 October 2019
Permanent link to this document
https://projecteuclid.org/euclid.aos/1572487381
Digital Object Identifier
doi:10.1214/18-AOS1713
Mathematical Reviews number (MathSciNet)
MR4025733
Zentralblatt MATH identifier
07151052
Subjects
Primary: 62H25: Factor analysis and principal components; correspondence analysis
Secondary: 62E10: Characterization and structure theory
Keywords
Distributed learning PCA one-shot approach communication efficiency unbiasedness of empirical eigenspaces heterogeneity
Citation
Fan, Jianqing; Wang, Dong; Wang, Kaizheng; Zhu, Ziwei. Distributed estimation of principal eigenspaces. Ann. Statist. 47 (2019), no. 6, 3009--3031. doi:10.1214/18-AOS1713. https://projecteuclid.org/euclid.aos/1572487381
Supplemental materials
- Supplement to “Distributed estimation of principal eigenspaces”. Proofs of the results in the paper can be found in the Supplementary Material.Digital Object Identifier: doi:10.1214/18-AOS1713SUPPSupplemental files are immediately available to subscribers. Non-subscribers gain access to supplemental files with the purchase of the article.