October 2022 Approximate kernel PCA: Computational versus statistical trade-off
Bharath K. Sriperumbudur, Nicholas Sterge
Author Affiliations +
Ann. Statist. 50(5): 2713-2736 (October 2022). DOI: 10.1214/22-AOS2204


Kernel methods are powerful learning methodologies that allow to perform nonlinear data analysis. Despite their popularity, they suffer from poor scalability in big data scenarios. Various approximation methods, including random feature approximation, have been proposed to alleviate the problem. However, the statistical consistency of most of these approximate kernel methods is not well understood except for kernel ridge regression wherein it has been shown that the random feature approximation is not only computationally efficient but also statistically consistent with a minimax optimal rate of convergence. In this paper, we investigate the efficacy of random feature approximation in the context of kernel principal component analysis (KPCA) by studying the trade-off between computational and statistical behaviors of approximate KPCA. We show that the approximate KPCA is both computationally and statistically efficient compared to KPCA in terms of the error associated with reconstructing a kernel function based on its projection onto the corresponding eigenspaces. The analysis hinges on Bernstein-type inequalities for the operator and Hilbert–Schmidt norms of a self-adjoint Hilbert–Schmidt operator-valued U-statistics, which are of independent interest.

Funding Statement

BKS is supported by National Science Foundation (NSF) award DMS-1713011 and CAREER award DMS-1945396.


The authors thank the Editor, Associate Editor and reviewers for their valuable comments. The authors specially thank the Associate Editor who handled an earlier version of this paper for some key suggestions, which significantly improved the paper.


Download Citation

Bharath K. Sriperumbudur. Nicholas Sterge. "Approximate kernel PCA: Computational versus statistical trade-off." Ann. Statist. 50 (5) 2713 - 2736, October 2022. https://doi.org/10.1214/22-AOS2204


Received: 1 November 2020; Revised: 1 December 2021; Published: October 2022
First available in Project Euclid: 27 October 2022

MathSciNet: MR4500622
zbMATH: 07628838
Digital Object Identifier: 10.1214/22-AOS2204

Primary: 62H25
Secondary: 62G05

Keywords: Bernstein’s inequality , Covariance operator , kernel PCA , Principal Component Analysis , random feature approximation , ‎reproducing kernel Hilbert ‎space

Rights: Copyright © 2022 Institute of Mathematical Statistics


This article is only available to subscribers.
It is not available for individual sale.

Vol.50 • No. 5 • October 2022
Back to Top