Translator Disclaimer
2021 Principal component analysis for multivariate extremes
Holger Drees, Anne Sabourin
Author Affiliations +
Electron. J. Statist. 15(1): 908-943 (2021). DOI: 10.1214/21-EJS1803


In the probabilistic framework of multivariate regular variation, the first order behavior of heavy-tailed random vectors above large radial thresholds is ruled by a homogeneous limit measure. For a high dimensional vector, a reasonable assumption is that the support of this measure is concentrated on a lower dimensional subspace, meaning that certain linear combinations of the components are much likelier to be large than others. Identifying this subspace and thus reducing the dimension will facilitate a refined statistical analysis. In this work we apply Principal Component Analysis (PCA) to a re-scaled version of radially thresholded observations.

Within the statistical learning framework of empirical risk minimization, our main focus is to analyze the squared reconstruction error for the exceedances over large radial thresholds. We prove that the empirical risk converges to the true risk, uniformly over all projection subspaces. As a consequence, the best projection subspace is shown to converge in probability to the optimal one, in terms of the Hausdorff distance between their intersections with the unit sphere. In addition, if the exceedances are re-scaled to the unit ball, we obtain finite sample uniform guarantees to the reconstruction error pertaining to the estimated projection subspace. Numerical experiments illustrate the capability of the proposed framework to improve estimators of extreme value parameters.


Download Citation

Holger Drees. Anne Sabourin. "Principal component analysis for multivariate extremes." Electron. J. Statist. 15 (1) 908 - 943, 2021.


Received: 1 October 2019; Published: 2021
First available in Project Euclid: 16 March 2021

Digital Object Identifier: 10.1214/21-EJS1803

Primary: 62G32
Secondary: 62H25


Vol.15 • No. 1 • 2021
Back to Top