Abstract
We develop in this work a new dimension reduction method for high-dimensional settings. The proposed procedure is based on a principal support vector machine framework where principal projections are used in order to overcome the non-invertibility of the covariance matrix. Using a series of equivalences we show that one can accurately recover the central subspace using a projection on a lower dimensional subspace and then applying an penalization strategy to obtain sparse estimators of the sufficient directions. Based next on a desparsified estimator, we provide an inferential procedure for high-dimensional models that allows testing for the importance of variables in determining the sufficient direction. Theoretical properties of the methodology are illustrated and computational advantages are demonstrated with simulated and real data experiments.
Funding Statement
The authors gratefully acknowledge the computational resources provided by the supercomputing facilities of the Université catholique de Louvain (CISM/UCL) and the Consortium des Équipements de Calcul Intensif en Fédération Wallonie Bruxelles (CÉCI) funded by the Fond de la Recherche Scientifique de Belgique (F.R.S.-FNRS) under convention 2.5020.11 and by the Walloon Region. The authors would also like to thank the Data Innovation Research Institute at Cardiff University for partially funding the project.
Citation
Eugen Pircalabelu. Andreas Artemiou. "High-dimensional sufficient dimension reduction through principal projections." Electron. J. Statist. 16 (1) 1804 - 1830, 2022. https://doi.org/10.1214/22-EJS1988
Information