Abstract
We place ourselves in the setting of high-dimensional statistical inference where the number of variables p in a dataset of interest is of the same order of magnitude as the number of observations n.
We consider the spectrum of certain kernel random matrices, in particular n×n matrices whose (i, j)th entry is f(X'iXj/p) or f(‖Xi−Xj‖2/p) where p is the dimension of the data, and Xi are independent data vectors. Here f is assumed to be a locally smooth function.
The study is motivated by questions arising in statistics and computer science where these matrices are used to perform, among other things, nonlinear versions of principal component analysis. Surprisingly, we show that in high-dimensions, and for the models we analyze, the problem becomes essentially linear—which is at odds with heuristics sometimes used to justify the usage of these methods. The analysis also highlights certain peculiarities of models widely studied in random matrix theory and raises some questions about their relevance as tools to model high-dimensional data encountered in practice.
Citation
Noureddine El Karoui. "The spectrum of kernel random matrices." Ann. Statist. 38 (1) 1 - 50, February 2010. https://doi.org/10.1214/08-AOS648
Information