Abstract
In this work we show that, using the eigen-decomposition of the adjacency matrix, we can consistently estimate feature maps for latent position graphs with positive definite link function $\kappa$, provided that the latent positions are i.i.d. from some distribution $F$. We then consider the exploitation task of vertex classification where the link function $\kappa$ belongs to the class of universal kernels and class labels are observed for a number of vertices tending to infinity and that the remaining vertices are to be classified. We show that minimization of the empirical $\varphi$-risk for some convex surrogate $\varphi$ of 0–1 loss over a class of linear classifiers with increasing complexities yields a universally consistent classifier, that is, a classification rule with error converging to Bayes optimal for any distribution $F$.
Citation
Minh Tang. Daniel L. Sussman. Carey E. Priebe. "Universally consistent vertex classification for latent positions graphs." Ann. Statist. 41 (3) 1406 - 1430, June 2013. https://doi.org/10.1214/13-AOS1112
Information