Open Access
June 2013 Universally consistent vertex classification for latent positions graphs
Minh Tang, Daniel L. Sussman, Carey E. Priebe
Ann. Statist. 41(3): 1406-1430 (June 2013). DOI: 10.1214/13-AOS1112

Abstract

In this work we show that, using the eigen-decomposition of the adjacency matrix, we can consistently estimate feature maps for latent position graphs with positive definite link function $\kappa$, provided that the latent positions are i.i.d. from some distribution $F$. We then consider the exploitation task of vertex classification where the link function $\kappa$ belongs to the class of universal kernels and class labels are observed for a number of vertices tending to infinity and that the remaining vertices are to be classified. We show that minimization of the empirical $\varphi$-risk for some convex surrogate $\varphi$ of 0–1 loss over a class of linear classifiers with increasing complexities yields a universally consistent classifier, that is, a classification rule with error converging to Bayes optimal for any distribution $F$.

Citation

Download Citation

Minh Tang. Daniel L. Sussman. Carey E. Priebe. "Universally consistent vertex classification for latent positions graphs." Ann. Statist. 41 (3) 1406 - 1430, June 2013. https://doi.org/10.1214/13-AOS1112

Information

Published: June 2013
First available in Project Euclid: 1 August 2013

zbMATH: 1273.62147
MathSciNet: MR3113816
Digital Object Identifier: 10.1214/13-AOS1112

Subjects:
Primary: 62H30
Secondary: 62C12 , 62G20

Keywords: Bayes-risk consistency , ‎classification‎ , convergence of eigenvectors , convex cost function , Latent space model

Rights: Copyright © 2013 Institute of Mathematical Statistics

Vol.41 • No. 3 • June 2013
Back to Top