Open Access
March 2019 Prediction models for network-linked data
Tianxi Li, Elizaveta Levina, Ji Zhu
Ann. Appl. Stat. 13(1): 132-164 (March 2019). DOI: 10.1214/18-AOAS1205

Abstract

Prediction algorithms typically assume the training data are independent samples, but in many modern applications samples come from individuals connected by a network. For example, in adolescent health studies of risk-taking behaviors, information on the subjects’ social network is often available and plays an important role through network cohesion, the empirically observed phenomenon of friends behaving similarly. Taking cohesion into account in prediction models should allow us to improve their performance. Here we propose a network-based penalty on individual node effects to encourage similarity between predictions for linked nodes, and show that incorporating it into prediction leads to improvement over traditional models both theoretically and empirically when network cohesion is present. The penalty can be used with many loss-based prediction methods, such as regression, generalized linear models, and Cox’s proportional hazard model. Applications to predicting levels of recreational activity and marijuana usage among teenagers from the AddHealth study based on both demographic covariates and friendship networks are discussed in detail and show that our approach to taking friendships into account can significantly improve predictions of behavior while providing interpretable estimates of covariate effects.

Citation

Download Citation

Tianxi Li. Elizaveta Levina. Ji Zhu. "Prediction models for network-linked data." Ann. Appl. Stat. 13 (1) 132 - 164, March 2019. https://doi.org/10.1214/18-AOAS1205

Information

Received: 1 May 2017; Revised: 1 June 2018; Published: March 2019
First available in Project Euclid: 10 April 2019

zbMATH: 07057423
MathSciNet: MR3937424
Digital Object Identifier: 10.1214/18-AOAS1205

Keywords: Network cohesion , prediction , regression

Rights: Copyright © 2019 Institute of Mathematical Statistics

Vol.13 • No. 1 • March 2019
Back to Top