Open Access
April 2020 Learning a tree-structured Ising model in order to make predictions
Guy Bresler, Mina Karzand
Ann. Statist. 48(2): 713-737 (April 2020). DOI: 10.1214/19-AOS1808

Abstract

We study the problem of learning a tree Ising model from samples such that subsequent predictions made using the model are accurate. The prediction task considered in this paper is that of predicting the values of a subset of variables given values of some other subset of variables. Virtually all previous work on graphical model learning has focused on recovering the true underlying graph. We define a distance (“small set TV” or ssTV) between distributions $P$ and $Q$ by taking the maximum, over all subsets $\mathcal{S}$ of a given size, of the total variation between the marginals of $P$ and $Q$ on $\mathcal{S}$; this distance captures the accuracy of the prediction task of interest. We derive nonasymptotic bounds on the number of samples needed to get a distribution (from the same class) with small ssTV relative to the one generating the samples. One of the main messages of this paper is that far fewer samples are needed than for recovering the underlying tree, which means that accurate predictions are possible using the wrong tree.

Citation

Download Citation

Guy Bresler. Mina Karzand. "Learning a tree-structured Ising model in order to make predictions." Ann. Statist. 48 (2) 713 - 737, April 2020. https://doi.org/10.1214/19-AOS1808

Information

Received: 1 November 2016; Revised: 1 April 2018; Published: April 2020
First available in Project Euclid: 26 May 2020

zbMATH: 07241566
MathSciNet: MR4102673
Digital Object Identifier: 10.1214/19-AOS1808

Subjects:
Primary: 62F12 , 62H12

Keywords: High-dimensional statistics , Ising model , Markov random fields , Model selection , prediction , tree model

Rights: Copyright © 2020 Institute of Mathematical Statistics

Vol.48 • No. 2 • April 2020
Back to Top