• Bernoulli
  • Volume 15, Number 4 (2009), 1179-1189.

Determining full conditional independence by low-order conditioning

Dhafer Malouche

Full-text: Open access


A concentration graph associated with a random vector is an undirected graph where each vertex corresponds to one random variable in the vector. The absence of an edge between any pair of vertices (or variables) is equivalent to full conditional independence between these two variables given all the other variables. In the multivariate Gaussian case, the absence of an edge corresponds to a zero coefficient in the precision matrix, which is the inverse of the covariance matrix.

It is well known that this concentration graph represents some of the conditional independencies in the distribution of the associated random vector. These conditional independencies correspond to the “separations” or absence of edges in that graph. In this paper we assume that there are no other independencies present in the probability distribution than those represented by the graph. This property is called the perfect Markovianity of the probability distribution with respect to the associated concentration graph. We prove in this paper that this particular concentration graph, the one associated with a perfect Markov distribution, can be determined by only conditioning on a limited number of variables. We demonstrate that this number is equal to the maximum size of the minimal separators in the concentration graph.

Article information

Bernoulli, Volume 15, Number 4 (2009), 1179-1189.

First available in Project Euclid: 8 January 2010

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

conditional independence graphical models Markov properties separability in graphs undirected graphs


Malouche, Dhafer. Determining full conditional independence by low-order conditioning. Bernoulli 15 (2009), no. 4, 1179--1189. doi:10.3150/09-BEJ193.

Export citation


  • Castelo, R. and Roverato, A. (2006). A robust procedure for Gaussian graphical models search for microarray data with p larger than n. J. Mach. Learn. Res. 57 2621–2650.
  • Chaudhuri, S., Drton, M. and Richardson, R.T. (2007). Estimation a covariance matrix with zeros. Biometrika 94 199–216.
  • Cox, D.R. and Wermuth, N. (1996). Multivariate Depencies: Models, Analysis & Interpretations. London: Chapman & Hall.
  • Dempster, A. (1972). Covariance selection. Biometrics 28 157–175.
  • Drton, M. and Richardson, T. (2008). Graphical methods for efficient likelihood inference in Gaussian covariance models. J. Mach. Learn. 9 893–914.
  • Edwards, D. (2000). Introduction to Graphical Modelling. New York: Springer.
  • Friedman, N., Linial, M., Nachman, I. and Pe’er, D. (2000). Using Bayesian networks to analyse expression data. J. Comput. Biol. 7(3–4) 601–620.
  • Geiger, D. and Pearl, J. (1993). Logical and algorithmic properties of conditional independence and graphical models. Ann. Statist. 21 2001–2021.
  • Kalisch, M. and Bühlmann, P. (2007). Estimating high-dimensional directed acyclic graphs with the PC-algorithm. J. Mach. Learn. Res. 8 613–636.
  • Kjærulff, U.B. and Madsen, A.L. (2007). Bayesian Networks and Influence Diagrams. New York: Springer.
  • Lauritzen, S.L. (1996). Graphical Models. New York: Oxford Univ. Press.
  • Letac, G. and Massam, H. (2007). Wishart distributions on decomposable graphs. Ann. Statist. 35 1278–1323.
  • Rajaratnam, B., Massam, H. and Carvalho, C. (2008). Flexible covariance estimation in graphical models. Ann. Statist. 36 2818–2849.
  • Spirtes, P., Glymour, C. and Scheines, R. (2000). Causation, Prediction and Search, 2nd ed. Cambridge, MA: MIT Press.
  • Whittaker, J. (1990). Graphical Models in Applied Multivariate Statistics. Cambridge, MA: Wiley.
  • Wille, A. and Bühlman, P. (2006). Low-order conditional independence graphs for inferring genetic network. Stat. Appl. Genet. Mol. Biol. 5 1–32.