Open Access
Translator Disclaimer
February 2019 Posterior graph selection and estimation consistency for high-dimensional Bayesian DAG models
Xuan Cao, Kshitij Khare, Malay Ghosh
Ann. Statist. 47(1): 319-348 (February 2019). DOI: 10.1214/18-AOS1689


Covariance estimation and selection for high-dimensional multivariate datasets is a fundamental problem in modern statistics. Gaussian directed acyclic graph (DAG) models are a popular class of models used for this purpose. Gaussian DAG models introduce sparsity in the Cholesky factor of the inverse covariance matrix, and the sparsity pattern in turn corresponds to specific conditional independence assumptions on the underlying variables. A variety of priors have been developed in recent years for Bayesian inference in DAG models, yet crucial convergence and sparsity selection properties for these models have not been thoroughly investigated. Most of these priors are adaptations/generalizations of the Wishart distribution in the DAG context. In this paper, we consider a flexible and general class of these “DAG-Wishart” priors with multiple shape parameters. Under mild regularity assumptions, we establish strong graph selection consistency and establish posterior convergence rates for estimation when the number of variables $p$ is allowed to grow at an appropriate subexponential rate with the sample size $n$.


Download Citation

Xuan Cao. Kshitij Khare. Malay Ghosh. "Posterior graph selection and estimation consistency for high-dimensional Bayesian DAG models." Ann. Statist. 47 (1) 319 - 348, February 2019.


Received: 1 May 2017; Revised: 1 January 2018; Published: February 2019
First available in Project Euclid: 30 November 2018

zbMATH: 07036203
MathSciNet: MR3909935
Digital Object Identifier: 10.1214/18-AOS1689

Primary: 62F15
Secondary: 62G20

Keywords: Bayesian DAG models , Covariance estimation , graph selection , High-dimensional data , posterior consistency

Rights: Copyright © 2019 Institute of Mathematical Statistics


Vol.47 • No. 1 • February 2019
Back to Top