Translator Disclaimer
September 2019 Probability Based Independence Sampler for Bayesian Quantitative Learning in Graphical Log-Linear Marginal Models
Ioannis Ntzoufras, Claudia Tarantola, Monia Lupparelli
Bayesian Anal. 14(3): 777-803 (September 2019). DOI: 10.1214/18-BA1128


We introduce a novel Bayesian approach for quantitative learning for graphical log-linear marginal models. These models belong to curved exponential families that are difficult to handle from a Bayesian perspective. The likelihood cannot be analytically expressed as a function of the marginal log-linear interactions, but only in terms of cell counts or probabilities. Posterior distributions cannot be directly obtained, and Markov Chain Monte Carlo (MCMC) methods are needed. Finally, a well-defined model requires parameter values that lead to compatible marginal probabilities. Hence, any MCMC should account for this important restriction. We construct a fully automatic and efficient MCMC strategy for quantitative learning for such models that handles these problems. While the prior is expressed in terms of the marginal log-linear interactions, we build an MCMC algorithm that employs a proposal on the probability parameter space. The corresponding proposal on the marginal log-linear interactions is obtained via parameter transformation. We exploit a conditional conjugate setup to build an efficient proposal on probability parameters. The proposed methodology is illustrated by a simulation study and a real dataset.


Download Citation

Ioannis Ntzoufras. Claudia Tarantola. Monia Lupparelli. "Probability Based Independence Sampler for Bayesian Quantitative Learning in Graphical Log-Linear Marginal Models." Bayesian Anal. 14 (3) 777 - 803, September 2019.


Published: September 2019
First available in Project Euclid: 11 June 2019

zbMATH: 1421.62029
MathSciNet: MR3960771
Digital Object Identifier: 10.1214/18-BA1128


Vol.14 • No. 3 • September 2019
Back to Top