Open Access
December 2008 Compressing parameters in Bayesian high-order models with application to logistic sequence models
Longhai Li, Radford M. Neal
Bayesian Anal. 3(4): 793-821 (December 2008). DOI: 10.1214/08-BA330

Abstract

Bayesian classification and regression with high-order interactions is largely infeasible because Markov chain Monte Carlo (MCMC) would need to be applied with a great many parameters, whose number increases rapidly with the order. In this paper we show how to make it feasible by effectively reducing the number of parameters, exploiting the fact that many interactions have the same values for all training cases. Our method uses a single "compressed" parameter to represent the sum of all parameters associated with a set of patterns that have the same value for all training cases. Using symmetric stable distributions as the priors of the original parameters, we can easily find the priors of these compressed parameters. We therefore need to deal only with a much smaller number of compressed parameters when training the model with MCMC. After training the model, we can split these compressed parameters into the original ones as needed to make predictions for test cases. We show in detail how to compress parameters for logistic sequence prediction models. Experiments on both simulated and real data demonstrate that a huge number of parameters can indeed be reduced by our compression method.

Citation

Download Citation

Longhai Li. Radford M. Neal. "Compressing parameters in Bayesian high-order models with application to logistic sequence models." Bayesian Anal. 3 (4) 793 - 821, December 2008. https://doi.org/10.1214/08-BA330

Information

Published: December 2008
First available in Project Euclid: 22 June 2012

zbMATH: 1330.62142
MathSciNet: MR2469800
Digital Object Identifier: 10.1214/08-BA330

Keywords: compressing parameters , high-order models , interaction , logistic models , Markov chain Monte Carlo

Rights: Copyright © 2008 International Society for Bayesian Analysis

Vol.3 • No. 4 • December 2008
Back to Top