Brazilian Journal of Probability and Statistics

Bayesian statistics with a smile: A resampling–sampling perspective

Hedibert F. Lopes, Nicholas G. Polson, and Carlos M. Carvalho

Full-text: Open access

Abstract

In this paper we develop a simulation-based approach to sequential inference in Bayesian statistics. Our resampling–sampling perspective provides draws from posterior distributions of interest by exploiting the sequential nature of Bayes theorem. Predictive inferences are a direct byproduct of our analysis as are marginal likelihoods for model assessment. We illustrate our approach in a hierarchical normal-means model and in a sequential version of Bayesian lasso. This approach provides a simple yet powerful framework for the construction of alternative posterior sampling strategies for a variety of commonly used models.

Article information

Source
Braz. J. Probab. Stat. Volume 26, Number 4 (2012), 358-371.

Dates
First available in Project Euclid: 3 July 2012

Permanent link to this document
https://projecteuclid.org/euclid.bjps/1341320248

Digital Object Identifier
doi:10.1214/11-BJPS144

Mathematical Reviews number (MathSciNet)
MR2949084

Zentralblatt MATH identifier
1319.62062

Keywords
Hierarchical models MCMC Gibbs sampling Bayesian lasso ANOVA

Citation

Lopes, Hedibert F.; Polson, Nicholas G.; Carvalho, Carlos M. Bayesian statistics with a smile: A resampling–sampling perspective. Braz. J. Probab. Stat. 26 (2012), no. 4, 358--371. doi:10.1214/11-BJPS144. https://projecteuclid.org/euclid.bjps/1341320248


Export citation

References

  • Brockwell, A., Del Moral, P. and Doucet, A. (2010). Sequentially interacting Markov chain Monte Carlo. The Annals of Statistics 38, 3387–3411.
  • Carlin, B. P. and Polson, N. G. (1991). Inference for nonconjugate Bayesian models using the Gibbs sampler. The Canadian Journal of Statistics 19, 399–405.
  • Carpenter, J., Clifford, P. and Fearnhead, P. (1999). An improved particle filter for non-linear problems. IEE Proceedings—Radar, Sonar and Navigation 146, 2–7.
  • Carvalho, C. M., Johannes, M., Lopes, H. F. and Polson, N. G. (2010a). Particle learning and smoothing. Statistical Science 25, 88–106.
  • Carvalho, C. M., Lopes, H. F., Polson, N. G. and Taddy, M. (2010b). Particle learning for general mixtures. Bayesian Analysis 5, 709–740.
  • Chen, R. and Liu, J. S. (2000). Mixture Kalman filters. Journal of the Royal Statistical Society, Series B 62, 493–508.
  • Chopin, N. (2002). A sequential particle filter method for static models. Biometrika 89, 539–551.
  • Fearnhead, P. (2002). Markov chain Monte Carlo, sufficient statistics, and particle filters. Journal of Computational and Graphical Statistics 11, 848–862.
  • Gelfand, A. and Smith, A. F. M. (1990). Sampling-based approaches to calculating marginal densities. Journal of the American Statistical Association 85, 398–409.
  • Gelman, A., van Dyk, D. A., Huang, Z. and Biscardin, W. J. (2008). Using redundant parameterizations to fit hierarchical models. Journal of Computational and Graphical Statistics 17, 95–122.
  • Gilks, W. and Berzuini, C. (2001). Following a moving target: Monte Carlo inference for dynamic Bayesian models. Journal of the Royal Statistical Society, Series B 63, 127–146.
  • Hans, C. (2009). Bayesian lasso regression. Biometrika 96, 835–845.
  • Kitagawa, G. (1996). Monte Carlo filter and smoother for non-Gaussian non-linear state space models. Journal of Computational and Graphical Statistics 5, 1–25.
  • Kong, A., Liu, J. S. and Wong, W. H. (1994). Sequential imputations and Bayesian missing data problems. Journal of the American Statistical Association 89, 278–288.
  • Liu, J. (1994). The collapsed Gibbs sampler with applications to a gene regulation problem. Journal of the American Statistical Association 89, 958–966.
  • Liu, J. and West, M. (2001). Combined parameters and state estimation in simulation-based filtering. In Sequential Monte Carlo Methods in Practice (A. Doucet, N. de Freitas and N. Gordon, eds.). New York: Springer.
  • Lopes, H. F., Carvalho, C. M., Johannes, M. S. and Polson, N. G. (2011). Particle learning for sequential Bayesian computation (with discussion). In Bayesian Statistics 9 (J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, A. F. M. Smith and M. West, eds.) 317–360. Oxford: Oxford Univ. Press.
  • Lopes, H. F. and Tsay, R. S. (2011). Particle filters and bayesian inference in financial econometrics. Journal of Forecasting 30, 168–209.
  • Pitt, M. and Shephard, N. (1999). Filtering via simulation: Auxiliary particle filters. Journal of the American Statistical Association 94, 590–599.
  • Rubin, D. B. (1987). A noniterative sampling-importance resampling alternative to the data augmentation algorithm for creating a few imputations when fractions of missing information are modest: The SIR algorithm. Journal of the American Statistical Association 82, 543–546.
  • Smith, A. F. M. and Gelfand, A. (1992). Bayesian statistics without tears: A sampling–resampling perspective. The American Statistician 46, 84–88.
  • Storvik, G. (2002). Particle filters in state space models with the presence of unknown static parameters. IEEE Transactions of Signal Processing 50, 281–289.
  • Tiao, G. C. and Tan, Y. (1965). Bayesian analysis of random-effect models in the analysis of variance. Biometrika 52, 37–54.
  • West, M. (1993). Approximating posterior distributions by mixtures. Journal of the Royal Statistical Society, Series B 55, 409–422.