Statistical Science

Bayes Model Selection with Path Sampling: Factor Models and Other Examples

Ritabrata Dutta and Jayanta K. Ghosh

Full-text: Open access

Abstract

We prove a theorem justifying the regularity conditions which are needed for Path Sampling in Factor Models. We then show that the remaining ingredient, namely, MCMC for calculating the integrand at each point in the path, may be seriously flawed, leading to wrong estimates of Bayes factors. We provide a new method of Path Sampling (with Small Change) that works much better than standard Path Sampling in the sense of estimating the Bayes factor better and choosing the correct model more often. When the more complex factor model is true, PS-SC is substantially more accurate. New MCMC diagnostics is provided for these problems in support of our conclusions and recommendations. Some of our ideas for diagnostics and improvement in computation through small changes should apply to other methods of computation of the Bayes factor for model selection.

Article information

Source
Statist. Sci. Volume 28, Number 1 (2013), 95-115.

Dates
First available in Project Euclid: 29 January 2013

Permanent link to this document
https://projecteuclid.org/euclid.ss/1359468410

Digital Object Identifier
doi:10.1214/12-STS403

Mathematical Reviews number (MathSciNet)
MR3075340

Zentralblatt MATH identifier
1332.62089

Keywords
Bayes model selection covariance models path sampling Laplace approximation

Citation

Dutta, Ritabrata; Ghosh, Jayanta K. Bayes Model Selection with Path Sampling: Factor Models and Other Examples. Statist. Sci. 28 (2013), no. 1, 95--115. doi:10.1214/12-STS403. https://projecteuclid.org/euclid.ss/1359468410


Export citation

References

  • Akaike, H. (1987). Factor analysis and AIC. Psychometrika 52 317–332.
  • Anderson, T. W. (1984). An Introduction to Multivariate Statistical Analysis, 2nd ed. Wiley, New York.
  • Andrieu, C., Doucet, A. and Robert, C. P. (2004). Computational advances for and from Bayesian analysis. Statist. Sci. 19 118–127.
  • Bartholomew, D. J., Steele, F., Moustaki, I. and Gabbrith, J. I. (2002). The Analysis and Interpretation of Multivariate Data for Social Scientists. Chapman & Hall, Boca Raton, FL.
  • Berger, J. O., Ghosh, J. K. and Mukhopadhyay, N. (2003). Approximations and consistency of Bayes factors as model dimension grows. J. Statist. Plann. Inference 112 241–258.
  • Bunke, O. and Milhaud, X. (1998). Asymptotic behavior of Bayes estimates under possibly incorrect models. Ann. Statist. 26 617–644.
  • Chen, M.-H., Shao, Q.-M. and Ibrahim, J. G. (2000). Monte Carlo Methods in Bayesian Computation. Springer, New York.
  • Chib, S. (1995). Marginal likelihood from the Gibbs output. J. Amer. Statist. Assoc. 90 1313–1321.
  • Clyde, M. and George, E. I. (2004). Model uncertainty. Statist. Sci. 19 81–94.
  • DiCiccio, T. J., Kass, R. E., Raftery, A. and Wasserman, L. (1997). Computing Bayes factors by combining simulation and asymptotic approximations. J. Amer. Statist. Assoc. 92 903–915.
  • Drton, M. (2009). Likelihood ratio tests and singularities. Ann. Statist. 37 979–1012.
  • Edwards, W., Lindman, H. and Savage, L. J. (1984). Bayesian statistical inference for psychological research. In Robustness of Bayesian Analyses. Stud. Bayesian Econometrics 4 1–62. North-Holland, Amsterdam.
  • Fan, Y., Wu, R., Chen, M.-H., Kuo, L. and Lewis, P. O. (2011). Choosing among partition models in Bayesian phylogenetics. Mol. Biol. Evol. 28 523–532.
  • Friel, N. and Pettitt, A. N. (2008). Marginal likelihood estimation via power posteriors. J. R. Stat. Soc. Ser. B Stat. Methodol. 70 589–607.
  • Gamarnik, D., Shah, D. and Wei, Y. (2010). Belief propagation for min-cost network flow: Convergence & correctness. In Proceedings of the Twenty-First Annual ACM-SIAM Symposium on Discrete Algorithms 279–292. SIAM, Philadelphia, PA.
  • Gamerman, D. and Lopes, H. F. (2006). Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference, 2nd ed. Chapman & Hall/CRC, Boca Raton, FL.
  • Gelman, A. (2006). Prior distributions for variance parameters in hierarchical models (comment on article by Browne and Draper). Bayesian Anal. 1 515–533 (electronic).
  • Gelman, A. and Meng, X.-L. (1998). Simulating normalizing constants: From importance sampling to bridge sampling to path sampling. Statist. Sci. 13 163–185.
  • Gelman, A., Carlin, J. B., Stern, H. S. and Rubin, D. B. (2004). Bayesian Data Analysis, 2nd ed. Chapman & Hall/CRC, Boca Raton, FL.
  • Ghosh, J. K., Delampady, M. and Samanta, T. (2006). An Introduction to Bayesian Analysis: Theory and Methods. Springer, New York.
  • Ghosh, J. and Dunson, D. B. (2008). Random Effect and Latent Variable Model Selection. Lecture Notes in Statistics 192. Springer, New York.
  • Ghosh, J. and Dunson, D. B. (2009). Default prior distributions and efficient posterior computation in Bayesian factor analysis. J. Comput. Graph. Statist. 18 306–320.
  • Green, P. J. (1995). Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika 82 711–732.
  • Jeffreys, H. (1961). Theory of Probability, 3rd ed. Clarendon Press, Oxford.
  • Lartillot, N. and Philippe, H. (2006). Computing Bayes factors using thermodynamic integration. Syst. Biol. 55 195–207.
  • Lee, S.-Y. and Song, X.-Y. (2002). Bayesian selection on the number of factors in a factor analysis model. Behaviormetrika 29 23–39.
  • Lefebvre, G., Steele, R., Vandal, A. C., Narayanan, S. and Arnold, D. L. (2009). Path sampling to compute integrated likelihoods: An adaptive approach. J. Comput. Graph. Statist. 18 415–437.
  • Liang, F., Paulo, R., Molina, G., Clyde, M. A. and Berger, J. O. (2008). Mixtures of $g$ priors for Bayesian variable selection. J. Amer. Statist. Assoc. 103 410–423.
  • Liu, J. S. (2008). Monte Carlo Strategies in Scientific Computing. Springer, New York.
  • Lopes, H. F. and West, M. (2004). Bayesian model assessment in factor analysis. Statist. Sinica 14 41–67.
  • Lynch, S. M. (2007). Introduction to Applied Bayesian Statistics and Estimation for Social Scientists. Springer, New York.
  • Meng, X.-L. and Wong, W. H. (1996). Simulating ratios of normalizing constants via a simple identity: A theoretical exploration. Statist. Sinica 6 831–860.
  • Neal, R. M. (2001). Annealed importance sampling. Stat. Comput. 11 125–139.
  • Nielsen, F. B. (2004). Variational approach to factor analysis and related models. Master’s thesis, Institute of Informatics and Mathematical Modelling, Technical Univ. Denmark.
  • Raftery, A. E., Newton, M. A., Satagopan, J. M. and Krivitsky, P. N. (2007). Estimating the integrated likelihood via posterior simulation using the harmonic mean identity. In Bayesian Statistics 8 371–416. Oxford Univ. Press, Oxford.
  • Robert, C. P. and Casella, G. (2004). Monte Carlo Statistical Methods, 2nd ed. Springer, New York.
  • Rue, H., Martino, S. and Chopin, N. (2009). Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations. J. R. Stat. Soc. Ser. B Stat. Methodol. 71 319–392.
  • Shen, G. and Ghosh, J. K. (2011). Developing a new BIC for detecting change-points. J. Statist. Plann. Inference 141 1436–1447.
  • Song, X.-Y. and Lee, S.-Y. (2006). Model comparison of generalized linear mixed models. Stat. Med. 25 1685–1698.
  • Xie, W., Lewis, P. O., Fan, Y., Kuo, L. and Chen, M.-H. (2011). Improving marginal likelihood estimation for Bayesian phylogenetic model selection. Syst. Biol. 60 150–160.