Abstract
Power-expected-posterior (PEP) methodology, which borrows ideas from the literature on power priors, expected-posterior priors and unit information priors, provides a systematic way to construct objective priors. The basic idea is to use imaginary training samples to update a (possibly improper) prior into a proper but minimally-informative one. In this work, we develop a novel definition of PEP priors for logistic regression models that relies on a Laplace expansion of the likelihood of the imaginary training sample. This approach has various advantages over previous proposals for non-informative priors in logistic regression, and can be easily extended to other generalized linear models. We study theoretical properties of the prior and provide a number of empirical studies that demonstrate superior performance both in terms of model selection and of parameter estimation, especially for heavy-tailed versions.
Funding Statement
A. Rodríguez’s research was supported by NSF grant 2023495 and NSF grant 2114727.
Acknowledgments
We would like to thank Dimitris Fouskakis and Ioannis Ntzourfras for sharing their code, and Merlise Clyde for answering our queries related to the BAS package.
Citation
Anupreet Porwal. Abel Rodríguez. "Laplace Power-Expected-Posterior Priors for Logistic Regression." Bayesian Anal. 19 (4) 1163 - 1186, December 2024. https://doi.org/10.1214/23-BA1389
Information