Abstract
Consider a Bayesian situation in which we observe $Y\sim p_{\theta}$, where $\theta\in\Theta$, and we have a family $\{\nu_{h},h\in\mathcal{H}\}$ of potential prior distributions on $\Theta$. Let $g$ be a real-valued function of $\theta$, and let $I_{g}(h)$ be the posterior expectation of $g(\theta)$ when the prior is $\nu_{h}$. We are interested in two problems: (i) selecting a particular value of $h$, and (ii) estimating the family of posterior expectations $\{I_{g}(h),h\in\mathcal{H}\}$. Let $m_{y}(h)$ be the marginal likelihood of the hyperparameter $h$: $m_{y}(h)=\int p_{\theta}(y)\nu_{h}(d\theta)$. The empirical Bayes estimate of $h$ is, by definition, the value of $h$ that maximizes $m_{y}(h)$. It turns out that it is typically possible to use Markov chain Monte Carlo to form point estimates for $m_{y}(h)$ and $I_{g}(h)$ for each individual $h$ in a continuum, and also confidence intervals for $m_{y}(h)$ and $I_{g}(h)$ that are valid pointwise. However, we are interested in forming estimates, with confidence statements, of the entire families of integrals $\{m_{y}(h),h\in\mathcal{H}\}$ and $\{I_{g}(h),h\in\mathcal{H}\}$: we need estimates of the first family in order to carry out empirical Bayes inference, and we need estimates of the second family in order to do Bayesian sensitivity analysis. We establish strong consistency and functional central limit theorems for estimates of these families by using tools from empirical process theory. We give two applications, one to latent Dirichlet allocation, which is used in topic modeling, and the other is to a model for Bayesian variable selection in linear regression.
Citation
Hani Doss. Yeonhee Park. "An MCMC approach to empirical Bayes inference and Bayesian sensitivity analysis via empirical processes." Ann. Statist. 46 (4) 1630 - 1663, August 2018. https://doi.org/10.1214/17-AOS1597
Information