A $k$-sided die is thrown $n$ times, to estimate the probabilities $\theta_1, \ldots, \theta_k$ of landing on the various sides. The MLE of $\theta$ is the vector of empirical proportions $p = (p_1, \ldots, p_k)$. Consider a set of Bayesians that put uniformly positive prior mass on all reasonable subsets of the parameter space. Their posterior distributions will be uniformly concentrated near $p$. Sharp bounds are given, using entropy. These bounds apply to all sample sequences: There are no exceptional null sets.
"On the Uniform Consistency of Bayes Estimates for Multinomial Probabilities." Ann. Statist. 18 (3) 1317 - 1327, September, 1990. https://doi.org/10.1214/aos/1176347751