We have a population composed of two subpopulations whose probability properties are described by known univariate distribution functions, $G(x)$ and $H(x)$, respectively. The probability of observing an individual from the first population is $\theta$, from the second is $1 - \theta$. We assume $\theta$ is a random variable with a prior distribution on (0, 1) and find the Bayes rule for classifying $n$ observations as from $G$ or from $H$ when the loss function is equal to the number of misclassifications. The main results in the paper give the asymptotic properties of the Bayes rule and several proposed approximations.
"Bayesian Classification: Asymptotic Results." Ann. Statist. 2 (4) 763 - 774, July, 1974. https://doi.org/10.1214/aos/1176342763