By introducing suitable loss random variables of detection, we obtain optimal tests in terms of the stopping time or alarm time for Bayesian change-point detection not only for a general prior distribution of change-points but also for observations being a Markov process. Moreover, the optimal (minimal) average detection delay is proved to be equal to $1$ for any (possibly large) average run length to false alarm if the number of possible change-points is finite.
"On the optimality of Bayesian change-point detection." Ann. Statist. 45 (4) 1375 - 1402, August 2017. https://doi.org/10.1214/16-AOS1479