Detection of an image boundary when the pixel intensities are measured with noise is an important problem in image segmentation. From a statistical point of view, a challenge is that likelihood-based methods require modeling the pixel intensities inside and outside the image boundary, even though these distributions are typically not of interest. Since misspecification of the pixel intensity distributions can negatively affect inference on the image boundary, it would be desirable to avoid this modeling step altogether. Toward this, we develop a robust Gibbsian approach that constructs a posterior distribution for the image boundary directly, without modeling the pixel intensities. We prove that the Gibbs posterior concentrates asymptotically at the minimax optimal rate, adaptive to the boundary smoothness. Monte Carlo computation of the Gibbs posterior is straightforward, and simulation results show that the corresponding inference is more accurate than that based on existing Bayesian methodology.
"Robust and rate-optimal Gibbs posterior inference on the boundary of a noisy image." Ann. Statist. 48 (3) 1498 - 1513, June 2020. https://doi.org/10.1214/19-AOS1856