Abstract
We proposed a robust mean change-point estimation algorithm in linear regression with the assumption that the errors follow the Laplace distribution. By representing the Laplace distribution as an appropriate scale mixture of normal distribution, we developed the expectation maximization (EM) algorithm to estimate the position of mean change-point. We investigated the performance of the algorithm through different simulations, finding that our methods is robust to the distributions of errors and is effective to estimate the position of mean change-point. Finally, we applied our method to the classical Holbert data and detected a change-point.
Citation
Fengkai Yang. "Robust Mean Change-Point Detecting through Laplace Linear Regression Using EM Algorithm." J. Appl. Math. 2014 1 - 9, 2014. https://doi.org/10.1155/2014/856350