Abstract
Since the time of Gauss, it has been generally accepted that $\ell_2$-methods of combining observations by minimizing sums of squared errors have significant computational advantages over earlier $\ell_1$-methods based on minimization of absolute errors advocated by Boscovich, Laplace and others. However, $\ell_1$-methods are known to have significant robustness advantages over $\ell_2$-methods in many applications, and related quantile regression methods provide a useful, complementary approach to classical least-squares estimation of statistical models. Combining recent advances in interior point methods for solving linear programs with a new statistical preprocessing approach for $\ell_1$-type problems, we obtain a 10- to 100-fold improvement in computational speeds over current (simplex-based) $\ell_1$-algorithms in large problems, demonstrating that $\ell_1$-methods can be made competitive with $\ell_2$-methods in terms of computational speed throughout the entire range of problem sizes. Formal complexity results suggest that $\ell_1$-regression can be made faster than least-squares regression for n sufficiently large and p modest.
Citation
Stephen Portnoy. Roger Koenker. "The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators." Statist. Sci. 12 (4) 279 - 300, November 1997. https://doi.org/10.1214/ss/1030037960
Information