## The Annals of Statistics

### Least quantile regression via modern optimization

#### Abstract

We address the Least Quantile of Squares (LQS) (and in particular the Least Median of Squares) regression problem using modern optimization methods. We propose a Mixed Integer Optimization (MIO) formulation of the LQS problem which allows us to find a provably global optimal solution for the LQS problem. Our MIO framework has the appealing characteristic that if we terminate the algorithm early, we obtain a solution with a guarantee on its sub-optimality. We also propose continuous optimization methods based on first-order subdifferential methods, sequential linear optimization and hybrid combinations of them to obtain near optimal solutions to the LQS problem. The MIO algorithm is found to benefit significantly from high quality solutions delivered by our continuous optimization based methods. We further show that the MIO approach leads to (a) an optimal solution for any dataset, where the data-points $(y_{i},\mathbf{x}_{i})$’s are not necessarily in general position, (b) a simple proof of the breakdown point of the LQS objective value that holds for any dataset and (c) an extension to situations where there are polyhedral constraints on the regression coefficient vector. We report computational results with both synthetic and real-world datasets showing that the MIO algorithm with warm starts from the continuous optimization methods solve small ($n=100$) and medium ($n=500$) size problems to provable optimality in under two hours, and outperform all publicly available methods for large-scale ($n=10,000$) LQS problems.

#### Article information

Source
Ann. Statist. Volume 42, Number 6 (2014), 2494-2525.

Dates
First available in Project Euclid: 12 November 2014

Permanent link to this document
https://projecteuclid.org/euclid.aos/1415801781

Digital Object Identifier
doi:10.1214/14-AOS1223

Mathematical Reviews number (MathSciNet)
MR3277669

Zentralblatt MATH identifier
1302.62154

#### Citation

Bertsimas, Dimitris; Mazumder, Rahul. Least quantile regression via modern optimization. Ann. Statist. 42 (2014), no. 6, 2494--2525. doi:10.1214/14-AOS1223. https://projecteuclid.org/euclid.aos/1415801781

#### References

• Agullo, J. (1997). Exact algorithms for computing the least median of squares estimate in multiple linear regression. In $L_1$-Statistical Procedness and Related Topics. Lecture Notes Monogr. Ser. 31 133–146. IMS, Hayward, CA.
• Barreto, H. and Maharry, D. (2006). Least median of squares and regression through the origin. Comput. Statist. Data Anal. 50 1391–1397.
• Bernholt, T. (2005a). Robust estimators are hard to compute. Technical Report 52/2005, Univ. Dortmund.
• Bernholt, T. (2005b). Computing the least median of squares estimator in time $O(n^d)$. In Computational Science and Its Applications, ICCSA 2005. Lecture Notes in Computer Science 3480 697–706. Springer, Berlin.
• Bertsimas, D. and Weismantel, R. (2005). Optimization over Integers. Dynamic Ideas, Belmont, MA.
• Bickel, P. J. (1975). One-step Huber estimates in the linear model. J. Amer. Statist. Assoc. 70 428–434.
• Boyd, S. and Vandenberghe, L. (2004). Convex Optimization. Cambridge Univ. Press, Cambridge.
• Chakraborty, B. and Chaudhuri, P. (2008). On an optimization problem in robust statistics. J. Comput. Graph. Statist. 17 683–702.
• Clarke, F. H. (1990). Optimization and Nonsmooth Analysis, 2nd ed. SIAM, Philadelphia, PA.
• Donoho, D. and Huber, P. J. (1983). The notion of breakdown point. In A Festschrift for Erich L. Lehmann 157–184. Wadsworth, Belmont, CA.
• Erickson, J., Har-Peled, S. and Mount, D. M. (2006). On the least median square problem. Discrete Comput. Geom. 36 593–607.
• Giloni, A. and Padberg, M. (2002). Least trimmed squares regression, least median squares regression, and mathematical programming. Math. Comput. Modelling 35 1043–1060.
• Gurobi Optimization, Inc. (2013). Gurobi Optimizer Reference Manual.
• Hampel, F. R. (1971). A general qualitative definition of robustness. Ann. Math. Statist. 42 1887–1896.
• Hampel, F. R. (1975). Beyond location parameters: Robust concepts and methods. Bull. Int. Stat. Inst. 46 375–382.
• Hawkins, D. M. (1993). The feasible set algorithm for least median of squares regression. Comput. Statist. Data Anal. 16 81–101.
• Hawkins, D. M., Bradu, D. and Kass, G. V. (1984). Location of several outliers in multiple-regression data using elemental sets. Technometrics 26 197–208.
• Huber, P. J. (1973). Robust regression: Asymptotics, conjectures and Monte Carlo. Ann. Statist. 1 799–821.
• Huber, P. J. (2011). Robust Statistics. Springer, Berlin.
• Hubert, M., Rousseeuw, P. J. and Van Aelst, S. (2008). High-breakdown robust multivariate methods. Statist. Sci. 23 92–119.
• Meer, P., Mintz, D., Rosenfeld, A. and Kim, D. Y. (1991). Robust regression methods for computer vision: A review. Int. J. Comput. Vis. 6 59–70.
• Mount, D. M., Netanyahu, N. S., Piatko, C. D., Silverman, R. and Wu, A. Y. (2000). Quantile approximation for robust statistical estimation and $k$-enclosing problems. Internat. J. Comput. Geom. Appl. 10 593–608.
• Mount, D. M., Netanyahu, N. S., Romanik, K., Silverman, R. and Wu, A. Y. (2007). A practical approximation algorithm for the LMS line estimator. Comput. Statist. Data Anal. 51 2461–2486.
• Nesterov, Y. (2004). Introductory Lectures on Convex Optimization: A Basic Course. Kluwer Academic, Boston, MA.
• Nunkesser, R. and Morell, O. (2010). An evolutionary algorithm for robust regression. Comput. Statist. Data Anal. 54 3242–3248.
• Olson, C. F. (1997). An approximation algorithm for least median of squares regression. Inform. Process. Lett. 63 237–241.
• Rockafellar, R. T. (1996). Convex Analysis. Princeton Univ. Press, Princeton, NJ.
• Rousseeuw, P. J. (1984). Least median of squares regression. J. Amer. Statist. Assoc. 79 871–880.
• Rousseeuw, P. J. and Driessen, K. V. (2006). Computing LTS regression for large data sets. Data Min. Knowl. Discov. 12 29–45.
• Rousseeuw, P. and Hubert, M. (1997). Recent developments in PROGRESS. In $L_1$-Statistical Procedures and Related Topics 201–214. IMS, Hayward, CA.
• Rousseeuw, P. J. and Leroy, A. M. (1987). Robust Regression and Outlier Detection. Wiley, New York.
• Rousseeuw, P. J., Debruyne, M., Engelen, S. and Hubert, M. (2006). Robustness and outlier detection in chemometrics. Crit. Rev. Anal. Chem. 36 221–242.
• Rousseeuw, P., Croux, C., Todorov, V., Ruckstuhl, A., Salibian-Barrera, M., Verbeke, T., Koller, M. and Maechler, M. (2013). robustbase: Basic Robust Statistics. R package version 0.9-10.
• Shor, N. Z. (1985). Minimization Methods for Nondifferentiable Functions. Springer, Berlin. Translated from the Russian by K. C. Kiwiel and A. Ruszczyński.
• Siegel, A. F. (1982). Robust regression using repeated medians. Biometrika 69 242–244.
• Souvaine, D. L. and Steele, J. M. (1987). Time- and space-efficient algorithms for least median of squares regression. J. Amer. Statist. Assoc. 82 794–801.
• Steele, J. M. and Steiger, W. L. (1986). Algorithms and complexity for least median of squares regression. Discrete Appl. Math. 14 93–100.
• Stewart, C. V. (1999). Robust parameter estimation in computer vision. SIAM Rev. 41 513–537.
• Stromberg, A. J. (1993). Computing the exact least median of squares estimate and stability diagnostics in multiple linear regression. SIAM J. Sci. Comput. 14 1289–1299.
• Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B 58 267–288.
• Todorov, V. and Filzmoser, P. (2009). An object-oriented framework for robust multivariate analysis. J. Stat. Softw. 32 1–47.