Statistics Surveys

Fundamentals of cone regression

Mariella Dimiccoli

Full-text: Open access


Cone regression is a particular case of quadratic programming that minimizes a weighted sum of squared residuals under a set of linear inequality constraints. Several important statistical problems such as isotonic, concave regression or ANOVA under partial orderings, just to name a few, can be considered as particular instances of the cone regression problem. Given its relevance in Statistics, this paper aims to address the fundamentals of cone regression from a theoretical and practical point of view. Several formulations of the cone regression problem are considered and, focusing on the particular case of concave regression as an example, several algorithms are analyzed and compared both qualitatively and quantitatively through numerical simulations. Several improvements to enhance numerical stability and bound the computational cost are proposed. For each analyzed algorithm, the pseudo-code and its corresponding code in Matlab are provided. The results from this study demonstrate that the choice of the optimization approach strongly impacts the numerical performances. It is also shown that methods are not currently available to solve efficiently cone regression problems with large dimension (more than many thousands of points). We suggest further research to fill this gap by exploiting and adapting classical multi-scale strategy to compute an approximate solution.

Article information

Statist. Surv., Volume 10 (2016), 53-99.

Received: March 2015
First available in Project Euclid: 19 May 2016

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62
Secondary: 90

cone regression linear complementarity problems proximal operators


Dimiccoli, Mariella. Fundamentals of cone regression. Statist. Surv. 10 (2016), 53--99. doi:10.1214/16-SS114.

Export citation


  • [1] C. Hildreth, Point Estimates of Ordinates of Concave Functions, Journal of the American Statistical Association 49 (267), 598–619, (1954).
  • [2] W. Dent, A Note on Least Squares Fitting of Functions Constrained to be Either non-negative, Nondecreasing or Convex, Management Science 20 (1), 130–132, (1973).
  • [3] C. A. Holloway, On the Estimation of Convex Functions, Operations Research 27 (2), 401–407, (1979).
  • [4] C. Perkins, A Convergence Analysis of Dykstra’s Algorithm for Polyhedral Sets, SIAM Journal on Numerical Analysis 40 (2), 792–804, (2003).
  • [5] N. I. Gould, How good are projection methods for convex feasibility problems?, Comput. Optim. Appl. 40 (1), 1–12, (2008).
  • [6] Y. Censor, W. Chen, P. L. Combettes, R. Davidi, G. T. Herman, On the Effectiveness of Projection Methods for Convex Feasibility Problems with Linear Inequality Constraints, Computational Optimization and Applications, 51 (3), 1065–1088, (2012).
  • [7] J. J. Moreau, Decomposition orthogonale d’un espace Hilbertien selon deux cones mutuellement polaires, Comptes Rendus de l’Académie des Sciences 255, 238–240, (1962).
  • [8] H. W. Kuhn, A. W. Tucker, Nonlinear programming, in: Proceedings of the Second Berkeley Symposium on Mathematical Statistics and Probability, 481–492, (1951).
  • [9] S. M. Goldman, P. A. Ruud, Nonparametric Multivariate Regression Subject to Constraint, Preprint. Department of Economics, UCB, (1993).
  • [10] J. J. Moreau, Fonctions convexes duales et points proximaux dans un espace Hilbertien, Comptes Rendus de l’Académie des Sciences (Paris), Série A 255, 2897–2899, (1962).
  • [11] J. J. Moreau, Propriétées des applications prox, Comptes Rendus de l’Académie des Sciences (Paris), Série A 256, 1069–1071, (1963).
  • [12] R. Dykstra, An Algorithm for Restricted Least Squares Regression, Journal of the American Statistical Association 78 (384), 837–842, (1983).
  • [13] D. R. Wilhelmsen, A Nearest Point Algorithm for Convex Polyhedral Cones and Applications to Positive Linear Approximation, Mathematics of Computation 30 (133), (1976).
  • [14] B. N. Pshenichny, M. Y. Danilin, Numerical methods in extremal problems, Mir, Moscow, (1978).
  • [15] C. F. Wu, Some algorithms for concave and isotonic regression, Studies in the Management Sciences 19, 105–116, (1982).
  • [16] D. A. S. Fraser, H. Massam, A Mixed Primal-Dual Bases Algorithm for Regression under Inequality Constraints. Application to Concave Regression, Scandinavian Journal of Statistics 16 (1), 65–74, (1989) .
  • [17] M. C. Meyer, An extension of the mixed primal-dual bases algorithm to the case of more constraints than dimensions, Journal of Statistical Planning and Inference, 13–31, (1999).
  • [18] M. C. Meyer, A Simple New Algorithm for Quadratic Programming with Applications in Statistics, Communications in Statistics.
  • [19] K. G. Murty, Y. Fathi, A critical index algorithm for nearest point problems on simplicial cones, Mathematical Programming 23 (1), 206–215, (1982).
  • [20] Z. Liu, Y. Fathi, An active index algorithm for the nearest point problem in a polyhedral cone, Computational Optimimization and Applications 49 (3), 435–456, (2011).
  • [21] W. Kahan, Gauss-Seidel Methods of Solving Large Systems of Linear Equations, Ph.D. thesis, University of Toronto (1958).
  • [22] J. von Neumann, Functional Operators, Volume II, Princeton University Press, USA, (1950).
  • [23] J. P. Boyle, R. L. Dykstra, A method for finding projections onto the intersection of convex sets in Hilbert spaces, Lecture Notes in Statistics 37, 28–47, (1986).
  • [24] H. R. Varian, The Nonparametric Approach to Production Analysis, Econometrica 52 (3), 579–597, (1984).
  • [25] A. N. Iusem, A. R. Pierro, On the convergence of Han’s method for convex programming with quadratic objective, Mathematical Programming 52 (1), 265–284, (1991).
  • [26] G. Crombez, Finding projections onto the intersection of convex sets in Hilbert spaces, Numerical Functional Analysis and Optimization 16 (5–6), 637–652, (1995).
  • [27] N. Gaffke, R. Mathar, A cyclic projection algorithm via duality, Metrika 36 (1), 29–54, (1989).
  • [28] H. H. Bauschke, J. M. Borwein, A. S. Lewis, On the method of cyclic projections for convex sets in Hilbert space, Preprint, (1994).
  • [29] P. L. Combettes, J. C. Pesquet, Proximal Splitting Methods in Signal Processing, Fixed-point algorithms for inverse problems in science and engineering, Springer New York, NY, 185–212, (2011).
  • [30] S. P. Han, A successive projection method, Mathematical Programming, 40 (1), 1–14, (1988).
  • [31] P. A. Ruud, Restricted least squares subject to monotonicity and concavity constraints, in: Advances in Economics and Econometrics: Theory and Applications, Econometric Society Monographs, 3, 166–187, (1997).
  • [32] M. Hestenes, Multiplier and gradient methods, Journal of Optimization Theory and Applications 4 (5), 303–320, (1969).
  • [33] M. J. D. Powell, A method for nonlinear constraints in minimization problems, R. Fletcher Editions, Optimization, Academic Press, New York, 283–298, (1969).
  • [34] J. Eckstein, D. P. Bertsekas, On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators, Mathematical Programming, 55 (3), 293–318, (1992).
  • [35] G. Pierra, Decomposition through formalization in a product space, Mathematical Programming, 28 (1), 96–115, (1984).
  • [36] I. I. Eremin, Fejer mappings and problems of convex optimization, Sibirskii Matematicheskii Zhurnal, 10, 1034–1047, (1969).
  • [37] K. J. Arrow, L. Hurwicz, H. Uzawa, Studies in Linear and Non-Linear Programming, Cambridge University Press, (1958).
  • [38] M. J. Silvapulle, P. K. Sen, Constrained statistical inference: Order, inequality, and shape constraints, John Wiley & Sons, (2011).
  • [39] R. T. Rockafellar, Convex Analysis (Princeton Landmarks in Mathematics and Physics), Princeton University Press, (1970).
  • [40] E. H. Zarantonello (Ed.), Projections on Convex Sets in Hilbert Space and Spectral Theory, Academic Press, (1971).
  • [41] K. G. Murty, Linear complementarity, linear and nonlinear programming, Berlin: Heldermann Verlag, (1988).
  • [42] T. Robertson, F. Wright, R. L. Dykstra, T. Robertson, Order restricted statistical inference, Statistical Papers, 30 (1), 316–316, (1989).
  • [43] N. Karmarkar, A new polynomial-time algorithm for linear programming, Proceedings of the sixteenth annual ACM symposium on Theory of computing, STOC ’84, ACM, New York, NY, USA, 302–311, (1984).
  • [44] J. N. Singh, D. Singh, Interior-Point Methods for Linear Programming: A Review, International Journal of Mathematical Education in Science and Technology, 33 (3), 405–423, (2002).
  • [45] F. Curtis, Z. Han, D. Robinson, A Globally Convergent Primal-Dual Active-Set Framework for Large-Scale Convex Quadratic Optimization, Computational Optimization and Applications, 60 (2): 311–341, (2015).
  • [46] T. Bhatia, L. T. Biegler, Dynamic Optimization in the Design and Scheduling of Multiproduct Batch Plants, Industrial and Engineering Chemistry Research. 35 (7), 2234–2246, (1996).
  • [47] E. Andelić, M. Schafföner, M. Katz, S. E. Krüger, A. Wendemuth, Kernel least-squares models using updates of the pseudoinverse, Neural Computations, 18 (12), 2928–2935, (2006).
  • [48] R. Mahfoudhi, A fast triangular matrix inversion, Proceedings of the World Congress on Engineering, (2012).
  • [49] P. Courrieu, Fast computation of Moore-Penrose inverse matrices, Neural Information Processing - Letters and Reviews 8 (2), 25–29, (2008).
  • [50] B.-Israel Adi, A Newton-Raphson method for the solution of systems of equations, Journal of Mathematical Analysis and Applications, 15 (2), 243–252, (1966).
  • [51] Y. Zhu, X. Li, Recursive least squares with linear contraints, Communications in Information and Systems 7 (3), 287–312, (2007).
  • [52] T. Kuosmanen, Representation Theorem for Convex Nonparametric Least Squares, The Econometrics Journal, 11 (2), 308–325, (2008).

Supplemental materials