April 2024 Convergence rates of oblique regression trees for flexible function libraries
Matias D. Cattaneo, Rajita Chandak, Jason M. Klusowski
Author Affiliations +
Ann. Statist. 52(2): 466-490 (April 2024). DOI: 10.1214/24-AOS2354


We develop a theoretical framework for the analysis of oblique decision trees, where the splits at each decision node occur at linear combinations of the covariates (as opposed to conventional tree constructions that force axis-aligned splits involving only a single covariate). While this methodology has garnered significant attention from the computer science and optimization communities since the mid-80s, the advantages they offer over their axis-aligned counterparts remain only empirically justified, and explanations for their success are largely based on heuristics. Filling this long-standing gap between theory and practice, we show that oblique regression trees (constructed by recursively minimizing squared error) satisfy a type of oracle inequality and can adapt to a rich library of regression models consisting of linear combinations of ridge functions and their limit points. This provides a quantitative baseline to compare and contrast decision trees with other less interpretable methods, such as projection pursuit regression and neural networks, which target similar model forms. Contrary to popular belief, one needs not always trade-off interpretability with accuracy. Specifically, we show that, under suitable conditions, oblique decision trees achieve similar predictive accuracy as neural networks for the same library of regression models. To address the combinatorial complexity of finding the optimal splitting hyperplane at each decision node, our proposed theoretical framework can accommodate many existing computational tools in the literature. Our results rely on (arguably surprising) connections between recursive adaptive partitioning and sequential greedy approximation algorithms for convex optimization problems (e.g., orthogonal greedy algorithms), which may be of independent theoretical interest. Using our theory and methods, we also study oblique random forests.

Funding Statement

MDC was supported in part by the National Science Foundation through SES-2019432 and SES-2241575.
JMK was supported in part by the National Science Foundation through CAREER DMS-2239448, DMS-2054808 and HDR TRIPODS CCF-1934924.


The authors would like to thank Florentina Bunea, Sameer Deshpande, Jianqing Fan, Yingying Fan, Jonathan Siegel, Bartolomeo Stellato and William Underwood for insightful discussions. The authors are particularly grateful to two anonymous reviewers whose comments improved the quality of the paper.


Download Citation

Matias D. Cattaneo. Rajita Chandak. Jason M. Klusowski. "Convergence rates of oblique regression trees for flexible function libraries." Ann. Statist. 52 (2) 466 - 490, April 2024. https://doi.org/10.1214/24-AOS2354


Received: 1 October 2022; Revised: 1 September 2023; Published: April 2024
First available in Project Euclid: 9 May 2024

Digital Object Identifier: 10.1214/24-AOS2354

Primary: 62G08
Secondary: 62L12

Keywords: CART , decision trees , neural networks , Projection pursuit regression , Random forest

Rights: Copyright © 2024 Institute of Mathematical Statistics


This article is only available to subscribers.
It is not available for individual sale.

Vol.52 • No. 2 • April 2024
Back to Top