November 2022 Conditional regression for single-index models
Alessandro Lanteri, Mauro Maggioni, Stefano Vigogna
Author Affiliations +
Bernoulli 28(4): 3051-3078 (November 2022). DOI: 10.3150/22-BEJ1482

Abstract

The single-index model is a statistical model for intrinsic regression where responses are assumed to depend on a single yet unknown linear combination of the predictors, allowing to express the regression function as E[Y|X]=f(v,X) for some unknown index vector v and link function f. Conditional methods provide a simple and effective approach to estimate v by averaging moments of X conditioned on Y, but depend on parameters whose optimal choice is unknown and do not provide generalization bounds on f. In this paper we propose a new conditional method converging at n rate under an explicit parameter characterization. Moreover, we prove that polynomial partitioning estimates achieve the 1-dimensional min-max rate for regression of Hölder functions when combined to any n-convergent index estimator. Overall this yields an estimator for dimension reduction and regression of single-index models that attains statistical optimality in quasilinear time.

Funding Statement

This research was partially supported by AFOSR FA9550-17-1-0280, NSF-DMS-1821211, NSF-ATD-1737984.

Acknowledgements

S.V. thanks Timo Klock for the discussion and the useful exchange of views about this and related problems.

Citation

Download Citation

Alessandro Lanteri. Mauro Maggioni. Stefano Vigogna. "Conditional regression for single-index models." Bernoulli 28 (4) 3051 - 3078, November 2022. https://doi.org/10.3150/22-BEJ1482

Information

Received: 1 December 2020; Published: November 2022
First available in Project Euclid: 17 August 2022

zbMATH: 07594088
MathSciNet: MR4474572
Digital Object Identifier: 10.3150/22-BEJ1482

Keywords: Dimension reduction , finite-sample bounds , Nonparametric regression , Single-index model

JOURNAL ARTICLE
28 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.28 • No. 4 • November 2022
Back to Top