Open Access
December 1997 Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression
Theodoros Nicoleris, Yannis G. Yatracos
Ann. Statist. 25(6): 2493-2511 (December 1997). DOI: 10.1214/aos/1030741082

Abstract

$L_1$-optimal minimum distance estimators are provided for a projection pursuit regression type function with smooth functional components that are either additive or multiplicative, in the presence of or without interactions. The obtained rates of convergence of the estimate to the true parameter depend on Kolmogorov's entropy of the assumed model and confirm Stone's heuristic dimensionality reduction principle. Rates of convergence are also obtained for the error in estimating the derivatives of a regression type function.

Citation

Download Citation

Theodoros Nicoleris. Yannis G. Yatracos. "Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression." Ann. Statist. 25 (6) 2493 - 2511, December 1997. https://doi.org/10.1214/aos/1030741082

Information

Published: December 1997
First available in Project Euclid: 30 August 2002

zbMATH: 0909.62063
MathSciNet: MR1604404
Digital Object Identifier: 10.1214/aos/1030741082

Subjects:
Primary: 62G20 , 62J02
Secondary: 62G05 , 62G30

Keywords: additive and multiplicative regression , dimensionality reduction principle , Hoeffding's inequality , interactions , Kolmogorov's entropy , Model selection , Nonparametric regression , Optimal rates of convergence , Projection pursuit

Rights: Copyright © 1997 Institute of Mathematical Statistics

Vol.25 • No. 6 • December 1997
Back to Top