The Annals of Statistics
- Ann. Statist.
- Volume 29, Issue 3 (2001), 624-647.
Nonparametric kernel regression subject to monotonicity constraints
We suggest a method for monotonizing general kernel-type estimators, for example local linear estimators and Nadaraya .Watson estimators. Attributes of our approach include the fact that it produces smooth estimates, indeed with the same smoothness as the unconstrained estimate. The method is applicable to a particularly wide range of estimator types, it can be trivially modified to render an estimator strictly monotone and it can be employed after the smoothing step has been implemented. Therefore,an experimenter may use his or her favorite kernel estimator, and their favorite bandwidth selector, to construct the basic nonparametric smoother and then use our technique to render it monotone in a smooth way. Implementation involves only an off-the-shelf programming routine. The method is based on maximizing fidelity to the conventional empirical approach, subject to monotonicity.We adjust the unconstrained estimator by tilting the empirical distribution so as to make the least possible change, in the sense of a distance measure, subject to imposing the constraint of monotonicity.
Ann. Statist., Volume 29, Issue 3 (2001), 624-647.
First available in Project Euclid: 24 December 2001
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Bandwidth,,,,,,,,,. biased bootstrap Gasser –Muller estimator isotonic regression local linear estimator Nadaraya-Watson estimator order restricted inference power divergence Priestley –Chao estimator weighted bootstrap
Hall, Peter; Huang, Li-Shan. Nonparametric kernel regression subject to monotonicity constraints. Ann. Statist. 29 (2001), no. 3, 624--647. doi:10.1214/aos/1009210683. https://projecteuclid.org/euclid.aos/1009210683