Open Access
June 2001 Nonparametric kernel regression subject to monotonicity constraints
Peter Hall, Li-Shan Huang
Ann. Statist. 29(3): 624-647 (June 2001). DOI: 10.1214/aos/1009210683


We suggest a method for monotonizing general kernel-type estimators, for example local linear estimators and Nadaraya .Watson estimators. Attributes of our approach include the fact that it produces smooth estimates, indeed with the same smoothness as the unconstrained estimate. The method is applicable to a particularly wide range of estimator types, it can be trivially modified to render an estimator strictly monotone and it can be employed after the smoothing step has been implemented. Therefore,an experimenter may use his or her favorite kernel estimator, and their favorite bandwidth selector, to construct the basic nonparametric smoother and then use our technique to render it monotone in a smooth way. Implementation involves only an off-the-shelf programming routine. The method is based on maximizing fidelity to the conventional empirical approach, subject to monotonicity.We adjust the unconstrained estimator by tilting the empirical distribution so as to make the least possible change, in the sense of a distance measure, subject to imposing the constraint of monotonicity.


Download Citation

Peter Hall. Li-Shan Huang. "Nonparametric kernel regression subject to monotonicity constraints." Ann. Statist. 29 (3) 624 - 647, June 2001.


Published: June 2001
First available in Project Euclid: 24 December 2001

zbMATH: 1012.62030
MathSciNet: MR1865334
Digital Object Identifier: 10.1214/aos/1009210683

Primary: 62G07
Secondary: 62G20

Keywords: Bandwidth,,,,,,,,,. , biased bootstrap , Gasser –Muller estimator , isotonic regression , local linear estimator , Nadaraya-Watson estimator , Order restricted inference , power divergence , Priestley –Chao estimator , weighted bootstrap

Rights: Copyright © 2001 Institute of Mathematical Statistics

Vol.29 • No. 3 • June 2001
Back to Top