• Bernoulli
  • Volume 22, Number 4 (2016), 2177-2208.

Integral approximation by kernel smoothing

Bernard Delyon and François Portier

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


Let $(X_{1},\ldots,X_{n})$ be an i.i.d. sequence of random variables in $\mathbb{R}^{d}$, $d\geq 1$. We show that, for any function $\varphi:\mathbb{R}^{d}\rightarrow\mathbb{R}$, under regularity conditions,

\[n^{1/2}(n^{-1}\sum_{i=1}^{n}\frac{\varphi(X_{i})}{\widehat{f}(X_{i})}-\int \varphi(x)\,dx){\longrightarrow}^{\mathbb{P}}0,\] where $\widehat{f}$ is the classical kernel estimator of the density of $X_{1}$. This result is striking because it speeds up traditional rates, in root $n$, derived from the central limit theorem when $\widehat{f}=f$. Although this paper highlights some applications, we mainly address theoretical issues related to the later result. We derive upper bounds for the rate of convergence in probability. These bounds depend on the regularity of the functions $\varphi$ and $f$, the dimension $d$ and the bandwidth of the kernel estimator $\widehat{f}$. Moreover, they are shown to be accurate since they are used as renormalizing sequences in two central limit theorems each reflecting different degrees of smoothness of $\varphi$. As an application to regression modelling with random design, we provide the asymptotic normality of the estimation of the linear functionals of a regression function. As a consequence of the above result, the asymptotic variance does not depend on the regression function. Finally, we debate the choice of the bandwidth for integral approximation and we highlight the good behavior of our procedure through simulations.

Article information

Bernoulli, Volume 22, Number 4 (2016), 2177-2208.

Received: September 2014
Revised: March 2015
First available in Project Euclid: 3 May 2016

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

central limit theorem integral approximation kernel smoothing nonparametric regression


Delyon, Bernard; Portier, François. Integral approximation by kernel smoothing. Bernoulli 22 (2016), no. 4, 2177--2208. doi:10.3150/15-BEJ725.

Export citation


  • [1] Bickel, P.J., Klaassen, C.A.J., Ritov, Y. and Wellner, J.A. (1993). Efficient and Adaptive Estimation for Semiparametric Models. Johns Hopkins Series in the Mathematical Sciences. Baltimore, MD: Johns Hopkins Univ. Press.
  • [2] Boucheron, S., Lugosi, G. and Bousquet, O. (2004). Concentration inequalities. In Advanced Lectures on Machine Learning. Lecture Notes in Computer Science 3176 208–240. Berlin: Springer.
  • [3] Chen, S.X. (1999). Beta kernel estimators for density functions. Comput. Statist. Data Anal. 31 131–145.
  • [4] Delecroix, M., Hristache, M. and Patilea, V. (2006). On semiparametric $M$-estimation in single-index regression. J. Statist. Plann. Inference 136 730–769.
  • [5] Devroye, L.P. and Wagner, T.J. (1980). The strong uniform consistency of kernel density estimates. J. Multivariate Anal. 5 59–77.
  • [6] Efron, B. and Stein, C. (1981). The jackknife estimate of variance. Ann. Statist. 9 586–596.
  • [7] Evans, L.C. and Gariepy, R.F. (1992). Measure Theory and Fine Properties of Functions. Studies in Advanced Mathematics. Boca Raton, FL: CRC Press.
  • [8] Evans, M. and Swartz, T. (2000). Approximating Integrals Via Monte Carlo and Deterministic Methods. Oxford Statistical Science Series. Oxford: Oxford Univ. Press.
  • [9] Federer, H. (1969). Geometric Measure Theory. Die Grundlehren der Mathematischen Wissenschaften 153. New York: Springer.
  • [10] Folland, G.B. (1999). Real Analysis: Modern Techniques and Their Applications, 2nd ed. Pure and Applied Mathematics (New York). New York: Wiley.
  • [11] Gamboa, F., Loubes, J.-M. and Maza, E. (2007). Semi-parametric estimation of shifts. Electron. J. Stat. 1 616–640.
  • [12] Hall, P. (1984). Central limit theorem for integrated square error of multivariate nonparametric density estimators. J. Multivariate Anal. 14 1–16.
  • [13] Hall, P. and Heyde, C.C. (1980). Martingale Limit Theory and Its Application: Probability and Mathematical Statistics. New York: Academic Press.
  • [14] Härdle, W. (1990). Applied Nonparametric Regression. Econometric Society Monographs 19. Cambridge: Cambridge Univ. Press.
  • [15] Härdle, W., Marron, J.S. and Tsybakov, A.B. (1992). Bandwidth choice for average derivative estimation. J. Amer. Statist. Assoc. 87 218–226.
  • [16] Härdle, W. and Stoker, T.M. (1989). Investigating smooth multiple regression by the method of average derivatives. J. Amer. Statist. Assoc. 84 986–995.
  • [17] Jones, M.C. (1993). Simple boundary correction for kernel density estimation. Stat. Comput. 3 135–146.
  • [18] Oh, M.-S. and Berger, J.O. (1992). Adaptive importance sampling in Monte Carlo integration. J. Stat. Comput. Simul. 41 143–168.
  • [19] Robinson, P.M. (1988). Root-$N$-consistent semiparametric regression. Econometrica 56 931–954.
  • [20] Silverman, B.W. (1986). Density Estimation for Statistics and Data Analysis. Monographs on Statistics and Applied Probability. London: Chapman & Hall.
  • [21] Stone, C.J. (1975). Adaptive maximum likelihood estimators of a location parameter. Ann. Statist. 3 267–284.
  • [22] Stone, C.J. (1980). Optimal rates of convergence for nonparametric estimators. Ann. Statist. 8 1348–1360.
  • [23] Tsybakov, A.B. (2009). Introduction to Nonparametric Estimation. Springer Series in Statistics. New York: Springer.
  • [24] Vial, C. (2003). Deux contributions à l’étude semi-paramétrique d’un modèle de régression. Ph.D. thesis, Univ. Rennes.
  • [25] Zhang, P. (1996). Nonparametric importance sampling. J. Amer. Statist. Assoc. 91 1245–1253.