Bernoulli

  • Bernoulli
  • Volume 22, Number 2 (2016), 857-900.

Consistency, efficiency and robustness of conditional disparity methods

Giles Hooker

Full-text: Access denied (no subscription detected) We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

This paper considers extensions of minimum-disparity estimators to the problem of estimating parameters in a regression model that is conditionally specified; that is where a parametric model describes the distribution of a response $y$ conditional on covariates $x$ but does not specify the distribution of $x$. We define these estimators by estimating a non-parametric conditional density estimates and minimizing a disparity between this estimate and the parametric model averaged over values of $x$. The consistency and asymptotic normality of such estimators is demonstrated for a broad class of models in which response and covariate vectors can take both discrete and continuous values and incorportates a wide set of choices for kernel-based conditional density estimation. It also establishes the robustness of these estimators for a broad class of disparities. As has been observed in Tamura and Boos (J. Amer. Statist. Assoc. 81 (1986) 223–229), minimum disparity estimators incorporating kernel density estimates of more than one dimension can result in an asymptotic bias that is larger that $n^{-1/2}$ and we characterize a similar bias in our results and show that in specialized cases it can be eliminated by appropriately centering the kernel density estimate. We also demonstrate empirically that bootstrap methods can be employed to reduce this bias and to provide robust confidence intervals. In order to demonstrate these results, we establish a set of $L_{1}$-consistency results for kernel-based estimates of centered conditional densities.

Article information

Source
Bernoulli Volume 22, Number 2 (2016), 857-900.

Dates
Received: April 2014
Revised: August 2014
First available in Project Euclid: 9 November 2015

Permanent link to this document
http://projecteuclid.org/euclid.bj/1447077763

Digital Object Identifier
doi:10.3150/14-BEJ678

Mathematical Reviews number (MathSciNet)
MR3449802

Zentralblatt MATH identifier
06562299

Keywords
bootstrap density estimation disparity regression robust inference

Citation

Hooker, Giles. Consistency, efficiency and robustness of conditional disparity methods. Bernoulli 22 (2016), no. 2, 857--900. doi:10.3150/14-BEJ678. http://projecteuclid.org/euclid.bj/1447077763.


Export citation

References

  • [1] Basu, A. and Lindsay, B.G. (1994). Minimum disparity estimation for continuous models: Efficiency, distributions and robustness. Ann. Inst. Statist. Math. 46 683–705.
  • [2] Basu, A., Sarkar, S. and Vidyashankar, A.N. (1997). Minimum negative exponential disparity estimation in parametric models. J. Statist. Plann. Inference 58 349–370.
  • [3] Beran, R. (1977). Minimum Hellinger distance estimates for parametric models. Ann. Statist. 5 445–463.
  • [4] Cheng, A.-L. and Vidyashankar, A.N. (2006). Minimum Hellinger distance estimation for randomized play the winner design. J. Statist. Plann. Inference 136 1875–1910.
  • [5] Devroye, L. and Györfi, L. (1985). Nonparametric Density Estimation: The $L{_{1}}$ View. New York: Wiley.
  • [6] Gervini, D. and Yohai, V.J. (2002). A class of robust and fully efficient regression estimators. Ann. Statist. 30 583–616.
  • [7] Hansen, B.E. (2004). Nonparametric conditional density estimation. Available at http://www.ssc.wisc.edu/~bhansen/papers/ncde.pdf.
  • [8] Hooker, G. (2014). Supplement to “Consistency, efficiency and robustness of conditional disparity methods.” DOI:10.3150/14-BEJ678SUPP.
  • [9] Hooker, G. and Vidyashankar, A.N. (2014). Bayesian model robustness via disparities. TEST 23 556–584.
  • [10] Li, Q. and Racine, J.S. (2007). Nonparametric Econometrics: Theory and Practice. Princeton, NJ: Princeton Univ. Press.
  • [11] Lindsay, B.G. (1994). Efficiency versus robustness: The case for minimum Hellinger distance and related methods. Ann. Statist. 22 1081–1114.
  • [12] Pak, R.J. and Basu, A. (1998). Minimum disparity estimation in linear regression models: Distribution and efficiency. Ann. Inst. Statist. Math. 50 503–521.
  • [13] Park, C. and Basu, A. (2004). Minimum disparity estimation: Asymptotic normality and breakdown point results. Bull. Inform. Cybernet. 36 19–33.
  • [14] Rousseeuw, P.J. and Leroy, A.M. (2005). Robust Regression and Outlier Detection. New York: Wiley.
  • [15] Simpson, D.G. (1987). Minimum Hellinger distance estimation for the analysis of count data. J. Amer. Statist. Assoc. 82 802–807.
  • [16] Tamura, R.N. and Boos, D.D. (1986). Minimum Hellinger distance estimation for multivariate location and covariance. J. Amer. Statist. Assoc. 81 223–229.
  • [17] Wu, Y. and Hooker, G. (2013). Hellinger disance and Bayesian non-parametrics: Hierarchical models for robust and efficient Bayesian inference. Under review.

Supplemental materials

  • Proofs and simulations for consistency, efficiency and robustness of conditional disparity methods. We provide additional supporting simulations of the efficiency and robustness of the conditional disparity methods along with proofs of the results stated above.