Abstract
This paper considers extensions of minimum-disparity estimators to the problem of estimating parameters in a regression model that is conditionally specified; that is where a parametric model describes the distribution of a response $y$ conditional on covariates $x$ but does not specify the distribution of $x$. We define these estimators by estimating a non-parametric conditional density estimates and minimizing a disparity between this estimate and the parametric model averaged over values of $x$. The consistency and asymptotic normality of such estimators is demonstrated for a broad class of models in which response and covariate vectors can take both discrete and continuous values and incorportates a wide set of choices for kernel-based conditional density estimation. It also establishes the robustness of these estimators for a broad class of disparities. As has been observed in Tamura and Boos (J. Amer. Statist. Assoc. 81 (1986) 223–229), minimum disparity estimators incorporating kernel density estimates of more than one dimension can result in an asymptotic bias that is larger that $n^{-1/2}$ and we characterize a similar bias in our results and show that in specialized cases it can be eliminated by appropriately centering the kernel density estimate. We also demonstrate empirically that bootstrap methods can be employed to reduce this bias and to provide robust confidence intervals. In order to demonstrate these results, we establish a set of $L_{1}$-consistency results for kernel-based estimates of centered conditional densities.
Citation
Giles Hooker. "Consistency, efficiency and robustness of conditional disparity methods." Bernoulli 22 (2) 857 - 900, May 2016. https://doi.org/10.3150/14-BEJ678
Information