Open Access
December, 1989 Min-Max Bias Robust Regression
R. D. Martin, V. J. Yohai, R. H. Zamar
Ann. Statist. 17(4): 1608-1630 (December, 1989). DOI: 10.1214/aos/1176347384

Abstract

This paper considers the problem of minimizing the maximum asymptotic bias of regression estimates over $\varepsilon$-contamination neighborhoods for the joint distribution of the response and carriers. Two classes of estimates are treated: (i) $M$-estimates with bounded function $\rho$ applied to the scaled residuals, using a very general class of scale estimates, and (ii) bounded influence function type generalized $M$-estimates. Estimates in the first class are obtained as the solution of a minimization problem, while estimates in the second class are specified by an estimating equation. The first class of $M$-estimates is sufficiently general to include both Huber Proposal 2 simultaneous estimates of regression coefficients and residuals scale, and Rousseeuw-Yohai $S$-estimates of regression. It is shown than an $S$-estimate based on a jump-function type $\rho$ solves the min-max bias problem for the class of $M$-estimates with very general scale. This estimate is obtained by the minimization of the $\alpha$-quantile of the squared residuals, where $\alpha = \alpha(\varepsilon)$ depends on the fraction of contamination $\varepsilon$. When $\varepsilon \rightarrow 0.5, \alpha(\varepsilon) \rightarrow 0.5$ and the min-max estimator approaches the least median of squared residuals estimator introduced by Rousseeuw. For the bounded influence class of $GM$-estimates, it is shown the "sign" type nonlinearity yields the min-max estimate. This estimate coincides with the minimum gross-error sensitivity $GM$-estimate. For $p = 1$, the optimal $GM$-estimate is optimal among the class of all equivariant regression estimates. The min-max $S$-estimator has a breakdown point which is independent of the number of carriers $p$ and tends to 0.5 as $\varepsilon$ increases to 0.5, but has a slow rate of convergence. The min-max $GM$-estimate has the usual rate of convergence, but a breakdown point which decreases to zero with increasing $p$. Finally, we compare the min-max biases for both types of estimates, for the case where the nominal model is multivariate normal.

Citation

Download Citation

R. D. Martin. V. J. Yohai. R. H. Zamar. "Min-Max Bias Robust Regression." Ann. Statist. 17 (4) 1608 - 1630, December, 1989. https://doi.org/10.1214/aos/1176347384

Information

Published: December, 1989
First available in Project Euclid: 12 April 2007

zbMATH: 0713.62068
MathSciNet: MR1026302
Digital Object Identifier: 10.1214/aos/1176347384

Subjects:
Primary: 62J02
Secondary: 62J05

Keywords: Min-max bias , regression , Robust estimates

Rights: Copyright © 1989 Institute of Mathematical Statistics

Vol.17 • No. 4 • December, 1989
Back to Top