The Annals of Statistics

Bootrapping robust estimates of regression

Matias Salibian-Barrera and Ruben H. Zamar

Full-text: Open access


We introduce a new computer-intensive method to estimate the distribution of robust regression estimates. The basic idea behind our method is to bootstrap a reweighted representation of the estimates. To obtain a bootstrap method that is asymptotically correct, we include the auxiliary scale estimate in our reweighted representation of the estimates. Our method is computationally simple because for each bootstrap sample we only have to solve a linear system of equations. The weights we use are decreasing functions of the absolute value of the residuals and hence outlying observations receive small weights. This results in a bootstrap method that is resistant to the presence of outliers in the data. The breakdown points of the quantile estimates derived with this method are higher than those obtained with the bootstrap. We illustrate our method on two datasets and we report the results of a Monte Carlo experiment on confidence intervals for the parameters of the linear model.

Article information

Ann. Statist., Volume 30, Number 2 (2002), 556-582.

First available in Project Euclid: 14 May 2002

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62F35: Robustness and adaptive procedures 62F40: Bootstrap, jackknife and other resampling methods 62G09: Resampling methods 62G20: Asymptotic properties 62G35: Robustness 62J05: Linear regression

Regression breakdown point confidence intervals


Salibian-Barrera, Matias; Zamar, Ruben H. Bootrapping robust estimates of regression. Ann. Statist. 30 (2002), no. 2, 556--582. doi:10.1214/aos/1021379865.

Export citation


  • BEATON, A. E. and TUKEY, J. W. (1974). The fitting of power series, meaning polynomials, illustrated on band-spectroscopic data. Technometrics 16 147-185.
  • BICKEL, P. J. and FREEDMAN, D. A. (1981). Some asymptotic theory for the bootstrap. Ann. Statist. 9 1196-1217.
  • CARROLL, R. J. (1978). On almost sure expansions for M-estimates. Ann. Statist. 6 314-318.
  • CARROLL, R. J. (1979). On estimating variances of robust estimators when the errors are asymmetric. J. Amer. Statist. Assoc. 74 674-679.
  • CARROLL, R. J. and WELSH, A. H. (1988). A note on asymmetry and robustness in linear regression. Amer. Statist. 42 285-287.
  • COLEMAN, J. (1966). Equality of Educational Opportunity. Office of Education, U.S. Department of Health, Washington, DC.
  • DAVISON, A. C. and HINKLEY, D. V. (1997). Bootstrap Methods and Their Application. Cambridge Univ. Press.
  • EFRON, B. (1979). Bootstrap methods: Another look at the jackknife. Ann. Statist. 7 1-26.
  • FREEDMAN, D. A. (1981). Bootstrapping regression models. Ann. Statist. 9 1218-1228.
  • HAMPEL, F. R., RONCHETTI, E. M., ROUSSEEUW, P. J. and STAHEL, W. A. (1986). Robust Statistics. The Approach Based on Influence Functions. Wiley, New York.
  • HU, F. and KALBFLEISCH, J. D. (2000). The estimating function bootstrap. Canad. J. Statist. 28 449-499.
  • HU, F. and ZIDEK, J. V. (1995). A bootstrap based on the estimating equations of the linear model. Biometrika 82 263-275.
  • HUBER, P. J. (1981). Robust Statistics. Wiley, New York.
  • LIU, R. Y. and SINGH, K. (1992). Efficiency and robustness in resampling. Ann. Statist. 20 370-384.
  • MOSTELLER, F. and TUKEY, J. W. (1977). Data Analysis and Regression. Addison-Wesley, Reading, MA.
  • PARR, W. C. (1985). The bootstrap: Some large sample theory and connections with robustness. Statist. Probab. Lett. 3 97-100.
  • ROCKE, D. M. and DOWNS, G. W. (1981). Estimating the variances of robust estimators of location: Influence curve, jackknife and bootstrap. Comm. Statist. Simulation Comput. 10 221-248.
  • ROUSSEEUW, P. J. and LEROY, A. M. (1987). Robust Regression and Outlier Detection. Wiley, New York.
  • ROUSSEEUW, P. J. and YOHAI, V. J. (1984). Robust regression by means of S-estimators. Robust and Nonlinear Time Series Analysis. Lecture Notes in Statist. 26 256-272. Springer, Berlin.
  • SALIBIAN-BARRERA, M. (2000). Contributions to the theory of robust inference. Ph.D. thesis, Dept. Statist., Univ. British Columbia, Vancouver.
  • SCHUCANY, W. R. and WANG, S. (1991). One-step bootstrapping for smooth iterative procedures. J. Roy. Statist. Soc. Ser. B 53 587-596.
  • SEBER, G. A. F. (1984). Multivariate Observations. Wiley, New York.
  • SERFLING, R. J. (1980). Approximation Theorems of Mathematical Statistics. Wiley, New York.
  • SHAO, J. (1990). Bootstrap estimation of the asymptotic variances of statistical functionals. Ann. Inst. Statist. Math. 42 737-752.
  • SHAO, J. (1992). Bootstrap variance estimators with truncation. Statist. Probab. Lett. 15 95-101.
  • SHORACK, G. R. (1982). Bootstrapping robust regression. Comm. Statist. Theory Methods 11 961-972.
  • SINGH, K. (1998). Breakdown theory for bootstrap quantiles. Ann. Statist. 26 1719-1732.
  • YANG, S.-S. (1985). On bootstrapping a class of differentiable statistical functionals with applications to Land M-estimates. Statist. Neerlandica 39 375-385.
  • YOHAI, V. J. (1987). High breakdown-point and high efficiency robust estimates for regression. Ann. Statist. 15 642-656.