Open Access
November 2017 A generalized divergence for statistical inference
Abhik Ghosh, Ian R. Harris, Avijit Maji, Ayanendranath Basu, Leandro Pardo
Bernoulli 23(4A): 2746-2783 (November 2017). DOI: 10.3150/16-BEJ826

Abstract

The power divergence (PD) and the density power divergence (DPD) families have proven to be useful tools in the area of robust inference. In this paper, we consider a superfamily of divergences which contains both of these families as special cases. The role of this superfamily is studied in several statistical applications, and desirable properties are identified and discussed. In many cases, it is observed that the most preferred minimum divergence estimator within the above collection lies outside the class of minimum PD or minimum DPD estimators, indicating that this superfamily has real utility, rather than just being a routine generalization. The limitation of the usual first order influence function as an effective descriptor of the robustness of the estimator is also demonstrated in this connection.

Citation

Download Citation

Abhik Ghosh. Ian R. Harris. Avijit Maji. Ayanendranath Basu. Leandro Pardo. "A generalized divergence for statistical inference." Bernoulli 23 (4A) 2746 - 2783, November 2017. https://doi.org/10.3150/16-BEJ826

Information

Received: 1 December 2014; Revised: 1 February 2016; Published: November 2017
First available in Project Euclid: 9 May 2017

zbMATH: 06778255
MathSciNet: MR3648044
Digital Object Identifier: 10.3150/16-BEJ826

Keywords: $S$-divergence , Breakdown point , divergence measure , influence function , robust estimation

Rights: Copyright © 2017 Bernoulli Society for Mathematical Statistics and Probability

Vol.23 • No. 4A • November 2017
Back to Top