Journal of Applied Probability

Minimum dynamic discrimination information models

Majid Asadi, Nader Ebrahimi, G. G. Hamedani, and Ehsan S. Soofi

Full-text: Access has been disabled (more information)

Abstract

In this paper, we introduce the minimum dynamic discrimination information (MDDI) approach to probability modeling. The MDDI model relative to a given distribution G is that which has least Kullback-Leibler information discrepancy relative to G, among all distributions satisfying some information constraints given in terms of residual moment inequalities, residual moment growth inequalities, or hazard rate growth inequalities. Our results lead to MDDI characterizations of many well-known lifetime models and to the development of some new models. Dynamic information constraints that characterize these models are tabulated. A result for characterizing distributions based on dynamic Rényi information divergence is also given.

Article information

Source
J. Appl. Probab. Volume 42, Number 3 (2005), 643-660.

Dates
First available in Project Euclid: 21 September 2005

Permanent link to this document
http://projecteuclid.org/euclid.jap/1127322018

Digital Object Identifier
doi:10.1239/jap/1127322018

Mathematical Reviews number (MathSciNet)
MR2157511

Zentralblatt MATH identifier
1094.94013

Subjects
Primary: 94A15: Information theory, general [See also 62B10, 81P94]
Secondary: 60E15: Inequalities; stochastic orderings 60B05: Probability measures on topological spaces

Keywords
Lifetime distribution residual life distribution monotone density failure rate dominance uncertainty ordering

Citation

Asadi, Majid; Ebrahimi, Nader; Hamedani, G. G.; Soofi, Ehsan S. Minimum dynamic discrimination information models. J. Appl. Probab. 42 (2005), no. 3, 643--660. doi:10.1239/jap/1127322018. http://projecteuclid.org/euclid.jap/1127322018.


Export citation

References

  • Asadi, M., Ebrahimi, N., Hamedani, G. G. and Soofi, E. S. (2004). Maximum dynamic entropy models. J. Appl. Prob. 41, 379--390.
  • Belzunce, F., Navarro, J., Ruiz, J. M. and del Aguila, Y. (2004). Some results on residual entropy functions. Metrika 59, 147--161.
  • Di Crescenzo, A. and Longobardi, M. (2002). Entropy-based measure of uncertainty in past lifetime distributions. J. Appl. Prob. 39, 434--440.
  • Ebrahimi, N. (1996). How to measure uncertainty in the residual lifetime distributions. Sankhyā A 58, 48--57.
  • Ebrahimi, N. (1998). Testing for exponentiality of the residual lifetime based on dynamic Kullback--Leibler information. IEEE Trans. Reliab. 47, 197--201.
  • Ebrahimi, N. (2001). Testing for uniformity of the residual lifetime based on dynamic Kullback--Leibler information. Ann. Inst. Statist. Math. 53, 325--337.
  • Ebrahimi, N. and Kirmani, S. N. U. A. (1996a). A characterization of the proportional hazards model through a measure of discrimination between two residual life distributions. Biometrika 83, 233--235.
  • Ebrahimi, N. and Kirmani, S. N. U. A. (1996b). Some results on ordering of survival functions through uncertainty. Statist. Prob. Lett. 29, 167--176.
  • Hamedani, G. G. (2005). Characterizations of univariate continuous distributions based on hazard functions. To appear in J. Appl. Statist. Sci.
  • Jaynes, E. T. (1957). Information theory and statistical mechanics. Physics Rev. 106, 620--630.
  • Jaynes, E. T. (1982). On the rationale of maximum entropy methods. Proc. IEEE 70, 939--952.
  • Kullback, S. (1959). Information Theory and Statistics. John Wiley, New York.
  • Rényi, A. (1961). On measures of entropy and information. In Proc. 4th Berkeley Symp. Math. Statist. Prob., Vol. 1, University of California Press, Berkeley, CA, pp. 547--561.
  • Shaked, M. and Shanthikumar, J. G. (1994). Stochastic Orders and Their Applications. Academic Press, Boston, MA.
  • Shannon, C. E. (1948). A mathematical theory of communication. Bell System Tech. J. 27, 379--423, 623--656.
  • Shore, J. E. and Johnson, R. W. (1980). Axiomatic derivation of the principle of maximum entropy and principle of minimum cross-entropy. IEEE Trans. Inf. Theory 26, 26--37.