Brazilian Journal of Probability and Statistics

On the time-dependent Fisher information of a density function

Omid Kharazmi and Majid Asadi

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

Fisher information is a very important and fundamental criterion in statistical inference especially in optimal and large sample studies in estimation theory. It also plays a key role in physics, thermodynamic, information theory and other applications. In the literature there have been defined two forms of Fisher information: one for the parameters of a distribution function and one for the density function of a distribution. In this paper, we consider a nonnegative continuous random (lifetime) variable $X$ and define a time-dependent Fisher information for density function of the residual random variable associated to $X$. We also propose a time-dependent version of Fisher information distance (relative Fisher information) between the densities of two nonnegative random variables. Several properties of the proposed measures and their relations to other statistical measures are investigated. To illustrate the results various examples are also provided.

Article information

Source
Braz. J. Probab. Stat., Volume 32, Number 4 (2018), 795-814.

Dates
Received: March 2017
Accepted: May 2017
First available in Project Euclid: 17 August 2018

Permanent link to this document
https://projecteuclid.org/euclid.bjps/1534492902

Digital Object Identifier
doi:10.1214/17-BJPS366

Mathematical Reviews number (MathSciNet)
MR3845030

Zentralblatt MATH identifier
06979601

Keywords
Fisher information distance escort distributions score function residual random variable score function residual random variable equilibrium distribution likelihood ratio order

Citation

Kharazmi, Omid; Asadi, Majid. On the time-dependent Fisher information of a density function. Braz. J. Probab. Stat. 32 (2018), no. 4, 795--814. doi:10.1214/17-BJPS366. https://projecteuclid.org/euclid.bjps/1534492902


Export citation

References

  • Barlow, R. E. and Proschan, F. (1981). Statistical Theory or Reliability and Life Testing: Probability Models. New York-Montreal, Que.-London: Holt, Rinehart and Winston, Inc.
  • Beck, C. and Schlögl, F. (1993). Thermodynamics of chaotic systems: an introduction. Cambridge Nonlinear Science Series 4. Cambridge: Cambridge University Press.
  • Bercher, J. F. (2015). Entropies and entropic criteria, theory. In Regularization and Bayesian Methods for Inverse Problems in Signal and Image Processing (J. F. Giovanneli and J. Idier, eds.). New York: John Wiley.
  • Bercher, J. F. and Vignat, C. (2009). On minimum Fisher information distributions with restricted support and fixed variance. Information Sciences 179, 3832–3842.
  • Bobkov, S. G., Chistyakov, G. P. and Gtze, F. (2014). Fisher information and the central limit theorem. Probability Theory and Related Fields 159, 1–59.
  • Brown, L. D. (1982). A proof of the central limit theorem motivated by the Cramer–Rao inequality. Statistics and Probability: Essays in Honor of C. R. Rao, 141–148.
  • Frieden, B. R. (2004). Science from Fisher Information: A Unification. Cambridge: Cambridge University Press.
  • Johnson, O. (2004). Information Theory and the Central Limit Theorem. London: Imperial College Press.
  • Johnson, O. and Barron, A. (2004). Fisher information inequalities and the central limit theorem theory. Probability Theory and Related Fields 129, 391–409.
  • Kostal, L., Lansky, P. and Pokora, O. (2013). Measures of statistical dispersion based on Shannon and Fisher information concepts. Information Sciences 235, 214–223.
  • Lehmann, E. L. and Casella, G. (1998). Information Theory and the Central Limit Theorem. New York: Springer.
  • Otto, F. and Villani, C. (2000). Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality. Journal of Functional Analysis 173, 361–400.
  • Papaioannou, T. and Ferentinos, K. (2005). On two forms of Fisher’s measure of information. Communications in Statistics Theory and Methods 34, 1461–1470.
  • Pardo, L., Morales, D. and Taneja, I. J. (1995). Generalized Jensen difference divergence measures and Fisher measure of information. Kybernetes 24, 15–28.
  • Rao, C. R. (1997). Statistics and Truth: Putting Chance to Work. River Edge: World Scientific.
  • Salicrú, M. and Taneja, I. J. (1993). Connections of generalized divergence measures with Fisher information matrix. Information Sciences 72, 251–269.
  • Shaked, M. and Shanthikumar, J. G. (2007). Stochastic Orders. New York: Springer Science and Business Media.
  • Shao, J. (2003). Mathematical Statistics. New York: Springer.
  • Venkatesan, R. C. and Plastino, A. (2014). Legendre transform structure and extremal properties of the relative Fisher information. Physics Letters A 378, 1341–1345.
  • Walker, S. G. (2016). Bayesian information in an experiment and the Fisher information distance. Statistics & Probability Letters 112, 5–9.
  • Yamano, T. (2013). Phase space gradient of dissipated work and information: A role of relative Fisher information. Journal of Mathematical Physics 54, 113301.
  • Yáñez, R. J., Sánchez-Moreno, P., Zarzo, A. and Dehesa, J. S. (2008). Fisher information of special functions and second-order differential equations. Journal of Mathematical Physics 49, 082104.
  • Zegres, P. (2002). Some new results on the architecture, training process, and estimation error bounds for learning machines. Ph.D. dissertation, Univ. Arizona.