Open Access
June, 1962 A Characterization of the Inverse Gaussian Distribution
C. G. Khatri
Ann. Math. Statist. 33(2): 800-803 (June, 1962). DOI: 10.1214/aoms/1177704599

Abstract

M. C. K. Tweedie [2] defined the inverse Gaussian distributions via the density functions \begin{equation*}\tag{1}f(x; m, \lambda)= \lbrack\lambda/(2\pi x^3)\rbrack^{\frac{1}{2}} \exp \lbrack -\lambda(x - m)^2/(2m^2x)\rbrack\quad\text{for}\quad x > 0 = 0\quad\text{for} x \leqq 0\end{equation*} The parameters $\lambda$ and $m$ are positive. The corresponding densities reflected about the origin, and with $\lambda$ and $m$ negative, may also be considered as in the Inverse Gaussian family. The characteristic function of the Inverse Gaussian distribution with parameters $\lambda, m$ is \begin{equation*}\tag{2}\phi(t) = \exp \lbrack\lambda\{1 - (1 - 2im^2t\lambda^{-1})^{\frac{1}{2}}\}/m\rbrack,\quad i = \sqrt{-1}\end{equation*} for all real values of $t$. If $x_1, x_2, \cdots, x_n$ are $n$ independent observations from (1), then $y = \sum^n_{j=1} x_j$ and $z = \sum^n_{j=1} x^{-1}_j - n^2y^{-1}$ are independently distributed. The distribution of $y$ is $f(y, nm, n^2\lambda)$ and that of $\lambda z$ is Chi-Square with $(n - 1)$ degrees of freedom. In this note, we prove that, if $x_1, x_2, \cdots, x_n$ are independently and identically distributed variates, with the existence of certain moments (different from zero), and if $y$ and $z$ are independently distributed, then the distribution of $x_j$ is Inverse Gaussian.

Citation

Download Citation

C. G. Khatri. "A Characterization of the Inverse Gaussian Distribution." Ann. Math. Statist. 33 (2) 800 - 803, June, 1962. https://doi.org/10.1214/aoms/1177704599

Information

Published: June, 1962
First available in Project Euclid: 27 April 2007

zbMATH: 0109.13402
MathSciNet: MR137197
Digital Object Identifier: 10.1214/aoms/1177704599

Rights: Copyright © 1962 Institute of Mathematical Statistics

Vol.33 • No. 2 • June, 1962
Back to Top