Open Access
June, 1960 A One-Sided Inequality of the Chebyshev Type
Albert W. Marshall, Ingram Olkin
Ann. Math. Statist. 31(2): 488-491 (June, 1960). DOI: 10.1214/aoms/1177705913

Abstract

If $x$ is a random variable with mean zero and variance $\sigma^2$, then, according to Chebyshev's inequality, $P{|x| \geqq 1} \leqq \sigma^2$. The corresponding one-sided inequality $P{x \geqq 1} \leqq \sigma^2/(\sigma^2 + 1)$ is also known (see e.g. [2, p. 198]). Both inequalities are sharp. A generalization of Chebyshev's inequality was obtained by Olkin and Pratt [1] for $P{|x_1| \geqq 1 or \cdots or |x_k| \geqq 1}$, where $Ex_i = 0, Ex_i^2 = \sigma^2$, $Ex_ix_j = \sigma^2 \rho (i \neq j), i, j = 1, \cdots, k;$ we give here the corresponding generalization of the one-sided inequality, and we consider also the case where only means and variances are known. To obtain an upper bound for $P\{x \varepsilon T\} \equiv P\{x_1 \geqq 1 \text{or} \cdots \text{or} x_k \geqq 1\}$, we consider a nonnegative function, $f(x) \equiv f(x_1, \cdots, x_k)$, such that $f(x) \geqq 1$ for $x \varepsilon T$. Then $Ef(x) \geqq \int_{\{x \varepsilon T\}} f(x) dP \geqq P\{x \varepsilon T\}$. Since the bound is to be a function of the covariance matrix, $\Sigma, f(x)$ must be of the form $(x - a)A(x - a)'$, where $a = (a_1, \cdots, a_k), A = (a_{ij}): k \times k$. A "best" bound is one which minimizes $Ef(x) = \operatorname{tr} A(\Sigma + a'a)$, subject to $f(x) \geqq 0, f(x) \geqq 1$ on $T$.

Citation

Download Citation

Albert W. Marshall. Ingram Olkin. "A One-Sided Inequality of the Chebyshev Type." Ann. Math. Statist. 31 (2) 488 - 491, June, 1960. https://doi.org/10.1214/aoms/1177705913

Information

Published: June, 1960
First available in Project Euclid: 27 April 2007

zbMATH: 0237.60007
MathSciNet: MR119230
Digital Object Identifier: 10.1214/aoms/1177705913

Rights: Copyright © 1960 Institute of Mathematical Statistics

Vol.31 • No. 2 • June, 1960
Back to Top