Open Access
December, 1966 On Moments of Cumulative Sums
Ragnar Ericson
Ann. Math. Statist. 37(6): 1803-1805 (December, 1966). DOI: 10.1214/aoms/1177699170

Abstract

In the theory of sequential analysis developed by Wald [6] there appear sums of the form $X = \sum^N_1 x_i$ where both the $x_i$ and $N$ are random variables. In this note we shall consider conditions for the existence of $E(X^k)$ when the $x_i$ are independent random variables and the event $N \geqq i$ is independent of $x_i, x_{i + 1}, \cdots$. Let $|x_i| = y_i, Y = \sum^N_1 y_i$. We show that sufficient conditions for $E(Y^k) < \infty$ are that $E(y^k_i) \leqq \beta_k < \infty, E(N^k) < \infty, k = 1, 2, \cdots$ (proved for $k = 1$ in [7]), and that if we can find a constant $c < \infty$ such that $P(y_i \leqq c)E(y_i \mid y_i \leqq c) \geqq \alpha > 0$ for $i = 1, 2, \cdots$, a necessary condition for $E(Y^k) < \infty$ is $E(N^k) < \infty$. We also show that $E(x_i) = 0, E(x^{2k}_i) \leqq \beta_{2k} < \infty, E(N^k) < \infty$ imply that $E(X^{2k}) = \lim_{m \rightarrow \infty} E(X^{2k}_m) < \infty$ for $k = 1, 2, \cdots$ (proved for $k = 1, 2$ in [1], for $k \geqq 3$ proved independently in [5]).

Citation

Download Citation

Ragnar Ericson. "On Moments of Cumulative Sums." Ann. Math. Statist. 37 (6) 1803 - 1805, December, 1966. https://doi.org/10.1214/aoms/1177699170

Information

Published: December, 1966
First available in Project Euclid: 27 April 2007

zbMATH: 0168.17903
MathSciNet: MR202227
Digital Object Identifier: 10.1214/aoms/1177699170

Rights: Copyright © 1966 Institute of Mathematical Statistics

Vol.37 • No. 6 • December, 1966
Back to Top