Translator Disclaimer
December, 1966 On Moments of Cumulative Sums
Ragnar Ericson
Ann. Math. Statist. 37(6): 1803-1805 (December, 1966). DOI: 10.1214/aoms/1177699170

## Abstract

In the theory of sequential analysis developed by Wald  there appear sums of the form $X = \sum^N_1 x_i$ where both the $x_i$ and $N$ are random variables. In this note we shall consider conditions for the existence of $E(X^k)$ when the $x_i$ are independent random variables and the event $N \geqq i$ is independent of $x_i, x_{i + 1}, \cdots$. Let $|x_i| = y_i, Y = \sum^N_1 y_i$. We show that sufficient conditions for $E(Y^k) < \infty$ are that $E(y^k_i) \leqq \beta_k < \infty, E(N^k) < \infty, k = 1, 2, \cdots$ (proved for $k = 1$ in ), and that if we can find a constant $c < \infty$ such that $P(y_i \leqq c)E(y_i \mid y_i \leqq c) \geqq \alpha > 0$ for $i = 1, 2, \cdots$, a necessary condition for $E(Y^k) < \infty$ is $E(N^k) < \infty$. We also show that $E(x_i) = 0, E(x^{2k}_i) \leqq \beta_{2k} < \infty, E(N^k) < \infty$ imply that $E(X^{2k}) = \lim_{m \rightarrow \infty} E(X^{2k}_m) < \infty$ for $k = 1, 2, \cdots$ (proved for $k = 1, 2$ in , for $k \geqq 3$ proved independently in ).

## Citation

Ragnar Ericson. "On Moments of Cumulative Sums." Ann. Math. Statist. 37 (6) 1803 - 1805, December, 1966. https://doi.org/10.1214/aoms/1177699170

## Information

Published: December, 1966
First available in Project Euclid: 27 April 2007

zbMATH: 0168.17903
MathSciNet: MR202227
Digital Object Identifier: 10.1214/aoms/1177699170  