CONVERGENCE OF STOPPED SUMS OF WEAKLY DEPENDENT RANDOM VARIABLES

In this paper we investigate stopped partial sums for weak dependent sequences. In particular, the results are used to obtain new maximal inequalities for strongly mixing sequences and related almost sure results.


Introduction
A random walk with weak dependent increments is a sequence {S n ; n ≥ 0} of random variables with S 0 = 0 and increments {X k ; k ≥ 1}, which are weakly dependent in some sense.The applications of these random walks in renewal theory was studied by Berbee (1979).
Motivated by applications to sequential analysis, we shall investigate stopped random walks {S τn ; n ≥ 1} where τ n are stopping times.
For the case when increments are independent a variety of convergence results and applications are surveyed in Gut (1986).
In this paper we assume the increments are weak dependent.Mixing type of dependence gives general models for which we can prove convergence theorems of both weak and strong type with a large applicability to Markov chains, Gaussian processes, time series, number theory, etc. Various examples are contained in Bradley (1986) and Doukhan (1994).
The problem we investigate here also reveals the relation of mixing sequences with some generalized notions of martingales such as amarts (Austin, Edgar and Ionescu Tulcea (1974)) and semiamarts (Edgar and Sucheston (1976), Krengel and Sucheston (1978)).Generalized martingales are important in getting martingale-like maximal inequalities, tightness of stochastic processes associated to partial sums, as well as various strong and weak limit theorems.Semiamarts have proven useful in connection with the optimal stopping rules and the reversed semiamart property is useful in proving uniform integrability of partial sums.In addition, the amarts admit the Riesz decomposition in a martingale and a potential which makes it easy for the martingale tool to be applied to all examples of amarts we shall give in this paper.The properties of amarts and semiamarts are surveyed in Gut and Schmidt (1983) and Edgar and Sucheston (1992).
Let (Ω, K, P ) be a probability space.We shall introduce some measures of dependence between two sub σ-algebras of K, A and B.
Let now (X n ) n≥1 be a sequence of random variables on (Ω, K, P ), and denote We shall consider in this paper sums of mixing sequences of random variables and we shall study the convergence of these stopped sums.We shall give sufficient conditions for the convergence of the stopped net which will relate the concept of mixing sequences to those of amarts and semiamarts.In particular, if the stopping time is defined as the first crossing of an interval by sums we get maximal inequalities for partial sums.Some applications of the results to the convergence of series of weakly dependent random variables will be given.Some of these results can be formulated in the context of Banach space valued random variables.The maximal inequalities can be used to obtain the invariance principles for random elements associated to partial sums of weakly dependent random variables by using Theorem (8.3) in Billingsley (1968).Other inequalities for rank orders of partial sums can be obtained by defining other stopping times (like in Newman and Wright (1982)).Also generalizations of these results to random fields can be considered.However we shall not follow in this paper all these implications and directions and we will leave them for further research.
In the following text, [x] denotes the integer part of x, and α(n

The results
Our first theorem gives an upper bound for a second moment of stopped random sums in terms of the ρ-mixing coefficients and second moments of the individual summands.It is well known that if the sequence is ρ-mixing then c(ρ, n) defined in the following theorem is a slowly varying Let τ be a stopping time.Then there is an absolute constant K such that for every n ≥ 1: As a first corollary of Theorem 2.1 we can formulate the following maximal inequality.

COROLLARY 2.1 (Bradley and Utev (1994))
Under the conditions of Theorem 2.1 there is a constant K such that for every n ≥ 1 and every λ > 0: where c(ρ, n) is defined by (2.1).
Another consequence of Theorem 2.1 is the following result which gives a sufficient condition for the almost sure convergence of series in terms of ρ-mixing coefficients and provides a class of examples of amarts which include Markov processes satisfying an L 2 operator condition and functions of some Gaussian sequences (see Bradley (1986)  We denote by Q X (u) = inf{t : P (|X| > t) ≤ u}, the quantile function of X.We shall establish next We shall estimate now the second moment of stopped partial sums in terms of the strong mixing coefficients and moments of the individual summands.Notice that if then c(α, n, δ) in the next theorem is a numerical constant which does not depend on n.This summability condition imposed on the strong mixing coefficients is implied by the condition k α(k) δ/(2+δ) < ∞ which is widely used to bound the variance of S n .This improvement is possible due to a recent result by Rio (1993).Next theorem gives an estimate for the second moment of S τ ∧n for strong mixing sequences which is up to a multiplicative constant the same as for the Var(S n ).
THEOREM 2.2 Let (X i ) i≥1 be a centered sequence of random variables such that for a certain δ > 0, E|X i | 2+δ < ∞ for every i ≥ 1.Let τ be a stopping time. Denote Then for every n ≥ 1 we have With a proof similar to that of Corollary 2.2 we obtain the following class of convergent amarts which is a subclass of strongly mixing sequences.
As an important consequence of Proposition 2.1 and of Theorem 2.2 we obtain the following maximal inequalities for partial sums of a strongly mixing sequence.Related results were obtained by Shao (1993) and Rio (1995).COROLLARY 2.4 Let (X i ) i≥1 be a centered sequence of random variables such that E|X i | 2 < ∞ for every i ≥ 1.Then, for every λ > 0 and every n ≥ 1 we have: If E|X i | 2+δ < ∞ for every i ≥ 1 we have: where c(α, n, δ) is defined in Theorem 2.2.
For the case of ϕ-mixing coefficients we shall establish:

Proofs
The proofs of Theorems 2.1 and 2.2 are based on the following lemma which is inspired by Garsia's version of Doob's maximal inequality (1973).
LEMMA 3.1 Let (X k ) k≥1 be a sequence of random variables and let τ be a stopping time.Then for every n ≥ 1, (i)

and
(iii)

PROOF.
In order to prove (i) we just observe that Now (iii) results from (i) by trivial computations and the fact that (τ ≤ i) ⊂ (τ ≤ j) for every i ≤ j.
In order to prove (ii) we have to remark only that (i) also implies 2 In order to prove Theorem 2.1 we also need the following analytical result which is a reformulation of Theorem 7 in Bradley and Utev (1994) to incorporate Remark 1 in the same paper.

LEMMA 3.2 For any two
Then there is an absolute constant K such that for every a ≥ 1, n ≥ 1: .

PROOF OF THEOREM 2.1
We shall apply the inequality (ii) of Lemma 3.1 to the sequence (X i ) i≥1 and get: By Theorem 1.1 in Utev (1991), we can find a constant C 1 such that We apply now Lemma 3.2 to the sequences and observe that r 1 (u) = 0 for every u ≥ 1 and since all Z k are all centered, it follows that r 2 (u) ≤ ρ(u) and r 3 (u) ≤ ρ(u) for every u ≥ 1. Therefore we can find a positive constant K such that: . Let λ > 0, and denote by τ = inf{i ≥ 1; |S i | ≥ λ}.According to Theorem 2.1 there is a constant K such that for every n ≥ 1: We have only to remark that by the definition of τ and Let τ be a finite stopping time, τ ≥ m.By Theorem 2.1 we can find a constant K such that: It is easy to see that (3.5) where α = α(σ(Y ), σ(Z)).

PROOF OF PROPOSITION 2.1
In order to establish this result, we shall use the relation (iii) of Lemma 3.1 where we expand ES 2 n .We have We have now to apply Lemma 3.3 twice and to take into account that We obtain whence the result follows by easy computations involving the change of the order of summation.

PROOF OF THEOREM 2.2
A careful reader can easily obtain this Theorem from Proposition 2.1 and the proof of Theorem 1.2 and its consequences, in Rio (1993).For convenience we shall sketch the proof.By Cauchy-Schwartz inequality, for all 1 ≤ k ≤ n, In order to estimate the first integral from the right hand side of the inequality, we change the variable µ to ν = n j=1 I(2α(j) > u) and observe that ν = j iff 2α(j + 1) ≤ u < 2α(j).After an integration by parts, we obtain where c(δ) = (4δ −1 + 2) δ/(2+δ) .In order to establish Theorem 2.2 we just combine this last inequality with Proposition 2.1.

PROOF OF COROLLARY 2.4
This corollary follows immediately from Proposition 2.1, Theorem 2.2 and the proof of Corollary 2.1.

PROOF OF THEOREM 2.3
Let τ be a finite stopping time such that τ ≤ n a.e. for some n ≥ 1.Then which is uniformly bounded for every finite stopping time.This proves that (S n ) n≥1 is a semiamart.
Assume now that (S n ) n≥1 is convergent in L 1 .Fix > 0, < 1 − ϕ (1).We can find an integer m ≥ 1 such that for every k ≥ m and every x > we have Let τ be a finite stopping time, m ≤ τ ≤ n a.e.By (3.6) and (3.7) for every x > , we have Therefore (S τ ) τ ∈T is Cauchy in L 1 and as a consequence (E|S τ |) τ ∈T is convergent.This proves that (S n ) n≥1 is an amart bounded in L 1 and the result follows as in Corollary 2.2.
n ρ(2 n ) < ∞ can replace the condition of independence in many almost sure results for sums of random variables.For instance most of the laws of large numbers from the Chapter IX in Petrov (1975) hold with about the same proof, including the sufficiency part of the three series theorem, the sufficiency part of the Kolmogorov and Feller strong laws of large numbers, almost sure convergence of a kernel based recursive procedures.See, for instance Roussas (1991), (1992).
) τ ∈T is said to converge to a if and only if for every ε > 0 there exists τ 0 ∈ T such that |a τ − a| < ε for all τ ∈ T , τ ≥ τ 0 .We say that (S n ) n≥1 is an amart if and only if the net (ES τ ) τ ∈T is convergent.We say that (S n ) n≥1 is a semiamart if and only if the net (ES τ ) τ ∈T is bounded.
Then (S τ ) τ ∈T is convergent in L 2 and therefore (S n ) n≥1 is an amart which is convergent a.s. and in L 2 .As a consequence of Corollaries 2.1 and 2.2 we can easily see that the condition ). COROLLARY 2.2 Let (X i ) i≥1 be a centered sequence of random variables satisfying i ρ(2 i ) < ∞ and implies that (S τ ) τ ∈T is Cauchy sequence in L 2 and therefore (ES τ ) τ ∈T is convergent.By the Definition 1.2 (S n ) n≥1 is an amart.We also remark that, by (3.5), S n is bounded in L 2 and therefore by Theorem 2 in Austin, Edgar and Ionescu Tulcea (1974), S n is convergent a.e.By using (3.5) once again we see that S n is also convergent in L 2 . 2