Large deviation exponential inequalities for supermartingales

Let $(X_{i}, \mathcal{F}_{i})_{i\geq 1}$ be a sequence of supermartingale differences and let $S_k=\sum_{i=1}^k X_i$. We give an exponential moment condition under which $P(\max_{1\leq k \leq n} S_k \geq n)=O(\exp\{-C_1 n^{\alpha}\}),$ $n\rightarrow \infty,$ where $\alpha \in (0, 1)$ is given and $C_{1}>0$ is a constant. We also show that the power $\alpha$ is optimal under the given condition. In particular, when $\alpha=1/3$, we recover an inequality of Lesigne and Voln\'{y}.


Introduction
Let (X i , F i ) i≥1 be a sequence of martingale differences and let S k = k i=1 X i , k ≥ 1. Under the Cramér condition sup i Ee |X i | < ∞, Lesigne and Volný [9] proved that P (S n ≥ n) = O(exp{−C 1 n for some constant C 1 > 0. Here and throughout the paper, for two functions f and g, we write f (n) = O(g(n)) if there exists a constant C > 0 such that |f (n)| ≤ C|g(n)| for all n ≥ 1. Lesigne and Volný [9] also showed that the power 1 3 in (1) is optimal in the sense that there exists a sequence of martingale differences ( X i , F i ) i≥1 such that sup i Ee | X i | < ∞ and P ( S n ≥ n) > exp{−C 2 n 1 3 } for some constant C 2 > 0 and infinitely many n's. Liu and Watbled [10] proved that the power 1 3 in (1) can be improved to 1 under the conditional Cramér condition sup i E(e |X i | |F i−1 ) ≤ C 3 , for some constant C 3 . It seems natural to ask under what condition it holds P (S n ≥ n) = O(exp{−C 1 n α }), n → ∞, where α ∈ (0, 1) is given and C 1 > 0 is a constant. In this paper, we give some sufficient conditions in order that (2) holds for supermartingales (S k , F k ) k≥1 . The paper is organized as follows. In Section 2, we present the main results. In Sections 3-5, we give the proofs of the main results.

Main Results
Our first result is an extension of the bound (1) of Lesigne and Volný.
does not depend on n. In particular, with x = 1, it holds Moreover, the power α in (3) is optimal even for the class of stationary martingale differences: for each α ∈ (0, 1), there exists a stationary sequence of martingale differences for all n large enough.
It is clear that when α = 1 3 , the bound (3) implies the bound (1) of Lesigne and Volný. In our second result we replace the condition sup i E exp{|X i | 2α 1−α } < ∞ of Theorem 2.1 by the weaker condition sup i E exp{( the sum of conditional variances.
Adding a hypothesis on S n to Theorem 2.2, we can easily obtain the following Bernstein type inequality which is similar to an inequality of Merlevède, Peligrad and Rio [11] for weakly dependent sequences.
In particular, with x = 1, it holds where C > 0 is an absolute constant. Moreover, the power α in (7) is optimal even for the class of stationary martingale differences: for each α ∈ (0, 1), there exists a stationary sequence of for all n large enough.
In the i.i.d. case, the conditions of Corollary 2.1 can be weakened considerably, see Lanzinger and Stadtmüller [8] where it is shown that if E exp{( 3. Proof of Theorem 2.1 We need the following refined version of the Azuma-Hoeffding inequality. A proof can be found in Laib [7]. Now, we are ready to prove Theorem 2.1. We start as in Lesigne and Volný [9] and push a step further by using the martingale maximal inequality (10). We end by giving a simple example to show that the power α in (3) is optimal.
Let (X i , F i ) i≥1 be a sequence of supermartingale differences. Given u > 0, define .
Using Lemma 3.1 and |X ′ i | ≤ 2u, we have Using the martingale maximal inequality p. 14 in [6], we get It is easy to see that Notice that the function g(t) = t 3 exp{−t 2α 1−α } is decreasing in [β, +∞) and is increasing in [0, β], Returning to (14), by (15) and (16), we get From (13), it follows that Combining (11), (12) and (18), we obtain Hence, for all x > 0, This completes the first assertion of Theorem 2.1. Next, we prove that the power α in (3) is optimal. We take a positive random variable X such that, for all x > 1, It is easy to verify that Assume that (ξ i ) i≥1 are Rademacher random variables independent of X, i.e. P ( Since, for n large enough, (cf. Corollary 3.5 in Lesigne and Volný [9]), we get, for n large enough, Setting 2β − 1 = α, we obtain, for n large enough, which proves that the power α in (3) is optimal.

Proof of Theorem 2.2
To prove Theorem 2.2, we need the following inequality whose proof can be found in Fan, Grama and Liu [4].
Then, for all x, v > 0, Assume that (X i , F i ) i≥1 are supermartingale differences. Given u > 0, set is also a sequence of supermartingale differences and S k = S ′ k + S ′′ k . Since S ′ k ≤ S k , we deduce, for any x, u, v > 0, Applying Lemma 4.1 to the supermartingale differences (X ′ i /u, F i ) i≥1 , we have Using the exponential Markov's inequality and the condition E exp{(X + i ) Combining the inequalities (22), (23) and (24) together, we obtain, for all x, u, v > 0, Taking u = x 1−α , we get, for all x, v > 0, This completes the proof of Theorem 2.2.

Proof of Corollary 2.1.
To prove Corollary 2.1 we make use of Theorem 2.2. It is easy to see that By Theorem 2.2, it follows that, for all x, v > 0, Using the exponential Markov's inequality and the condition E exp{( S n n ) α 1−α } ≤ C 2 , we get, for all v > 0, Taking v = (nx) 1−α 2 , we obtain, for all x > 0, P max 1≤k≤n X k ≥ nx ≤ exp − x 1+α 2 1 + 1 3 x n α + (nC 1 + C 2 ) exp{−x α n α }, which gives inequality (6). Next, we prove that the power α in (7) is optimal even for the class of stationary martingale differences. Let X be the positive random variable defined in (19). Let X i = Xξ i and F i = σ(X, (ξ k ) k=1,...,i ), where (ξ i ) i≥1 are Rademacher random variables independent of X. Note that S n n = X 2 satisfies the condition Using the same argument as in the proof of Theorem 2.1, we obtain, for n large enough, which shows that the power α in (7) is optimal.