MARTINGALES ON RANDOM SETS AND THE STRONG MARTINGALE PROPERTY

Let $X$ be a process defined on an optional random set. The paper develops two different conditions on $X$ guaranteeing that it is the restriction of a uniformly integrable martingale. In each case, it is supposed that $X$ is the restriction of some special semimartingale $Z$ with canonical decomposition $Z=M+A$. The first condition, which is both necessary and sufficient, is an absolute continuity condition on $A$. Under additional hypotheses, the existence of a martingale extension can be characterized by a strong martingale property of $X$. Uniqueness of the extension is also considered.


Introduction
Let Λ be an optional random set and let X t (ω) be defined for (t, ω) ∈ Λ.We consider the following and some of its extensions.
(0.1) Problem.Find necessary and sufficient conditions on X guaranteeing that it is the restriction to Λ of a globally defined, right continuous uniformly integrable martingale.
For an example where this formulation may be natural, consider a process (Y t ) t≥0 with values in a manifold.Given a coordinate patch V , let Λ := {(t, ω) : Y t (ω) ∈ V } and let X t (ω) denote a real component of Y t (ω) for (t, ω) ∈ Λ.A second natural example is provided by X = f •W , where W is a Markov process in a state space E and f is a function defined on a subset S of E, Λ denoting in this case {(t, ω) : W t (ω) ∈ S}.
The solution is obvious if there is an increasing sequence of stopping times T n which are complete sections of Λ (i.e., T n ⊂ Λ and P{T n < ∞} = 1) such that Λ ⊂ ∪ n 0, T n .A number of other cases may now be found in the literature.The case of an optional rightopen interval of the form 0, ζ was discussed first by Maisonneuve [Ma77] for continuous martingales, and by Sharpe [Sh92] in the general case.See also [Ya82], [Zh82].
In sections 2 and 3, we give a complete solution for the discrete parameter problem under mild conditions on X.The continuous parameter case, treated in sections 4 and 5, involves considerable additional complication.Roughly speaking, in the discrete parameter case the condition is that X have a "strong martingale property" on Λ (defined in (3.4)), but in the continuous parameter context, an example is given in section 5 to show that this condition is not sufficient.Theorem (4.1), one of the main results of the paper, assumes X is the restriction of a special semimartingale Z = M + A (with A predictable and of integrable total variation) and gives a necessary and sufficient condition in terms of absolute continuity of A with respect to C, the predictable compensator of unit mass at the part of the end of Λ not contained in Λ. Theorems (5.3) and (5.10) gives some conditions under which the strong martingale property is sufficient to imply existence of a martingale extension.The proofs are based on a reduction to the special case treated in [Sh92], which is discussed along with a number of extensions in section 1.
In view of the obvious case discussed in the first paragraph, it should be kept in mind that these results are primarily of interest in cases where Λ is "sectionally challenged."

Setup
Throughout this paper, we suppose given a probability space (Ω, F, P) and a filtration (F t ) with t either a positive integer index or a positive real index.In the latter case, (F t ) is assumed to satisfy the usual right continuity and completeness hypotheses.Expectation of a random variable X with respect to P is denoted by PX rather than EX.
The optional (resp., predictable) σ-algebra O (resp, P) with respect to (F t ) is that generated by the right (resp., left) continuous processes adapted to (F t ).We assume given a random set Λ ∈ O and a process X defined on Λ (i.e., X t (ω) is defined only for (t, ω) ∈ Λ) satisfying at minimum: (1.1) X is the restriction to Λ of some (right continuous) special semimartingale.Recall that a semimartingale Z is special in case it admits a decomposition Z = Z 0 + M + A with M 0 = A 0 = 0, M a local martingale and A predictable and of locally finite variation, or equivalently, Z * t := sup s≤t |Z s | is locally integrable.See [DM80,VII.23].The special semimartingale in (1.1ii) is of course not unique, but we reserve the notation Z = M + A for the canonical decomposition of a suitably chosen special semimartingale Z extending X, with M a martingale and A predictable, A 0 = 0 and of locally integrable variation.We denote by S 1 (F .) the class of special semimartingales Z = M + A over the filtration (F t ) such that M is a uniformly integrable martingale and A is of integrable total variation.
The reader is referred to [DM75] and [DM80] for a detailed discussion of the definitions and results used below, but a brief review and clarification of notation may be in order.
• Given a random time ζ, let ε ζ denote the random measure putting unit mass at ζ on {ζ < ∞}.• Given a random measure ν on R + and a positive measurable process W , W * ν denotes the random measure having density W with respect to ν. • The left limit X 0− is defined to be 0 in all cases, even if X is not defined at 0. • We use the term predictable compensator instead of dual predictable projection.
The predictable compensator B p of an process B of integrable total variation is the unique predictable process B p such that P H t dB t = P H t dB p t for every bounded predictable process H.If B is adapted, this is to say that B − B p is a martingale.
• By the optional projection o W of a positive measurable process W we mean the unique optional process satisfying P o W T 1 {T <∞} = P W T 1 {T <∞} for all stopping times T .The fundamental martingale extension result, to which all other cases will be reduced, is the following extension of [Sh92,(4.8)].
(1.2) Proposition.Let ζ denote a stopping time over (F t ) and suppose Let X be defined on Λ and suppose X satisfies (1.1).Define the process Z by Suppose Z is a semimartingale in S 1 (F .), having canonical decomposition Then X is the restriction to Λ of a uniformly integrable martingale if and only if A C. In this case, let H denote a predictable version of dA/dC, and set Proof.The case proved in [Sh92] assumed the stronger hypotheses (i) ζ > 0 a.s. and Λ = 0, ζ ; and (ii) We first show how to relax condition (ii) to the get the same result under the weaker condition (ii') M is uniformly integrable.(We continue to assume (i) for the moment, so that we have the simplifications The theorem applied to X n (as the restriction of Z n ) shows that X n extends to the martingale Xn of class H 1 determined by Xn We may therefore choose a sequence n k tending to infinity so rapidly that and in particular, F n k is a uniformly integrable family.It follows immediately that the uniformly integrable martingale X with final value F satisfies Xt = lim k Xn k t a.s. and in L 1 and so X extends X.This shows that the condition M ∈ H 1 in [Sh92, Theorem (4.8)] may be replaced by the weaker condition (ii') on M .
We now reduce the rest of the problem by a simple artifice to the known case where Λ = 0, ζ , ζ > 0 a.s., and M is uniformly integrable.We first extend Λ if necessary so that (0, ω) ∈ Λ for all ω, defining X 0 (ω) := 0 on the extension.In this way, we may assume {ζ = 0} ⊂ Ω c 0 while not affecting the definitions of Z and F .Define the stopping time ζ by Note that since {ζ = 0} ⊂ Ω c 0 , ζ > 0 everywhere.Extend X on Λ to X on Λ := 0, ζ by setting  X , Z, A, C , and we may conclude that X has a martingale extension given by X : is the desired extension of X. Conversely, if X has a martingale extension, then so does X , and an application of the converse direction of [Sh92, (4.8)] shows that A C , which is as we showed above equivalent to A C.
(1.3) Remark.In the statement and proof of Proposition (1.2), we took the simplest path to extending Λ and X so that Λ 0 ⊂ Λ, by giving X the value 0 on the part of 0 not in Λ.
In fact, we could have chosen any integrable F 0 -measurable random variable J by making simple changes to the definitions of F and Z.We allow for such a modification in (1.4) below.
It was shown in [Sh92] that under the hypotheses ζ > 0 a.s., Λ = 0, ζ and Z ∈ H 1 , the extension X of X is unique among extensions which stop at ζ and satisfy Xζ ∈ F ζ− .We adapt the proof of this result to get a uniqueness result under the broader conditions of the preceding theorem as modified by (1.3).
(1.4) Proposition.Let ζ, Λ, Ω 0 , X be as in (1.2) and fix J ∈ L 1 (F 0 ) extending X 0 on Λ 0 as in (1.3).Then the process X constructed in (1.2) is the unique uniformly integrable martingale extending X and satisfying Proof.By taking differences, we may assume X = 0 on Λ ∪ 0 , with uniqueness equivalent to showing X∞ = 0. We may assume by (ii) that Λ 0 = Ω.Hypothesis (i) implies of course that X stops at ζ.
e. with respect to the P-measure on P given by U The stopping argument employed in the first paragraph of the proof of (1.2) can be modified to give the following local version of (1.2).
(1.5) Proposition.Assume the same general hypotheses as (1.2), but relaxed so that the canonical decomposition of the special semimartingale Z = Z 0 + M + A has the properties (a) M is a local martingale; (b) there is an increasing sequence T n of stopping times such that for each n, M T n is a uniformly integrable martingale, P C. Then X extends to a local martingale X, and the stopping times T n reduce X to a uniformly integrable martingale.
Proof.By the same artifice employed in the proof of (1.2), we may reduce the problem to the case Ω 0 = Ω.Let X n denote the restriction to Λ of ), hence satisfying the conditions of (1.2) with respect to X n on Λ.Let H n be a predictable version of dA T n /dC, which may be assumed to vanish outside the predictable interval 0, The corresponding expansion of Xn t , together with the observation that Thus, in all cases, Xn+1 t = Xn t for t ≤ T n .This consistency condition means there is a local martingale X such that XT n = Xn , as claimed.

A special discrete parameter case
We begin with a special case-essentially the discrete parameter version of (1.2) but with features of (1.5).
Fix a filtration (F n ) n≥0 and let Λ be a discrete parameter random set satisfying (2.1i) for every n ≥ 0, Λ n (the section of Λ at time n) is in Suppose given a process X defined on Λ such that for every n, X n is integrable on its domain of definition.Let J ∈ L 1 (F 0 ).We shall first extend Λ so that Λ 0 = Ω, setting X 0 (ω) := J(ω) for ω ∈ Λ c 0 .Thus we shall always assume throughout this section that (2.2) Λ 0 = Ω.In particular, under (2.2),G is truly the end of Λ.We define the σ-algebra of events prior to G by its collection of measurable functions: so that F G may be identified with the usual (discrete parameter) definition of F L− by its random variables: Define processes Z, A, C and D by That is, Z extends X by stopping at the end of Λ. Writing (The expectations in the preceding expression are taken only over the domains of definition of the X k .)The conditional expectation defining A is therefore meaningful, and A is the unique predictable (i.e., A n ∈ F n−1 for n ≥ 1) process with A 0 = 0 such that Z −A is a martingale.The adapted process D starts at 0 and jumps up by 1 at L(≥ 1), and C is its predictable compensator, so that D − C is a martingale.
Note that since Z and D both stop at the stopping time L, so do A and C.

one may define a predictable process H unambiguously by
defines a martingale extension of X.It is the unique martingale extending X, stopping at L, and with XL ∈ F G .
Proof.We reduce this to (1.2) by making the obvious extension to continuous time, replacing the integer index n with the interval [n, n + 1[.More specifically, define Similarly extend Z, D, A and B to give step processes Z , D , A and B .The stopping times T n := n reduce Z, M and A in the sense of (1.5).Then (1.2) and (1.5) applied to the primed processes shows at once that X is a martingale and the extension is unique by (1.4) because F G may be identified with F L− .
(2.6) Remark.Proposition (1.2) implies that the condition A C in also necessary in order that X have a martingale extension.Thus, any conditions equivalent to X having a martingale extension are equivalent to the condition A C. (2.7) Remark.If we had imposed a stronger hypothesis on X, requiring that Z ∈ S 1 (F .) (i.e., M is uniformly integrable and A has integrable variation), then the triple of conditions X is uniformly integrable, XL ∈ F G , X stops at L is equivalent to the pair of conditions X is uniformly integrable, X∞ ∈ F G , for under the latter pair, X∞ ∈ F L so X necessarily stops at L. (2.8) Remark.The extension of X defined by (2.5) actually stops at G in a number of important cases.Let L p denote the predictable part of L, defined in [Sh92, §2] as the largest predictable stopping with graph contained in L .It is easy to see, for example, that L p = 1 on {L = 1} = {G = 0}.The details are given in greater generality in (4.8) and the surrounding discussion.

The general discrete parameter case
Let Λ denote an arbitrary optional random set.That is, Λ ⊂ {0, 1, . . .} × Ω satisfies (2.1i) but not necessarily (2.1ii).We suppose also that X is defined on Λ and optional in the sense that for each n, X n is measurable with respect to the trace of F n on Λ n .
For m ≤ n, let It is easy to see that Γ m,n is the largest (modulo null sets) F m -measurable subset of Λ n .Note that m → 1 − W m,n is a positive martingale, and therefore (3.1) Definition.X has the simple martingale property on Λ provided, for every pair m < n, Note that the conditional expectation in the line above makes sense, for as we pointed out above, Λ m ∩ Γ m,n ⊂ Λ n , the domain of definition of X n .
A stronger version of (3.1) will be required.Given stopping times S ≤ T , define Λ S := {ω : (S(ω), ω) ∈ Λ} and Γ S,T := {P{Λ T | F S } = 1}.Then Λ S ∩ Γ S,T determines the part of the graph of S which is in Λ and on which is is almost certain that T is in Λ.The set Γ S,T may also be described as the largest F S -measurable set contained in Λ T .These definitions apply equally in discrete and continuous parameter cases.The following is phrased in continuous parameter terms for later use.Note that the conditional expectation in (3.5) makes sense because of (3.3).Unlike ordinary uniformly integrable martingales, where the simple and strong martingale properties are equivalent (optional sampling theorem) the same is not the case for general Λ.For example, in a coin tossing model, define Λ by Λ 0 := Ω, Λ 1 := heads on first toss, Λ 2 := tails on first toss.It is easy to check that the sets Γ m,n = ∅ for m < n, and consequently an arbitrary adapted X defined on Λ has the simple martingale property.However, if we define stopping times D 0 := 0, D 1 (ω) := inf{n : (n, ω) ∈ Λ} (which takes values either 1 or 2) then the strong martingale property is plainly not valid for every adapted X on Λ-eg, X 1 = X 2 = 1, X 0 = 0.The following result indicates an important special case where the simple and strong martingale properties are equivalent.
(3.6) Theorem.Let Λ be an optional random set satisfying (2.1ii) (ie, Λ 0 ⊃ Λ 1 ⊃ . . .,) let X be defined on Λ, adapted and with X n integrable on Λ n for every n.Let Z, A and C be defined as in (2.3).Then the following are equivalent.
(vi) X satisfies (3.1) for pairs of the form n − 1, n; (vii) X has the simple martingale property on Λ.
If, in addition, Z ∈ S 1 (F .) (or equivalently, X is the restriction to Λ of a uniformly integrable martingale), then each of the conditions above is equivalent to (viii) X has the strong martingale property on Λ.
Proof.Properties (i) and (ii) are equivalent by (2.4), and (ii) is clearly equivalent to (iii) since A and C stop at L. The equivalence of (iii) and (iv) then follows by definition of Z, A and C and the fact that {L ≥ n} ∈ F n−1 .For equivalence of (iv) and (v), note that Z n = Z n−1 on {L = n}.It is clear that (i) =⇒ (vii) =⇒ (vi), so the proof of the first assertion will be complete once we prove (vi) =⇒ (v).Assume (vi) holds and let But, under (2.1ii),Λ k = {L > k}, and so On this latter set (∈ F n−1 ), L > n a.s., from which (v) follows.Finally, is X is the restriction of a uniformly integrable martingale, it suffices to observe that (i) =⇒ (viii) =⇒ (vii).
We turn now to the case of a general optional (discrete parameter) random set Λ.The idea here is to reduce the problem to (3.6) by a time change argument.For n ≥ 0, let D n := inf{k ≥ n : k ∈ Λ}.Then D n is an increasing sequence of stopping times tending to infinity, and the graph D n of D n is a subset of Λ.As in the special case, let G(ω) := sup{n : (n, ω) ∈ Λ} denote the end of Λ, and L := G + 1.
We assume given a process X defined on Λ.We shall generally be assuming X is the restriction of a semimartingale Z ∈ S 1 (F .).Define Fn := F D n and define the random set Clearly Λ is optional with respect to ( Fn ).Note that Λ0 ⊃ Λ1 ⊃ . . ., so Λ satisfies (2.1ii) with respect to ( Fn ).Observe to that G is still the end of Λ. Next define a process Xn adapted to ( Fn ) on the random set Λ by (3.8) Proposition.Let X be defined on Λ and satisfy: (i) X is the restriction to Λ of a semimartingale Z ∈ S 1 (F .); (ii) X has the strong martingale property on Λ.
Then X is the restriction to Λ of a semimartingale Ẑ ∈ S 1 ( F. ), and X has the strong martingale property on Λ.
Proof.Write as usual Z = M + A, with M a uniformly integrable martingale over (F .) and A predictable with We have then Ẑ = M + Â, and clearly M is a uniformly integrable martingale over ( F. ).Though Â is not in general predictable over ( F. ), it is adapted and clearly has integrable total variation.Therefore Ẑ ∈ S 1 ( F. ).Then X is the restriction of Ẑ ∈ S 1 (F .) to Λ, and in particular Xn is integrable on Λn = {n < L}.By (3.6), to complete the proof, it will suffice to prove that for each fixed n ≥ 1, The strong martingale property of X then shows that the last displayed term vanishes.
(3.9) Corollary.Let X and Λ satisfy the hypotheses of (3.8).Then X extends to a uniformly integrable martingale X.
Proof.The process X constructed in (3.8) satisfies the conditions of (3.6) relative to Λ and the filtration ( Fn ).Let X∞ ∈ L 1 denote its final value.Then we have Once we show that Xn = X n on Λ n , X will be the desired extension of X.But, on Λ n , As S ∈ F n is an arbitrary subset of Λ n , and X n is F n -measurable on Λ n , this proves X n = Xn on Λ n .
(3.10) Corollary.Under the hypotheses of (3.8) and (3.9), if Y is a uniformly integrable martingale extending X and if Y ∞ ∈ F G , then Y = X, as constructed in (3.9).
Proof.With notation as in the proof of (3.8), Y D n is a uniformly integrable martingale over ( Fn ) defined for all n (not just n < L), and as Y ∞ ∈ F G = FG , (2.4) shows that Y ∞ = X∞ .

Continuous parameter case
Fix an optional set Λ ⊂ R + × Ω and suppose X is defined on Λ and satisfies (1.1).The main result of this section is the following.
Then X extends to a unique uniformly integrable martingale X such that X ∈ F L , X∞ 1 Ω 0 is measurable with respect to the trace of F L− on Ω 0 , and Xt = 0 for all t ≥ 0 on {L = 0} ∩ Λ c 0 .Before beginning the proof of (4.1), we make some preliminary reductions that will simplify the proof.Reduction 1. Λ may be assumed right closed.Indeed, if Λ is not right closed and we let Λ denote its closure from the right, then Λ is also optional, and if we define X on Λ as (say) the lim sup of X values from the right, then we have Z = X on Λ , and as the end of Λ is L, the condition A C has the same force whether we deal with X on Λ or X on Λ.From now until the end of the proof, Λ will be assumed right closed.Reduction 2. We may assume L(ω) = 0 if and only if Λ(ω) = ∅.Extend the original Λ to be a subset of [−1, ∞[×Ω, adjoining {(t, ω) : −1 ≤ t < 0, ω ∈ Λ 0 } to the original Λ.Let F t := F 0 for t ∈ [−1, 0[, and define X on the extended Λ by X t (ω) = X 0 (ω) for (t, ω) ∈ Λ ∩ − 1, 0 .The existence of a martingale extension of the original X is clearly not affected by this extension.In addition, if we extend M and A back to time -1 by setting A t := 0 and M t := M 0 for −t ≤ t < 0, then the new X continues to be the restriction to the new Λ of the new Z.The new random measure C is carried by 0, ∞ , as is the new A. Thus we affect neither the hypotheses nor the conclusions of the theorem by changing the time domain in this way.However, in the proofs, it is awkward to have a time index starting at -1, so we relabel the time axis to start at 0. Shifting time by 1 does not affect affect the hypotheses nor the conclusions.The net effect is that Λ may be assumed to satisfy L(ω) = 0 if and only if Λ(ω) = ∅.
We follow the development as in discrete case as far as possible.Let Because Λ is right closed, D t ∈ Λ for every t ≥ 0 for every t < L. The process t → D t is right continuous, and its left continuous inverse is given by s → g s .Each D t is a stopping time over (F t ).Let Ft := F D t , and note that L is a stopping time over ( Ft ), for {L > t} = {D t < ∞} ∈ Ft .Define the random set Λ := 0, L ∪ ( L ∩ Λ), so that Λ is optional relative to Ft .Define X on Λ by The following result is evident.
(4.2) Lemma.With Λ modified in accordance with reductions 1 and 2, the range of the map t , where Λ i := {t ∈ Λ : t > 0, t = g t < D t }-i.e., the points in Λ which are accumulation points of Λ from the left but not from the right.
In outline the proof will involve showing first that Xt extends to a martingale with final value X∞ .We will then let X be a right continuous version of the martingale Xt := P{ X∞ | F t }, and show that the hypotheses imply that X extends X.In preparation for these arguments we need some results which use ideas surrounding the change of variable formula in the form given, say, in [Sh88,p379].
and for every fixed r < t, we have (4.4) Lemma.Let Ĥ be predictable over ( Ft ) with Ĥ0 = 0. Then Ĥ(g s ) is predictable over (F s ).
Proof.It suffices to check this in case Ĥ = 1 T ,∞ with T a stopping time over ( Ft ), as such processes generate the predictable processes over ( Ft ) vanishing at 0. For Ĥ of this form, Ĥ •g s = 1 T ,∞ (g s ) = 1 { T <g s } , and the latter is left continuous in s and adapted to (F s ) by the preceding lemma.

Now suppose in addition that
C then implies that the right side of (4.5) vanishes, and consequently, P 0,L Ĥt d Âp t = P 0,L Ĥt d Ât = 0.This proves that Âp Ĉ. Observe now that Ẑ = Mt + Ât is a semimartingale over ( Ft ), and it is in S 1 ( F. ) because M is a uniformly integrable martingale over ( Ft ) and Ât is optional over ( Ft ) and of integrable total variation.The canonical decomposition of Ẑ is then ( Mt + Ât − Âp t ) + Âp t .Observe too that Ẑt = Z ∞ for t ≥ L, and that on Ω 0 , D t < L for all t < L so that ẐL− = lim t↑↑L Ẑt = Z L− = Z ∞ by hypothesis.Therefore Ẑt = X L− for all t ≥ L on Ω 0 .Clearly Xt = X(D t ) for t < L, and on Ω c 0 ∩ {0 < L < ∞}, XL = ẐL = Z ∞ = X L by the hypotheses on Z.We have now shown that X is the restriction to Λ of Ẑ, and that the conditions of (1.2) with respect to the process X on Λ are satisfied.Therefore, by (1.2), X extends to a uniformly integrable martingale (with respect to ( Ft )) whose final value we shall denote by X∞ .In fact, by (1.2), if we let Ĥ ∈ P be a version of d Âp /d Ĉ, then we may take Let Xt be a right continuous version of P{ X∞ | F t }.Clearly X D t = XD t a.s on {t < L} for every t ≥ 0. It follows that X s = Xs for all s ∈ Λ in the range of the map t → D t , so by (4.2), X = X on Λ \ Λ i .Let Λ := { X = X} ⊂ Λ i .Clearly Λ is optional, though Λ i need not be.As Λ has countable sections, we may express Λ = ∪ n T n , where the stopping times T n have disjoint graphs.In order to prove Λ is evanescent (which implies that X extends X) it suffices to show T n = ∞ a.s.for every n.Fix n and let T denote T n , so that T is a stopping time with T ⊂ Λ i .In particular, since L = D T ∈ Λ is not possible on Ω 0 , D T < L on {T < L} ∩ Ω 0 .Let K t := 1 T,D T (t)Y t , where Y is an arbitrary bounded predictable process.Then In view of the hypothesis A C, we have P K t dA t = 0, and as Y ∈ bP is arbitrary, this shows that dA does not charge the interval T, D T , so A(D T ) = A T .It follows that Z T = P{Z(D T ) | F T }.Putting this together with (4.6) and the fact that X is a uniformly integrable martingale, we find However, on Ω 0 , if T < ∞ then T < L and so D T < ∞, and consequently XT = X T a.s. on {T < ∞}.This proves T = ∞ a.s., finishing the existence part of the theorem.
For uniqueness, suppose Y is a another uniformly integrable martingale extending X and satisfying (a) Y ∞ ∈ F L ; (b) Y ∞ 1 Ω 0 ∩{0<L<∞} is measurable with respect to the trace of F L− on Ω 0 ; (c) Y ∞ = 0 on {L = 0}.Subtracting Y from X, we see that uniqueness is equivalent to showing that X = 0 if X = 0 on Λ.Let Xt := X(D t ).Then Xt is a uniformly integrable martingale over ( F. ) extending 0 on Λ, stopping at L, and satisfying X∞ ∈ F L , X∞ 1 Ω 0 ∩{0<L<∞} measurable with respect to the trace of F L− on Ω 0 ∩ {0 < L < ∞}.
The condition X∞ ∈ F L implies X∞ ∈ FL , for L is a stopping time over F. and so the test is X∞ 1 {L≤t} ∈ Ft for every t ≥ 0. By definition of Ft , this is the same as X∞ 1 {L≤t}∩{D t ≤s} ∈ F s for all t, s ≥ 0. However, on {L ≤ t}, D t = ∞, so X∞ ∈ FL .The σ-algebra F L− is generated by events of the form W ∩ {L > t} with W ∈ F t and t ≥ 0. As F t ⊂ Ft , this proves F L− ⊂ FL− .Now apply the uniqueness result (1.4) to X to see X∞ = 0.
The extension of X defined by (4.1) takes a simpler form some particular cases which we now describe.We work under the hypotheses of (4.1), together with reductions 1 and 2 and the notation developed in its proof.The end L of Λ is a stopping time over ( Ft ).Decompose L = L 0 ∪ L 1 , where L 0 := L ∩ 0, ∞ ∩ Λ c and L 1 := L \ L 0 .It is clear that L 0 and L 1 are stopping times over ( Ft ).It is also clear by definition of Ω 0 that Ω 0 = {L 0 < ∞}.Let L p denote the predictable part of L 0 , defined in [Sh92, §2] as the largest predictable stopping time (over ( Ft )) with graph contained in L 0 .Choose stopping times T n announcing L p , and observe that for every bounded left continuous predictable (with respect to ( Ft )) process Ŷ , making use of the fact that Âp is carried by 0, L in the third equality, Now write the last term as P ŶL p ( Ẑ∞ , and note that on {L p < ∞} ⊂ Ω 0 , Ẑ∞ = ẐL− = ẐL p − , so the second term vanishes.On the other hand Ω c 0 ∩ {L p < ∞} = ∅, so the first term also vanishes and so we are led to the identity, for every Y ∈ b P, (4.7) This proves ĤL p = 0 on {L p < ∞}.This may be restated as follows.(4.9) Remark.Theorem (4.1) is utterly worthless if Λ contains its end L, for in this case C = 0, and the condition A C implies A = 0, so the theorem amounts to "X has an extension to a uniformly integrable martingale if it is the restriction to Λ of a uniformly integrable martingale."

The strong martingale property, continuous parameter case
Given a stopping times S ≤ T , define Λ T and Γ S,T as in section 3, and define the strong martingale property of X on Λ as in (3.4).Under the hypotheses of Theorem (4.1), X may be regarded as the restriction to Λ of a uniformly integrable martingale, and consequently X has the strong martingale property on Λ.
We investigate in this section conditions under which the strong martingale property implies the existence of a martingale extension of X on Λ.We give first an example to show that in general there can be no equivalence between the absolute continuity condition A C of (4.1) and a strong martingale property such as holds in the discrete parameter case, (3.6).Consider the following example from [Sh92].Let B denote linear Brownian motion, let ζ be an exponential time with parameter 1 independent of B, and let Λ := 0, ζ .Let S ≤ T be stopping times over (F t ), the natural filtration (suitably completed) for (B t ) t<ζ .We show Λ Because of this, the strong martingale property holds trivially for every X on Λ satisfying obvious integrability conditions.However, as shown in [Sh92], B 2 t 1 {t<ζ} has a martingale extension, while |B t |1 {t<ζ} does not.In other words, the strong martingale property need not by itself imply the existence of a martingale extension.
Before beginning a discussion of sufficiency of the strong martingale property, here is a preliminary result which reduces the work needed to verify it.Recall the notation of (3.This proves that the pair S, T satisfies (3.5).
(5.3) Theorem.Let X be defined on Λ, 0, ζ ⊂ Λ ⊂ 0, ζ , and suppose ζ \ Λ is contained in a predictable set K with countable sections, and having no finite limit points other than ζ.Assume also that X is the restriction to Λ of a semimartingale Z ∈ S 1 (F .).Then X extends to a uniformly integrable martingale if and only if X has the strong martingale property on Λ.
Proof.Assume X has the strong martingale property on Λ.Let Ω 0 := {ω : ζ(ω) / ∈ Λ(ω)}, and let C denote the predictable compensator of 1 Ω 0 * ε ζ , as usual.With the conditions imposed on K, the predicable compensator of 1 Ω 0 * ε ζ is carried by K. Let T n denote the time of the n th jump of C so that the predictable stopping times T n have disjoint graphs and increase with n to a limit T ∞ ≥ ζ.Denote by K 1 := ∪ n T n ∈ P the (discrete) support of C, so K 1 ⊂ K ⊂ 0, ζ .Define d t (ω) := inf{s > t : s ∈ K 1 (ω)}, so that for every t ≥ 0, d t > t, and We shall prove that (5.5) implies that A is carried by K 1 , which will show (by Theorem (4.1)) that X extends to a uniformly integrable martingale.Let

(3. 2 )
Lemma.Let T be a stopping time and let Y T t be a right continuous version of the martingale P{Λ c T | F t } and ζ T := inf{t : Y T t = 0}.Then for every stopping time S ≤ T , P{Λ T | F S } = 1 if and only if S ≥ ζ T , and hence Γ S,T = {S = T ∈ Λ} ∪ {ζ T ≤ S < T }.Proof.Since Y T is a right continuous, positive martingale, Y T S = 0 if and only if S ≥ ζ T .(3.3) Lemma.For S ≤ T stopping times, Λ S ∩ Γ S,T ⊂ Λ T .Proof.By definition of Γ S,T , P(Λ S ∩ Γ S,T ∩ Λ c T ) = P(P{Λ c T | F S }; Λ S ∩ Γ S,T ) = 0. (3.4) Definition.X has the strong martingale property on Λ provided, for every pair S ≤ T of stopping times, (3.5) P{X T | F S } = X S on Λ S ∩ Γ S,T .
3) and (3.4), where for a given stopping time T , Y T t denotes a right continuous version of P{Λ c T | F t } and ζ T := inf{t : Y T t = 0}.(5.1) Proposition.X has the strong martingale property on Λ provided (3.5) holds for all pairs of stopping times S ≤ T such that T ⊂ Λ and S ⊂ ζ T , T ∩ Λ. Proof.We must verify (3.5) for an arbitrary pair S ≤ T of stopping times.Define the stopping time T by T = T ∩ Λ, so that T ≥ S and T ⊂ Λ. Clearly Λ T = Λ T , so Γ S,T = Γ S,T .It follows that (3.5) holds for the pair S, T if and only if it holds for the pair S, T , and it therefore suffices to verify (3.5) assuming T ⊂ Λ. Define the stopping time S by S = S ∩ ζ T , T ∩ Λ and let S := S ∧ T .Then the stopping time S ≤ T and S ⊂ Λ.By hypothesis, (3.5) holds for the pair S , T .This property is equivalent to the equality, for every bounded right continuous adapted process W , (5.2) P{X T W S ; Λ S ∩ Γ S ,T } = P{X S W S ; Λ S ∩ Γ S ,T }.For ω ∈ Λ S ∩ Γ S ,T , either ω ∈ {S ≥ ζ T } and S(ω) = S (ω) and ω ∈ Λ S ∩ Γ S,T , or ω ∈ {S < ζ Λ T } and S (ω) = T (ω) and by (3.2), ω / ∈ Λ S ∩ Γ S,T .Thus we have

Considering ( 5
.2) separately on the sets {S ≥ ζ T }, {S < ζ T } (both in F S since S ≤ S ) givesP{X T W S ; Λ S ∩ Γ S ,T , S ≥ ζ T } = P{X S W S ; Λ S ∩ Γ S ,T , S ≥ ζ T } P{X T W S ; Λ S ∩ Γ S ,T , S < ζ T } = P{X S W S ; Λ S ∩ Γ S ,T , S < ζ T }Making the reductions from the previous display makes the second equation a triviality, and the first becomesP{X T W S ; Λ S ∩ Γ S,T , S ≥ ζ T } = P{X S W S ; Λ S ∩ Γ S,T , S ≥ ζ T }.By (3.2), the term S ≥ ζ T may now be omitted, giving P{X T W S ; Λ S ∩ Γ S,T } = P{X S W S ; Λ S ∩ Γ S,T }.