THE NOISE MADE BY A POISSON SNAKE

: The purpose of this article is to study a coalescing (cid:176)ow of sticky Brownian motions on [0 ; 1 ). Sticky Brownian motion arises as the weak solution of a stochastic diﬁerential equation, and the study of (cid:176)ow reveals the nature of the extra randomness that must be added to the driving Brownian motion. This can be represented in terms of Poissonian marking of the trees associated with the excursions of Brownian motion. We also study the noise, in the sense of Tsirelson, generated by the (cid:176)ow. It is shown that this noise is not generated by any Brownian motion, even though it is predictable.


INTRODUCTION
Suppose that Ω, (F t ) t≥0 , P is a filtered probability space satisfying the usual conditions, and that Z t ; t ≥ 0 is a continuous, adapted process taking values in [0, ∞) which satisfies the stochastic differential equation where W t ; t ≥ 0 is a real-valued F t -Brownian motion and x ≥ 0 and θ ∈ (0, ∞) are constants. We say that Z is sticky Brownian motion with parameter θ started from x. Sticky Brownian motion arose in the work of Feller [6] on strong Markov processes taking values in [0, ∞) that behave like Brownian motion away from 0. The parameter θ determines the stickiness of zero: the cases (which we usually exclude) θ = 0 and θ = ∞ correspond respectively to Brownian motion absorbed or instantaneously reflected on hitting zero. For θ ∈ (0, ∞) sticky Brownian motion can be constructed quite simply as a time change of reflected Brownian motion so that the resulting process is slowed down at zero, and so spends a non-zero amount of (real) time there. However here our interest will be focused on it arising as a solution of the stochastic differential equation (1.1). This equation does not admit a strong solution, in order to construct Z it is necessary to add to the driving Brownian motion W some extra 'randomness'. The nature of this randomness was first investigated in [15]. More recently it has been shown (Warren, [16], and Watanabe, [17], following Tsirelson, [13]) that the filtration (F t ) t≥0 cannot be generated by any Brownian motion.
The main object of study in this paper is a coalescing flow on R + which we denote by Z s,t ; 0 ≤ s ≤ t < ∞ , where each Z s,t is an increasing function from R + to itself. We may describe the flow rather informally in terms of the motion of particles. For any t and x the trajectory h → Z t+h (x) describes the motion of a particle which starts at time t from a position x. We shall be concerned with a flow in which this motion is determined by a sticky Brownian motion. Away from zero, particles move parallel to each other-which means that the SDEs (similar to equation (1.1)) which describe their trajectories are all driven by the same Brownian motion W . Particles collide while visiting zero and thereafter they move together-and the map Z s,t is typically neither injective nor surjective.
The stochastic differential equation (1.1) was studied in [15]. There it was shown that the solution is weak, and that for each t, assuming Z 0 = 0, the conditional distribution of Z t given the entire path of the driving Brownian motion W is given by: where L t = sup s≤t (−W s ).
An interpretation of (1.2) was given in [15] which we can summarise loosely as follows. Define a process ξ via ξ is then a Brownian motion reflected from zero. Each excursion of ξ from zero gives rise to a rooted tree. Each time t determines a path in the tree corresponding to the excursion of ξ which straddles t. This path starts from the root of the tree and is of length ξ t . If times s and t are both straddled by the same excursion then the paths they determine co-incide from the root for a length I s,t = inf h∈[s,t] ξ h , and thereafter differ. The reader not familar with the correspondence between trees and excursions may consult Aldous [1] or [2]. Next Figure 1: The trajectories of three particles in a typical realization of the flow. The upper particle never hits zero and its trajectory is a translate of the driving Brownian motion. The middle particle has started from a critical height-it just hits zero and from then on follows the same trajectory as the lowest particle which was started from zero. The dotted line shows the path that would be taken by the middle particle if the boundary were not sticky.
we add marks to these trees according to a Poisson point process of intensity 2θ. We can then construct the sticky Brownian motion Z from these marked trees via, for each t ≥ 0, if the path corresponding to t is not marked, ξ t − h t otherwise, (1.4) where h t is the distance along the path corresponding to t from the root to the first mark. Notice that not all the marks carried by the trees are used in constructing the process Z in this way.
In this paper the description of the previous paragraph is extended to the flow Z s,t ; s ≤ t of sticky Brownian motions. By replacing the single sticky Brownian motion by an entire flow we make natural use of all the marks on the trees. A discrete version of the correspondence between a flow and a family of marked trees is illustrated in Figures 2 and 3.
Section 2 of the paper details the construction of the flow. We begin with a process X t ; t ≥ 0 , taking values in R ∞ which solves an infinite family of stochastic differential equations. From X we are able to construct the flow in a pathwise manner.
In Section 3 we show that a generalization of (1.4) holds for the process X. There exists a family of marked trees such that the components X (1) , X (2) , . . . , X (k) , . . . of X satisfy if the path corresponding to t carries fewer than k marks, is the distance along the path corresponding to t from the root to the kth mark. A consequence of this description of the process X is that it may be identified with a simple path-valued Markov process similar in nature to the Brownian snake of Le Gall [8]; the difference being that the white noise along the branches of the tree is replaced with Poisson noise. For an interpretation in terms of killing in superprocesses, see Watanabe [18]. A   Figure 2. The driving random walk determines a family of trees. The vertical edges of the trees are marked by independently tossing a coin for each edge. This carries the information about the stickiness of the boundary. A particle at the origin at time n will move to level 1 at time n + 1 if the driving walk makes an upward step between times n and n + 1 and the corresponding edge carries a mark. closely related construction involving Poisson marking of trees is described by Aldous and Pitman [3].
In the final section we consider the flow as a noise, in the sense of Tsirelson [11] and [12]. The flow Z has independent stationary increments, in that, for any 0 ≤ t 0 ≤ t 1 ≤ . . . ≤ t n the random maps Z t 0 ,t 1 , Z t 1 ,t 2 , . . . Z t n−1 ,tn are independent and for s ≤ t and h > 0 the maps Z s,t and Z s+h,t+h have the same distribution. We will write F s,t for the σ-algebra generated by Z u,v ; s ≤ u ≤ v ≤ t . Following Tsirelson the doubly indexed family F s,t ; s ≤ t is called a noise. It is a non-white noise because, as we shall see, there is no Brownian motion whose increments generate it.

CONSTRUCTING THE FLOW
Let R ∞ denote the space of real-valued sequences with the product topology. We will denote (for reasons shortly to become clear) the canonical process on the space t . . . . Often we use the notation X Theorem 1. Let θ ∈ (0, ∞). There exists a unique probability measure P on C R + , R ∞ such that the canonical process X t ; t ≥ 0) satisfies the following.
• The first component, W , is a Brownian motion starting from 0 with respect to the filtration generated by X.
• For k ≥ 1, the processes X (k) are non-negative and the following SDEs are satisfied: s >0} ds, k ≥ 2.
Proof. We begin by proving that P is unique. Let P (n) denote the restriction of P to the σ-algebra F (n) generated by W, X (1) , . . . , X (n) . Since the union of such σ-algebras is an algebra generating the entire Borel σ-algebra, it suffices to prove that each P (n) is unique.
Let, for any k ≥ 1, s >0} ds, and denote their right-continuous inverses by α (k) and α (k+) respectively. We shall see shortly that A (k) = ∞ with probability one, and so we may define, for any k ≥ 1 and for 0 ≤ u < ∞, s >0} dW s , s >0} dW s .
For each n, under P (n) , the processes B (1) , . . . , B (n) , B (n+) must be independent Brownian motions, as a consequence of Knight's theorem on orthogonal martingales (see [9], Chapter V, Theorem 1.9). Hence if we can recover W, X (1) , . . . , X (n) from them, the measure P (n) will be determined uniquely. Now, we may write, Since the lefthandside is non-negative, and A (k) grows only when it is zero, we recall that the Lemma of Skorokhod (see [9], Chapter VI, Lemma 2.1) tells us that, where we denote, Note that, When k = 1 we may take the lefthandside to be given by A We can now check the earlier claim that A (k) Denote by σ (k) the inverse of the continuous, strictly increasing function: Then, we find that, These two relations hold even for k = 1 provided that we take B Now for any fixed n we can use (*) to recover B (0+) , . . . , B (n+) from B (1) , . . . , B (n) , B (n+) . Moreover we can then use (**) to obtain A (1+) , . . . , A (n+) , and then, finally, we obtain X (1) , . . . , X (n) , since More precisely we are able to write, for any finite set of times t 1 , t 2 , . . . , t m , the m(n + 1)tuple W t 1 , X tm , . . . , X tm as a jointly measurable function of B (1) , . . . , B (n) , B (n+) , and this determines P (n) on a generating π-system. This completes the proof of uniqueness.
To prove existence we construct, for each n the probability measure P (n) on C R + , R ∞ , F (n) from the law of an (n + 1)-tuple of processes W, X (1) , . . . , X (n) . This sequence of measures is consistent (by virtue of the uniqueness property just obtained), and we may take P to be its projective limit.
Suppose for some k we have established that for r = 1, 2, . . . , (k−1) the process X (r) satisfies the appropriate SDE and that A s >0} ds. Then, using (**) and (***), Call the common value A (k) t and note that this identifies the second term on the righthandside of (***). It remains to identify the first term as a stochastic integral against W .
The first integral must be identically zero because the zero set of the integrand supports the measure dM t dM t while the second integral is simply B Applying the above arguments for k = 1, 2, . . . , n we deduce that all the necessary SDEs are satisfied.
The remaining issue is to verify that W is a Brownian motion with respect to the filtration generated by W, X (1) , . . . , X (n) . Suppose that B (k+) is a Brownian motion with respect to some filtration G (k) and that G (k) ∞ is independent of the Brownian motion B (k) . Let G (k−1) be the filtration generated by Then it is easy to check that both the time changed Brownian motions B But B ((k−1)+) , having quadratic variation process t, is thus a G (k−1) t -Brownian motion by Lévy's characterization. If we apply this argument successively, firstly for k = n and G (n) the natural filtration of B (n+) then for k = n − 1, . . . , 2, 1 we find that W = B (0+) is a G (0) -Brownian motion. But the processes W, X (1) , . . . , X (n) are all G (0) -adapted and we are finished.
In the following we will work with the probability space given by this theorem, and with respect to the filtration denoted by F t t≥0 , the smallest right-continuous and P-complete filtration to which X is adapted. Since we want to construct the flow Z from X in a pathwise manner we need to restrict ourselves to a subset of C R + , R ∞ on which X is sufficiently well behaved. Our first task is to identify such a subset.
For each t letN t denote the smallest k ≥ 0 such that X (k+1) t is zero, or be infinity if no such k exists. As we shall see the process N t ; t ≥ 0 must be treated with respect; it is very singular. For any s ≤ t letN s,t = inf{N h : s ≤ h ≤ t}.
Lemma 2. The following hold with probability one.
•N t < ∞ except for a set of t having Lebesgue measure zero, andN s,t < ∞ for all s < t.
Proof. We continue to use notation as introduced in the proof of the preceeding theorem. We will show that, for all u and k, with equality only if both sides are 0. Then by taking u = A for all t, and with equality only if both sides are 0.
Recall that In view of the definition of σ t , and hence the second bracketed term above, is constant on each component of the set and σ (k) the probability of this occurring for t belonging to S is zero.
Let us prove the second assertion of the lemma: thatN s,t is finite for all s < t with probability one. On applying Itô's formula we find that, for any λ > 0, and any k ≥ 1, s is a bounded martingale. Taking expectations, and letting t → ∞ we calculate that: we deduce that, with probability one, for all t, But the measure of the set {t : for 0 ≤ k < ∞, and so the set t :N t = ∞ has measure zero, and a complement dense in R + .
It is not true that with probability oneN t is finite for all t. For each k, the set {t :N t > k} is open and dense, and thus by virtue of Baire's Category Theorem the set {t :N t = ∞} = ∩ k {t :N t > k} is dense.
We now describe a result that will eventually lead to the independent increments property of the flow. If W (1) and W (2) are independent Brownian motions starting from zero and t 0 some fixed time, then the process W defined by and obtained by splicing W t (2); t ≥ 0 onto the end of the path W t (1); 0 ≤ t ≤ t 0 is itself a Brownian motion. A more complicated procedure is given in the following lemma which generalizes this construction.
Lemma 3. Suppose we are given two independent processes X(1) and X(2) each distributed according to the law given by Theorem 1. Fix some t 0 , and construct a process X as follows.
For each x ≥ 0 let Continue by letting Then the process X is also distributed according to the law determined by Theorem 1.
Proof. As remarked previously W is a Brownian motion, and indeed it is a Brownian motion with respect to the filtration generated by X since any increment W u − W t is independent of X s ; 0 ≤ s ≤ t . Note also that the times T k are stopping times with respect to this filtration.
Observe that t → X (k) t is continuous-in fact at the times T k , T k−1 , . . . T 1 , at which continuity is in doubt, it is zero. This is because, for any r, we have X (2)>0} , and likewise, Now making use of the SDE satisfied by X (r) (2) we obtain, s >0} ds.
Puting together the intervals (T m+1 , T m ) for m = k − 1, . . . , 0 we find that X (k) satisfies the appropriate SDE and the result follows from the uniqueness assertion of Theorem 1.
Notice how to recover X t (1); 0 ≤ t ≤ t 0 and X t (2); t ≥ 0 from X. The former is simply the restriction of X to times t ≤ t 0 . While W (2) is given by W t (2) = W t+t 0 − W t 0 the recovery of X (r) (2) for r ≥ 1 is more involved. Since T m = inf{t ≥ t 0 : X Consequently we have The special case r = 1 deserves attention: which as t ≥ t 0 varies is a sticky Brownian motion starting from zero at time t = t 0 . This is the key to the construction of the flow from X.
where sup s≤h≤t W s − W h is denoted by U s,t . We complete the definition of Z by taking Z t,t to be the identity for all t.
Then with probability one Z possesses the properties of a flow: for all 0 ≤ s ≤ t ≤ u < ∞, For all x ∈ R + and any s ≥ 0, Moreover this trajectory satisfies the stochastic differential equation, The doubly indexed process Z s,t ; 0 ≤ s ≤ t < ∞ has independent and stationary increments: for any 0 ≤ t 0 ≤ t 1 ≤ . . . ≤ t n the random maps Z t 0 ,t 1 , Z t 1 ,t 2 , . . . Z t n−1 ,tn are independent and for s ≤ t and h > 0 the maps Z s,t and Z s+h,t+h have the same distribution.
Proof. Fix s < t < u. We evaluate the composition Z t,u Z s,t (x) in four distinct and exhaustive cases. First note the following estimate, which we will verify below. Note also the following property ofN which is an easy consequence of its definition: Suppose that x > U s,t and W t − W s + x > U t,u . These conditions may be interpreted as saying that a particle started from x at time s does not reach zero by time u. Then, Suppose that x ≤ U s,t and X (Ns,t+1) t > U t,u . This time the particle starting from x at time s hits zero before time t but then does not visit zero during the interval [t, u]. We have T > u, and so for 1 ≤ k ≤N s,t + 1, and h ∈ [t, u], we have X (k) h > 0, so (recall always Lemma 2)N t,u ≥N s,t + 1 and whence by virtue of (**)N s,u =N s,t and, Suppose that x ≤ U s,t and X (Ns,t+1) t ≤ U t,u . The particle starting from x at time s visits 0 during both intervals [s, t] and [t, u]. Then, t ≤ T ≤ u, and soN t,u ≤N s,t , whence, using (**) again,N t,u =N s,u . Thus, Suppose that x > U s,t and W t − W s + x ≤ U t,u , so that the particle starting from x at time s visits 0 during the interval [t, u] but not during [s, t]. Combining these inequalities with that denoted by (*) above, X (Ns,t+1) t ≤ U t,u , so t ≤ T ≤ u and and we may argue as for the preceding case thatN t,u =N s,u . Thus Finally note that x > U s,u if and only if CASE 1 holds, and so we have demonstrated the required composition property.
Construct X as in the preceding lemma from two independent copies X(1) and X(2) taking the splicing time t 0 to be equal to s, then we commented following Lemma 3 that and it is also the case that By virtue of Lemma 2 we can now deduce equation (*) above since We can also prove that Z has stationary, independent increments from this splicing. The map Z s,t can be written in terms of X(2) as Thus Z s,t is determined from X(2) in the same way that Z 0,t−s is determined from X and so has the same law as Z 0,t−s . The construction also shows that Z s,t is independent of Z u,v ; 0 ≤ u ≤ v ≤ s , this being determined by X(1).

Finally note that t → X
(1) t−s (2) is for t ≥ s a sticky Brownian motion starting from zero and this is the same as t → Z s,t (0). More generally is, for t ≥ s, a sticky Brownian motion starting at time s from x, and that this trajectory satisfies the SDE as claimed.
Lemma 5. We may recover the process X from the flow: for each t, Proof. To see this note first that the inclusion follows from the very definition of Z s,t (0) as X (k) t for some k. This inclusion is in fact an equality as the following argument shows. For each k ≥ 1 let t , t]. Now, using the fact that the X (k) t take distinct values for different k unless equal to zero, we see that: Finally note that if, for some k we have X (k) t = 0 then Z s,t (0) = 0 also for s sufficiently close to t. Thus regardless of the value of X The maps Z s,t were expected to have a simple form on the basis of the verbal description given in the introduction. For an initial position x sufficiently large the motion is simply a 'translate' of W . Whereas for all values x less than or equal to some critical level, the corresponding particles have reached zero between times s and t and coalesced. We can identify the map Z s,t with a point X s,t , U s,t , V s,t ∈ R 3 + where Notice that, equation (*) from the proof of Theorem 4 can be written and in fact this holds simultaneously for all s ≤ t with probability one. When we compose such maps we obtain via this identification a semigroup on (x, u, v) ∈ R 3 + : x ≤ v}; the map z associated with (x, u, v) being Thus when we compose the maps associated with z 1 = (x 1 , u 1 , v 1 ) and z 2 = (x 2 , u 2 , v 2 ) we obtain the map associated with Using this identification of each map Z s,t with a point in R 3 + we can investigate the continuity of (s, t) → Z s,t . Discontinuities are caused by jumps in X s,t which in turn are caused by jumps inN s,t . For a pair of times s 0 < t 0 let k =N s 0 ,t 0 + 1. Then if r < k, X h 0 = 0 thenN s,t =N s 0 ,t 0 for all pairs s, t sufficiently close to s 0 , t 0 . The alternative is that the only h ∈ [s 0 , t 0 ] such that X (k) h = 0 are h = s 0 or h = t 0 or both. In these cases (s, t) →N s,t has a simple discontinuity at (s 0 , t 0 ). However not all such discontinuities inN cause discontinuities in (s, t) → X s,t = Z s,t (0). It is only if the Another place to worry about the continuity of (s, t) → X s,t is on the diagonal s = t. Here the behaviour of (s, t) →N s,t is particularly bad. But the inequality X s,t ≤ V s,t assuages our concerns.

LOOKING AT THE FLOW BACKWARDS
In this section we study the flow Z s,t ; s ≤ t by fixing t and letting s decrease. This reveals certain random variables which have a Poisson distribution and leads to a verification of the description of the R ∞ -valued process X in terms of marked trees which was given in the introduction. Recently Watanabe [19] has also made a similar study of the flow Z in which the dual flow is described in terms of elastic Brownian motions. Let us define, for s < t, a random set Ξ s,t via Recall from Theorem 4 or Lemma 5 that the flow Z is constructed from the R ∞ -valued process X in such a way that Z s,t (0) is equal to X and in view of Lemma 2 we deduce that except for an exceptional set of t having Lebesgue measure zero Ξ s,t contains only a finite number of points. From this, and the stationarity of Z, it follows that for any fixed t, the probability that Ξ s,t is finite for all s is one.
The principal result of this section is the description of the law of Ξ s,t conditional on the driving Brownian motion W t ; t ≥ 0 .
Theorem 6. For fixed s < t, the law of Ξ s,t conditional on W t ; t ≥ 0 is given, by the following.
• 0 ∈ Ξ s,t with probability one; • Ξ s,t \ {0} is distributed as a Poisson point process of intensity 2θ on (0, V s,t ]. In proving this theorem it is helpful to extend the flow to negative time. Thus suppose Z s,t ; −∞ < s ≤ t < ∞ is a flow defined for all time with stationary independent increments, and whose restriction to positive time is the flow constructed in Theorem 4. Suppose also that Z possess the regularity properties (2.5). As in the introduction let F s,t be the σ-algebra generated by Z u,v ; s ≤ u ≤ v ≤ t which we complete to contain all events with zero probability.
Lemma 7. Suppose that T is a finite stopping time with respect to the filtration F −t,0 t≥0 then and is moreover independent of F −T,0 .
Proof. We mimic the usual proof of the strong Markov property. Suppose the stopping time T only takes values in a discrete set of times, then, by virtue of the stationarity of the flow and the fact that F −∞,−t i and F −t i ,0 are independent, the result holds for T . An arbritary stopping time is the limit of a decreasing sequence of such elementary stopping times. Using this fact we deduce the result for an arbitrary finite T noting that as h ↓ 0, for any s ≤ t ≤ 0, For a flow which is defined for all negative time we extend the driving Brownian motion W to negative time by the device of considering the doubly indexed process W s,t ; −∞ < s ≤ t < ∞ defined by For non-negative s and t this is just the increment W t − W s of the Brownian motion W . The definitions of U s,t and V s,t can be extended to negative times via: We may also define a random set Ξ −∞,0 to be {Z s,0 (0) : −∞ < s < 0}.
Proof. For x ≥ 0 let −T x = sup{s ≤ 0 : V s,0 > x}. Then x → T x is right-continuous, and each T x is a F −t,0 t≥0 -stopping time by virtue of F −t,0 being complete.
We define a right continuous counting process N (x); x ≥ 0 via where nS counts the number of points belonging to the set S.

Now fix some x and notice that
On the other hand if s < −T x then Z s,0 (0) = Z −Tx,0 • Z s,−Tx (0), which is either equal to Z −Tx,0 (0) or strictly greater than x. From this we deduce two facts. Firstly that N (h); h ≤ x is determined by the set {Z s,0 (0) : −T x ≤ s < 0} and hence is measurable with respect to F −Tx,0 . Secondly that for y ≥ 0, On applying the previous lemma we deduce that N is a process with independent identically distributed increments, and since it is constant except for positive jumps of size one, it must be a Poisson process of with some rate that remains to determine.
Before discussing the rate of N we first prove that it is independent of W s,t ; −∞ < s ≤ t < ∞ . In fact because of the independence of flow before and after time 0 it is enough to show that N is independent of W s,t ; −∞ < s ≤ t ≤ 0 . It follows from Lemma 2 and stationarity that for each fixed s < 0 with probability one, the strict inequality holds Nevertheless there exist random s at which equality holds: the times −T x for x ∈ Ξ −∞,0 in fact. By Fubini's Theorem, with probability one, the Lebesgue measure of the set of these exceptional s for which equality holds is zero. Now consider an interval of the form This latter value cannot be equal to x, for if it was then Z s,0 (0) = x = V s,0 for all s in the interval and the interval would then have to have length zero. Notice that if s < −T x or −T x− < s then Z s,0 (0) cannot be equal to 0 either, and so we have shown that any x such that T x− < T x cannot belong to Ξ −∞,0 and is not the time of a jump of the process N (x); x ≥ 0 . But x such that T x− < T x are exactly the local times at which the reflecting Brownian motion W −t,0 + V −t,0 ; t ≥ 0 makes excursions from zero. It follows from a well-known property of Poisson processes (see [9], Chapter XII, Proposition 1.7) that N is independent of the Poisson point process of excursions from zero made by this reflecting Brownian motion and hence from the reflecting Brownian motion itself. Finally we note that we can recover W s,t ; −∞ < s ≤ t ≤ 0 from this reflecting Brownian motion and the proof of independence is complete.
To determine the rate of N we may calculate, using Lemma 2 On the other hand, where we have used the resolvent of Brownian motion and the fact that V 0,t has the same law as |B t |. This shows that β = 2θ.
Proof of Theorem 6. We have 0 ∈ Ξ s,t unlessN t = ∞ which according to Lemma 2 occurs only for a set of t having Lebesgue measure zero. Thus by stationarity for any fixed s and t the probability that 0 ∈ Ξ s,t is one. By Fubini's Theorem it follows 0 ∈ Ξ s,t with probability one having conditioned on W ( except for a null set of possible values for W .) We consider the extended flow defined for negative time. By stationarity and Lemma 8 is distributed as a Poisson point process with rate 2θ on (0, ∞) and is independent of the white noise (W s,t ; s ≤ t) and hence of the Brownian motion (W t ; t ≥ 0) = (W 0,t ; t ≥ 0). Finally and V s,t is measurable with respect to W .
Corollary 9. The law of Z s,t depends only on h = t − s and is given by where U is a variable uniformly distributed on [0, 1], T is an exponential random variable with mean 1/(2θ), R h is distributed as √ hR 1 for R 1 the modulus of a standard Gaussian variable in R 3 , and the three variables U , T and R h are independent.
Proof. The marginal law of (U s,t , V s,t ) which depends just on the BM W is very well known see for example [20]. The exponentially distributed contribution describing the conditional law of Z s,t (0) is a consequence of Theorem 6 above, since Z s,t (0) > x if and only if Ξ s,t contains a point in (x, V s,t ].
We turn now to an examination of the joint law of Ξ 0,t 1 and Ξ 0,t 2 when t 1 < t 2 . It is at this point that the tree structure discussed in the introduction becomes evident.
Proof. First observe that Ξ 0,t 1 \{0} and Ξ t 1 ,t 2 \{0} are by virtue of Theorem 6, conditionally on W , distributed as Poisson point processes of rate 2θ on (0, V 0,t 1 ] and (0, V t 1 ,t 2 ] respectively. Moreover they are conditionally independent since they are measurable with respect to the independent σ-algebras: σ Z u,v ; 0 ≤ u ≤ v ≤ t 1 and σ Z u,v ; t 1 ≤ u ≤ v ≤ t 2 respectively. Consider s < t 1 and suppose first that Z s,t 1 (0) > U t 1 ,t 2 then On the other hand if Z s,t 1 (0) ≤ U t 1 ,t 2 then In view of our opening observation that Ξ 0,t 1 and Ξ t 1 ,t 2 are independent, statement 1 of the proposition is proved.
We will now re-interpret Proposition 10 in terms of the marked trees as described in the introduction. Recall that the excursions of the reflecting Brownian motion ξ from zero determine a family of rooted trees to which we add marks according to a Poisson point process of intensity 2θ. The random set Ξ 0,t \ {0} determines the position of the marks on the path corresponding to time t. Recall that this path has length ξ t = W t +sup s≤t (−W s ). If x ∈ Ξ 0,t \{0} then there is a mark on the path at a distance ξ t −x from the root. Part 1 of the proposition covers the case when the two times t 1 and t 2 are straddled by different excursions of ξ, and the paths corresponding to the times are disjoint. Part 2 of the proposition covers the case when the two times t 1 and t 2 are straddled by a single excursion of ξ. In this case the two corresponding paths co-incide from the root for a distance I t 1 ,t 2 = inf h∈[t 1 ,t 2 ] ξ h and part 2(a) of the proposition guarantees that Ξ 0,t 1 and Ξ 0,t 2 determine marks in the same places along this common part of the paths. Then after a distance I t 1 ,t 2 from the root the paths separate and the location of marks on the two paths is independent as covered by part 2(b) of the proposition. Note that equation (1.5) in which the positions of the marks were determined from the R ∞ -valued process X is equivalent to the rule we have just given for determining the positions from the sets Ξ 0,t by virtue of Lemma 5 with X (k) t determining the position of kth mark along the path counting from the origin.
We can re-cast these results in terms of a snake process of the type considered by Le Gall [8], with the motion process being a Poisson counting process. As a consequence the SDEs displayed in the statement of Theorem 1 specify the generator of the this snake process. Compare with the work of Dhersin and Serlet [4].
where nS counts the number of elements of a set S. The following proposition is a consequence of Proposition 10.

THE FLOW AS A NOISE
A probability space Ω, F, P together with a doubly indexed family of sub-σ-algebras F s,t ; −∞ ≤ s ≤ t ≤ ∞ and a group of measure preserving shifts θ t ; t ∈ R satisfying: • For any h and any s < t, we have θ −1 h F s,t = F s+h,t+h .
• For any s < t and u < v with the intervals (s, t) and (u, v) disjoint, the two σ-algebras F s,t and F u,v are independent.
• For any s < t < u the σ-algebra F s,u is generated by F s,t and F t,u .
is called a noise by Tsirelson, [11] and [10]. The concept has being around in various forms for some time, for example see Feldman [5]. The most familiar example is that of white noiseobtained by considering the increments of a Brownian motion. Any stochastic flow with independent stationary increments gives rise to such an object, but often the corresponding noise is going to be the white noise associated with some driving Brownian motion.
We now consider the flow of sticky Brownian motions we have constructed as a noise.
Recall that for each s ≤ t the σ-algebra F s,t is generated by the random maps Z u,v for s ≤ u ≤ v ≤ t. We refer to a process B as a F s,t -Brownian motion if it is a Brownian motion in law, and if for all 0 ≤ s ≤ t the increment B t − B s is measurable with respect to F s,t . Of course the driving Brownian motion W is an F s,t -Brownian motion. Now we define the linear component of the noise to be the family of P-complete σ-algebras, denoted by F lin s,t , s ≤ t , generated by the increments of all F s,t -Brownian motions, thus F lin s,t = σ B v − B u ; s ≤ u ≤ v ≤ t, B is a F s,t − Brownian motion . In particular F lin s,t contains the σ-algebra F W s,t generated by increments of the driving Brownian motion W between times s and t. Proof. Suppose B is an F s,t -Brownian motion. Fix s < t, and then F W s,t being contained in F s,t is independent of F 0,s which contains σ(B s ) and F W 0,s . Thus Similarly It follows that Using this we deduce B t − E B t |F W 0,t is an F 0,t t≥0 -martingale which is orthogonal to the Brownian motion W , and by virtue of the following proposition identically zero. But this means that the increment B t − B s is F W s,t -measurable, and we are done.
The following proposition is a consequence of the uniqueness result of Theorem 1 and the general theory of martingale representations, presented, for example, in Chapter V of [9].
Proposition 13. W has the F t -predictable representation property.
Thus we have obtained a noise with a nonlinear component: for we know from Theorem 6 that F W s,t is strictly contained in F s,t and thus, in light of the above, F s,t = F lin s,t . This example of a noise also provides an affirmative answer to the question asked by Tsirelson in [10]: can a predictable noise generate a non-cosy filtration. This is because we know that the filtration F 0,t ; t ≥ 0 is non-cosy from [16].
Finally we note that this example of a noise is also time-asymmetric: in view of the martingale representation property just given all martingales in the filtration F 0,t t≥0 are continuous whereas in the reverse filtration F −t,0 t≥0 there are many discontinuous martingales.