THE MEAN OF A MAXIMUM LIKELIHOOD ESTIMATOR ASSOCIATED WITH THE BROWNIAN BRIDGE

A closed formula for the mean of a maximum likelihood estimator associated with the Brownian bridge is obtained; the exact relation with that of the Brownian motion is established.


Introduction
Let X t , t ∈ T , be a Gaussian process, and µ be a probability measure on T . To avoid measurability questions, we take T as finite. Denote X µ = T X t µ(dt). Consider the following maximum likelihood estimator where the supremum is taken over all the probability measures on T . This maximum likelihood estimator (introduced in a slightly different way) has been studied extensively (c.f. [3]- [5]). By taking the δ measure at t ∈ T , it is easy to see that (1) dominates Thus, there is a close connection between (1) and the supremum of the Gaussian process. However, because of the quadratic term λ 2 2 E |X µ | 2 in (1), there seems to be no direct derivation from one to the other. In particular, what is the mean for a given Gaussian process? (The use of the letter "W" will become clear later, once the connection with the Wills functional is established.) By omitting the quadratic term, it is clear that (3) is bounded above by the moment generating function of sup t∈T X t . Note that the exact expression for the moment generating function is hard to obtain (in fact, it is known only for six special Gaussian processes). Thus, (3) is an alternative quantity to study. We believe that (3) contains rich information about a Gaussian process. For example, by studying its upper bound, [6] obtained an exponential inequality, which is stronger than the well-known deviation inequality for the supremum of a Gaussian process. Motivated by this, we consider (3) for Brownian bridge on [0, 1]. We choose this process because of the nice geometric structure, and more importantly, because the moment generating function of the supremum is known. This allows one to study the connections between the two. It turns out that for Brownian bridge, (3) is of very nice closed form. and where the supremum is over all the probability measure on [0, 1], and ω k = π k/2 /Γ(k/2 + 1) is the volume of the k-dimensional unit ball.
It is interesting that there is an extra factor 2 in M B .

Connecting to geometry
By approximation and continuity, we can assume T consist of n + 1 elements. Equipped with the distance dist(X t , X s ) =: E |X t − X s | 2 , {X t : t ∈ T } can be viewed as a set in R n . Let K be the convex hull of this set. Then for any probability measure µ on T , X µ is a point in K. It is then not hard to derive that (See [6]). The right hand side is usually called the Wills functional of K, and can be expressed as where V k (K) is the k th intrinsic volume of K. (See [1] or [7]). It is known that V k (K) can be evaluated by where |F J | is the Lebesgue measure of the k-dimensional face F J of K, γ(N (F J , K)) is the Gaussian measure of the solid angle of the normal cone N (F J , K) at F J , and the sum is over all the k-dimensional faces of K.
Thus the evaluation of W X (λ) becomes a problem of computing solid angles.

Proof of the theorem
Because there is no formula available in general on the evaluation of higher dimensional solid angles, from now on, we restrict ourselves to the Brownian bridge setting.
Lemma 1 Proof: Let D J be the volume of the parallelepiped body generated by the vectors −→ P i0 P j , j ∈ J, j = i 0 . Then |F J | = D J /k!. Note D J is just the square root of the determinant of the k × k Subtracting row k by n−d k n−d k−1 multiple of row (k − 1), and expanding along row k, we obtain Continuing the procedure, we finally obtain

Lemma 1 follows.
In order to evaluate the k th intrinsic volume of K, we need to compute the Gaussian measure of the normal cone N (F J , K) at F J . For 1 ≤ j ≤ m + 1, we denote e j the unit vector in R m+1 , whose j th coordinate is 1. Let u 0 = e 1 − e m+1 , and u j = e j+1 − e j for 1 ≤ j ≤ m.
It can be checked that the extreme rays of the normal cone at F J are u j , 0 ≤ j ≤ m, j / ∈ J. These vectors can be separated into k + 1 groups: B j = {u l : i j−1 < l < i j }, j = 1, 2, ..., k and C = {u l : l < i 0 or l > i k }. The vectors from different groups are mutually orthogonal. Thus, γ(N (F J , K)) can be evaluated as where γ(C) means the Gaussian measure of the cone generated by the vectors in C.
Proof: Add the vector x =: e ij−1+1 − e ij into B j . The expanded group contains (i j − i j−1 ) vectors. Note that the sum of these vectors is 0. Thus any linear combination of these vectors can be expressed as a convex combination of (i j − i j−1 − 1) vectors. In fact, for any linear combination ax which is a convex combination of (i j −i j−1 −1) vectors. Because any (i j −i j−1 −1) vectors from this extended group are linearly independent, such convex combination expression is unique (except on a lower dimensional subset). Consider the (i j − i j−1 ) cones that are formed by (i j − i j−1 − 1) vectors from the extended group. The argument above implies that these cones form a partition of an (i j − i j−1 − 1)dimensional space. By looking at the inner products of the vectors, we notice that all these cones are reflections of one another. Therefore, they have the same measure. Hence each cone has Gaussian measure 1/(i j − i j−1 ). To compute γ(C), we notice that the inner product structure of the vectors in C is the same as the inner product structure of the vectors u 1 , u 2 , ..., u m−i k +i0 . So they have the same measure. The latter has measure 1/(m − i k + i 0 ) by the previous argument.
Proof: Applying Lemma 2 to (9), together with (8) and (7), we have The lemma follows by changing variables: l k = i k − i k−1 . Proof of Theorem 1: As n → ∞, m/n → T . By applying Lemma 3, we obtain By changing variables, we can express the right hand side as where β T is the incomplete Beta function. In particular, if T = 1, we have V k (K n / √ n) → ω k+1 2 k! . Together with (6), this implies (4). The proof of (5) follows from (4) and the result in [2], which states that the corresponding k th intrinsic volume for Brownian motion approaches ω k k! .