NSF-CBMS Regional Conference Series in Probability and Statistics

Empirical Processes: Theory and Applications

David Pollard

Book information

David Pollard

Publication information
Regional Conference Series in Probability and Statistics, Volume 2
Haywood CA and Alexandria VA: Institute of Mathematical Statistics and American Statistical Association, 1990
86 pp.

Publication date: 1990
First available in Project Euclid: 1 May 2016

Permanent link to this book


David Pollard, Empirical Processes: Theory and Applications (Haywood, CA: Institute of Mathematical Sciences; Alexandria VA: American Statistical Association, 1990)


These notes grew from lectures I gave at the University of Iowa in July of 1988, as part of the NSF-CBMS Regional Conference Series. The conference was ably organized by Tim Robertson and Richard Dykstra. I am most grateful to them for giving me the opportunity to experiment on a live and receptive audience with material not entirely polished. I also appreciate the suggestions and comments of Richard Dudley. Much of the lecture material was repackaging of ideas originally due to him.

In reworking the lecture notes I have tried (not always successfully) to resist the urge to push the presentation to ever higher levels of generality. My aim has been to introduce just enough technique to handle typical nontrivial asymptotic problems in statistics and econometrics. Of course the four substantial examples that represent the applications part of the lectures do not exhaust the possible uses for the theory. I have chosen them because they cleanly illustrate specific aspects of the theory, and also because I admire the original papers.

To anyone who is acquainted with the empirical process literature these notes might appear misleadingly titled. Empirical process theory usually deals with sums of independent (identically distributed) random variables $f(\xi_i(\omega))$, with $f$ running over a class of functions $\mathcal{F}$. However I have chosen to present results for sums of independent stochastic processes $f_i(\omega,t)$ indexed by a set $T$. Such a setting accommodates not only the relatively straightforward generalization to nonidentically distributed $\{\xi_i\}$ but also such simple modifications as a rescaling of the summands by a factor that depends on $i$ and $\omega$. It has often irked me that the traditional notation cannot handle summands such as $f(\xi_i)/i$, even though the basic probabilistic method is unaffected.

The cost of the modified notation appears in two ways. Some familiar looking objects no longer have their usual meanings. For example, $\mathcal{F}$ will now stand for a subset of $\mathbb{R}^n$ rather than for a class of functions. Also, some results, such as the analogues in Section 4 of the standard Vapnik-Červonenkis theory, become a trifle less general than in the traditional setting. The benefits include the natural reinterpretation of the Vapnik-Červonenkis property as a sort of dimensionality concept, and the transformation of $\mathcal{L}^{2}(P_n)$ pseudometrics on classes of functions into the usual $(\mathit{l}_n)$ Euclidean distances in $\mathbb{R}^n$.