Abstract
A basic result of large deviations theory is Sanov’s theorem, which states that the sequence of empirical measures of independent and identically distributed samples satisfies the large deviation principle with rate function given by relative entropy with respect to the common distribution. Large deviation principles for the empirical measures are also known to hold for broad classes of weakly interacting systems. When the interaction through the empirical measure corresponds to an absolutely continuous change of measure, the rate function can be expressed as relative entropy of a distribution with respect to the law of the McKean–Vlasov limit with measure-variable frozen at that distribution. We discuss situations, beyond that of tilted distributions, in which a large deviation principle holds with rate function in relative entropy form.
Citation
Markus Fischer. "On the form of the large deviation rate function for the empirical measures of weakly interacting systems." Bernoulli 20 (4) 1765 - 1801, November 2014. https://doi.org/10.3150/13-BEJ540
Information