Divergence (statistics)


In statistics and information geometry, divergence or a contrast function is a function which establishes the "distance" of one probability distribution to the other on a statistical manifold. The divergence is a weaker notion than that of the distance, in particular the divergence need not be symmetric, and need not satisfy the triangle inequality.

Definition

Suppose S is a space of all probability distributions with common support. Then a divergence on S is a function satisfying
  1. D ≥ 0 for all p, qS,
  2. D = 0 if and only if p = q,
The dual divergence D* is defined as

Geometrical properties

Many properties of divergences can be derived if we restrict S to be a statistical manifold, meaning that it can be parametrized with a finite-dimensional coordinate system θ, so that for a distribution we can write.
For a pair of points with coordinates θp and θq, denote the partial derivatives of D as
Now we restrict these functions to a diagonal, and denote
By definition, the function D is minimized at, and therefore
where matrix g is positive semi-definite and defines a unique Riemannian metric on the manifold S.
Divergence D also defines a unique torsion-free affine connection with coefficients
and the dual to this connection ∇* is generated by the dual divergence D*.
Thus, a divergence D generates on a statistical manifold a unique dualistic structure , ∇, ∇. The converse is also true: every torsion-free dualistic structure on a statistical manifold is induced from some globally defined divergence function.
For example, when D is an f-divergence for some function ƒ, then it generates the metric and the connection, where g is the canonical Fisher information metric, ∇ is the α-connection,, and.

Examples

The two most important divergences are the relative entropy, which is central to information theory and statistics, and the squared Euclidean distance. Minimizing these two divergences is the main way that linear inverse problem are solved, via the principle of maximum entropy and least squares, notably in logistic regression and linear regression.
The two most important classes of divergences are the f-divergences and Bregman divergences; however, other types of divergence functions are also encountered in the literature. The only divergence that is both an f-divergence and a Bregman divergence is the Kullback–Leibler divergence; the squared Euclidean divergence is a Bregman divergence, but not an f-divergence.

f-divergences

This family of divergences are generated through functions f, convex on and such that. Then an f-divergence is defined as
Kullback–Leibler divergence:
squared Hellinger distance:
Jeffreys divergence:
Chernoff's α-divergence:
exponential divergence:
Kagan's divergence:
-product divergence:

If a Markov process has a positive equilibrium probability distribution then is a monotonic function of time, where the probability distribution is a solution of the Kolmogorov forward equations, used to describe the time evolution of the probability distribution in the Markov process. This means that all f-divergencies are the Lyapunov functions of the Kolmogorov forward equations. Reverse statement is also true: If is a Lyapunov function for all Markov chains with positive equilibrium and is of the trace-form
then, for some convex function f. Bregman divergences in general do not have such property and can increase in Markov processes.

Bregman divergences

Bregman divergences correspond to convex functions on convex sets. Given a strictly convex, continuously-differentiable function on a convex set, known as the Bregman generator, the Bregman divergence measures the convexity of: the error of the linear approximation of from as an approximation of the value at :
The dual divergence to a Bregman divergence is the divergence generated by the convex conjugate of the Bregman generator of the original divergence. For example, for the squared Euclidean distance, the generator is, while for the relative entropy the generator is the negative entropy.

History

The term "divergence" for a statistical distance was used informally in various contexts from c. 1910 to c. 1940. Its formal use dates at least to, entitled "On a measure of divergence between two statistical populations defined by their probability distributions", which defined the Bhattacharyya distance, and, entitled "On a Measure of Divergence between Two Multinomial Populations", which defined the Bhattacharyya angle. The term was popularized by its use for the Kullback–Leibler divergence in, its use in the textbook, and then by generally, for the class of f-divergences. The term "Bregman distance" is still found, but "Bregman divergence" is now preferred. In information geometry, alternative terms were initially used, including "quasi-distance" and "contrast function", though "divergence" was used in for the -divergence, and has become standard.