For random variables,, and with support sets, and, we define the conditional mutual information as This may be written in terms of the expectation operator:. Thus is the expected Kullback–Leibler divergence from the conditional joint distribution to the product of the conditional marginals and. Compare with the definition of mutual information.
In terms of pmf's for discrete distributions
For discrete random variables,, and with support sets, and, the conditional mutual information is as follows where the marginal, joint, and/or conditional probability mass functions are denoted by with the appropriate subscript. This can be simplified as
In terms of pdf's for continuous distributions
For continuous random variables,, and with support sets, and, the conditional mutual information is as follows where the marginal, joint, and/or conditional probability density functions are denoted by with the appropriate subscript. This can be simplified as
Some identities
Alternatively, we may write in terms of joint and conditional entropies as This can be rewritten to show its relationship to mutual information usually rearranged as the chain rule for mutual information Another equivalent form of the above is Like mutual information, conditional mutual information can be expressed as a Kullback–Leibler divergence: Or as an expected value of simpler Kullback–Leibler divergences:
More general definition
A more general definition of conditional mutual information, applicable to random variables with continuous or other arbitrary distributions, will depend on the concept of regular conditional probability. Let be a probability space, and let the random variables,, and each be defined as a Borel-measurable function from to some state space endowed with a topological structure. Consider the Borel measure in the state space of each random variable defined by assigning each Borel set the -measure of its preimage in. This is called the pushforward measure The support of a random variable is defined to be the topological support of this measure, i.e. Now we can formally define the conditional probability measure given the value of one of the random variables. Let be a measurable subset of and let Then, using the disintegration theorem: where the limit is taken over the open neighborhoods of, as they are allowed to become arbitrarily smaller with respect toset inclusion. Finally we can define the conditional mutual information via Lebesgue integration: where the integrand is the logarithm of a Radon–Nikodym derivative involving some of the conditional probability measures we have just defined.
Note on notation
In an expression such as and need not necessarily be restricted to representing individual random variables, but could also represent the joint distribution of any collection of random variables defined on the same probability space. As is common in probability theory, we may use the comma to denote such a joint distribution, e.g. Hence the use of the semicolon to separate the principal arguments of the mutual information symbol.
Properties
Nonnegativity
It is always true that for discrete, jointly distributed random variables, and. This result has been used as a basic building block for proving other inequalities in information theory, in particular, those known as Shannon-type inequalities.
Interaction information
Conditioning on a third random variable may either increase or decrease the mutual information: that is, the difference, called the interaction information, may be positive, negative, or zero. This is the case even when random variables are pairwise independent. Such is the case when: in which case, and are pairwise independent and in particular, but
The conditional mutual information can be used to inductively define a multivariate mutual information in a set- or measure-theoretic sense in the context of information diagrams. In this sense we define the multivariate mutual information as follows: where This definition is identical to that of interaction information except for a change in sign in the case of an odd number of random variables. A complication is that this multivariate mutual information can be positive, negative, or zero, which makes this quantity difficult to interpret intuitively. In fact, for random variables, there are degrees of freedom for how they might be correlated in an information-theoretic sense, corresponding to each non-empty subset of these variables. These degrees of freedom are bounded by various Shannon- and non-Shannon-type inequalities in information theory.