Stein's lemma


Stein's lemma, named in honor of Charles Stein, is a theorem of probability theory that is of interest primarily because of its applications to statistical inference — in particular, to James–Stein estimation and empirical Bayes methods — and its applications to portfolio choice theory. The theorem gives a formula for the covariance of one random variable with the value of a function of another, when the two random variables are jointly normally distributed.

Statement of the lemma

Suppose X is a normally distributed random variable with expectation μ and variance σ2. Further suppose g is a function for which the two expectations E and E both exist. Then
In general, suppose X and Y are jointly normally distributed. Then

Proof

The univariate probability density function for the univariate normal distribution with expectation 0 and variance 1 is
and the density for a normal distribution with expectation μ and variance σ2 is
Then use integration by parts.

More general statement

Suppose X is in an exponential family, that is, X has the density
Suppose this density has support where could be and as , where is any differentiable function such that or if finite. Then
The derivation is same as the special case, namely, integration by parts.
If we only know has support , then it could be the case that but. To see this, simply put and with infinitely spikes towards infinity but still integrable. One such example could be adapted from so that is smooth.
Extensions to elliptically-contoured distributions also exist.