Principle of indifference


The principle of indifference is a rule for assigning epistemic probabilities. The principle of indifference states that in the absence of any relevant evidence, agents should distribute their credence equally among all the possible outcomes under consideration.
In Bayesian probability, this is the simplest non-informative prior. The principle of indifference is meaningless under the frequency interpretation of probability, in which probabilities are relative frequencies rather than degrees of belief in uncertain propositions, conditional upon state information.

Examples

The textbook examples for the application of the principle of indifference are coins, dice, and cards.
In a macroscopic system, at least, it must be assumed that the physical laws which govern the system are not known well enough to predict the outcome. As observed some centuries ago by John Arbuthnot,
Given enough time and resources, there is no fundamental reason to suppose that suitably precise measurements could not be made, which would enable the prediction of the outcome of coins, dice, and cards with high accuracy: Persi Diaconis's work with coin-flipping machines is a practical example of this.

Coins

A symmetric coin has two sides, arbitrarily labeled heads and tails. Assuming that the coin must land on one side or the other, the outcomes of a coin toss are mutually exclusive, exhaustive, and interchangeable. According to the principle of indifference, we assign each of the possible outcomes a probability of 1/2.
It is implicit in this analysis that the forces acting on the coin are not known with any precision. If the momentum imparted to the coin as it is launched were known with sufficient accuracy, the flight of the coin could be predicted according to the laws of mechanics. Thus the uncertainty in the outcome of a coin toss is derived from the uncertainty with respect to initial conditions. This point is discussed at greater length in the article on coin flipping.

Dice

A symmetric die has n faces, arbitrarily labeled from 1 to n. An ordinary cubical die has n = 6 faces, although a symmetric die with different numbers of faces can be constructed; see Dice. We assume that the die will land on one face or another upward, and there are no other possible outcomes. Applying the principle of indifference, we assign each of the possible outcomes a probability of 1/n. As with coins, it is assumed that the initial conditions of throwing the dice are not known with enough precision to predict the outcome according to the laws of mechanics. Dice are typically thrown so as to bounce on a table or other surface. This interaction makes prediction of the outcome much more difficult.
The assumption of symmetry is crucial here. Suppose that we are asked to bet for or against the outcome "6". We might reason that there are two relevant outcomes here "6" or "not 6", and that these are mutually exclusive and exhaustive. This suggests assigning the probability 1/2 to each of the two outcomes.

Cards

A standard deck contains 52 cards, each given a unique label in an arbitrary fashion, i.e. arbitrarily ordered. We draw a card from the deck; applying the principle of indifference, we assign each of the possible outcomes a probability of 1/52.
This example, more than the others, shows the difficulty of actually applying the principle of indifference in real situations. What we really mean by the phrase "arbitrarily ordered" is simply that we don't have any information that would lead us to favor a particular card. In actual practice, this is rarely the case: a new deck of cards is certainly not in arbitrary order, and neither is a deck immediately after a hand of cards. In practice, we therefore shuffle the cards; this does not destroy the information we have, but instead renders our information practically unusable, although it is still usable in principle. In fact, some expert blackjack players can track aces through the deck; for them, the condition for applying the principle of indifference is not satisfied.

Application to continuous variables

Applying the principle of indifference incorrectly can easily lead to nonsensical results, especially in the case of multivariate, continuous variables. A typical case of misuse is the following example:
In this example, mutually contradictory estimates of the length, surface area, and volume of the cube arise because we have assumed three mutually contradictory distributions for these parameters: a uniform distribution for any one of the variables implies a non-uniform distribution for the other two. In general, the principle of indifference does not indicate which variable is to have a uniform epistemic probability distribution.
Another classic example of this kind of misuse is the Bertrand paradox. Edwin T. Jaynes introduced the principle of transformation groups, which can yield an epistemic probability distribution for this problem. This generalises the principle of indifference, by saying that one is indifferent between equivalent problems rather than indifference between propositions. This still reduces to the ordinary principle of indifference when one considers a permutation of the labels as generating equivalent problems. To apply this to the above box example, we have three random variables related by geometric equations. If we have no reason to favour one trio of values over another, then our prior probabilities must be related by the rule for changing variables in continuous distributions. Let L be the length, and V be the volume. Then we must have
where are the probability density functions of the stated variables. This equation has a general solution:, where K is a normalization constant, determined by the range of L, in this case equal to:
To put this "to the test", we ask for the probability that the length is less than 4. This has probability of:
For the volume, this should be equal to the probability that the volume is less than 43 = 64. The pdf of the volume is
And then probability of volume less than 64 is
Thus we have achieved invariance with respect to volume and length. One can also show the same invariance with respect to surface area being less than 6 = 96. However, note that this probability assignment is not necessarily a "correct" one. For the exact distribution of lengths, volume, or surface area will depend on how the "experiment" is conducted.
The fundamental hypothesis of statistical physics, that any two microstates of a system with the same total energy are equally probable at equilibrium, is in a sense an example of the principle of indifference. However, when the microstates are described by continuous variables, an additional physical basis is needed in order to explain under which parameterization the probability density will be uniform. Liouville's theorem justifies the use of canonically conjugate variables, such as positions and their conjugate momenta.
The wine/water paradox shows a dilemma with linked variables, and which one to choose.

History

The original writers on probability, primarily Jacob Bernoulli and Pierre Simon Laplace, considered the principle of indifference to be intuitively obvious and did not even bother to give it a name. Laplace wrote:
These earlier writers, Laplace in particular, naively generalized the principle of indifference to the case of continuous parameters, giving the so-called "uniform prior probability distribution", a function which is constant over all real numbers. He used this function to express a complete lack of knowledge as to the value of a parameter. According to Stigler, Laplace's assumption of uniform prior probabilities was not a meta-physical assumption. It was an implicit assumption made for the ease of analysis.
The principle of insufficient reason was its first name, given to it by later writers, possibly as a play on Leibniz's principle of sufficient reason. These later writers objected to the use of the uniform prior for two reasons. The first reason is that the constant function is not normalizable, and thus is not a proper probability distribution. The second reason is its inapplicability to continuous variables, as described above. to be used for the probability itself. See the Bertrand paradox
The "principle of insufficient reason" was renamed the "principle of Indifference" by the economist, who was careful to note that it applies only when there is no knowledge indicating unequal probabilities.
Attempts to put the notion on firmer philosophical ground have generally begun with the concept of equipossibility and progressed from it to equiprobability.
The principle of indifference can be given a deeper logical justification by noting that equivalent states of knowledge should be assigned equivalent epistemic probabilities. This argument was propounded by E.T. Jaynes: it leads to two generalizations, namely the principle of transformation groups as in the Jeffreys prior, and the principle of maximum entropy.
More generally, one speaks of uninformative priors.