The propensity theory of probability is one interpretation of the concept of probability. Theorists who adopt this interpretation think of probability as a physical propensity, or disposition, or tendency of a given type of physical situation to yield an outcome of a certain kind, or to yield a long runrelative frequency of such an outcome. Propensities are not relative frequencies, but purported causes of the observed stable relative frequencies. Propensities are invoked to explain why repeating a certain kind of experiment will generate a given outcome type at a persistent rate. A central aspect of this explanation is the law of large numbers. This law, which is a consequence of the axioms of probability, says that if a coin is tossed repeatedly many times, in such a way that its probability of landing heads is the same on each toss, and the outcomes are probabilistically independent, then the relative frequency of heads will be close to the probability of heads on each single toss. This law suggests that stable long-run frequencies are a manifestation of invariant single-case probabilities. Frequentists are unable to take this approach, since relative frequencies do not exist for single tosses of a coin, but only for large ensembles or collectives. Hence, these single-case probabilities are known as propensities or chances. In addition to explaining the emergence of stable relative frequencies, the idea of propensity is motivated by the desire to make sense of single-case probability attributions in quantum mechanics, such as the probability of decay of a particular atom at a particular moment. The main challenge facing propensity theories is to say exactly what propensity means, and to show that propensity thus defined has the required properties.
A later propensity theory was proposed by philosopher Karl Popper, who had only slight acquaintance with the writings of Charles S. Peirce, however. Popper noted that the outcome of a physical experiment is produced by a certain set of "generating conditions". When we repeat an experiment, as the saying goes, we really perform another experiment with a similar set of generating conditions. To say that a set of generating conditions has propensity p of producing the outcome E means that those exact conditions, if repeated indefinitely, would produce an outcome sequence in which E occurred with limiting relative frequency p. For Popper then, a deterministic experiment would have propensity 0 or 1 for each outcome, since those generating conditions would have the same outcome on each trial. In other words, non-trivial propensities only exist for genuinely indeterministic experiments. Popper's propensities, while they are not relative frequencies, are yet defined in terms of relative frequency. As a result, they face many of the serious problems that plague frequency theories. First, propensities cannot be empirically ascertained, on this account, since the limit of a sequence is a tail event, and is thus independent of its finite initial segments. Seeing a coin land heads every time for the first million tosses, for example, tells one nothing about the limiting proportion of heads on Popper's view. Moreover, the use of relative frequency to define propensity assumes the existence of stable relative frequencies, so one cannot then use propensity to explain the existence of stable relative frequencies, via the Law of large numbers.
Recent work
A number of other philosophers, including David Miller and Donald A. Gillies, have proposed propensity theories somewhat similar to Popper's, in that propensities are defined in terms of either long-run or infinitely long-run relative frequencies. Other propensity theorists do not explicitly define propensities at all, but rather see propensity as defined by the theoretical role it plays in science. They argue, for example, that physical magnitudes such as electrical charge cannot be explicitly defined either, in terms of more basic things, but only in terms of what they do. In a similar way, propensity is whatever fills the various roles that physical probability plays in science. Other theories have been offered by D. H. Mellor, and Ian Hacking
What roles does physical probability play in science? What are its properties? One central property of chance is that, when known, it constrains rational belief to take the same numerical value. David Lewis called this the Principal Principle, a term that philosophers have mostly adopted. For example, suppose you are certain that a particular biased coin has propensity 0.32 to land heads every time it is tossed. What is then the correct price for a gamble that pays $1 if the coin lands heads, and nothing otherwise? According to the Principal Principle, the fair price is 32 cents. It is argued that Propensity Theories fail to meet the Principal Principle.