History of probability


has a dual aspect: on the one hand the likelihood of hypotheses given the evidence for them, and on the other hand the behavior of stochastic processes such as the throwing of dice or coins. The study of the former is historically older in, for example, the law of evidence, while the mathematical treatment of dice began with the work of Cardano, Pascal and Fermat between the 16th and 17th century.
Probability is distinguished from statistics; see history of statistics. While statistics deals with data and inferences from it, probability deals with the stochastic processes which lie behind data or outcomes.

Etymology

Probable and probability and their cognates in other modern languages derive from medieval learned Latin probabilis and, deriving from Cicero and generally applied to an opinion to mean plausible or generally approved. The form probability is from Old French probabilite and directly from Latin probabilitatem "credibility, probability," from probabilis.
The mathematical sense of the term is from 1718. In the 18th century, the term chance was also used in the mathematical sense of "probability". This word is ultimately from Latin cadentia, i.e. "a fall, case".
The English adjective likely is of Germanic origin, most likely from Old Norse likligr, originally meaning "having the appearance of being strong or able" "having the similar appearance or qualities", with a meaning of "probably" recorded mid-15c. The derived noun likelihood had a meaning of "similarity, resemblance" but took on a meaning of "probability" from the mid 15th century. The meaning "something likely to be true" is from 1570s.

Origins

Forms of probability and statistics were developed by Arab mathematicians studying cryptology between the 8th and 13th centuries. Al-Khalil wrote the Book of Cryptographic Messages which contains the first use of permutations and combinations to list all possible Arabic words with and without vowels. Al-Kindi was the first to use statistics to decipher encrypted messages and developed the first code breaking algorithm in the House of Wisdom in Baghdad, based on frequency analysis. He wrote a book entitled Manuscript on Deciphering Cryptographic Messages, containing detailed discussions on statistics. Al-Kindi also made the earliest known use of statistical inference in his work on cryptanalysis and frequency analysis. An important contribution of Ibn Adlan was on sample size for use of frequency analysis.
Ancient and medieval law of evidence developed a grading of degrees of proof, probabilities, presumptions and half-proof to deal with the uncertainties of evidence in court.
In Renaissance times, betting was discussed in terms of odds such as "ten to one" and maritime insurance premiums were estimated based on intuitive risks, but there was no theory on how to calculate such odds or premiums.
The mathematical methods of probability arose in the investigations first of Gerolamo Cardano in the 1560s, and then in the correspondence Pierre de Fermat and Blaise Pascal on such questions as the fair division of the stake in an interrupted game of chance. Christiaan Huygens gave a comprehensive treatment of the subject.
From Games, Gods and Gambling by F. N. David:

Eighteenth century

's Ars Conjectandi and Abraham De Moivre's The Doctrine of Chances put probability on a sound mathematical footing, showing how to calculate a wide range of complex probabilities. Bernoulli proved a version of the fundamental law of large numbers, which states that in a large number of trials, the average of the outcomes is likely to be very close to the expected value - for example, in 1000 throws of a fair coin, it is likely that there are close to 500 heads.

Nineteenth century

The power of probabilistic methods in dealing with uncertainty was shown by Gauss's determination of the orbit of Ceres from a few observations. The theory of errors used the method of least squares to correct error-prone observations, especially in astronomy, based on the assumption of a normal distribution of errors to determine the most likely true value. In 1812, Laplace issued his Théorie analytique des probabilités in which he consolidated and laid down many fundamental results in probability and statistics such as the moment-generating function, method of least squares, inductive probability, and hypothesis testing.
Towards the end of the nineteenth century, a major success of explanation in terms of probabilities was the Statistical mechanics of Ludwig Boltzmann and J. Willard Gibbs which explained properties of gases such as temperature in terms of the random motions of large numbers of particles.
The field of the history of probability itself was established by Isaac Todhunter's monumental A History of the Mathematical Theory of Probability from the Time of Pascal to that of Laplace.

Twentieth century

Probability and statistics became closely connected through the work on hypothesis testing of R. A. Fisher and Jerzy Neyman, which is now widely applied in biological and psychological experiments and in clinical trials of drugs, as well as in economics and elsewhere. A hypothesis, for example that a drug is usually effective, gives rise to a probability distribution that would be observed if the hypothesis is true. If observations approximately agree with the hypothesis, it is confirmed, if not, the hypothesis is rejected.
The theory of stochastic processes broadened into such areas as Markov processes and Brownian motion, the random movement of tiny particles suspended in a fluid. That provided a model for the study of random fluctuations in stock markets, leading to the use of sophisticated probability models in mathematical finance, including such successes as the widely used Black–Scholes formula for the valuation of options.
The twentieth century also saw long-running disputes on the interpretations of probability. In the mid-century frequentism was dominant, holding that probability means long-run relative frequency in a large number of trials. At the end of the century there was some revival of the Bayesian view, according to which the fundamental notion of probability is how well a proposition is supported by the evidence for it.
The mathematical treatment of probabilities, especially when there are infinitely many possible outcomes, was facilitated by Kolmogorov's axioms.