Radical probabilism is a doctrine in philosophy, in particular epistemology, and probability theory that holds that no facts are known for certain. That view holds profound implications for statistical inference. The philosophy is particularly associated with Richard Jeffrey who wittily characterised it with the dictum "It's probabilities all the way down."
Background
In frequentist statistics, Bayes' theorem provides a useful rule for updating a probability when new frequency data becomes available. In Bayesian statistics, the theorem itself plays a more limited role. Bayes' theorem connects probabilities that are held simultaneously. It does not tell the learner how to update probabilities when new evidence becomes available over time. This subtlety was first pointed out in terms by Ian Hacking in 1967. However, adopting Bayes' theorem is a temptation. Suppose that a learner forms probabilities Pold = p and Pold = q. If the learner subsequently learns that B is true, nothing in the axioms of probability or the results derived therefrom tells him how to behave. He might be tempted to adopt Bayes' theorem by analogy and set his Pnew = Pold = p/q. In fact, that step, Bayes' rule of updating, can be justified, as necessary and sufficient, through a dynamicDutch book argument that is additional to the arguments used to justify the probability axioms. This argument was first put forward by David Lewis in the 1970s though he never published it. The dynamic Dutch book argument for Bayesian updating has been criticised by Hacking, H. Kyburg, D. Christensen and P. Maher. It was defended by Brian Skyrms.
Certain and uncertain knowledge
That works when the new data is certain. C. I. Lewis had argued that "If anything is to be probable then something must be certain". There must, on Lewis' account, be some certain facts on which probabilities were conditioned. However, the principle known as Cromwell's rule declares that nothing, apart from a logical law, if that, can ever be known for certain. Jeffrey famously rejected Lewis' dictum. He later quipped, "It's probabilities all the way down," a reference to the "turtles all the way down" metaphor for the infinite regress problem. He called this position radical probabilism.
Conditioning on an uncertainty – probability kinematics
In this case Bayes' rule isn't able to capture a mere subjective change in the probability of some critical fact. The new evidence may not have been anticipated or even be capable of being articulated after the event. It seems reasonable, as a starting position, to adopt the law of total probability and extend it to updating in much the same way as was Bayes' theorem. Adopting such a rule is sufficient to avoid a Dutch book but not necessary. Jeffrey advocated this as a rule of updating under radical probabilism and called it probability kinematics. Others have named it Jeffrey conditioning.
Alternatives to probability kinematics
Probability kinetics is not the only sufficient updating rule for radical probabilism. Others have been advocated including E. T. Jaynes' maximum entropy principle, and Skyrms' principle of reflection. It turns out that probability kinematics is a special case of maximum entropy inference. However, maximum entropy is not a generalisation of all such sufficient updating rules.