Causal decision theory
Causal decision theory is a mathematical theory intended to determine the set of rational choices in a given situation. It is a school of thought in decision theory. In informal terms, it maintains that the rational choice is that with the best expected causal consequences. This theory is often contrasted with evidential decision theory, which recommends those actions that provide the best expected outcome conditional on one’s best evidence about the world.
Informal description
Informally, causal decision theory recommends the agent to make the decision with the best expected causal consequences. For example: if eating an apple will cause you to be happy and eating an orange will cause you to be sad then you would be rational to eat the apple. One complication is the notion of expected causal consequences. Imagine that eating a good apple will cause you to be happy and eating a bad apple will cause you to be sad but you aren't sure if the apple is good or bad. In this case you don't know the causal effects of eating the apple. Instead, then, you work from the expected causal effects, where these will depend on three things: how likely you think the apple is to be good and how likely you think it is to be bad; how happy eating a good apple makes you; and how sad eating a bad apple makes you. In informal terms, causal decision theory advises the agent to make the decision with the best expected causal effects.Formal description
In a 1981 article, Allan Gibbard and William Harper explained causal decision theory as maximization of the expected utility of an action "calculated from probabilities of counterfactuals":where is the desirability of outcome and is the counterfactual probability that, if were done, then would hold.
Difference from evidential decision theory
proved that the probability of a conditional does not always equal the conditional probability. If that were the case, causal decision theory would be equivalent to evidential decision theory, which uses conditional probabilities.Gibbard and Harper showed that if we accept two axioms, then the statistical independence of and suffices to guarantee that. However, there are cases in which actions and conditionals are not independent. Gibbard and Harper give an example in which King David wants Bathsheba but fears that summoning her would provoke a revolt.
Further, David has studied works on psychology and political science which teach him the following: Kings have two personality types, charismatic and uncharismatic. A king's degree of charisma depends on his genetic make-up and early childhood experiences, and cannot be changed in adulthood. Now, charismatic kings tend to act justly and uncharismatic kings unjustly. Successful revolts against charismatic kings are rare, whereas successful revolts against uncharismatic kings are frequent. Unjust acts themselves, though, do not cause successful revolts; the reason uncharismatic kings are prone to successful revolts is that they have a sneaky, ignoble bearing. David does not know whether or not he is charismatic; he does know that it is unjust to send for another man's wife.
In this case, evidential decision theory recommends that David abstain from Bathsheba, while causal decision theory—noting that whether David is charismatic or uncharismatic cannot be changed—recommends sending for her.
When required to choose between causal decision theory and evidential decision theory, philosophers usually prefer causal decision theory.
Criticism
Vagueness
The theory of causal decision theory does not itself specify what algorithm to use to calculate the counterfactual probabilities. One proposal is the "imaging" technique suggested by Lewis: To evaluate, move probability mass from each possible world to the closest possible world in which holds, assuming is possible. However, this procedure requires that we know what we would believe if we were certain of ; this is itself a conditional to which we might assign probability less than 1, leading to regress.Counterexamples
There are innumerable "counterexamples" where, it is argued, a straightforward application of CDT fails to produce a defensibly "sane" decision. Philosopher Andy Egan argues this is due to a fundamental disconnect between the intuitive rational rule, "do what you expect will bring about the best results", and CDT's algorithm of "do whatever has the best expected outcome, holding fixed our initial views about the likely causal structure of the world." In this view, it is CDT's requirement to "hold fixed the agent’s unconditional credences in dependency hypotheses" that leads to irrational decisions.An early alleged counterexample is Newcomb's problem. Because your choice of one or two boxes can't causally affect the Predictor's guess, causal decision theory recommends the two-boxing strategy. However, this results in getting only $1,000, not $1,000,000. Philosophers disagree whether one-boxing or two-boxing is the "rational" strategy. Similar concerns may arise even in seemingly-straightforward problems like the prisoner's dilemma, especially when playing opposite your "twin" whose choice to cooperate or defect correlates strongly, but is not caused by, your own choice.
In the "Death in Damascus" scenario, an anthropomorphic "Death" predicts where you will be tomorrow, and goes to wait for you there. As in Newcomb's problem, we postulate that Death is a reliable predictor. A CDT agent would be unable to process the correlation, and may as a consequence make irrational decisions: "You should rather play hide-and-seek against someone who cannot predict where you hide than against someone who can. Causal Decision Theory denies this. So Causal Decision Theory is false."
Another recent counterexample is the "Psychopath Button":
Paul is debating whether to press the ‘kill all psychopaths’ button. It would, he thinks, be much better to live in a world with no psychopaths. Unfortunately, Paul is quite confident that only a psychopath would press such a button. Paul very strongly prefers living in a world with psychopaths to dying. Should Paul press the button?
According to Egan, "pretty much everyone" agrees that Paul should not press the button, yet CDT endorses pressing the button.
Philosopher Jim Joyce, perhaps the most prominent modern defender of CDT, argues that CDT naturally is capable of taking into account any "information about what one is inclined or likely to do as evidence". This interpretation of CDT would require solving additional issues: How can a CDT agent avoid stumbling into having beliefs related to its own future acts, and thus becoming provably inconsistent via Gödelian incompleteness and Löb's theorem? How does the agent standing on a cliff avoid inferring that if he were to jump, he would probably have a parachute to break his fall?
Alternatives to causal and evidential decision theory
Some scholars believe that a new decision theory needs to be built from the ground up. Philosopher Christopher Meacham proposes "Cohesive Expected Utility Maximization": An agent "should perform the act picked out by a comprehensive strategy which maximizes cohesive expected utility". Meacham also proposes this can be extended to "Global Cohesive Expected Utility Maximization" to enable superrationality-style cooperation between agents. In the context of AI, bitcoin pioneer Wei Dai proposes "updateless decision theory", which adds to globally cohesive mechanisms the admittedly difficult concept of "logical counterfactuals" to avoid being blackmailed:Consider an agent that would pay up in response to a counterfactual blackmail. The blackmailer would predict this and blackmail the agent. Now, instead, consider an agent that would refuse to pay up in response to a counterfactual blackmail... The blackmailer would predict this too, and so would not blackmail the agent. Therefore, if we are constructing an agent that might encounter counterfactual blackmail, then it is a better overall policy to construct an agent that would refuse to pay up when blackmailed in this way.
It is an open question whether a satisfactory formalization of logical counterfactuals exists.