Deductive-nomological model


The deductive-nomological model of scientific explanation, also known as Hempel's model, the Hempel–Oppenheim model, the Popper–Hempel model, or the covering law model, is a formal view of scientifically answering questions asking, "Why...?". The DN model poses scientific explanation as a deductive structure—that is, one where truth of its premises entails truth of its conclusion—hinged on accurate prediction or postdiction of the phenomenon to be explained.
Because of problems concerning humans' ability to define, discover, and know causality, this was omitted in initial formulations of the DN model. Causality was thought to be incidentally approximated by realistic selection of premises that derive the phenomenon of interest from observed starting conditions plus general laws. Still, the DN model formally permitted causally irrelevant factors. Also, derivability from observations and laws sometimes yielded absurd answers.
When logical empiricism fell out of favor in the 1960s, the DN model was widely seen as a flawed or greatly incomplete model of scientific explanation. Nonetheless, it remained an idealized version of scientific explanation, and one that was rather accurate when applied to modern physics. In the early 1980s, a revision to the DN model emphasized maximal specificity for relevance of the conditions and axioms stated. Together with Hempel's inductive-statistical model, the DN model forms scientific explanation's covering law model, which is also termed, from critical angle, subsumption theory.

Form

The term deductive distinguishes the DN model's intended determinism from the probabilism of inductive inferences. The term nomological is derived from the Greek word :wikt:νόμος|νόμος or nomos, meaning "law". The DN model holds to a view of scientific explanation whose conditions of adequacy —semiformal but stated classically—are derivability, lawlikeness, empirical content, and truth.
In the DN model, a law axiomatizes an unrestricted generalization from antecedent A to consequent B by conditional proposition—If A, then B—and has empirical content testable. A law differs from mere true regularity—for instance, George always carries only $1 bills in his wallet—by supporting counterfactual claims and thus suggesting what must be true, while following from a scientific theory's axiomatic structure.
The phenomenon to be explained is the explanandum—an event, law, or theory—whereas the premises to explain it are explanans, true or highly confirmed, containing at least one universal law, and entailing the explanandum. Thus, given the explanans as initial, specific conditions C1, C2... Cn plus general laws L1, L2... Ln, the phenomenon E as explanandum is a deductive consequence, thereby scientifically explained.

Roots

's scientific explanation in Physics resembles the DN model, an idealized form of scientific explanation. The framework of Aristotelian physics—Aristotelian metaphysics—reflected the perspective of this principally biologist, who, amid living entities' undeniable purposiveness, formalized vitalism and teleology, an intrinsic morality in nature. With emergence of Copernicanism, however, Descartes introduced mechanical philosophy, then Newton rigorously posed lawlike explanation, both Descartes and especially Newton shunning teleology within natural philosophy. At 1740, David Hume staked Hume's fork, highlighted the problem of induction, and found humans ignorant of either necessary or sufficient causality. Hume also highlighted the fact/value gap, as what is does not itself reveal what ought.
Near 1780, countering Hume's ostensibly radical empiricism, Immanuel Kant highlighted extreme rationalism—as by Descartes or Spinoza—and sought middle ground. Inferring the mind to arrange experience of the world into substance, space, and time, Kant placed the mind as part of the causal constellation of experience and thereby found Newton's theory of motion universally true, yet knowledge of things in themselves impossible. Safeguarding science, then, Kant paradoxically stripped it of scientific realism. Aborting Francis Bacon's inductivist mission to dissolve the veil of appearance to uncover the noumena—metaphysical view of nature's ultimate truths—Kant's transcendental idealism tasked science with simply modeling patterns of phenomena. Safeguarding metaphysics, too, it found the mind's constants holding also universal moral truths, and launched German idealism, increasingly speculative.
Auguste Comte found the problem of induction rather irrelevant since enumerative induction is grounded on the empiricism available, while science's point is not metaphysical truth. Comte found human knowledge had evolved from theological to metaphysical to scientific—the ultimate stage—rejecting both theology and metaphysics as asking questions unanswerable and posing answers unverifiable. Comte in the 1830s expounded positivism—the first modern philosophy of science and simultaneously a political philosophy—rejecting conjectures about unobservables, thus rejecting search for causes. Positivism predicts observations, confirms the predictions, and states a law, thereupon applied to benefit human society. From late 19th century into the early 20th century, the influence of positivism spanned the globe. Meanwhile, evolutionary theory's natural selection brought the Copernican Revolution into biology and eventuated in the first conceptual alternative to vitalism and teleology.

Growth

Whereas Comtean positivism posed science as description, logical positivism emerged in the late 1920s and posed science as explanation, perhaps to better unify empirical sciences by covering not only fundamental science—that is, fundamental physics—but special sciences, too, such as biology, psychology, economics, and anthropology. After defeat of National Socialism with World War II's close in 1945, logical positivism shifted to a milder variant, logical empiricism. All variants of the movement, which lasted until 1965, are neopositivism, sharing the quest of verificationism.
Neopositivists led emergence of the philosophy subdiscipline philosophy of science, researching such questions and aspects of scientific theory and knowledge. Scientific realism takes scientific theory's statements at face value, thus accorded either falsity or truth—probable or approximate or actual. Neopositivists held scientific antirealism as instrumentalism, holding scientific theory as simply a device to predict observations and their course, while statements on nature's unobservable aspects are elliptical at or metaphorical of its observable aspects, rather.
DN model received its most detailed, influential statement by Carl G Hempel, first in his 1942 article "The function of general laws in history", and more explicitly with Paul Oppenheim in their 1948 article "Studies in the logic of explanation". Leading logical empiricist, Hempel embraced the Humean empiricist view that humans observe sequence of sensory events, not cause and effect, as causal relations and casual mechanisms are unobservables. DN model bypasses causality beyond mere constant conjunction: first an event like A, then always an event like B.
Hempel held natural laws—empirically confirmed regularities—as satisfactory, and if included realistically to approximate causality. In later articles, Hempel defended DN model and proposed probabilistic explanation by inductive-statistical model. DN model and IS model—whereby the probability must be high, such as at least 50%—together form covering law model, as named by a critic, William Dray. Derivation of statistical laws from other statistical laws goes to the deductive-statistical model. Georg Henrik von Wright, another critic, named the totality subsumption theory.

Decline

Amid failure of neopositivism's fundamental tenets, Hempel in 1965 abandoned verificationism, signaling neopositivism's demise. From 1930 onward, Karl Popper had refuted any positivism by asserting falsificationism, which Popper claimed had killed positivism, although, paradoxically, Popper was commonly mistaken for a positivist. Even Popper's 1934 book embraces DN model, widely accepted as the model of scientific explanation for as long as physics remained the model of science examined by philosophers of science.
In the 1940s, filling the vast observational gap between cytology and biochemistry, cell biology arose and established existence of cell organelles besides the nucleus. Launched in the late 1930s, the molecular biology research program cracked a genetic code in the early 1960s and then converged with cell biology as cell and molecular biology, its breakthroughs and discoveries defying DN model by arriving in quest not of lawlike explanation but of causal mechanisms. Biology became a new model of science, while special sciences were no longer thought defective by lacking universal laws, as borne by physics.
In 1948, when explicating DN model and stating scientific explanation's semiformal conditions of adequacy, Hempel and Oppenheim acknowledged redundancy of the third, empirical content, implied by the other three—derivability, lawlikeness, and truth. In the early 1980s, upon widespread view that causality ensures the explanans' relevance, Wesley Salmon called for returning cause to because, and along with James Fetzer helped replace CA3 empirical content with CA3' strict maximal specificity.
Salmon introduced causal mechanical explanation, never clarifying how it proceeds, yet reviving philosophers' interest in such. Via shortcomings of Hempel's inductive-statistical model, Salmon introduced statistical-relevance model. Although DN model remained an idealized form of scientific explanation, especially in applied sciences, most philosophers of science consider DN model flawed by excluding many types of explanations generally accepted as scientific.

Strengths

As theory of knowledge, epistemology differs from ontology, which is a subbranch of metaphysics, theory of reality. Ontology poses which categories of being—what sorts of things exist—and so, although a scientific theory's ontological commitment can be modified in light of experience, an ontological commitment inevitably precedes empirical inquiry.
Natural laws, so called, are statements of humans' observations, thus are epistemological—concerning human knowledge—the epistemic. Causal mechanisms and structures existing putatively independently of minds exist, or would exist, in the natural world's structure itself, and thus are ontological, the ontic. Blurring epistemic with ontic—as by incautiously presuming a natural law to refer to a causal mechanism, or to trace structures realistically during unobserved transitions, or to be true regularities always unvarying—tends to generate a category mistake.
Discarding ontic commitments, including causality per se, DN model permits a theory's laws to be reduced to—that is, subsumed by—a more fundamental theory's laws. The higher theory's laws are explained in DN model by the lower theory's laws. Thus, the epistemic success of Newtonian theory's law of universal gravitation is reduced to—thus explained by—Albert Einstein's general theory of relativity, although Einstein's discards Newton's ontic claim that universal gravitation's epistemic success predicting Kepler's laws of planetary motion is through a causal mechanism of a straightly attractive force instantly traversing absolute space despite absolute time.
Covering law model reflects neopositivism's vision of empirical science, a vision interpreting or presuming unity of science, whereby all empirical sciences are either fundamental science—that is, fundamental physics—or are special sciences, whether astrophysics, chemistry, biology, geology, psychology, economics, and so on. All special sciences would network via covering law model. And by stating boundary conditions while supplying bridge laws, any special law would reduce to a lower special law, ultimately reducing—theoretically although generally not practically—to fundamental science.

Weaknesses

By DN model, if one asks, "Why is that shadow 20 feet long?", another can answer, "Because that flagpole is 15 feet tall, the Sun is at x angle, and laws of electromagnetism". Yet by problem of symmetry, if one instead asked, "Why is that flagpole 15 feet tall?", another could answer, "Because that shadow is 20 feet long, the Sun is at x angle, and laws of electromagnetism", likewise a deduction from observed conditions and scientific laws, but an answer clearly incorrect. By the problem of irrelevance, if one asks, "Why did that man not get pregnant?", one could in part answer, among the explanans, "Because he took birth control pills"—if he factually took them, and the law of their preventing pregnancy—as covering law model poses no restriction to bar that observation from the explanans.
Many philosophers have concluded that causality is integral to scientific explanation. DN model offers a necessary condition of a causal explanation—successful prediction—but not sufficient conditions of causal explanation, as a universal regularity can include spurious relations or simple correlations, for instance Z always following Y, but not Z because of Y, instead Y and then Z as an effect of X. By relating temperature, pressure, and volume of gas within a container, Boyle's law permits prediction of an unknown variable—volume, pressure, or temperature—but does not explain why to expect that unless one adds, perhaps, the kinetic theory of gases.
Scientific explanations increasingly pose not determinism's universal laws, but probabilism's chance, ceteris paribus laws. Smoking's contribution to lung cancer fails even the inductive-statistical model, requiring probability over 0.5. to 1 An applied science that applies statistics seeking associations between events, epidemiology cannot show causality, but consistently found higher incidence of lung cancer in smokers versus otherwise similar nonsmokers, although the proportion of smokers who develop lung cancer is modest. Versus nonsmokers, however, smokers as a group showed over 20 times the risk of lung cancer, and in conjunction with basic research, consensus followed that smoking had been scientifically explained as a cause of lung cancer, responsible for some cases that without smoking would not have occurred, a probabilistic counterfactual causality.

Covering action

Through lawlike explanation, fundamental physics—often perceived as fundamental science—has proceeded through intertheory relation and theory reduction, thereby resolving experimental paradoxes to great historical success, resembling covering law model. In early 20th century, Ernst Mach as well as Wilhelm Ostwald had resisted Ludwig Boltzmann's reduction of thermodynamics—and thereby Boyle's law—to statistical mechanics partly because it rested on kinetic theory of gas, hinging on atomic/molecular theory of matter. Mach as well as Ostwald viewed matter as a variant of energy, and molecules as mathematical illusions, as even Boltzmann thought possible.
In 1905, via statistical mechanics, Albert Einstein predicted the phenomenon Brownian motion—unexplained since reported in 1827 by botanist Robert Brown. Soon, most physicists accepted that atoms and molecules were unobservable yet real. Also in 1905, Einstein explained the electromagnetic field's energy as distributed in particles, doubted until this helped resolve atomic theory in the 1910s and 1920s. Meanwhile, all known physical phenomena were gravitational or electromagnetic, whose two theories misaligned. Yet belief in aether as the source of all physical phenomena was virtually unanimous. At experimental paradoxes, physicists modified the aether's hypothetical properties.
Finding the luminiferous aether a useless hypothesis, Einstein in 1905 a priori unified all inertial reference frames to state special principle of relativity, which, by omitting aether, converted space and time into relative phenomena whose relativity aligned electrodynamics with the Newtonian principle Galilean relativity or invariance. Originally epistemic or instrumental, this was interpreted as ontic or realist—that is, a causal mechanical explanation—and the principle became a theory, refuting Newtonian gravitation. By predictive success in 1919, general relativity apparently overthrew Newton's theory, a revolution in science resisted by many yet fulfilled around 1930.
In 1925, Werner Heisenberg as well as Erwin Schrödinger independently formalized quantum mechanics. Despite clashing explanations, the two theories made identical predictions. Paul Dirac's 1928 model of the electron was set to special relativity, launching QM into the first quantum field theory, quantum electrodynamics. From it, Dirac interpreted and predicted the electron's antiparticle, soon discovered and termed positron, but the QED failed electrodynamics at high energies. Elsewhere and otherwise, strong nuclear force and weak nuclear force were discovered.
In 1941, Richard Feynman introduced QM's path integral formalism, which if taken toward interpretation as a causal mechanical model clashes with Heisenberg's matrix formalism and with Schrödinger's wave formalism, although all three are empirically identical, sharing predictions. Next, working on QED, Feynman sought to model particles without fields and find the vacuum truly empty. As each known fundamental force is apparently an effect of a field, Feynman failed. Louis de Broglie's waveparticle duality had rendered atomism—indivisible particles in a void—untenable, and highlighted the very notion of discontinuous particles as selfcontradictory.
Meeting in 1947, Freeman Dyson, Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga soon introduced renormalization, a procedure converting QED to physics' most predictively precise theory, subsuming chemistry, optics, and statistical mechanics. QED thus won physicists' general acceptance. Paul Dirac criticized its need for renormalization as showing its unnaturalness, and called for an aether. In 1947, Willis Lamb had found unexpected motion of electron orbitals, shifted since the vacuum is not truly empty. Yet emptiness was catchy, abolishing aether conceptually, and physics proceeded ostensibly without it, even suppressing it. Meanwhile, "sickened by untidy math, most philosophers of physics tend to neglect QED".
Physicists have feared even mentioning aether, renamed vacuum, which—as such—is nonexistent. General philosophers of science commonly believe that aether, rather, is fictitious, "relegated to the dustbin of scientific history ever since" 1905 brought special relativity. Einstein was noncommittal to aether's nonexistence, simply said it superfluous. Abolishing Newtonian motion for electrodynamic primacy, however, Einstein inadvertently reinforced aether, and to explain motion was led back to aether in general relativity. Yet resistance to relativity theory became associated with earlier theories of aether, whose word and concept became taboo. Einstein explained special relativity's compatibility with an aether, but Einstein aether, too, was opposed. Objects became conceived as pinned directly on space and time by abstract geometric relations lacking ghostly or fluid medium.
By 1970, QED along with weak nuclear field was reduced to electroweak theory, and the strong nuclear field was modeled as quantum chromodynamics. Comprised by EWT, QCD, and Higgs field, this Standard Model of particle physics is an "effective theory", not truly fundamental. As QCD's particles are considered nonexistent in the everyday world, QCD especially suggests an aether, routinely found by physics experiments to exist and to exhibit relativistic symmetry. Confirmation of the Higgs particle, modeled as a condensation within the Higgs field, corroborates aether, although physics need not state or even include aether. Organizing regularities of observations—as in the covering law model—physicists find superfluous the quest to discover aether.
In 1905, from special relativity, Einstein deduced mass–energy equivalence, particles being variant forms of distributed energy, how particles colliding at vast speed experience that energy's transformation into mass, producing heavier particles, although physicists' talk promotes confusion. As "the contemporary locus of metaphysical research", QFTs pose particles not as existing individually, yet as excitation modes of fields, the particles and their masses being states of aether, apparently unifying all physical phenomena as the more fundamental causal reality, as long ago foreseen. Yet a quantum field is an intricate abstraction—a mathematical field—virtually inconceivable as a classical field's physical properties. Nature's deeper aspects, still unknown, might elude any possible field theory.
Though discovery of causality is popularly thought science's aim, search for it was shunned by the Newtonian research program, even more Newtonian than was Isaac Newton. By now, most theoretical physicists infer that the four, known fundamental interactions would reduce to superstring theory, whereby atoms and molecules, after all, are energy vibrations holding mathematical, geometric forms. Given uncertainties of scientific realism, some conclude that the concept causality raises comprehensibility of scientific explanation and thus is key folk science, but compromises precision of scientific explanation and is dropped as a science matures. Even epidemiology is maturing to heed the severe difficulties with presumptions about causality. Covering law model is among Carl G Hempel's admired contributions to philosophy of science.