History of malaria


The history of malaria stretches from its prehistoric origin as a zoonotic disease in the primates of Africa through to the 21st century. A widespread and potentially lethal human infectious disease, at its peak malaria infested every continent, except Antarctica. Its prevention and treatment have been targeted in science and medicine for hundreds of years. Since the discovery of the Plasmodium parasites which cause it, research attention has focused on their biology, as well as that of the mosquitoes which transmit the parasites.
References to its unique, periodic fevers are found throughout recorded history beginning in the first millennium BC in Greece and China.
For thousands of years, traditional herbal remedies have been used to treat malaria. The first effective treatment for malaria came from the bark of the cinchona tree, which contains quinine. After the link to mosquitos and their parasites was identified in the early twentieth century, mosquito control measures such as widespread use of the insecticide DDT, swamp drainage, covering or oiling the surface of open water sources, indoor residual spraying and use of insecticide treated nets was initiated. Prophylactic quinine was prescribed in malaria endemic areas, and new therapeutic drugs, including chloroquine and artemisinins, were used to resist the scourge. Today, artemisinin is present in every remedy applied in treatment of malaria. After introducing artemisinin as a cure administered together with other remedies, malaria mortality in Africa went down by half, though it later partially rebounded.
Malaria researchers have won multiple Nobel Prizes for their achievements, although the disease continues to afflict some 200 million patients each year, killing more than 600,000.
Malaria was the most important health hazard encountered by U.S. troops in the South Pacific during World War II, where about 500,000 men were infected. According to Joseph Patrick Byrne, "Sixty thousand American soldiers died of malaria during the African and South Pacific campaigns."
At the close of the 20th century, malaria remained endemic in more than 100 countries throughout the tropical and subtropical zones, including large areas of Central and South America, Hispaniola, Africa, the Middle East, the Indian subcontinent, Southeast Asia, and Oceania. Resistance of Plasmodium to anti-malaria drugs, as well as resistance of mosquitos to insecticides and the discovery of zoonotic species of the parasite have complicated control measures.

Origin and prehistoric period

The first evidence of malaria parasites was found in mosquitoes preserved in amber from the Palaeogene period that are approximately 30 million years old. Human malaria likely originated in Africa and coevolved with its hosts, mosquitoes and non-human primates. Malaria protozoa are diversified into primate, rodent, bird, and reptile host lineages. Humans may have originally caught Plasmodium falciparum from gorillas. P. vivax, another malarial Plasmodium species among the six that infect humans, also likely originated in African gorillas and chimpanzees. Another malarial species recently discovered to be transmissible to humans, P. knowlesi, originated in Asian macaque monkeys. While P. malariae is highly host specific to humans, there is some evidence that low level non-symptomatic infection persists among wild chimpanzees.
About 10,000 years ago, malaria started having a major impact on human survival, coinciding with the start of agriculture in the Neolithic revolution. Consequences included natural selection for sickle-cell disease, thalassaemias, glucose-6-phosphate dehydrogenase deficiency, Southeast Asian ovalocytosis, elliptocytosis and loss of the Gerbich antigen and the Duffy antigen on the erythrocytes, because such blood disorders confer a selective advantage against malaria infection. The three major types of inherited genetic resistance were present in the Mediterranean world by the time of the Roman Empire, about 2000 years ago.
Molecular methods have confirmed the high prevalence of P. falciparum malaria in ancient Egypt. The Ancient Greek historian Herodotus wrote that the builders of the Egyptian pyramids were given large amounts of garlic, probably to protect them against malaria. The Pharaoh Sneferu, the founder of the Fourth dynasty of Egypt, who reigned from around 2613–2589 BCE, used bed-nets as protection against mosquitoes. Cleopatra VII, the last Pharaoh of Ancient Egypt, similarly slept under a mosquito net. However, whether the mosquito nets were used for the purpose of malaria prevention, or for more mundane purpose of avoiding the discomfort of mosquito bites, is unknown. The presence of malaria in Egypt from circa 800 BCE onwards has been confirmed using DNA-based methods.

Classical period

Malaria became widely recognized in ancient Greece by the 4th century BC, and is implicated in the decline of many city-state populations. The term μίασμα : "stain, pollution", was coined by Hippocrates of Kos who used it to describe dangerous fumes from the ground that are transported by winds and can cause serious illnesses. Hippocrates, the "father of medicine", related the presence of intermittent fevers with climatic and environmental conditions and classified the fever according to periodicity: Gk.:tritaios pyretos / L.:febris tertiana, and Gk.:tetartaios pyretos / L.:febris quartana.
The Chinese Huangdi Neijing dating from ~300 BC – 200 AD apparently refers to repeated paroxysmal fevers associated with enlarged spleens and a tendency to epidemic occurrence.
Around 168 BC, the herbal remedy Qing-hao came into use in China to treat female hemorrhoids.
Qing-hao was first recommended for acute intermittent fever episodes by Ge Hong as an effective medication in the 4th-century Chinese manuscript Zhou hou bei ji fang, usually translated as "Emergency Prescriptions kept in one's Sleeve". His recommendation was to soak fresh plants of the artemisia herb in cold water, wring it out and ingest the expressed bitter juice in its raw state.
'Roman fever' refers to a particularly deadly strain of malaria that affected the Roman Campagna and the city of Rome throughout various epochs in history. An epidemic of Roman fever during the fifth century AD may have contributed to the fall of the Roman empire. The many remedies to reduce the spleen in Pedanius Dioscorides's De Materia Medica have been suggested to have been a response to chronic malaria in the Roman empire. Some so-called "vampire burials" in late antiquity may have been performed in response to malaria epidemics. For example, some children who died of malaria were buried in the necropolis at Lugnano in Teverina using rituals meant to prevent them from returning from the dead. Modern scholars hypothesize that communities feared that the dead would return and spread disease.
In 835, the celebration of Hallowmas was moved from May to November at the behest of Pope Gregory IV, on the "practical grounds that Rome in summer could not accommodate the great number of pilgrims who flocked to it", and perhaps because of public health considerations regarding Roman Fever, which claimed a number of lives of pilgrims during the sultry summers of the region.

Middle Ages

During the Middle Ages, treatments for malaria included blood-letting, inducing vomiting, limb amputations and trepanning. Physicians and surgeons in the period used herbal medicines like belladonna to bring about pain relief in afflicted patients.

European Renaissance

The name malaria derived from mal aria. This idea came from the Ancient Romans who thought that this disease came from pestilential fumes in the swamps. The word malaria has its roots in the miasma theory, as described by historian and chancellor of Florence Leonardo Bruni in his Historiarum Florentini populi libri XII, which was the first major example of Renaissance historical writing:

Avuto i Fiorentini questo fortissimo castello e fornitolo di buone guardie, consigliavano fra loro medesimi fosse da fare. Erano alcuni a' quali pareva sommamente utile e necessario a ridurre lo esercito, e massimamente essendo affaticato per la infermità e per la mala ariae per lungo e difficile campeggiare nel tempo dell'autunno e in luoghi infermi, e vedendo ancora ch'egli era diminuito assai per la licenza conceduta a molti pel capitano di potersi partire: perocchè, nel tempo che eglino erano stati lungamente a quello assedio, molti, o per disagio del campo o per paura d'infermità, avevano domandato e ottenuto licenza da lui.
After the Florentines had conquered this stronghold, after putting good guardians on it they were discussing among themselves how to proceed. For some of them it appeared most useful and necessary to reduce the army, more so as it was extremely stressed by disease and bad air, and due to the long-lasting and difficult camps in unhealthy places during the autumn. They further considered that the army was reduced in numbers due to the leave permits granted to many soldiers by their officers. In fact, during the siege, many soldiers had asked and obtained leave permits due to the camp hardships and fear of illness .

The coastal plains of southern Italy fell from international prominence when malaria expanded in the sixteenth century. At roughly the same time, in the coastal marshes of England, mortality from "marsh fever" or "tertian ague" was comparable to that in sub-Saharan Africa today. William Shakespeare was born at the start of the especially cold period that climatologists call the "Little Ice Age", yet he was aware enough of the ravages of the disease to mention it in eight of his plays.
Medical accounts and ancient autopsy reports state that tertian malarial fevers caused the death of four members of the prominent Medici family of Florence. These claims have been confirmed with more modern methodologies.

Spread to the Americas

Malaria was not referenced in the "medical books" of the Mayans or Aztecs. European settlers and the West Africans they enslaved likely brought malaria to the Americas in the 16th century.
In the book , the author Charles Mann cites sources that speculate that the reason African slaves were brought to the British Americas was because of their resistance to malaria. The colonies needed low-paid agricultural labor, and large numbers of poor British were ready to emigrate. North of the Mason–Dixon line, where malaria-transmitting mosquitoes did not fare well, British indentured servants proved more profitable, as they would work diligently toward their freedom. However, as malaria spread to places such as the tidewater of Virginia and South Carolina, the owners of large plantations came to rely on the enslavement of more malaria-resistant West Africans, while white small landholders risked ruin whenever they got sick. The disease also helped weaken the Native American population and make them more susceptible to other diseases.
Malaria caused huge losses to British forces in the South during the revolutionary war as well as to Union forces during the Civil War.

Cinchona tree

Spanish missionaries found that fever was treated by Amerindians near Loxa with powder from Peruvian bark. It was used by the Quechua Indians of Ecuador to reduce the shaking effects caused by severe chills. Jesuit Brother Agostino Salumbrino, who lived in Lima and was an apothecary by training, observed the Quechua using the bark of the cinchona tree for that purpose. While its effect in treating malaria was unrelated to its effect in controlling shivering from cold, it was nevertheless effective for malaria. The use of the “fever tree” bark was introduced into European medicine by Jesuit missionaries. Jesuit Bernabé de Cobo, who explored Mexico and Peru, is credited with taking cinchona bark to Europe. He brought the bark from Lima to Spain, and then to Rome and other parts of Italy, in 1632. Francesco Torti wrote in 1712 that only “intermittent fever” was amenable to the fever tree bark. This work finally established the specific nature of cinchona bark and brought about its general use in medicine.
It would be nearly 200 years before the active principles, quinine and other alkaloids, of cinchona bark were isolated. Quinine, a toxic plant alkaloid, is, in addition to its anti-malarial properties, moderately effective against nocturnal leg cramps.

Clinical indications

In 1717, the dark pigmentation of a postmortem spleen and brain was published by the epidemiologist Giovanni Maria Lancisi in his malaria text book De noxiis paludum effluviis eorumque remediis. This was one of the earliest reports of the characteristic enlargement of the spleen and dark color of the spleen and brain which are the most constant post-mortem indications of chronic malaria infection. He related the prevalence of malaria in swampy areas to the presence of flies and recommended swamp drainage to prevent it.

19th century

In the nineteenth century, the first drugs were developed to treat malaria and parasites were first identified as its source.

Antimalarial drugs

Quinine

French chemist Pierre Joseph Pelletier and French pharmacist Joseph Bienaimé Caventou separated in 1820 the alkaloids cinchonine and quinine from powdered fever tree bark, allowing for the creation of standardized doses of the active ingredients. Prior to 1820, the bark was simply dried, ground to a fine powder and mixed into a liquid for drinking.
An English trader, Charles Ledger, and his Amerindian servant spent four years collecting cinchona seeds in the Andes in Bolivia, highly prized for their quinine but whose export was prohibited. Ledger managed to get seeds out; in 1865, the Dutch government cultivated 20,000 trees of the Cinchona ledgeriana in Java. By the end of the nineteenth century, the Dutch had established a world monopoly over its supply.

'Warburg's Tincture'

In 1834, in British Guiana, a German physician, Carl Warburg, invented an antipyretic medicine: 'Warburg's Tincture'. This secret, proprietary remedy contained quinine and other herbs. Trials were made in Europe in the 1840s and 1850s. It was officially adopted by the Austrian Empire in 1847. It was considered by many eminent medical professionals to be a more efficacious antimalarial than quinine. It was also more economical. The British Government supplied Warburg's Tincture to troops in India and other colonies.

Methylene blue

In 1876, methylene blue was synthesized by German chemist Heinrich Caro. Paul Ehrlich in 1880 described the use of "neutral" dyes – mixtures of acidic and basic dyes for the differentiation of cells in peripheral blood smears. In 1891 Ernst Malachowski and Dmitri Leonidovich Romanowsky independently developed techniques using a mixture of Eosin Y and modified methylene blue that produced a surprising hue unattributable to either staining component: a shade of purple. Malachowski used alkali-treated methylene blue solutions and Romanowsky used methylene blue solutions which were molded or aged. This new method differentiated blood cells and demonstrated the nuclei of malarial parasites. Malachowski's staining technique was one of the most significant technical advances in the history of malaria.
In 1891, Paul Guttmann and Ehrlich noted that methylene blue had a high affinity for some tissues and that this dye had a slight antimalarial property. Methylene blue and its congeners may act by preventing the biocrystallization of heme.
observed pigmented parasites and the exflagellation of male gametocytes.

Cause: Identification of Plasmodium and Anopheles

In 1848, German anatomist Johann Heinrich Meckel recorded black-brown pigment granules in the blood and spleen of a patient who had died in a mental hospital. Meckel was thought to have been looking at malaria parasites without realizing it; he did not mention malaria in his report. He hypothesized that the pigment was melanin. The causal relationship of pigment to the parasite was established in 1880, when French physician Charles Louis Alphonse Laveran, working in the military hospital of Constantine, Algeria, observed pigmented parasites inside the red blood cells of malaria sufferers. He witnessed the events of exflagellation and became convinced that the moving flagella were parasitic microorganisms. He noted that quinine removed the parasites from the blood. Laveran called this microscopic organism Oscillaria malariae and proposed that malaria was caused by this protozoan. This discovery remained controversial until the development of the oil immersion lens in 1884 and of superior staining methods in 1890–1891.
In 1885, Ettore Marchiafava, Angelo Celli and Camillo Golgi studied the reproduction cycles in human blood. Golgi observed that all parasites present in the blood divided almost simultaneously at regular intervals and that division coincided with attacks of fever. In 1886 Golgi described the morphological differences that are still used to distinguish two malaria parasite species Plasmodium vivax and Plasmodium malariae. Shortly after this Sakharov in 1889 and Marchiafava & Celli in 1890 independently identified Plasmodium falciparum as a species distinct from P. vivax and P. malariae. In 1890, Grassi and Feletti reviewed the available information and named both P. malariae and P. vivax By 1890, Laveran's germ was generally accepted, but most of his initial ideas had been discarded in favor of the taxonomic work and clinical pathology of the Italian school. Marchiafava and Celli called the new microorganism Plasmodium. H. vivax was soon renamed Plasmodium vivax. In 1892, Marchiafava and Bignami proved that the multiple forms seen by Laveran were from a single species. This species was eventually named P. falciparum. Laveran was awarded the 1907 Nobel Prize for Physiology or Medicine "in recognition of his work on the role played by protozoa in causing diseases".
Dutch physician Pieter Pel first proposed a tissue stage of the malaria parasite in 1886, presaging its discovery by over 50 years. This suggestion was reiterated in 1893 when Golgi suggested that the parasites might have an undiscovered tissue phase. Pel in 1896 supported Golgi's latent phase theory.
first described pigmented malaria parasites in stomach tissues of an Anopheles mosquito, 20 and 21 August 1897
The establishment of the scientific method from about the mid-19th century on demanded testable hypotheses and verifiable phenomena for causation and transmission. Anecdotal reports, and the discovery in 1881 that mosquitos were the vector of yellow fever, eventually led to the investigation of mosquitoes in connection with malaria.
An early effort at malaria prevention occurred in 1896 in Massachusetts. An Uxbridge outbreak prompted health officer Dr. Leonard White to write a report to the State Board of Health, which led to a study of mosquito-malaria links and the first efforts for malaria prevention. Massachusetts state pathologist, Theobald Smith, asked that White's son collect mosquito specimens for further analysis, and that citizens add screens to windows, and drain collections of water.
Britain's Sir Ronald Ross, an army surgeon working in Secunderabad India, proved in 1897 that malaria is transmitted by mosquitoes, an event now commemorated via World Mosquito Day. He was able to find pigmented malaria parasites in a mosquito that he artificially fed on a malaria patient who had crescents in his blood. He continued his research into malaria by showing that certain mosquito species transmit malaria to sparrows and he isolated malaria parasites from the salivary glands of mosquitoes that had fed on infected birds. He reported this to the British Medical Association in Edinburgh in 1898.
Giovanni Battista Grassi, professor of Comparative Anatomy at Rome University, showed that human malaria could only be transmitted by Anopheles mosquitoes. Grassi along with coworkers Amico Bignami, Giuseppe Bastianelli and Ettore Marchiafava announced at the session of the Accademia dei Lincei on 4 December 1898 that a healthy man in a non-malarial zone had contracted tertian malaria after being bitten by an experimentally infected Anopheles claviger specimen.
In 1898–1899, Bastianelli, Bignami and Grassi were the first to observe the complete transmission cycle of P. falciparum, P. vivax and P. malaria from mosquito to human and back in A. claviger.
A dispute broke out between the British and Italian schools of malariology over priority, but Ross received the 1902 Nobel Prize for Physiology or Medicine for "his work on malaria, by which he has shown how it enters the organism and thereby has laid the foundation for successful research on this disease and methods of combating it".

Synthesis of quinine

, a student of August Wilhelm von Hofmann at the Royal College of Chemistry in London, unsuccessfully tried in the 1850s to synthesize quinine in a commercial process. The idea was to take two equivalents of N-allyltoluidine and three atoms of oxygen to produce quinine and water. Instead, Perkin's mauve was produced when attempting quinine total synthesis via the oxidation of N-allyltoluidine. Before Perkin's discovery, all dyes and pigments were derived from roots, leaves, insects, or, in the case of Tyrian purple, molluscs.
Quinine wouldn't be successfully synthesized until 1918. Synthesis remains elaborate, expensive and low yield, with the additional problem of separation of the stereoisomers. Though quinine is not one of the major drugs used in treatment, modern production still relies on extraction from the cinchona tree.

20th century

Etiology: Plasmodium tissue stage and reproduction

Relapses were first noted in 1897 by William S. Thayer, who recounted the experiences of a physician who relapsed 21 months after leaving an endemic area. He proposed the existence of a tissue stage. Relapses were confirmed by Patrick Manson, who allowed infected Anopheles mosquitoes to feed on his eldest son. The younger Manson then described a relapse nine months after his apparent cure with quinine.
Also, in 1900 Amico Bignami and Giuseppe Bastianelli found that they could not infect an individual with blood containing only gametocytes. The possibility of the existence of a chronic blood stage infection was proposed by Ronald Ross and David Thompson in 1910.
The existence of asexually-reproducing avian malaria parasites in cells of the internal organs was first demonstrated by Henrique de Beaurepaire Aragão in 1908.
Three possible mechanisms of relapse were proposed by Marchoux in 1926 parthenogenesis of macrogametocytes: persistence of schizonts in small numbers in the blood where immunity inhibits multiplication, but later disappears and/or reactivation of an encysted body in the blood. James in 1931 proposed that sporozoites are carried to internal organs, where they enter reticuloendothelial cells and undergo a cycle of development, based on quinine's lack of activity on them. Huff and Bloom in 1935 demonstrated stages of avian malaria that transpire outside blood cells. In 1945 Fairley et al. reported that inoculation of blood from a patient with P. vivax may fail to induce malaria, although the donor may subsequently exhibit the condition. Sporozoites disappeared from the blood stream within one hour and reappeared eight days later. This suggested the presence of forms that persist in tissues. Using mosquitoes rather than blood, in 1946 Shute described a similar phenomenon and proposed the existence of an 'x-body' or resting form. The following year Sapero proposed a link between relapse and a tissue stage not yet discovered. Garnham in 1947 described exoerythrocytic schizogony in Hepatocystis kochi. In the following year Shortt and Garnham described the liver stages of P. cynomolgi in monkeys. In the same year a human volunteer consented to receive a massive dose of infected sporozoites of P. vivax and undergo a liver biopsy three months later, thus allowing Shortt et al. to demonstrate the tissue stage. The tissue form of Plasmodium ovale was described in 1954 and that of P. malariae in 1960 in experimentally infected chimpanzees.
The latent or dormant liver form of the parasite, apparently responsible for the relapses characteristic of P. vivax and P. ovale infections, was first observed in the 1980s. The term hypnozoite was coined by Miles B. Markus while a student. In 1976, he speculated: "If sporozoites of Isospora can behave in this fashion, then those of related Sporozoa, like malaria parasites, may have the ability to survive in the tissues in a similar way." In 1982, Krotoski et al reported identification of P. vivax hypnozoites in liver cells of infected chimpanzees.

Malariotherapy

In the early twentieth century, before antibiotics, patients with tertiary syphilis were intentionally infected with malaria to induce a fever; this was called malariotherapy. In 1917, Julius Wagner-Jauregg, a Viennese psychiatrist, began to treat neurosyphilitics with induced Plasmodium vivax malaria. Three or four bouts of fever were enough to kill the temperature-sensitive syphilis bacteria. P. vivax infections were then terminated by quinine. By accurately controlling the fever with quinine, the effects of both syphilis and malaria could be minimized. While about 15% of patients died from malaria, this was preferable to the almost-certain death from syphilis. Therapeutic malaria opened up a wide field of chemotherapeutic research and was practised until 1950. Wagner-Jauregg was awarded the 1927 Nobel Prize in Physiology or Medicine for his discovery of the therapeutic value of malaria inoculation in the treatment of dementia paralytica.
Henry Heimlich advocated malariotherapy as a treatment for AIDS, and some studies of malariotherapy for HIV infection have been performed in China. The United States Centers for Disease Control and Prevention does not recommend the use of malariotherapy for HIV.

Panama Canal and vector control

In 1881, Dr. Carlos Finlay, a Cuban-born physician of Scottish ancestry, theorized that yellow fever was transmitted by a specific mosquito, later designated Aedes aegypti. The theory remained controversial for twenty years until confirmed in 1901 by Walter Reed. This was the first scientific proof of a disease being transmitted exclusively by an insect vector, and demonstrated that control of such diseases necessarily entailed control or eradication of its insect vector.
Yellow fever and malaria among workers had seriously delayed construction of the Panama Canal. Mosquito control instituted by William C. Gorgas dramatically reduced this problem.

Antimalarial drugs

Chloroquine

and colleagues synthesized and tested some 12,000 compounds, eventually producing Resochin® as a substitute for quinine in the 1930s. It is chemically related to quinine through the possession of a quinoline nucleus and the dialkylaminoalkylamino side chain. Resochin and a similar compound Sontochin were synthesized in 1934. In March 1946, the drug was officially named Chloroquine. Chloroquine is an inhibitor of hemozoin production through biocrystallization. Quinine and chloroquine affect malarial parasites only at life stages when the parasites are forming hematin-pigment as a byproduct of hemoglobin degradation.
Chloroquine-resistant forms of P. falciparum emerged only 19 years later. The first resistant strains were detected around the Cambodia‐Thailand border and in Colombia, in the 1950s. In 1989, chloroquine resistance in P. vivax was reported in Papua New Guinea. These resistant strains spread rapidly, producing a large mortality increase, particularly in Africa during the 1990s.

Artemisinins

Systematic screening of traditional Chinese medical herbs was carried out by Chinese research teams, consisting of hundreds of scientists in the 1960s and 1970s. Qinghaosu, later named artemisinin, was cold-extracted in a neutral milieu from the dried leaves of Artemisia annua.
Artemisinin was isolated by pharmacologist Tu Youyou. Tu headed a team tasked by the Chinese government with finding a treatment for choloroquine-resistant malaria. Their work was known as Project 523, named after the date it was announced – 23 May 1967. The team investigated more than 2000 Chinese herb preparations and by 1971 had made 380 extracts from 200 herbs. An extract from qinghao was effective but the results were variable. Tu reviewed the literature, including Zhou hou bei ji fang written in 340 BC by Chinese physician Ge Hong. This book contained the only useful reference to the herb: "A handful of qinghao immersed with two litres of water, wring out the juice and drink it all." Tu's team subsequently isolated a nontoxic, neutral extract that was 100% effective against parasitemia in animals. The first successful trials of artemisinin were in 1979.
being grown as a field crop in West Virginia for the production of artemisinin, 2005
Artemisinin is a sesquiterpene lactone containing a peroxide group, which is believed to be essential for its anti-malarial activity. Its derivatives, artesunate and artemether, have been used in clinics since 1987 for the treatment of drug-resistant and drug-sensitive malaria, especially cerebral malaria. These drugs are characterized by fast action, high efficacy and good tolerance. They kill the asexual forms of
P. berghei and P. cynomolgi and have transmission-blocking activity. In 1985, Zhou Yiqing and his team combined artemether and lumefantrine into a single tablet, which was registered as a medicine in China in 1992. Later it became known as “Coartem”. Artemisinin combination treatments are now widely used to treat uncomplicated falciparum'' malaria, but access to ACTs is still limited in most malaria-endemic countries and only a minority of the patients who need artemisinin-based combination treatments receive them.
In 2008 White predicted that improved agricultural practices, selection of high-yielding hybrids, microbial production, and the development of synthetic peroxides would lower prices.

Insecticides

Efforts to control the spread of malaria suffered a major setback in 1930: entomologist Raymond Corbett Shannon discovered imported disease-bearing Anopheles gambiae mosquitoes living in Brazil. This species of mosquito is a particularly efficient vector for malaria and is native to Africa. In 1938, the introduction of this vector caused the greatest epidemic of malaria ever seen in the New World. However, complete eradication of A. gambiae from northeast Brazil and thus from the New World was achieved in 1940 by the systematic application of the arsenic-containing compound Paris green to breeding places, and of pyrethrum spray-killing to adult resting places.

DDT

Austrian chemist Othmar Zeidler is credited with the first synthesis of DDT in 1874. The insecticidal properties of DDT were identified in 1939 by chemist Paul Hermann Müller of Geigy Pharmaceutical. For his discovery of DDT as a contact poison against several arthropods, he was awarded the 1948 Nobel Prize in Physiology or Medicine. In the fall of 1942, samples of the chemical were acquired by the United States, Britain and Germany. Laboratory tests demonstrated that it was highly effective against many insects.
Rockefeller Foundation studies showed in Mexico that DDT remained effective for six to eight weeks if sprayed on the inside walls and ceilings of houses and other buildings. The first field test in which residual DDT was applied to the interior surfaces of all habitations and outbuildings was carried out in central Italy in the spring of 1944. The objective was to determine the residual effect of the spray upon Anopheline density in the absence of other control measures. Spraying began in Castel Volturno and, after a few months, in the delta of the Tiber. The unprecedented effectiveness of the chemical was confirmed: the new insecticide was able to eradicate malaria by eradicating mosquitoes. At the end of World War II, a massive malaria control program based on DDT spraying was carried out in Italy. In Sardinia – the second largest island in the Mediterranean – between 1946 and 1951, the Rockefeller Foundation conducted a large-scale experiment to test the feasibility of the strategy of "species eradication" in an endemic malaria vector. Malaria was effectively eliminated in the United States by the use of DDT in the National Malaria Eradication Program. The concept of eradication prevailed in 1955 in the Eighth World Health Assembly: DDT was adopted as a primary tool in the fight against malaria.
In 1953, the World Health Organization launched an antimalarial program in parts of Liberia as a pilot project to determine the feasibility of malaria eradication in tropical Africa. However, these projects encountered difficulties that foreshadowed the general retreat from malaria eradication efforts across tropical Africa by the mid-1960s.
DDT was banned for agricultural uses in the US in 1972 after the discussion opened in 1962 by Silent Spring, written by American biologist Rachel Carson, which launched the environmental movement in the West. The book catalogued the environmental impacts of indiscriminate DDT spraying and suggested that DDT and other pesticides cause cancer and that their agricultural use was a threat to wildlife. The U.S. Agency for International Development supports indoor DDT spraying as a vital component of malaria control programs and has initiated DDT and other insecticide spraying programs in tropical countries.
field Lari Hills, Nairobi, Kenya, in 2010

Pyrethrum

Other insecticides are available for mosquito control, as well as physical measures, such as draining the wetland breeding grounds and the provision of better sanitation. Pyrethrum is an economically important source of natural insecticide. Pyrethrins attack the nervous systems of all insects. A few minutes after application, the insect cannot move or fly, while female mosquitoes are inhibited from biting. The use of pyrethrum in insecticide preparations dates to about 400 BCE. Pyrethrins are biodegradable and break down easily on exposure to light. The majority of the world's supply of pyrethrin and Chrysanthemum cinerariaefolium comes from Kenya. The flower was first introduced into Kenya and the highlands of Eastern Africa during the late 1920s. The flowers of the plant are harvested shortly after blooming; they are either dried and powdered, or the oils within the flowers are extracted with solvents.

Research

Avian, mouse and monkey models

Until the 1950s, screening of anti-malarial drugs was carried out on avian malaria. Avian malaria species differ from those that infect humans. The discovery in 1948 of Plasmodium berghei in wild rodents in the Congo and later other rodent species that could infect laboratory rats transformed drug development. The short hepatic phase and life cycle of these parasites made them useful as animal models, a status they still retain. Plasmodium cynomolgi in rhesus monkeys were used in the 1960s to test drugs active against P. vivax.
Growth of the liver stages in animal-free systems was achieved in the 1980s when pre-erythrocytic P. berghei stages were grown in wI38, a human embryonic lung cell line. This was followed by their growth in human hepatoma line HepG2. Both P. falciparum and P. vivax have been grown in human liver cells; partial development of P. ovale in human liver cells was achieved; and P. malariae was grown in chimpanzee and monkey liver cells.
The first successful continuous malaria culture was established in 1976 by William Trager and James B. Jensen, which facilitated research into the molecular biology of the parasite and the development of new drugs. By using increasing volumes of culture medium, P.falciparum was grown to higher parasitemia levels.

Diagnostics

The use of antigen-based malaria rapid diagnostic tests emerged in the 1980s. In the twenty-first century Giemsa microscopy and RDTs became the two preferred diagnostic techniques. Malaria RDTs do not require special equipment and offer the potential to extend accurate malaria diagnosis to areas lacking microscopy services.

A zoonotic malarial parasite

Plasmodium knowlesi has been known since the 1930s in Asian macaque monkeys and as experimentally capable of infecting humans. In 1965 a natural human infection was reported in a U.S. soldier returning from the Pahang Jungle of the Malaysian peninsula.