The history of nuclear cardiology began in 1927 when Dr. Herrmann Blumgart developed the first method for measuring cardiac strength by injecting subjects with a radioactive compound known as Radium C. The substance was injected into the venous system and travelled through the right heart into the lungs, then into the left heart and out into the arterial system where it was then detected through a Wilson chamber. The Wilson chamber represented a primitive scintillation counter which could measure radioactivity. Measured over time, this sequential acquisition of radioactivity produced what was known as "circulation time". The longer the "circulation time", the weaker the heart. Blumgart's emphasis was twofold. First, radioactive substances could be used to determine cardiac physiology and should be done so with the least amount of radioactivity necessary to do so. Secondly, to accomplish this task, one needs to obtain multiple counts over time. For decades no substantial work was done, until 1959. Dr. Richard Gorlin's work on "resting" studies of the heart and nitroglycerin emphasized several points. First, like Blumgart, he emphasized that evaluation of cardiac function required multiple measurements of change over time and these measurements must be performed under same state conditions, without changing the function of the heart in between measurements. If one is to evaluate ischemia then individuals must be studied under "stress" conditions and comparisons require "stress-stress" comparisons. Similarly, if tissue damage is to be determined, this is done under "resting" conditions. Rest-stress comparisons do not yield adequate determination of either ischemia or infarction. By 1963, Dr. William Bruce, aware of the tendency of people with coronary artery disease to experience angina during exercise, developed the first standardized method of "stressing" the heart, where serial measurements of changes in blood pressure, heart rate and electrocardiographic changes could be measured under "stress-stress" conditions. By 1965 Dr. William Love demonstrated that the cumbersome cloud chamber could be replaced by a Geiger counter, which was more practical to use. However, Love had expressed the same concern as many of his colleagues, namely that there were no suitable radioisotopes available for human use in the clinical setting.
By the mid 1970s, scientists and clinicians alike began using thallium-201 as the radioisotope of choice for human studies. Individuals could be placed on a treadmill and be "stressed" by the "Bruce protocol" and when near peak performance, could be injected with thallium-201. The isotope required exercise for an additional minute to enhance circulation of the isotope. Using nuclear cameras of the day and given the limitations of Tl-201, the first "stress" image could not be taken until 1 hour after "stress". In keeping with the concept of comparison images, the second "stress" image was taken 4 hours after "stress" and compared with the first. The movement of Tl-201 reflected differences in tissue delivery and function. The relatively long half-life of Tl-201 forced doctors to use relatively small doses of Tl-201, albeit with relatively large dose exposure and tissue effects. The poor quality images resulted in the search for isotopes which would produce better results.
By the late 1980s, two different compounds containing technetium-99m were introduced: teboroxime and sestamibi. The utilization of Tc-99m would allow higher doses due to the shorter physical half life of Tc-99m. This would result in more decay, more scintillation and more information for the nuclear cameras to measure and turn into better pictures for the clinician to interpret.
Major indications
Diagnosis of CAD and various cardiac abnormalities.
Identifying location and degree of CAD in patients with a history of CAD.
Prognosis of patients who are at risk of having a myocardial or coronary incident.
Assessment of viable myocardium in particular coronary artery territory following heart attacks to justify revascularization
Post intervention revascularization evaluation of heart.
From 1993 to 2001, myocardial perfusion scans in the US increased >6%/y with "no justification". Myocardial perfusion imaging scans are "powerful predictors of future clinical events", and in theory may identify patients for whom aggressive therapies should improve outcome. But this is "only a hypothesis, not a proof". However, several trials have indicated the high sensitivity of the test, regardless of tracer, outweighing any potential detrimental effect of the ionising radiation. In the UK, NICE guidance recommends myocardial perfusion scans following myocardial infarction or reperfusion interventions. The power of prognosis from a myocardial perfusion scan is excellent and has been well tested, and this is "perhaps the area of nuclear cardiology where the evidence is most strong". Many radionuclides used for myocardial perfusion imaging, including rubidium-82, technetium-99m and thallium-201 have similar typical effective doses. The positron emission tomography tracernitrogen-13 ammonia, though less widely available, may offer significantly reduced doses. Stress-only protocols may also prove to be effective at reducing costs and patient exposure.