CHAPTER SIX: RADIATION AND HUMAN HEALTH
The Effects of Radiation
It is hardly necessary to emphasize that since the explosion of a nuclear weapon at Alamogordo, New Mexico, in July 1945 ionizing radiation has become the most serious threat to man’s survival. If nuclear weapons are further developed or increased in number, it is quite conceivable that a future war will virtually extinguish human life. Grave as this danger may be, ionizing radiation is associated with a number of subtle hazards that warrant equal public concern. “Today . . . sources of ionizing radiation are rapidly becoming more and more widely used,” observes a team of radiation specialists for the U. S. Public Health Service. “Each atomic reactor produces the radioactive equivalent of hundreds of pounds of radium. Radioactive
substances are used in increasing numbers of hospitals, industries, and research establishments. X-ray machines are encountered in tuberculosis surveys, in shoe stores, in physicians’ offices and in foundries. Potentially, ionizing radiation has come to affect the environment of every member of the community.”
Radioactive elements are among the most lethal toxicants in man’s environment. As little as four micrograms (one seven-millionth of an ounce) of radium is capable of producing cancer in man years after lodging in his bones. Not all radioactive elements are dangerous in the same way as radium. Some are distributed differently in the body and are not retained for long periods; others, although retained for years, emit different kinds of radiation. Strontium-go, for example, is rated about one tenth as lethal as radium, although it, too, lodges primarily in the bones. Attempts to compare the potency of the two elements in quantitative terms, however, can be somewhat misleading. Radium causes biological damage through the emission of alpha particles, whereas strontium-go is hazardous to living things because it is an energetic emitter of beta particles. “Leukemia has been observed to occur in radium patients, though not as a prominent phenomenon,” observes W. F. Neuman, of the University of Rochester. “Leukemia is, however, a common finding in people who have suffered irradiation of the bone marrow or of the whole body. Since the beta rays from strontium-go are more penetrating than the alpha rays from radium, a greater proportion of the rays from strontium-go will penetrate into the marrow.” It is quite possible that, for individuals prone to leukemia, strontium-go poses hazards in addition to those that would be expected on the basis of criteria derived from experiences with radium.
To gain a clear understanding of the various kinds of radiation and their biological effects, it will be useful to review briefly some elementary concepts of atomic physics. Radioactivity is produced by changes that occur within the nucleus of an atom. These changes either directly or indirectly involve two nuclear particles, the neutron and the proton. The number of neutrons (electrically neutral particles) and protons (positively charged particles) in an atomic nucleus determines the species to which the atom belongs. Carbon-12 and carbon-13 for example, have six and seven neutrons, respectively, and are called isotopes of the element carbon. Every carbon atom, however, has six protons in its nucleus and an equal number of external electrons (negatively charged particles), which can be thought of as “revolving” around the nucleus like planets in a miniature solar system. Similarly, all hydrogen atoms have one proton and one planetary electron, whereas hydrogen-2 and hydrogen-3 have one and two neutrons, respectively. When carbon and hydrogen combine to form hydrocarbons, the nuclear particles (protons and neutrons) of the two elements are not affected by the combination. The chemical reaction of carbon and hydrogen is determined by the arrangement of their planetary electrons. Each element can be separated from the other without any loss of its original identity and without producing ionizing radiation. On the other hand, if carbon were to gain or lose any protons, it would become an entirely different element. Gains or losses in the number of protons may occur when a neutron is converted into a proton, when a proton is converted into a neutron, and, of course, when a proton together with a neutron is emitted from the nucleus of an atom. These changes and the various energy adjustments that occur within the nucleus produce ionizing radiation.
The more familiar elements that make up the earth do not undergo nuclear changes spontaneously. Their nuclei are extremely stable. A few elements, however, such as uranium, radium, and polonium, have unstable nuclei. They emit particles and eventually break down into more stable elements. Radium, which is derived from the decay of uranium, disintegrates into radon by emitting two protons and two neutrons. Decay continues through a series of radioactive daughter products until the atom becomes lead. The two protons and two neutrons emitted by the radium atom remain together and constitute an alpha particle. Streams of these neutron-proton groups, or alpha “rays,” are emitted from radium nuclei at velocities of about 44000 miles per second. In addition, radium emits gamma rays, and several daughter products of the element emit beta rays. Beta rays are streams of electrons and positrons (the positive counterparts of electrons); they travel at varying speeds, sometimes approaching the speed of light (186,000 miles per second). Gamma rays are physically the same as X rays, the difference in name arising from the difference in the sources of emission.(X radiation is produced when a fast-moving, free electron collides violently with an atomic nucleus and loses part of its surrounding electric field. Unlike gamma radiation, however, X radiation does not involve any energy adjustments within atomic nuclei.)
It takes 1,600 years for half a given quantity of radium to disintegrate into radon. Another 1,600 years must pass before the remaining quantity of radium is halved again. As this process goes on indefinitely, always leaving a remainder to be halved each 1,600 years, some radium will remain for an immensely long period. By the same token, by adding radioactive substances to his environment, man increases the amount that will remain indefinitely, whether the halflife of an element is 5,900 years (carbon-I4) or 28 years (strontium-60). Although man can create radioactive elements such as strontium-go, there is nothing he can do to reduce their radioactivity. Radiation continues undisturbed, and is diminished only with the passage of time.
What effects do alpha, beta, and gamma rays produce in living tissues? Although the long-range effects of low-level radiation on animals and man are very much in dispute, there seems to be little doubt about the harmful changes radiation produces in cells. Radiation particles have been compared to bullets tearing into an organism, but the damage they cause depends upon the number of particles that reach the tissues. The particles interact with many of the atoms that make up our cells, either causing electrons to leave their orbits (ionization) or exciting combinations of atoms (molecules) to a point where they break apart. The resulting damage may be minor and easily repaired; on the other hand, it may be serious and permanent. Irradiated germ cells may one day give rise to harmful mutations, and irradiated body cells may take the first step toward the development of cancer. An individual exposed to large dosages of radiation may exhibit symptoms of “radiation sickness”-nausea, vomiting, loss of hair, progressive weakness, and hemorrhages. Severe irradiation of the entire body damages the blood-forming organs in man and higher animals, often claiming life within several weeks after exposure. If the whole body is exposed to very high levels of radiation, there will be widespread destruction of the intestinal tract and nerve cells. In such cases, death is certain to occur within a few days after exposure.
Quantities of radiation are usually expressed in terms of reds and roentgens. (These units are used in most of the works cited in this discussion. During the past few years, however, the red has been steadily supplanted by the rem (roentgen equivalent man) in discussions of the effects of radiation on humans. The terms “rem” and “red” signify identical values for exposure to beta and gamma radiation.)
A red denotes the quantity of radiation an individual absorbs; a roentgen denotes the amount of radiation (ordinarily X radiation) to which he is exposed. For the purposes of this discussion, no significant distinction need be made between the two units. A single red or roentgen can be thought of as giving rise to about I,OOO ionizations in a single cell. An individual normally receives a total dosage of 3 to 5 roentgens from natural background radiation during the first thirty years of his life. Background radiation comes from cosmic rays, radioactivity in rocks, and radioactive elements in food and water. It constitutes the radiation to which man has evidently adapted himself over long periods of biological evolution. Although background radiation will vary in amount from one locale to another, the absolute quantities involved are ordinarily very small. A man living at sea level usually absorbs about 30 millirads (30 thousandths of a red) of cosmic radiation a year. A man living at 5,000 feet above sea level is likely to absorb about 70 millirads annually. An individual who moves from sea level to a height of 5000 feet will thus raise his absorption of cosmic radiation by little more than a red during a thirty-year period. A man who remains behind at sea level may well acquire more than that amount in certain parts of his body, by having a tooth X-rayed, for example. In fact, a series of dental X rays will supply his jaw and neck with a larger amount of radiation than they are likely to acquire from natural background sources in the greater part of a normal life span.
When X rays are used for diagnostic or therapeutic purposes, the good they do outweighs the harm. Diagnostic X rays are usually directed at limited areas of the body, a form of exposure that is far less hazardous than irradiation of the whole body. However, despite recent claims to the contrary, there is no conclusive evidence that small doses of radiation are harmless. Although some research workers have reported that very low levels of radiation (5 roentgens or less per week) increase longevity in animals, others working with similar doses of radiation have found that the life span is shortened. In the cases in which increases in longevity were reported, the last animals to die were generally non-irradiated control animals. If radiation had any “beneficial” effect, it was that it seemed to reduce the death rate of irradiated animals in early age groups, but the maximum life span of the animals was shortened. (In the opinion of Alex Comfort, one of England’s foremost authorities on the aging process in animals and man, the increase in longevity produced by low-level radiation ‘is due partly to the slight physiological stimulus arising from ‘stress’ and partly to the fact that radiation at these low levels does more harm to parasites of various kinds than to the animals infected with them. In any case, the life extension is seen only in animal colonies where the survival curve is suboptimal. Among animals that are well cared for, radiation, if it has any visible effect, shortens life.”)
There seems to be very little ambiguity, however, about the genetic effects of radiation. All the facts at hand indicate that any increase in the amount of radiation received by a large population produces a rise in the occurrence of harmful mutations. Nor do we have to rely exclusively on experiments with animals to acquire statistically significant evidence to support this conclusion. A study made by John T. Gentry and his colleagues in the New York State Department of Health suggests that areas with relatively high concentrations of radioactive rock are correlated with a high incidence of congenitally malformed children-many, perhaps most, of whose defects are believed to be caused by defective genes. By working primarily with statistics compiled from a million and a quarter birth certificates in New York State (exclusive of New York City), Gentry found that the over-all incidence of congenital malformation amounted to 12.9 per 1000 live births in areas of the state which seemed to have the least amount of geologic radioactivity. The incidence of congenital malformation for areas which seemed to have the highest amount of geologic radioactivity was 17.5-a difference of nearly 40 per cent. New York State, it should be noted, is not particularly radioactive; field measurements of background radiation were found to lie mainly in the range of 2.I to 3.2 roentgens per thirty-year period. Additional evidence that the rate of malformation is influenced by the radioactivity of the geologic environment appears in a preliminary nation-wide study by Jack Kratchman and Douglas Grahn, of the U. S. Atomic Energy Commission. By grouping county and state mortality data according to the geologic provinces of the United States, Kratchman and Grahn found that “mortality incidence from malformation may be higher in those geologic provinces . . . that contain major uranium ore deposits, uraniferous waters or helium concentrations.”
These studies are a reminder that man is probably far more sensitive to radiation than has generally been supposed. Although the living matter that surrounds the nucleus of a cell can often absorb a great deal of radiation without being significantly harmed, the nucleus itself is highly vulnerable. Theodore T. Puck and his coworkers at the University of Colorado Medical Center have recently shown that visible damage can be inflicted on chromosomes (the strands in the nucleus of the cell that carry the genes) with as little as 20 to 25 roentgens. The genetic apparatus of an ordinary body cell “thus shows extraordinary sensitivity to radiation,” Puck observes. “Calculation indicates that the ionization of not more than a few hundred atoms within or around the space occupied by the chromosome may be sufficient to produce a visible break. It seems probable that an even greater number of lesions may occur and remain invisible, either because they are submicroscopic or because they become resealed before they are expanded.”
If radiation is increased to so roentgens, many cells soon lose the ability to reproduce, although they do not necessarily die. The nuclear material of the cell may be damaged beyond recovery, but the cytoplasm surrounding the nucleus survives and performs the essential functions of life. Instead of reproducing, the cells continue to obsorb nutriment and grow until they reach enormous size. They become so-called “giant cells,” seven to ten times larger than a cell that has not been irradiated. Irradiation of the whole body with 400 to 500 roentgens is fatal to half the human beings so exposed, apparently because over 99 per cent of the body’s reproducing cells lose the ability to multiply normally. Individuals who receive lethal dosages of radiation may live for weeks after exposure, but the body’s inability to form a sufficient number of red and white blood cells eventually leads to death from anemia or infection.
Children are probably more sensitive to radiation than adults, and the fetus appears to be more sensitive than the child. Radiation not only has a selective effect upon different parts of a cell; it does the greatest damage to those cells that have a high rate of reproduction. It seems to be most harmful during those stages of life at which metabolism is at its peak, notably the years of growth. Sensitivity varies markedly from individual to individual and from one organ of the body to another. A large whole-body dose of radiation from which some individuals recover with a self-limiting case of “radiation sickness” may lead in others to permanent damage and death. If concentrated on the hand, the same dose may produce nothing more than a radiation burn. Directed at the abdominal area, however, it will often cause serious damage and marked physiological reactions.
Finally, there are different kinds of radiation and various emitters of radiation. The effects produced by all forms of radiation are ordinarily reducible to the amount of ionization they produce. But each form of radiation has distinctive features of its own. The alpha particle is the “heavy artillery shell” of ionizing radiation. As it is made up of two protons joined with two neutrons (actually a nucleus of the element helium), an alpha particle has about 7,300 times the mass of the lighter, faster beta particle. The penetrating power of alpha rays is very limited; they can be stopped by a thin sheet of paper. But although alpha particles do not penetrate very far into living tissue, they score a large number of “direct hits” on atoms. An irradiated cell is damaged more severely by alpha particles than by other forms of radiation, although fewer cells are irradiated by an alpha emitter. If the alpha particle had the penetrating power possessed by beta and gamma rays, it would be extremely destructive to living things.
Beta particles travel about IOO times farther than alpha particles. Whereas alpha particles seldom pass beyond the outer, dead layer of the skin, the free, fast-moving electrons and positrons that constitute beta radiation penetrate for about a quarter of an inch into living matter. Gamma rays and X rays will pass readily through a large organism; they reach the innermost recesses of the body and injure highly sensitive tissues, but they produce only about one twentieth of the damage inflicted on cells by alpha particles.
As gamma rays are physically the same as X rays, we might say that most of the effects produced in man by gamma radiation are due to the widespread use of X-ray equipment. Alpha and primary beta radiations are carried to man’s vital organs by radioactive elements. (The word “primary” is used because beta radiation induces gamma radiation in much the same way that a stream of electrons striking a metal target produces X rays. Gamma radiation, in turn, by imparting energy to the planetary electrons of atoms, causes them to leave their orbits. The free electrons behave like a beta ray. Thus, a short-lived series of radiations, alternating between beta and gamma rays, could extend the range of biological damage to tissues that might not ordinarily be reached by the primary beta ray.) X-ray equipment can be turned on and off; thus, exposure to X radiation need not be continuous or uncontrolled. But radioactive isotopes cannot be “turned off.” Once they combine with the skeleton and the soft tissues of the body, they are no longer within human control. Like the nutrients that sustain life and growth, many radioactive elements become part of the organism. The body tends to use radium and strontium-go, for example, the same way that it employs calcium, although it shows a preference for the latter. Radium and strontium-go migrate to the mineral part of the skeleton, where the rate of metabolic activity is relatively low, and they bombard bone matter and highly sensitive bone marrow with alpha and beta particles for a long period of time. The body does not discriminate between radioactive and stable carbon isotopes. Carbon-I4 appears in every kind of tissue, including germ cells, and irradiates the area in which it lodges with beta particles.
In the field of radiobiology, it is very imprudent to make ironclad generalizations that are likely to affect the well-being of large populations. The field is beset with so many uncertainties that scientific opinion can be expected to differ on nearly every issue but the “need for further research.” The public, in turn, would probably acknowledge this need with tranquillity if the problems of radiation were confined to the laboratory. But they are not. X-ray machines are now part of the equipment of every physician and dentist; the devices have become indispensable in medical practice. Radioactive isotopes are used widely in laboratories, hospitals, and industry. Nuclear weapons tests have discharged radioactive debris into the soil, water, and atmosphere, and nuclear reactors are annually producing huge quantities of highly radioactive wastes. The experimental animals needed for further research currently include all of humanity, and the modern radiation laboratory encompasses the entire earth. But many serious questions have been left unanswered. Are we using radiation with the caution and respect it deserves? To what extent have we contaminated the environment? In what direction are we going? Unless these questions are answered frankly and existing abuses corrected, the data acquired by further research may well prove to be of academic importance.
The Problems of X Radiation
X-rays were the first form of ionizing radiation to be deliberately produced by man, and in the United States and western Europe they now constitute the radiation to which he is most heavily exposed. Many people of all ages and all walks of life are given routine X rays during periodic medical and dental examinations. Moreover, exposure to X rays has been increasing steadily over the past three decades. According to data gathered by the National Advisory Committee on Radiation, the annual X-ray dosage received by the average American increased 900 per cent between 1925 and 1955. The radiation he receives from X rays is 135 per cent greater than the radiation he acquires from natural sources. “The continued upward trend exhibited by X-ray data,” observes the committee, “suggests the likelihood that the current exposure of the population from X-ray apparatus may increase still further unless appropriate radiation control measures are systematically applied.”
Systematic control of X radiation is long overdue. Man-induced radiation has been in use since 1896, the year Roentgen’s discovery of X rays attracted world-wide attention, and the effective application of these rays to medical problems antedates World War I. Surprisingly, a good working knowledge of the hazards of X radiation and valuable recommendations for the proper use of X-ray equipment were available shortly after Roentgen’s work became well known. Elihu Thomson, after experimenting with the new rays on his little finger, reported a great deal of the data needed to enable radiologists to use X-ray equipment with a reasonable amount of safety. His suggestions were largely ignored. For many years X-ray tubes were employed with little or no shielding, while physicians, in blissful disregard of Thomson’s recommendations, often exposed themselves as well as their patients to heavy doses of radiation. By the early 1920’S, scores of radiologists had needlessly sacrificed their lives and, in all probability, inflicted a great deal of damage on many of their unsuspecting patients.
To make matters worse, X-ray equipment was rapidly debased into a cosmetic agent and, finally, into a sales promotion device. It was found that X rays could cause a loss of hair (epilation), an effect that suggested lucrative possibilities. By the 1920’S many physicians, beauticians, and self-appointed “epilation specialists” had begun to treat women with radiation for the removal of “superfluous hair.” One New York physician, Dr. Albert C. Geyser, developed a “harmless” method of hair removal that involved cumulative dosages of at least 500 roentgens over a twelve-week period of radiation treatment. The method, named the “Tricho System,” was very successful, and beauticians trained by Geyser’s “Tricho Institute” began operating in many parts of the United States and Canada. It soon became evident, however, that women treated according to the “Tricho System” lost substantially more than unwanted hair. Many individuals acquired radiodermatitis (skin inflammation), severe radiation burns, festering skin ulcers, and, in time, cancer. The “Tricho” story is one of the more tragic episodes in the history of radiation. It is believed that the victims of Geyser’s system numbered in the thousands; the exact number of those who suffered latent injury and premature death will never be known.
Although radiation is no longer employed in the American beauty parlor, the use of X-ray equipment to fit shoes still lingers in a number of communities. The equipment is used mainly on the feet of children. As of 1960, the use of the shoe-fitting fluoroscope had been banned in twenty-nine states. Some of the other states regulate the use of the machine, but in a few states there are no restrictions at all. A number of local surveys cited by Schubert and Lapp have shown that the machines are often defective, giving high doses of radiation to both the child and the salesman. The Michigan Department of Health, for example, found shoe-fitting machines that emitted as many as 65 roentgens (r) a minute. A survey in Boston showed that irradiation of the foot ranged from 0.5 to 5.8 r a second. (The use of shoe-fitting fluoroscopes has been banned in Boston by state law and is regulated in Michigan.) “For a 22-second exposure, which is commonly used, the feet receive from IO to 116 rl” Schubert and Lapp write. “Remember, too, that one child may have his feet examined many times while trying on different shoes. Similar dosage measurements have been reported by the United States Public Health Service, which states that the average dosages to the children’s feet are between 7 r and I4 r per exposure.” The amount of scattered radiation that reaches the child’s pelvic region and gonads may run as high as 0.2 roentgens for a Twenty second exposure.
Society has found it very difficult to accept the fact that a valuable device can become extremely dangerous if it is used improperly. Schubert and Lapp observe that a major reason why Geyser was not stopped until his “system” had injured a large number of women is that the medical community generally accepted radiation as a form of treatment for minor skin disorders. To this day, many physicians are likely to underestimate the amount of latent damage caused by repeated exposure to X radiation. In some cases, physicians are inclined to use X-ray equipment as freely as they use the stethoscope, and often it is the cost of an X-ray picture rather than the ‘risk of irradiation that keeps the two devices from occupying places of equal importance in routine medical examinations. The physician often feels that there is no danger involved in taking diagnostic X rays; the word “diagnostic” seems to impart benign qualities to radiation. But the need for an X ray does not in any way diminish the effect of exposure; it merely provides a scale on which the hazards of X radiation should be weighed against the hazards of an incomplete and faulty diagnosis.
This requires careful evaluation. A middle-aged man who refuses to be exposed to the small amount of radiation involved in an annual chest X ray carries his fear of radiation to extremes. The risk involved in having an X ray taken is much smaller than the risk involved in permitting a serious lung disease to remain undiagnosed. Schubert and Lapp estimate that sixty chest X rays taken on a I4-by-I7-inch plate without accompanying fluoroscopies will give an individual a cumulative dose of 3 roentgens. With newly developed fast films, exposure is likely to be reduced appreciably. Similarly, a man with marked and prolonged gastrointestinal distress who fails to respond to treatment for benign stomach disorders would be foolhardy if he refused to have X rays taken of his abdominal region. The danger of serious disease would outweigh by far the amount of harm caused by a single series of X rays.
But individuals without medical complaints who are exposed periodically to a large amount of diagnostic radiation might do well to entertain some second thoughts. For example, many large American corporations have established the practice of sending their executives to clinics for annual medical examinations. These examinations take several days to perform and normally include a program of X rays and fluoroscopies of important organs of the body. Schubert and Lapp found that an executive can expect to receive 35 roentgens with each check-up; frequently the dosages reach so roentgens. ‘We grant that the executive is not exposed to total body radiation, which is more serious than localized radiation,” they write. “Nevertheless, the most vital organs are exposed and at the very least one may expect a shortening of life span of the exposed individual, especially if the executive receives the x-ray bombardment year after year. We do not deny the valuable data which doctors may gain in the course of x-ray examination, which may lengthen the life span, but we do regard the annual irradiation of 50 r per executive with some degree of apprehension for the individual’s future welfare.”
A number of radiobiologists believe that certain X rays should not be undertaken unless they are absolutely necessary for the well-being of the patient. A case in point is the pelvic X ray (pelvimetry) of pregnant women. “X-ray pelvimetries have been a relatively common practice for many years-and still is as far as is known,” Schubert told a congressional committee holding hearings on radiation in 1959. An estimated 440,000 American women and 100,000 English women were given pelvic X rays annually during the late 1950’S. The practice had always aroused a certain amount of concern because of the harmful genetic effects it could produce in later generations, but recent studies show that the hazards may be more immediate. In an extensive survey of deaths from cancer among English and Welsh children, Alice Stewart and her co-workers at Oxford University found a statistical relationship between fetal irradiation and the incidence of cancer among children. The survey, perhaps the most extensive of its kind, covered most of the children who had died of leukemia and other cancers between 1953 and 1955. The data gathered by the investigators show that, among children under ten, the chances of dying from cancer are twice as great for those who were irradiated during the fetal stage.
Moderate enlargement of the thymus gland, a condition for which a large number of children have received radiotherapy, is now regarded as harmless by many medical authorities. Although thymic enlargement is no longer treated with radiation in England, the practice still persists in the United States. C. Lenore Simpson and her co-workers have carefully surveyed about 2000 such cases in New York State, mainly in Rochester and Buffalo. By comparing the incidence of cancer in treated cases with that in children in the general population, they found nearly a sevenfold increase in cancer among children who had been exposed to high dosages of radiation for thymic enlargement. The number of leukemia cases found by Simpson was nearly ten times higher than the number that would normally have been expected.
It remains to be seen whether most radiobiologists will decide that pelvimetries and thymic irradiation produce cancer in children. The evidence is not altogether clear-cut. Several later surveys of a more limited nature have reported findings that conflict with the results obtained by Stewart and Simpson, whereas other surveys support them. The results seem to vary with the methods used and the communities studied. The Stewart and Simpson surveys, however, have unearthed statistical probabilities that are far too high to be dismissed lightly. Although these probabilities can be expected to vary with the techniques of the radiologists in different communities and medical institutions, there seems to be little reason to doubt the validity of the findings of Stewart and Simpson.
Physicians have observed that many children with thyroid cancer received radiation around the head and neck during infancy and early childhood. George Crile, of the Cleveland Clinic Foundation, for example, reports that in I8 cases treated by his clinic, the parents of I4 were asked whether their children had been exposed to X rays in the general area of the thyroid gland. Only 3 had not received radiation. The remaining II had histories of radiotherapy for enlargement of the thymus gland and lymph nodes, for eczema, and for other disorders. “In the I5 years between 1924 and 1939,” Crile notes, “at a time when many more operations on the thyroid were being done at the Cleveland Clinic than are being done today, no children with cancers of the thyroid were seen. Only three were seen between 1939 and 1950, and between 1950 and 1958 there were I5 cases. The question immediately arises as to whether the increasing incidence of cancer of the thyroid in children is the result of an increase in the use of radiation therapy in infancy and early childhood.” Crile notes that factors other than radiation may account for the rise in thyroid cancer among children but adds that “the increasing incidence of cancer of the thyroid in recent years suggests that an environmental factor such as radiation is involved.”
Many individuals are still exposed to large dosages of X radiation for what are essentially cosmetic purposes. Localized eczemas, warts, and childhood birthmarks that are likely to disappear spontaneously after a few years have in many cases been heavily irradiated. This practice is declining owing to a greater awareness of radiation hazards, but it is still found too often to be overlooked.
Radiation, to be sure, is still the only effective method for treating certain irritating or disfiguring skin disorders, and the patient may be the first to demand radiotherapy in full awareness of the risk involved. Radiotherapy also reduces the severe pain that accompanies ankylosing spondylitis, an arthritic disease of the spine that afflicts thousands of men and often leads to a useless or impaired life. In such cases, where irritation or pain may be severe, it is unlikely that the hazards of radiation will deter a patient from accepting radiotherapy. He will probably take the risk-especially if the risk is relatively small. ( In England the incidence of leukemia among men who have been treated with X radiation for ankylosing spondylitis “is about one third of one per cent; yet calculations based on the national death rates . . . show that even this low incidence is about ten times greater than would have been expected in the absence of irradiation.”)
The element of risk cannot be avoided, but the amount of risk can be reduced appreciably. X-ray equipment can be used with moderation and with good sense. Doctors should be made thoroughly aware of the hazards involved and they should be taught the most advanced methods of reducing exposure. The physician should limit the use of his equipment to those aspects of radiology with which he is thoroughly familiar. A general practitioner is not a radiologist. Where complex techniques are required to diagnose a disease or treat a patient, the patient should have the benefit of the special training and experience that come from long service in the field of radiology. There is also an area which both layman and physician can enter with equal authority-that of radiation safety. Careful shielding from scattered X rays should always be provided for sensitive areas of the body, such as the gonads, neck, and abdominal region, particularly when children and young people are being irradiated. A patient has a right to insist upon protection whenever it is feasible, and in this he has the emphatic support of the most knowledgeable authorities in the field of radiology.
The use of X-ray equipment should be carefully regulated. These devices do not belong in the hands of quacks and shoe salesmen. Technicians should be licensed personnel who have given substantial evidence of their qualifications to operate X-ray equipment. Their work requires careful training that cannot be picked up through irregular, offhand instruction. There should be compulvery periodic inspections of X-ray equipment by competent agencies. Concerted efforts must be made to bring the latest advances in radiology into physicians’ offices and hospitals. Research has produced faster films, electronic devices to increase the brightness of fluoroscopic screens (with concomitant reductions in X-ray dosage), image intensifiers, and improved filters that eliminate diagnostically useless long-wave radiation. Some of these improvements are too costly for the ordinary physician; hence the need to make use of the services of a well-equipped radiologist or hospital when a program irradiation is required. It is the responsibility of the community to see that outdated X-ray equipment is scrapped and that no one is exposed to defective and potentially harmful machines.
Unfortunately, the available evidence suggests that, in most cases, both the machines and the physicians’ techniques are unsatisfactory. A recent two-year survey of diagnostic X-ray equipment in New York City, for example, showed that 92 per cent of 3623 machines inspected by the Board of Health either were not being used properly or were defective. The survey disclosed that X-ray beams were very broad, needlessly irradiating parts of the body that were not under study. The majority of physicians who were not radiologists were unfamiliar with the safety recommendations of the National Committee on Radiation Protection. The inspectors did not find a single physician who had voluntarily followed earlier recommendations to switch from outmoded to new equipment. Although the survey covered only 35 per cent of the machines employed in the city, it included the most frequently used X-ray machines, notably those in hospitals and in the offices of radiologists.
This survey, it should be emphasized, took place in 1959 and 1960 not a half century ago, when research on radiation was still in its infancy. The survey followed a period of widespread public discussion on the hazards of radiation and nuclear fallout. It is now clear that the situation is much worse than had generally been supposed. Exposure to radiation is occurring on a scale that has no precedent in man’s natural history. Millions of people in all stages of life, from the fetal to the senile are being irradiated every year. Clearly the problem of radiation control has reached serious proportions. From the various reports “on the influence of ionizing radiation on biological systems,” observes the National Advisory Committee on Radiation, “. . . it is evident that serious health problems may be created by undue exposure and that every practical means should be adopted to limit such exposure both to the individual and to the population at large.”
X-ray equiptment, we noted earlier, can be turned on and off, but the radioactive wastes that enter man’s environment through nuclear weapons tests and the activity of nuclear reactors are essentially beyond human control. They contaminate air, water, and food, and they irradiate everyone, irrespective of age or health. Radioactive contaminants also create problems not encountered with conventional pollutants. Ordinary contaminants usually lose their toxic properties by undergoing chemical change, but there is no loss of radioactivity involved in the chemical reactions of radio-isotopes. When radiocarbon combines with oxygen to form carbon dioxide, the carbon in the compound continues to emit beta particles. The same is true for chemical compounds formed by strontium-go. Radioactivity persists in all radio-isotopes until unstable atoms decay into stable ones.
Until recently, the layman was given a highly misleading picture of the hazards created by nuclear weapons tests. This picture was largely created by the Atomic Energy Commission, the official agency that had been made chiefly responsible for furnishing the public with information in the field of nuclear energy. For many years the A.E.C. consistently minimized the danger posed by radioactive fallout produced by nuclear weapons tests. For example, it completely ignored the extent to which food had been contaminated with strontium-go until the problem was raised by scientists who were critical of the agency’s public information policies. “In the I3th Semiannual Report of the AEC, published in 1953,” notes Barry Commoner, of Washington University, “the AEC stated that the only possible hazard to humans from strontium-go would arise from ‘the ingestion of bone splinters which might be intermingled with muscle tissue during butchering and cutting of the meat.’ No mention of milk was made”-or, for that matter, of vegetables and cereals. Spokesmen for the A.E.C. predicted that fallout would be uniformly distributed over the earth, so that no areas need fear concentrations of debris from nuclear weapons tests. The public was assured that the greater part of the debris sent into the stratosphere would remain aloft for a period of five to ten years. As fallout occurs very slowly, it was said, the radioactivity of short-lived radio-isotopes would be almost entirely dissipated in the stratosphere.
Actually, the radioactive debris that soars into the stratosphere stays there, on an average, less than five years. According to General Herbert B. Loper, of the Department of Defense, half the stratospheric debris produced by a nuclear explosion returns to the earth within two years. Fallout occurs three and one half times faster than Willard F. Libby, former commissioner of the A.E.C had estimated. A model of stratospheric air circulation developed by A. W. Brewer and G. M. B. Dobson indicates that the heaviest fallout in the Northern Hemisphere occurs in the temperate zone, reaching a peak between 40 and 50 degrees north latitude-or roughly between Madrid and London in Europe and between New York City and Winnipeg in North America. Measurements made during the I958-6I nuclear weapons test moratorium indicate that the hazard from fallout in these latitudes is substantially greater than the world-wide average. (Let us grant that the A.E.C. had made an honest error, but how did the agency handle the facts when it became evident from classified data that its predictions were wrong? A chronological account prepared by the Joint Committee on Atomic Energy indicates that a restudy by the A.E.C., released early in 1959, “makes no mention of [the] Defense Department study” and “maintains [the] position of a residence time of 5 to 10 years, selecting 6 years as the mean residence time of stratospheric fallout. Results of another AEC analysis, Project Ash Can, which indicated a residence time of 3 years, was discounted as being doubtful. No mention was made that the Department of Defense conclusions of residence half life of 2 years tended to support results of Project Ash Can.”)
The rapidity with which radioactive debris descends to the earth places the danger presented by short-lived, supposedly harmless radioactive elements in a new perspective. Cesium-I44 and strontium-8g have half-lives of only 290 and 56 days, respectively, but nuclear explosions produce these radioactive elements in such relatively large quantities that, if fallout is rapid, they become a serious hazard to public health. Cesium-I44, like long-lived cesium-I37 (another component of fallout), is an emitter of beta rays. When taken into the body, both cesium isotopes are handled metabolically like potassium; they migrate to all the soft tissues, including the reproductive organs. Strontium-89 possesses the characteristics of strontium-go; it, too, emits beta rays and tends to lodge in bone matter. Although a short-lived bone seeker like strontium-89 might seem to be relatively harmless, it should not be underestimated as a hazard to public health. “Since strontium-89 is produced more abundantly in fission than strontium-go . . . ,” the Special Subcommittee on Radiation of the Joint Committee on Atomic Energy reported in 1959, “it is possible that comparable doses to the body from the two materials could occur.” The subcommittee added that “it would require IOO times more initial activity of strontium-89, whose half life is 56 days, to deliver the same dose to tissue that would be created by I unit of strontium-go. It is considered significant that transient levels of strontium-89 with approximately this ratio to strontium-go have been observed in milk.”
For a few weeks after a nuclear explosion, the wind-borne debris in the lower part of the atmosphere may contain appreciable amounts of iodine-I3I. Iodine-I3I has a half-life of eight days. At the 1959 hearings of the Special Subcommittee on Radiation, E. B. Lewis, of the California Institute of Technology, observed that “the radioiodines in fallout are a special hazard to infants and children. This hazard arises for a variety of reasons. Radioiodine is a significant fraction of the fresh fission products released by nuclear weapons explosions. Grazing cattle ingest and inhale the radioiodines in fallout and then concentratate it in their milk. Infants and children are expected to ingest more of the isotope than will adults since fresh cow’s milk is the principal source of fallout radioiodine in the human diet and young people obviously drink more fresh milk than do adults. As has long been known, iodine isotopes, natural and radioactive, concentrate in the thyroid gland. Moreover, for the same amount of radioiodine orally ingested, the infant thyroid receives some 15 to 20 times the dose that the adult thyroid receives. (Briefly, this is because more radioiodine is taken up by the infant than by the adult thyroid; as a result many more of the short-ranged iodine-I3I beta rays will be generated in a gram of infant than in a gram of adult thyroid tissue. ) Finally, in spite of its small size, the infant thyroid may be more susceptible than the adult thyroid to cancer induction by ionizing radiation.”
No one denies that radio-isotopes produce damage when they are deposited in the human body. Controversy tends to center around the “maximum permissible concentrations” (MPC’s) that have been established for the quantities of various radioactive elements that the human body can be allowed to accumulate. (In the United States, the term “maximum permissible concentration” is being superseded by “radiation protection guide” (RPG), and the job of formulating “acceptable” values of exposure to radiation has been placed in the hands of the newly formed Federal Radiation Council. These changes, however, do not affect the substance of the discussion that follows.) There is nothing safe about an MPC. An MPC constitutes the amount of risk an official agency is prepared to inflict upon certain individuals and the general population in carrying out a nuclear-energy program. The U. S. Naval Radiological Laboratory points out that any degree of exposure to ionizing radiation produces “biological effects.” “Since we don’t know that these effects can be completely recovered from,” observes the laboratory’s report to the Special Subcommittee on Radiation, “we have to fall back on an arbitrary decision about how much we will put up with; i.e., what is ‘acceptable’ or permissible’-not a scientific finding, but an administrative decision.”
Many scientists take a grim view of the “administrative decisions” that have established the permissible levels of strontium-go for the general population. The MPC for strontium-go is measured in strontium units (S.U.), formerly called “sunshine units.” A single S.U. is one micromicrocurie (,u,uc) of strontium-go per gram of calcium (a uuc is equal to one millionth of a millionth of a curie; a curie essentially represents the amount of radioactivity associated with one gram of radium). For a number of years, the maximum permissible concentration for strontium-go was established by a definite although largely unofficial procedure. The MPC originated as a recommendation by the International Commission on Radiological Protection (I.C.R.P. ), an advisory body made up of scientists from all parts of the world. The International Commission’s recommendation, in turn, was usually adopted by the National Committee on Radiation Protection (N.C.R.P.), the American affiliate of the I.C.R.P. Finally, the National Committee’s recommendation was generally adopted by government agencies.
Until 1955, the recommendation of the International Commission dealt almost exclusively with problems of occupational exposure to radiation and radioactive isotopes. The problem of formulating an MPC for the general population was left in the hands of the commission’s national affiliates and official agencies. This created a highly unsatisfactory situation. It made it possible for official agencies to grossly understate the hazards of nuclear weapons tests; they proceeded to evaluate all the dangers that fallout presented to the general population in terms of the large MPC established for workers in atomic plants. “The maximum permissible level of strontium-go in the human skeleton, accepted by the International Commission on Radiological Protection, corresponds to IOOO micromicrocuries per gramme of calcium [I,OOO S.U.],” noted the British Medical Research Council in the mid-1950’s. “But this is the maximum permissible level for adults in special occupations and is not suitable for application to the population as a whole or to children with their greater sensitivity to radiation and greater expectation of life.” To cope with this problem, the International Commission decided in 1955 that the prolonged exposure of a large population to radiation should not exceed one tenth of the maximum permissible levels adopted for occupational exposures. For all practical purposes, the commission’s recommendation meant that the MPC for strontium-go established for the general population should be reduced from I,OOO to IOO S.U.
It was not until a great deal of controversy had developed over the established MPC for strontium-go that the National Committee, followed with undisguised reluctance by the A.E.C., adopted the “one-tenth rule” recommended by the International Commission. It soon became evident that even this go per cent reduction was inadequate. Finally, the International Commission made a curious, perhaps contradictory decision: It raised the MPC for workers in atomic plants from I,OOO to 2,000 S.U. but recommended that the MPC for the general population be reduced to one thirtieth of the occupational level. Thus, in a circuitous, often confusing manner, the permissible level of strontium-go for the general population has been reduced to 67 S.U., or one fifteenth of the MPC ( I ,000 S.U. ) that was in effect in 1954.
The problem of establishing a “suitable” MPC, however, is still unsettled. Strontium-go is not uniformly distributed in bone matter. The element, like radium, tends to form “hot spots,” some of which may exceed the average skeletal distribution many times over. To estimate the damage which skeletal concentrations of strontium-go are likely to produce, a factor, N, should be used to increase any average result based on a uniform distribution of the element in the bone especially in the case of adults, who form these “hot spots” more readily than children. Two Swedish investigators, A. Engstrom and R. Bjornerstedt, who have pioneered in research on this problem, emphasize that N is not constant. The factor may vary from 6 to 60, depending upon individual metabolism and variations in the amount of strontium-go that appear in food, air, and water. An individual who acquired an average, presumably “safe,” skeletal burden of 65 S.U., for example, might well have minute “hot spots” of nearly 4,ooo S.U.-enough to increase appreciably his chances of developing cancer.
“Meanwhile, what is the course of wisdom?” asks W. O. Caster, of the University of Minnesota. “Some claim that where there is honest doubt, public safety demands that the safety standards be adjusted to cover the worst possible contingency. But if one couples Dr. Engstrom’s estimate of IOO strontium units as the maximum permissible level for radiation workers with the International Commission’s suggestion that the permissible level for a population should be one thirtieth the occupational level, it would appear that the population limit should be only three strontium units. Some children have already passed this mark. The official agencies point out that, in the absence of proof that such a level is in any way deleterious, it would be the height of irresponsibility to raise a public alarm.” The proof required by official agencies, however, might not be forthcoming for ten or fifteen years.
Whatever the analytic method employed, there is no doubt that concentrations of strontium-go in human bones have been rising steadily over the past decade. The principal source of American data on the amount of radio-strontium entering the human body is an A.E.C.financed research project at the Lamont Geological Observatory of Columbia University under the direction of J. Laurence Kulp. Thus far, Kulp’s research group has issued four reports, covering the years 1954-9. The data show that for the ‘Western culture” area (Europe and the United States), situated between 30 and 70 degrees north latitude, the average amount of skeletal strontiumgo in adults increased from 0.07 S.U. in 1954 to 0.31 S.U. Ill 1959. The amount of radio-strontium in the bones of children up to four years of age rose from 0.5 S.U. m 1955 to 2.3 S.U. in 1959, a total seven times the amount in adults. Although the body discriminates in favor of dietary calcium against strontium, the discriminatory factor varies with age. As adults no longer undergo skeletal growth, they add only about one fourth of the strontium they ingest to their bones; children, however, add about half.
The value of averaging the amount of strontium-go in bones at a given collection point, and then averaging the averages for an entire “culture area,” is highly questionable. Jack Schubert has pointed out that “when dealing with a potentially harmful agent involving the world’s population, it can be misleading to use average values of strontium-go in the bones. It is important, especially in relation to the setting of permissible levels for a world population, that we have some basis for estimating the degree to which appreciable fractions of the population accumulate two and more times the average amount.” Using more suitable mathematical methods than those employed by Kulp and his colleagues, Schubert shows that 28 8 per cent of the skeletal specimens analyzed in the 1957-8 Kulp report “have three or more times the average ( geometric mean ) amount of strontium-go in their bones. Over 4 per cent will have seven or more times the average These values are appreciably greater than those of Kulp’s who used incorrect averages and an incorrect distribution curve and hence underestimated the fraction of the population which would exceed the average values.” (For the benefit of readers who are familiar with statistical methods, it might be added that a proper evaluation of Kulp s data requires a Poisson distribution, not the normal distribution employed by Kulp and his group.)
Submerged in the avalanche of averages, estimates, and statistical extrapolations are those communities that have been heavily irradiated by fallout debris. “Substantial areas in Nevada and Utah close to the bomb testing grounds had received ten roentgens of gamma radiation as early as 1955,” Caster observes. “Some 40 communities had had average doses between one and eight roentgen units. Such doses are substantially above allowable levels for the general population-or for professional personnel for that matter.” In May 1953, for example, a sudden change in weather caused the fallout “cloud” from a Nevada test series to move over well-traveled and inhabited areas near Yucca Flats. According to an account of the episode by Paul Jacobs, fairly high levels of air contamination were recorded in St. George (population 5,000) and Hurricane (population, I,375). (A similar incident occurred a year later, when a large group of Marshall Islanders received high doses of radiation as a result of a hydrogen-bomb test at the A.E.C.’s Eniwetok Proving Grounds in the Pacific Ocean. Exposure to fallout was sufficiently high to induce symptoms of “radiation sickness” in many of the natives. Had the wind veered thirty miles to the south, the entire population of the islands would have been exposed to lethal doses of radiation (about I,OOO roentgens). Hundreds of motor vehicles traveling the highways near the test site required decontamination, while residents of endangered communities (when they could be reached) were warned to remain indoors. At St. George, reports Jacobs, a “high degree of contamination continued for sixteen days after the shots.”
Radioactive debris from the Nevada test explosions normally drifts in a northeasterly direction toward the grain and dairy states of the upper Midwest. It tends to settle out in areas where a large percentage of the American food supply is raised. As a result, high concentrations of strontium-go have been found in food plants, especially cereals, grown in many parts of the Great Plains region. In 1957, for example, wheat samples collected from seven agricultural experiment stations in Minnesota registered concentrations of strontium-go about IOO per cent higher than the current daily maximum permissible level (MPL) for humans over a lifetime. The largest number of S.U. found in one group of Minnesota samples was 200 per cent greater than the MPL.
Had a moratorium on nuclear weapons tests not been established in the autumn of 1958, the amount of strontium-go in milk from certain collection points in the United States might well have approached or equaled the current MPL value of 33 ,uuc per liter. The A.E.C.’s “Hardtack” series of explosions in Nevada and high-yield Russian tests in Siberia, both conducted during 1958, resulted in heavy fallout. In 1959, the amount of strontium-go in milk reached 14.6 uuc per liter in New York City (July 18), 18.2 in Cincinnati, Ohio (May 19), 22.6 in Spokane, Washington (May 5), and 22.8 in Atlanta, Georgia (May 5). Heavy concentrations of strontium-go appeared in wheat and cereal products, particularly in whole-wheat bread. During February 1959, for example, the levels of strontium-go in whole-wheat bread obtained by the A.E.C. from New York City retail markets generally exceeded the MPL value of 33 ,uuc per kilogram. (With the resumption of nuclear weapons tests, the amount of strontium-go in food is expected to reach unprecedented levels. It is likely that the strontium-go in the milk supply of a number of American communities will substantially exceed the MPL of 33 uuc per liter. Estimates based on the Federal Radiation Council “radiation protection guides” (RPG’s) place 1962 levels of strontium-go in milk in the lower part of “Range II” (from 20 to 200 uuc of strontium-go per liter). This range requires “active surveillance and routine control” by public health authorities. It is hardly necessary to emphasize that there will be a sharp increase in skeletal burdens of strontium-go, especially in children and young adults.)
Attempts have been made by highly responsible scientists to estimate the damage caused by the fallout debris produced to date. According to A. N. Sturtevant, professor of genetics at the California Institute of Technology, fallout will produce harmful genetic mutations in 4,000 people in the first generation and in 40,000 people in generations to come. Although these figures represent Sturtevant’s latest published estimates (1959) they are calculated on the basis of the radioactive debris produced up to 1956. Since then, the total amount of fallout has increased about 70 per cent. On the basis of more up-to-date material. James Crow, president of the Genetics Society of America, estimates that 20,000 mutations will be inherited by the next generation as a result of fallout (his estimate is based on the assumption that the present world population will produce a total of two billion children). The number would reach 200,000 if nuclear testing were resumed and continued for thirty years at the 1954-8 rate.
If E. B. Lewis’s ‘linear hypothesis” is correct, fallout may have contributed to the rising incidence of leukemia. According to Lewis’s theory, the chances of acquiring leukemia increase in direct (or linear) proportion to increased irradiation. The hypothesis ‘predicts that constant exposure to even one sunshine unit [strontium unit] would be capable of producing about 5 to IO cases of leukemia annually in the United States population.” There seem to be very few scientists who deny that linearity exists in the middle dose range of radiation, but a great deal of controversy has arisen over Lewis’s hypothesis that linearity exists at all levels of radiation, including the very low dose range involved in diagnostic X-rays and fallout. Yet there is a substantial amount of data to support his theory. Aside from the basic evidence that Lewis presented when his views were first published (I957), the linear hypothesis has gained strong support from the work of Alice Stewart and others on cancer in children (see page 170). The pelvimetries studied by Stewart and her co-workers involved dosages of only 2 to IO rads; nevertheless, this low level of whole-body radiation directed at the fetus was enough to double its chances of acquiring cancer.
If Lewis’s hypothesis is also valid for other forms of radiation-induced cancer, fallout from nuclear weapons tests may be responsible for thousands of cases of bone cancer. By using a statistical approach developed by the British Atomic Scientists Association, it is possible to make a rough but reasonable estimate of the number of such cancers that would be produced by increases in the amount of strontium-go in human bones. A long-term average of only one strontium unit in the bones of every individual in the world may well cause 12,500 cases of bone cancer; as little as two strontium units may result in 25,000 cases. The figure increases proportionately as strontium-go becomes part of the human skeleton. It is sobering to note that if the world population were to acquire an average skeletal concentration