CHAPTER FOUR: The Problem of Chemicals in Food
The Consumer and Commercial Foods
With the rise of an urbanized society, the production of food becomes a complex industrial operation. In contrast with earlier times, when very few changes were made in the appearance or the constituents of food, much of the food consumed in the United States is highly processed. Allen B. Paul, of the Brookings Institution, and Lorenzo B. Mann, of the Farmer Cooperative Service, have summed up the change as follows: “Our grandparents used for baking about four-fifths of the flour milled in this country. They churned almost all the butter Americans ate. They killed and prepared much of the meat eaten. They made their own soups, sausage, salad dressing, clothing and countless other items. Such tasks, which a generation ago were part of farm and home life, have been taken over by commercial factories, 85000 of them.” Of the 85000 factories cited by Paul and Mann, nearly half were food-processing plants.
It is hardly necessary to emphasize the fact that advances in food technology and methods of transportation have given us many advantages. A large number of seasonal fruits and vegetables are now available throughout the year in canned and frozen form. Foods are generally cleaner than they were a half century ago. Processed foods have enormously reduced the homemaker’s burden. Many vegetables, fruits, and meats are shredded, eviscerated, mixed, creamed, cooked, baked, or fried before they reach the kitchen. The homemaker often needs little more than a can opener, a few utensils, and a stove to serve meals that required elaborate preparation and hours of work a generation or two ago.
But the shift from home-prepared to processed foods has not been an absolute gain. Although very little was known about human nutrition a few generations ago, our forebears could make use of what knowledge they had in growing and preparing their own foods. Today the situation is different. A great deal has been learned about the nutrients required for human health, but the cultivation, storage, refining, and, in large part, the preparation of food is no longer controlled by the consumer. These tasks have been taken over by the food industry. In the absence of a nation-wide food-grading program that would compel processors to disclose the nutritive values of food items, the housewife’s attempt to plan a healthful diet depends to a great extent upon chance. The nutritive quality of a given food may vary enormously. The variation may be due to the type of seed that the farmer plants, the methods employed in food cultivation and food processing, or the length of time the food is stored.
For example: Nutritionists, in analyzing thirty-one strains of cabbage, found that the ascorbic-acid, or vitamin-C, content varied as much as 350 per cent, while the amount of carotene in different varieties of sweet potatoes has been found to range from zero to 7.2 milligrams per cent. A threefold difference was discovered in the niacin content of forty-six strains of sweet corn. Large variations in nutritive value have been found among different varieties of apples, peas, wheat, onions, and many other crops. The fact that a given variety attains great size does not necessarily mean that it is nutritious. The very opposite may be true. Small cabbages, tomatoes, and onions, for example, contain more vitamin C than those of greater size. Size, appearance, color, and texture are often very poor criteria for judging the nutritive content of a food. Generally, plant breeders are interested in developing varieties of vegetables and fruits that have an attractive appearance, yield more bushels to the acre, are more resistant to disease, and withstand storage and shipment. “It is now realized that in the development of these commercially improved varieties by genetic selection the nutrient content is often decreased,” notes Robert S. Harris, of the Massachusetts Institute of Technology. “If the plant breeder were to collaborate with a food analysis laboratory, he could develop commercially improved varieties which are also superior in nutrient content.” (It is quite doubtful, however, whether improvements in the nutritive quality of vegetables and fruits depend exclusively on genetic selection (as Harris seems to believe). The plant environment, notably the condition and fertility of the soil, is a factor of decisive importance.)
Foods may lose much of their nutritive value when they are heated (blanched) prior to canning or freezing. If the processor follows a good blanching procedure, vegetables will retain most of their ascorbic acid and vitamin B. But if the procedure is poor, the losses may be enormous. Losses may run as high as 30 per cent of the riboflavin in green beans, 37 per cent of the naicin in spinach, 40 per cent of the ascorbic acid in peas, and 64 per cent of the thiamine in lima beans. “Blanching procedures are not standardized,” Mildred M. Boggs and Clyde L. Rasmussen observe. “Some processors have a leaching loss of 5 to IO per cent of each water-soluble vitamin or other constituents. Others have losses of 40 to 50 per cent-or even higher with the same food.”
A substantial loss of nutrients takes place when wheat is refined into flour for white bread. “Even with 4 per cent milk solids in the white enriched bread,” Bogert notes, “it is still not quite up to the content of whole wheat bread in protein, calcium, iron, thiamine and niacin, while the whole grain breads contain other B complex vitamins not added in the enrichment of white flour.” If a product is stored too long, it may lose a substantial amount of thiamine. “Pork luncheon,” for example, may lose 20 per cent of its thiamine in the can while standing on the retailer’s shelf for three months. After six months the loss may be as high as 3o per cent.
Many of these problems could be solved by establishing a compulsory system of grade labeling for processed foods based on their nutritive constituents. The grade markings that currently appear on labels are often meaningless from a nutritional point of view. For the most part, they indicate the food’s size, appearance, and taste, not its nutritive value. But the food industry has been adamantly opposed to any kind of compulsory grade labeling. Jesse V. Coles, professor of home economics at the University of California, observes that “practically all, if not all, manufacturers, processors, and distributors are opposed to compulsory grade labeling.” In the thirties the food industry defeated attempts to provide for grade labeling in the N.R.A. fair competition codes; it was bitterly opposed to the inclusion of standards of food quality in the food and drug legislation enacted by Congress in 1938; and it successfully blocked efforts to link price regulations with food quality in wartime anti-inflationary laws.
In the absence of a meaningful system of grade labeling, the homemaker is surrounded by nutritional anarchy. Her choice of a product is often governed by its appearance; if it is packaged, she is likely to choose a well-advertised brand or one that is attractively wrapped. Her difficulties are complicated by the fact that many processors color the food she sees and use chemicals to soften the refined products she touches. Preservatives are often added to foods to permit longer storage, and synthetic flavoring matter is put into many processed foods to enhance their taste. And even in those cases in which appearance, taste, and softness are true indications of superior nutritional quality, the processor frequently uses a large variety of chemicals to make an inferior, stale, or overly refined product seem high in quality.
Aside from the deception they practice on the consumer, many of the chemicals added to foods are known to be active toxicants. Presumably they are “safe under the conditions of their intended use,” to borrow a phrase formulated by the Food and Drug Administration, but the word “safe” does not mean that the chemicals do no damage. In view of the fact that they are harmful even in relatively small quantities, it is likely that toxicants in foods inflict a certain amount of damage on nearly all animal organisms, including man. Another group of food additives is enveloped in an air of biochemical mystery; very little is known about how they react during animal metabolism. Many additives are chemically unique or “exotic.” Until they were developed by chemists, the compounds never appeared in foods, and their long-range effects on the human body have yet to be determined.
The overwhelming majority of people in the United States derive their nourishment from commercial foods. If a harmful chemical finds its way into a popular food, l many individuals are likely to be affected. These individuals differ in age, state of health, genetic endowment, environmental background, and susceptibility to the toxicity of the chemical. Some individuals may be more vulnerable to the toxicant than others, although they may never show any symptoms of acute illness. (There have been cases, however, in which hundreds and even thousands of people suffered acute disorders caused by chemical additives in commercial foods. According to newspaper accounts, as recently as 1960 nearly IOO,OOO people in the Netherlands were poisoned by a chemical emulsifier that had been added to a popular margarine. The margarine was manufactured by one of the most reputable food processors in Europe. The emulsifier produced an eczema-like rash on many parts of the body and temperatures as high as I06 degrees. At least two deaths were attributed to the emulsifier. About 5 million packages of margarine had been distributed to retailers before the effects of the additive were discovered.) The research devoted to chemicals in foods seldom goes beyond the effects they produce in laboratory animals. A chemical additive is tested on rats and dogs for about two years. If the chemical causes no apparent damage to tissues and organs, it may be used in food. If it is obviously toxic “under the conditions of its intended use,” it will be rejected. But laboratory animals are not men. Their physiology is different from that of human beings and they are unable to communicate important reactions that may well escape the notice of the researcher. Although experiments on rats, dogs, and even monkeys are useful in forewarning man against harmful substances, they do not supply conclusive evidence that chemicals which cause no apparent damage to animals are harmless to man.
Evidence that a chemical is harmful may well appear years after it has been permitted in foods. The use of additives regarded as safe only a decade or two ago has been forbidden because later, more exacting studies showed that these chemicals have toxic properties. According to Consumer Reports, coal-tar dyes were certified by the Food and Drug Administration for use as food colors “on the basis of hasty, old-fashioned, and otherwise inadequate testing, and recent advances in research methodology have led to the conclusion that the safety of most of them is unknown. Since 1945, when the FDA began to apply modern methods of study and research to certifiable dyes, I5 food dyes have been re-examined for toxic, carcinogenic or allergenic properties. Only one of these, Yellow No. 5 (used to color candies, icings, and pie-fillings, for example) has been conclusively shown to be harmless. Last year [I955], Orange No. I, Orange No. 2, and Red No. 32 were decertified as too toxic for use in foods. The FDA announcement said that ‘while manufacturers will no longer label and sell these three colors for use, they may label and sell them for external drug and cosmetic use.’ Orange No. I had been widely used in candy, cakes, cookies, carbonated beverages, desserts, and such meat products as frankfurters. Orange No. 2 and Red No. 32 were used to color the outer skin of oranges, and during the Christmas season last year, some 150 children were made ill in California as a result of eating popcorn colored with Red No. 32.”
Whether a food additive is classified as harmful or not depends not only upon laboratory techniques but on the standards of the researcher. Every investigator is confronted with the following questions: How thoroughly should the chemical be studied? On how many species? What constitutes reasonable evidence that a chemical additive is harmless to man? In hearings before the Delaney Committee more than a decade ago, many conflicting answers were given to these questions. Some eminent researchers advanced the view that tests should be conducted through the whole life span of many animal species, while others seemed to be satisfied with limited experiments on rats. The generally accepted criteria for evaluating the toxicity of food additives represent an unstable compromise between widely divergent scientific views and are subject to serious challenge.
While disputes over the adequacy of current testing standards continue, the problem of chemical additives in food becomes greater with every passing year. Chemicals are used in nearly every phase of the production, storage, and merchandising of food. It is difficult to find a single commercial food that has not been exposed to a chemical compound at some stage of cultivation or processing. Many foods accumulate a large number of residues and additives as they travel from the farm through the processing plant to the retail outlet. Grain, for example, is dusted repeatedly with insecticides and fungicides and then treated with fumigants during storage. At the processing stage, chemicals are added to facilitate handling and to improve the consistency of the flour. The flour is bleached and treated with preservatives. The final product may be colored and wrapped in chemically treated paper that sometimes transfers undesirable substances to the food. Are all of these chemicals indispensable to an ample supply of food? Do any of them constitute hazards to public health? At a time when the number of chemicals in our food is reaching staggering proportions, these questions require considered replies.
Chemicals in Agriculture
The claim is often that without chemicals to protect our orchards and fields from pests, we could not possibly have a modern, productive system of agriculture. “There is abundant evidence that our fruit, vegetables and many other staple crops cannot be produced economically, efficiently, and in an adequate volume without chemical protection from insects, plant diseases, weeds, and other pests,” observes George C. Decker, a noted American entomologist. ‘`To deny agriculture the use of these chemical tools would be to jeopardize our agricultural economy and an adequate, well-balanced food supply for the American public. Uninhibited insects and plant diseases would largely eliminate the commercial production of such crops as apples, peaches, citrus fruits, tomatoes, and potatoes, to mention only a few, and would drastically curtail the production of many other major crops.”
Although no one would want to jeopardize our agricultural economy, still less deny the American people a well-balanced diet, a great deal of public and professional concern has been aroused by the presence of pesticidal residues in food. An insecticide should be “used as a stiletto rather than a scythe,” A. W. A. Brown writes. “Ideally it should be employed to punch a particular hole in the food chain, which, if it cannot be filled in by [other species in the] neighboring elements of the biota, at least should be done at such a time of season that it does not seriously disturb the biota.” In practice, the trend has gone in the opposite direction. Insecticides are not only becoming more lethal; they are being used with less and less discrimination. Aside from the ecological implications of this trend, the growing toxicity and variety of insecticidal residues in food have become one of the major health problems of our time.
The large-scale application of insecticides to field crops did not get under way until the 1860’S, when a large number of American farmers began to use parts green (copper acetoarsenite) against the Colorado potato beetle. The effectiveness of this insecticide encouraged chemists to develop a variety of other formulas. Most of the new insecticides were inorganic compounds-preparations of arsenic, mercury, and sulfur-which normally remained on the crop after harvesting. The residues ranged in amount from mere traces to quantities large enough to be hazardous to the consumer. As early as I9I9 the Boston Board of Health condemned shipments of western pears because of excessive arsenic residue, and six years later British health authorities raised strong objections to the amount of arsenic found on imported American apples. Finally, federal officials placed limits on the amount of arsenic that could be allowed to remain on commercial apples and pears. Ironically, for a brief period the original limits, or tolerances, for food sold to domestic consumers were two and a half times higher than those established for apples exported to Europe.
Although the prewar era of inorganic insecticides has been described as pristine in comparison with the present era of organic preparations, the situation was far from ideal. In the 1930’S one of the most widely used inorganic compounds, lead arsenate, combined a suspected carcinogen (arsenic) with a cumulative poison (lead). Calcium arsenate seriously damaged the soil, especially soils low in humus. Many areas of Washington and South Carolina were rendered unproductive as a result of being dusted with large amounts of arsenicals. Harmful as they are, however, prior to 1945 inorganic insecticides were employed in relatively modest quantities and in a limited variety. The most lethal preparations were used primarily for agricultural purposes, and the greater part of the residue could be removed from the surface of the crops by careful washing.
After World War II, the insecticide problem acquired entirely new dimensions. The total quantity of insecticides used prior to the war was but a fraction of the amount employed in the mid-1950’s. Annual domestic sales of pesticides (primarily insecticides) increased more than sevenfold between 1940 and 1956. Inorganic insecticides, though still in use, declined markedly in importance; they were supplanted for the most part by a whole new group of organic preparations. Of these the best known is DDT, a chlorinated hydrocarbon; it was synthesized as early as 1874, but it remained a little-known chemical compound until the late I 930’S, when the Swiss chemist Paul Muller discovered its insecticidal properties. The American armed forces began using it in 1943, a year after the first small quantities had been shipped to the United States for experimental purposes. While the war was still in progress, A. P. Dupire in France and F. J. Thomas in England began to work with benzene hexachloride (BHC), another chlorinated hydrocarbon that had been synthesized in the last century. The research of Thomas and his colleagues led to the discovery of the insecticidal properties of lindane, a form of BHC. Significant work was also being done in Germany during this period. Shortly before the war, Gerhard Schrader, while engaged in research on poison gases, discovered a series of extremely lethal organic phosphates, from which parathion, malathion, and other now familiar preparations were developed. The list continued to grow until an impressive number of chlorinated hydrocarbons and organic phosphates were available for use on farms and in gardens, warehouses, stores, and homes.
In retrospect, the carelessness with which some of these insecticides were delivered to the public is incredible. The chemical behavior of the chlorinated hydrocarbons inside the human body is still unknown, although it has been determined that both the chlorinated hydrocarbons and the organic phosphates are powerful nerve toxicants. The organic phosphates are known to inactivate the enzyme cholinesterase, which participates in the chemical control of nerve impulses. (Until recently, virtually nothing was known about the effects of extended exposure to low doses of organic phosphates, and it was generally assumed that there were no long-term effects at all. In June I96I, however, S. Gershon and F. H. Shaw, of the University of Melbourne, reported that schizophrenic and depressive reactions, with severe impairment of memory and difficulty in concentrating, were found in sixteen individuals who had been exposed to organic phosphates for one and a half to ten years.) Solutions of organic insecticides are absorbed through the skin as well as the lungs and digestive tract. The skin does not offer an effective barrier to the entry of these toxicants when they are used in liquid form. The younger an organism, the more easily it is poisoned. Both groups of insecticides produce effects in warm-blooded animals very similar to those they produce in insects. Farm animals poisoned by DDT and other chlorinated hydrocarbons first become restless and excitable; the next stage is characterized by twitching, spasms, and, finally, convulsions. Animals poisoned by the organic phosphates usually die of respiratory failure, but in these cases, too, convulsions have been known to occur.
Fortunately, DDT, the first of the new organic insecticides to be widely used, is one of the milder chlorinated hydrocarbons. The compound began to reach American farms in 1945. As test samples of DDT had arrived in the United States only three years earlier, American research on the compound up to that time had been necessarily limited. Some of its more disquieting features were still under study when many Americans began using it with reckless abandon as a “magic bullet” against insect pests. At this stage, very little was known about the way the human body reacts to DDT, the possibility of chronic toxicity, the existence of the indirect routes by which DDT enters the human body, and DDT’s stability under varying agricultural conditions. By 1950, when a number of these questions had been answered, it had become evident that although the insecticide is certainly a “magic bullet,” the target toward which it is traveling is very much in doubt.
It was found that DDT tends to concentrate heavily in fat. A relatively low dietary intake of the insecticide is magnified from six to twenty-eight times in the fat deposits of the body. According to A. J. Lehman, of the F.D.A., the continual ingestion of as little as 5 parts per million (ppm) of DDT in the diet produces “definite though minimal liver damage” in rats. Such an intake is not likely to be uncommon in the United States. The F.D.A. has placed a residue tolerance of 7 parts per million of DDT on nearly every fruit, vegetable, and portion of animal fat that reaches the American dinner table. But the agency does not know how many farmers are honoring these tolerances. As recently as December 1959, F.D.A. Commissioner George P. Larrick confessed that economic incentives to use chemicals in agriculture are so compelling that his organization “can hardly be expected to control the problem adequately.”
The reputation of DDT suffered its first blow only two years after the end of the war, As early as 1947, Italian entomologists noted that DDT-resistant flies were appearing in many parts of Italy. Similar reports began to come from California and other areas in the United States. By 1950, American surveys had shown that previously susceptible strains of flies “were becoming resistant to DDT over a period of 2 or 3 years.” Entomologists and public health workers were still hopeful that the malaria-carrying mosquito would continue to succumb to DDT, but this insect also developed DDT-resistant strains. As the number of resistant insects began to multiply, attention shifted to more lethal preparations, such as chlordane, dieldrin, and the organic phosphates.
Chlordane and the dieldrin-aldrin group of preparations are highly toxic chlorinated hydrocarbons. A portion of the testimony and data that the Delaney Committee received on chlordane is summarized in its report as follows: “The Director of the Division of Pharmacology of the Food and Drug Administration [A. J. Lehman] testified that from a chronic viewpoint chlordane is four to five times more toxic than DDT, and that he would hesitate to eat food that had any [chlordane] residue on it whatever. He stated that chlordane has no place in the food industry where even the remotest opportunity for contamination exists, and that it should not be used even as a household spray or in floor waxes.” Later experimental work showed that chlordane, when fed to rats, has about twice the chronic toxicity of DDT; the insecticide produces detectable liver damage when ingested in amounts as low as 2 1/2 parts per million.
Dieldrin and aldrin, according to Lehman, create a danger of chronic toxicity that “may be similar” to that posed by chlordane. These insecticides have the “rather rare property of being more toxic by skin absorption than by oral ingestion, the ratio being about IO:I. To state this another way, dieldrin and aldrin are approximately IO times more poisonous when applied to the skin than when swallowed…. Neither of these compounds is irritating to the skin in low concentrations so that the individual has no warning of skin contact.”
The F.D.A. permits residues of chlordane, dieldrin, and aldrin on nearly all the vegetables and fruits sold to American consumers. In view of current spraying practices, it is not unusual for foods to accumulate residues of several chlorinated hydrocarbons. The fact that the various insecticides have different chemical properties is small reason for comfort. “Studies in our laboratories have shown that when low concentrations of several insecticides are fed to rats the effect is additive,” Lehman writes. “For example, when 2 ppm each of DDT, chlordane, lindane, toxaphene, and methoxychlor, making a total of IO ppm of chlorinated hydrocarbons, are added to the diet of rats, liver damage results. With the possible exception of chlordane, none of the insecticides listed, when fed individually at 2 ppm, would injure the liver, but in their combination their effect is additive.”
Where do we stand today in our chemical war against the insect world? Although partial control has been achieved over some carriers of disease, and other pests are no longer the nuisance they once were, the basic problems of insect infestation remain unsolved. Herbert F. Schoof observes that “man’s insect enemies are adapting to successful coexistence with pesticides almost as rapidly as new technics and toxicants are produced.” We are paying a heavy price for our questionable gains against agricultural pests. A larger quantity of insecticides is required almost every year. The production of DDT in the United States nearly doubled between 1950, when the compound was no longer a novelty, and 1958. Nearly IOO million pounds of aldrin, chlordane, dieldrin, endrin, heptachlor, and toxaphene-chemicals virtually unknown to farmers in the 1940’S-were produced in 1958. Approximately 3 billion pounds of pesticides, including herbicides, fungicides, and fumigants, are produced annually in the United States. “The pesticide industry is growing by leaps and bounds and entomologists predict, and chemical manufacturers hope for, a fourfold expansion in use of pesticides during the next ten to fifteen years,” observes Clarence Cottam, of the Welder Wildlife Foundation. “Today, well over I 2,500 brandname formulations and more than 200 basic control compounds are on the market. Most of the currently used pesticides were unknown even ten years ago. Furthermore, and contrary to the public interest, most new pesticides are decidedly more toxic, generally more stable, and less specific in effect than those of but a few years back.”
The postwar era also witnessed the introduction of highly potent chemical agents for increasing the weight of livestock. In 1947 the F.D.A. approved the use of stilbestrol pellets for “tenderizing” the meat of poultry. Stilbestrol, a synthetically prepared drug, causes bodily changes that are identical with those produced by certain female sex hormones (estrogens ). The drug was approved for use with the understanding that poultrymen would implant a single pellet in the upper region of the chicken’s neck about a month or two before marketing. Presumably this period was sufficient for the complete absorption of the chemical, so that no residues would be left in the flesh of the bird. Male birds treated with stilbestrol undergo radical physiological changes; their reproductive organs, combs, and wattles shrivel, and they lose all inclination to crow or fight. The drug tends to cause substantial increases in weight in both sexes, producing plump birds with skin that is noticeably smooth in texture.
Careful investigation has shown that stilbestrol increases the over-all fat content of chickens by 40 percent. In the hearings before the Delaney Committee, Robert K. Enders, professor of zoology at Swarthmore College, expressed his agreement “with those endocrinologists who say that the use of the drug to fatten poultry is an economic fraud. Chicken feed is not saved; it is merely turned into fat instead of protein. Fat is abundant in the American diet so more is undesirable. Protein is what one wants from poultry. By their own admission it is the improvement in appearance and increase in fat that makes it more profitable to the poultryman to use the drug. This fat is of very doubtful value and is in no way the dietary equal of the protein that the consumer thinks he is paying for.”
By the late 1950’S approximately IOO million hormone-treated fowl were being marketed annually to American consumers. In a number of limited surveys, the F.D.A. found that stilbestrol pellets often remained in the necks of poultry until after the birds had reached wholesale outlets; the pellets were not completely absorbed. stilbestrol was detected in the edible tissues of hormone-treated fowl even after the heads and necks had been removed. Finally, in December 1959, the F.D.A. announced that, on its recommendation, the poultry industry had agreed to discontinue the use of the synthetic estrogen. It is doubtful whether either the F.D.A. or the poultry industry had very much choice. Stilbestrol is suspected of being a cancer-promoting drug, and recent federal legislation specifically prohibits the presence of carcinogenic additives in food. Pellets of stilbestrol are still implanted in the ears of cattle, however, and the drug is widely used as a supplement to the feed of steers and sheep. The official opinion today is that no residues of the drug will remain in large animals if breeders obey F.D.A. regulations, but the evidence to support this opinion is open to question (see pages I40-I ).
The use of synthetic estrogens in livestock management was soon followed by the addition of antibiotics to feed, a technique that dramatically accelerates the growth of domestic animals. Although there is no evidence that antibiotics affect the nutritive quality of meat, research conducted by the F.D.A. showed that chlortetracycline and penicillin could be detected in the blood, tissues, and eggs of chickens given feed containing as little as 50 parts per million of the drugs. This quantity does not exceed the limits established by the F.D.A. for antibiotics in feed. According to William A. Randall, of the F.D.A.’s Division of Antibiotics, the “small amounts found in chicken tissue were destroyed following frying. Furthermore, when the antibiotic-containing feed was withdrawn and normal feeding begun the antibiotic disappeared from the tissue or eggs within a few days. After hard-boiling eggs containing antibiotics, no activity could be detected in the great majority of those tested.”
But we should place the emphasis where it properly belongs. Antibiotics could be detected even in eggs that were hard-boiled, and residues were found in the tissues and eggs of chickens after normal feeding was restored. Accordingly, current F.D.A. regulations prohibit the use of antibiotics in the feed of egg-laying hens and require that, in the case of poultry destined for human consumption, normal feeding be restored four days prior to slaughtering. “It may well be in the public interest to reserve this valuable class of drugs for the purpose for which they were originally developed,” Randall concludes, “that is, the prevention and treatment of disease.”
This conclusion is not without merit. Many farmers pay very little attention to the F.D.A.’s regulations, and nearly all farmers ignore them at one time or another. The situation is similar to that faced by the police in trying to keep motorists from speeding; almost everyone occasionally exceeds the speed limit. But in the case of the F.D.A. regulations, the violator’s chances of being caught are infinitesimal in comparison with those of the speeder. Although its inspection program is totally inadequate to cope with the problem of chemicals in food, the F.D.A. has already discovered violations of its regulations on feed supplemented with arsenic compounds, another widely used promoter of growth. “A very preliminary survey made some time ago by the Food and Drug Administration indicated that some poultry raisers are not withholding arsenic-containing feeds from their flocks 5 days before slaughter,” complained Arthur S. Flemming, former Secretary of Health, Education, and Welfare, “and we have information that in some parts of the country, hog raisers maintain their animals on arsenic-containing feed within the s-day period that the arsenic is supposed to be withheld. It is evident that such disregard for safe use could lead to added arsenic residues in poultry and pork on the retail market and thus could frustrate the public health safeguards upon which the original approvals were granted.”
Antibiotics are dangerous food additives. The sensitivity of many persons to penicillin has reached a point where a therapeutic dose of the drug can be expected to claim their lives. Such individuals are increasing in number with every passing year. Cases have already been reported in which penicillin used to treat infections in dairy cattle entered the milk supply in sufficient quantities to cause severe reactions in human consumers. According to Murray C. Zimmerman, of the University of Southern California, data on milk contamination suggest that a person who drinks two quarts of milk every day “could ingest over I,OOO units of penicillin daily. The fact that penicillin ‘naturally’ present in milk has not previously been proved to cause allergic reactions is outweighed by the fact that this is almost a billion times more penicillin than has previously been shown necessary to provoke reactions. Coleman and Siegel reported a patient with a reaction on passive transfer to 0.00001 units of penicillin. Bierlein’s patient went into shock when skin-tested with o.ooooo3 units of penicillin.” Zimmerman then proceeds to discuss detailed case histories of four patients whose reaction to penicillin could be traced with reasonable certainty to the consumption of dairy products.
It is quite conceivable that the extensive use of antibiotics in agriculture will slowly turn our domestic animals into a major reservoir of antibiotic-resistant bacteria. This represents a problem fraught with incalculable dangers to public health. Feed supplemented with tetracycline, for example, has already produced resistant strains of Bacterium cold and Clostridium perfigens. Both bacteria typically inhabit man’s alimentary tract, but under certain circumstances they behave as disease-causing organisms. “Examination of pigs and poultry kept on many different farms showed that the B. cold in the feces of tetracycline-fed animals were predominantly tetracycline-resistant,” observes H. William Smith, of the Animal Health Trust in Sussex, England, “while those in the feces of animals on farms where tetracycline feeding was not practiced were predominantly tetracycline-sensitive. In some herds in which tetracycline feeding was just being introduced it was possible to trace the changes of the B. cold fecal flora from tetracycline-sensitive to tetracycline-resistant.” Smith notes that similar changes could be traced in the case of Clostridium perfigens..
The use of shotgun methods to treat or forestall udder infections in cattle with penicillin has almost certainly increased the propagation of resistant strains of Staphylococcus aureus, the principal culprit in the common “staph infection” and the direct cause of grave, often fatal cases of pneumonia. Smith notes that the proportion of penicillin-resistant Staphylococcus aureus cultures isolated from herd samples of milk in Britain “had increased from I I % in I 954 to 47 % in I 956.” Two bacteriologists of the Canadian Food and Drug Directorate, F. S. Thatcher and W. Simon, in analyzing cheese that had been purchased in retail outlets, found substantially more penicillin-resistant staphylococci “than would be expected in a normal population not exposed to antibiotic therapy. . . . With regard to the streptococci recovered from cheese our results show a close approximation to the degree of resistance to penicillin found among comparable species isolated from hospital patients.”
Thatcher and Simon conclude their report with the warning that “where penicillin or other antibiotics are used with dairy cattle, the survival of resistant organisms may lead to widespread distribution of resistant strains into the homes of the general populace, since staphylococci and streptococci, often in large numbers, are of common occurrence in cheese. Foods so contaminated may well contribute to the severity of the problem arising from infections in man with resistant strains without the patients having received prior antibiotic therapy or without having been exposed to endemic infections within hospitals.”
But in the United States this is no longer merely a potential hazard. Already there have been two localized epidemics of “staph infections” that have been attributed to contact with cattle. During 1956 and 1957 about 36 per cent of the senior students and 26 per cent of the faculty at the Veterinary School of the University of Pennsylvania contracted “staph infections.” In 1958 another epidemic of the infections occurred among 44 per cent of the senior students at the Colorado State University College of Veterinary Medicine. There is good reason to believe that, in both cases, animals handled by the students had become a reservoir of antibiotic-resistant staphylococci.
The current trends in agriculture reflect a shocking indifference to the long-term health of our population. By recklessly overcommitting ourselves to the use of chemicals in the growing of our food, we have turned highly potent toxicants and drugs, which we would be justified in using only on a limited scale or in exceptional circumstances, into the foundations of our agriculture. These foundations are extremely unstable and the edifice we are trying to construct may prove to be a trap rather than a habitable dwelling. Insecticides commonly aggravate the very problems they are meant to solve. The ecological difficulties they create lead to the use of increasingly toxic preparations, until a point is reached where the insecticides threaten to become more dangerous in the long run to the human species than to the insects for which they are intended. The need to rescue a crop from agricultural mismanagement gains priority over the health and wellbeing of the consumer. The use of drugs to fatten and accelerate the growth of domestic animals is playing havoc with the physiology of our livestock. The damage these drugs inflict on poultry and cattle necessitates the use of additional drugs, which, in turn, contaminate man’s food and reduce the value of these agents in combating disease. It would seem to be a form of ecological retribution that the very forces man has summoned against the living world around him are being redirected by a remorseless logic against the human organism itself.
Chemicals in Food Processing
The same overcommitment to potentially risky methods is found in other phases of food production. Complex problems of storage, cleansing, handling, refining, cooking, mixing, heating, and packaging are often solved by the use of chemical additives. A half century ago the yellowish tint in freshly milled flour was removed slowly by aging; today it is removed rapidly by oxidizing agents. The miller no longer has to store flour and wait for it to mature. Emulsifiers are employed, partly to “improve” the texture of a product, partly to speed up processing. Their “functional advantages” in ice cream, for example, are that the “mix can be whipped faster, the product from the freezer is dryer, and the mix has less tendency to ‘churn’ in the freezer.” Chemical antioxidants are used to help prevent oils and fats from becoming rancid, a problem that becomes especially pronounced when natural antioxidants are removed during large-scale processing. Chemical preservatives are added to food to permit longer storage in wholesale depots and to extend the “shelf life” of a product in retail outlets. In time chemicals are turned from mere adjuncts of food production into “technological necessities.” Their use may result in new machines and facilities, the abandonment of old processing methods, and a broad reorientation of technology to meet the new chemical requirements.
The fact that a chemical is technologically useful does not mean that it is biologically desirable. The two phrases are occasionally used interchangeably, as though they were synonymous. Some of the chemicals most widely used in the production of food have been found to be harmful to animals in laboratory tests. Nitrogen trichloride, or agene, was employed for decades as a bleaching and maturing agent in flour before it was discovered, in 1946, that the compound produces “running fits,” or hysteria, in dogs. The use of agene in the bread industry has since been discontinued. Although there is no evidence that agenized flour causes nerve disorders in man, no one can say with certainty that cereal products treated with agene are completely harmless to consumers, especially if eaten day after day for twenty years or more. The toxicant in agenized flour “produces nervous disturbances in monkeys, but not epilepsy in man,” observed the late Anton Carlson, one of the most distinguished physiologists of his day. “That is not sufficient for me, because there are many other types of nervous disturbances . . . that may be aggravated or introduced by this chemical.”
Another group of chemicals, notably certain types of polyoxyethylene derivatives, were widely used in the United States as emulsifiers before they were banned as potential hazards to consumers. Former F.D.A. Commissioner Charles W. Crawford is reported to have described the agents as “good paint removers.” According to data furnished to the Delaney Committee, rats that had been fed two commercial preparations of polyoxyethylenederived emulsifiers showed blood in their feces and developed a variety of kidney, abdominal, cardiac, and liver abnormalities. These preparations were used primarily as bread softeners but also found their way into other foods. The polyoxyethylene derivatives constitute a highly suspect family of emulsifying agents, yet some of them are still added to cake mixes, cake icing, frozen desserts, ice cream, candies, dill pickles, fruit and vegetable coatings, vitamin preparations, and many food-packaging materials. (Several experiments indicate that an intake of large amounts of polyoxyethylene ( 8 ) stearate produces gall stones and bladder tumors in rats. At least two commercial preparations of polyoxyethylene derivatives are reported to be co-carcinogens; they promote the development of skin cancer in mice when used in conjunction with certain carcinogenic agents. For a number of years all three of these polyoxyethylene compounds were used as food additives in the United States. By the early 1950’S, polyoxyethylene derivatives had also migrated to England, where they were used on a limited scale as bread softeners.)
Similar examples can be culled from other groups of chemical aids. In general, the toxic effects of many commonly used compounds are difficult to determine, but individually and in groups they have aroused suspicion and concern. “The fortification of oils and fats against oxidation and rancidity is another field in which there are serious questions of possible toxicity,” warns F. J. Schlink, technical director of Consumers’ Research and editor of Consumer Bulletin. “The natural antioxidants which delay rancidity are lost during the factory processing of refined oils and fats; then the attempt is made to restore the antioxidation qualities by addition of fat-stabilizing substances. Most of the materials that have antioxidation properties are known to be toxic to some degree.”
The term “technological aid” has been used loosely to include food additives that could be dispensed with entirely if a manufacturer improved his processing methods and the quality of his product. The question of whether a chemical, instead of being legitimately employed to furnish distant consumers with a perishable product, was being used rather as a device for masking inferior ingredients and unsanitary methods arose in sharp form early in the century, when sodium benzoate was employed extensively as a preservative in catsup. Harvey Wiley, the first administrator of the federal food and drug law, regarded sodium benzoate as a hazard to consumers Supported by data from feeding experiments performed with human beings, Wiley claimed that the preservative causes injury to the digestive system. He was convinced that catsup need not deteriorate readily if clean facilities and proper ingredients are used in preparing it. Challenged on this point by Charles F. Loudon, a canner in Terre Haute, Indiana, Wiley appointed a bacteriologist, Arvill W. Bitting, to work at the Loudon factory and prepare a formula for a stable catsup preparation without artificial preservatives. Bitting was successful. Moreover, after inspecting numerous canning factories and examining many brands of catsup, Bitting concluded that manufacturers who needed sodium benzoate often used inferior products and were careless about sanitation.
“The whole spirit and tradition of the pure-food law is against the use of preservatives and substitutes,” declared Mrs. Harvey Wiley more than forty years after the Loudon episode. “The intent of the law is to encourage the manufacture of high-grade ingredients, not to try to hide inferiority and cheapness by the use of chemicals and preservatives.” If this is so, the spirit of the law has been ignored for decades. The most up-to-date surveys of chemical additives in foods show that preservatives are added to cheese, margarine, cereal products, jams, jelly, and many other processed foods. Uncooked poultry is dipped in solutions of antibiotics.(Notably the tetracyclines. Despite strong pressure from the food industry, the British Food Standards Committee has refused to recommend the use of tetracyclines as preservatives for poultry and fish.) Aside from the damage to public health that may arise from the use of questionable chemicals as preservatives, the longer “shelf life” acquired by some of these foods may cause a loss of valuable nutrients.
Many chemical additives in foods perform no technological or nutritional function whatever. They cannot be regarded as materially useful by any stretch of the imagination. These chemicals enhance the color of a food or make certain products feel soft and newly prepared; in some instances they have been used to replace costly but valuable nutrients with inferior ones. In the least objectionable cases, an additive will deceive the consumer without impairing his health. The application of an innocuous vegetable dye to some foods often leads the consumer to believe that he is acquiring a better, more wholesome, or tastier product than is actually the case. In the most objectionable cases, a chemical is toxic to a greater or lesser degree. This type of additive not only serves to change the appearance or conceal the nutritive deficiency of a food; it exposes the consumer to a certain amount of damage.
Nitrates and nitrites, for example, are used to impart a pink color to certain brands of processed meat. Ostensibly, this is their sole function. The addition of these chemicals to meat, especially frankfurters and hamburger meat, would be objectionable if only because of the deception which makes it difficult for the housewife to distinguish between high-quality and low-quality products. But this is not the only deception perpetrated on the consumer. Used together with salt, nitrite compounds definitely extend the “shelf life” of processed meat products.
When nitrites are ingested, they react with hemoglobin in the blood to form me/hemoglobin, and, like carbon monoxide, reduce the hemoglobin’s capacity to carry oxygen. An individual who consumes three to four ounces of processed meat containing zoo parts per million of sodium nitrite (a permissible residue) ingests enough of the compound to convert from 1.4 to 5.7 per cent of his hemoglobin to me/hemoglobin. Ordinarily, this percentage is insignificant. But if the same individual is a
heavy smoker and lives in an air-polluted community, the cumulative loss in oxygen-carrying capacity produced by the nitrites in the food and by the carbon monoxide in tobacco smoke and motorcar exhaust can no longer be dismissed as trivial. Sodium nitrite is highly toxic in relatively small amounts. About four grams of the compound constitutes a lethal dose for adults. Although Lehman regards 200 parts per million of sodium nitrite as a safe residue, he notes that “only a small margin of safety exists between the amount of nitrite that is safe and that which may be dangerous. The margin of safety is even more reduced when the smaller blood volume and the corresponding smaller quantity of hemoglobin in children is taken into account. This has been emphasized in the recent cases of nitrite poisoning in children who consumed weiners and bologna containing nitrite greatly in excess of the 200 ppm permitted by Federal regulations. The application of nitrite to other foods is not to be encouraged.”
Emulsifiers have been used in bread not only as softening agents, which can give stale bread the texture of newly baked bread, but as substitutes for nourishing ingredients. “The record of the bread-standards hearings contain evidence of distribution among bakers of advertising material advocating the replacement of fats, oils, eggs, and milk by emulsifiers,” George L. Prichard, of the Department of Agriculture, told the Delaney Committee. “The use of such products as components of food may work to the disadvantage of our farm economy by displacing farm products normally used. The record indicates that natural food constituents, such as fats and oils, probably will be reduced in many commercial bakery products if bakers are allowed to employ these emulsifiers.”
This substitution affects more than the farm economy, however; it also works to the disadvantage of the consumer. To illustrate this, the Delaney Committee report compared the ingredients in two cake batters prepared by the same company eleven years apart; during the interval emulsifiers were added to the product. The first batter, made in I 939, did not contain synthetic emulsifiers; the second, prepared in 1949, did. “On a percentage basis, the cake batter in 1939 contained I3 per cent eggs, and 8.6 per cent shortening. In 1949, the cake batter contained 6.3 per cent eggs, and 4.8 per cent shortening, with somewhat less than 0.3 per cent of synthetic emulsifier.” The report noted that a “synthetic yellow dye could be added to provide the color formerly obtainable through the use of eggs. The utilization of synthetic yellow dye in commercial cake was practiced before the war. There are indications that the use of artificial coloring matter is increased when quantities of whole eggs or egg yolks are reduced in commercial cake formulas.”
To heighten the insult, among the most commonly used emulsifiers in 1949 were the polyoxyethylene monostearates. The yellow dye referred to in the Delaney Committee report was probably FDC Yellow No. 3 (Yellow AB), which, until fairly recently, was added to many yellow cake mixes. As we shall see in the following chapter, the dye often contained impurities of a potent carcinogen and its use in foods was forbidden in 1959. For a number of years, however, both the emulsifier and the dye undoubtedly appeared in many brands of cake and reached large numbers of unsuspecting consumers.
Problems of this kind are not likely to disappear unless there is a basic change in the viewpoint of the F.D.A “Inherited from the Wiley era is a too common misconception that all ‘chemicals’ are harmful, and the related idea that any amount of a ‘poison’ is harmful,” the F.D.A. observes in a brochure on food additives. “The fact is, of course, that chemical additives, or food additives as they are now being called, have brought about great improvements in the American food supply. Additives like potassium iodide in salt and vitamins in enriched food products are making an important contribution to the health of our people, and yet it is a fact that both iodine and some of the vitamins would be harmful if consumed in excessive amounts. Many similar examples could be given to refute these common misconceptions.”
This argument is grossly misleading. Iodine and vitamins are indispensable to human life, but coal-tar dyes and benzoic acid, for example, are not. If we adhered to a well-balanced diet of properly prepared natural foods, iodine and vitamins would never enter the body in toxic amounts. Coal-tar dyes, on the other hand, are suspected of being harmful to man in nearly any amount if consumed repeatedly, and the kindest statement that can be made for the presence of benzoic acid in food is that the compound is “safe under the conditions of its intended use.”
The formula “safe under the conditions of its intended use” exposes the consumer to serious risks when it opens the door to the use of food additives whose biochemical activity is not understood. Many unexpected problems may arise when such additives appear in food. A particular additive may seem to be relatively harmless to the organs of the body, but it may be carcinogenic on the cellular level of life. Another additive may produce insignificant or controllable effects when studied in isolation; brought into combination with various chemicals in food, however, it may give rise to toxic compounds. The body, in turn, may make a toxic additive more poisonous in the course of changing it during metabolism. At one time it was generally believed that whenever a toxic substance was absorbed the body was capable of calling upon special mechanisms for detoxifying the toxicant,” Lehman observes. “It was believed also that the metabolic pathway that the toxicant followed always proceeded in the direction of the formation of a less toxic compound. Later work on the metabolism of drugs and toxic substances showed that special mechanisms did not exist, but that the toxicant was subject to the same metabolic influences as those which normally operate in the body. The assumption that the metabolic product was less poisonous than the parent substance from which it was derived is also unwarranted simply because in many instances the toxicity of either the original substance and its conversion product is unknown. In other instances the metabolic product is even more poisonous than its parent. The conversion product, heptachlorepoxide, which is two or three times more poisonous than the parent substance, heptachlor [a widely used insecticide], may be cited as an example of this.”
Finally, food additives may cause allergic reactions that are likely to go unnoticed for many years. The reactions need not be severe to be harmful. Otto Saphir and his colleagues at the Michael Reese Hospital in Chicago have recently suggested that the development of arteriosclerosis may be promoted by the sensitivity of the body to allergenic compounds, notably certain antibiotics. By using sulfa drugs to produce allergies in forty-two rabbits, the researchers were able in eight months to cause degenerative changes in the arteries of thirty-one of the animals. These changes closely resembled arteriosclerosis in man. According to a press account of Saphir’s report, the rabbits’ reactions “were not apparent on the surface.” It would be very imprudent to assume that such effects are produced only by chemicals that cause severe or noticeable allergies. If the data of Saphir and his colleagues are applicable to man, additives with even minor allergenic properties cannot be dismissed as harmless.
Today more than 3000 chemicals are used in the production and distribution of commercially prepared food. At least 1288 are purposely added as preservatives, buffers, emulsifiers, neutralizing agents, sequestrants, stabilizers, anticaking agents, flavoring agents, and coloring agents, while from 25 to 30 consist of nutritional supplements, such as potassium iodide and vitamins. The remaining chemicals are “indirect additives”-substances, such as detergents and germicides, that get into food by way of packaging materials and processing equipment. Many chemical additives are natural ingredients, but a large number are not. The artificial substances that appear in food range from simple inorganic chemicals to exotic compounds whose biochemical activity is still largely unknown.
At a time when almost every processed food contains chemical additives or residues, the “misconceptions” of the “Wiley era” must seem like bold anticipations. Sixty years ago the number of chemicals added to food was small. Public concern was aroused primarily by the gross adulteration of food and by the unsanitary practices followed in mills and slaughterhouses. These problems are still with us; as recently as 1958 over 5000 tons of food was seized by the F.D.A. because of filth and decomposition. But the problem of chemicals in food has reached proportions that would have appalled Wiley, and the incidence of diseases, such as cancer, that can be produced by chemicals in man’s environment has increased to an alarming extent. We sense that our food technology has taken a wrong turn. Having achieved the abundance of nutriment it once promised, it threatens to become another factor imperiling man’s health and well-being.