Who Should Decide What Goes into a Can of Tomatoes? Food Laws from a Voluntaryist Perspective
ABSTRACT: This paper recounts the history of food inspection from a voluntaryist perspective. In England and the United States, the efforts to achieve food safety have relied upon two main methods: education and legislation. Governments did nothing that could not be done on the free market (and in many cases was already being done). Books on how to test for adulterated products at home were published. Some manufacturers voluntarily observed the highest standards of sanitation and cleanliness in their manufacturing plants. Private commercial testing labs were established, and third-party certifications such as the Good Housekeeping Seal came into being. At the same time, we might ask: Why was not strict liability for causing sickness or death imposed upon manufacturers and retailers that sold foods or drugs? Where were the insurance companies that might have provided product liability insurance? To answer these questions, this article looks at the historical evolution of negligence, product liability, and tort law.
Carl Watner ([email protected]) is an independent scholar. This article appeared in The Voluntaryist, digital issue 192, at voluntaryist.com.
When I took over the operation of Inman Feed Mill in late 1987, none of the animal products that we processed and bagged were tagged. Cracked corn, whole corn, sweet feed, and chicken scratch all went out the loading door in plain, unmarked bags. The feed mill had been started in the early 1950s in a very rural area of upstate South Carolina, and most of its customers had face–to–face contact with the various owners. Never was there a doubt in the customer’s mind about what they were getting. Feed bags were not sewn; pieces of string tied in the ubiquitous miller’s knot secured their contents. If there was a question, we only had to untie the bag, show the contents to the customer, or place it on the scale if somehow the customer doubted how many pounds he was buying. If there were federal feed and grain laws, there was no evidence of their enforcement. However, there were South Carolina Department of Agriculture regulations which mandated that statements of feed ingredients and analysis (of protein, fat, and fiber content) be placed on the bags. Due to very lax enforcement by state inspectors and the very local nature of our business, the tagging laws were not enforced until about 2015.
Why am I recounting this history? Because this was how most food and drugs for people were sold well into the late nineteenth and early twentieth centuries—no food labels; no statements of ingredients; no stated weight; no serving breakdowns of calories, fat, fiber, sugar, and protein; and no prescriptions required—not even for dangerous drugs. What first called my attention to this topic was a book by Deborah Blum titled The Poison Squad: One Chemist’s Single-Minded Crusade for Food Safety at the Turn of the Twentieth Century. The Poison Squad consisted of young healthy men who volunteered as human guinea pigs to test the safety of additives, adulterants, and preservatives in foods sold for human consumption. It was an experimental program designed to test the toxicity of ingredients in food. It was begun in late 1902 by Harvey Wiley, who was chief chemist of the United States Department of Agriculture from 1882 to 1912. Wiley used the the Poison Squad’s results and the publicity surrounding the publication of Upton Sinclair’s The Jungle to promote the Pure Food and Drug Act, which was passed in 1906.
The purpose of this paper is to recount the history of food inspection from a voluntaryist perspective. In England and the United States, the efforts to achieve food safety have relied upon two main methods: education and legislation (Whorton 2010, 156). I suppose one could argue that if education were sufficient and successful, legislation would be unnecessary, but we shall see how this argument worked out historically. But even if legislation were necessary, which I am not granting, governments did nothing that could not be done on the free market (and in many cases was already being done). Books on how to test for adulterated products at home were published. Some manufacturers voluntarily observed the highest standards of sanitation and cleanliness in their manufacturing plants and used only the best ingredients in their products. . Private commercial testing labs were established, and third-party product certifications such as the Good Housekeeping Seal came into being. At the same time, we might ask: Why was not strict liability for causing sickness or death imposed upon manufacturers and retailers that sold foods or drugs? Where were the insurance companies that might have provided product liability insurance? To answer these questions requires a look at the historical evolution of negligence, product liability, and tort law.
As B. I. Smith (2013) has noted, “Legislation designed to prevent the sale of unsafe or unwholesome food represents one of the oldest forms of government” intervention in the marketplace. The English Assize of Bread and Ale, enacted during the reign of King John during the mid-1200s, contains one of the earliest references to food adulteration. Both in England and in British North America the establishment of public markets was usually a prerogative of city governments. The first meat inspection law in North America was enacted in New France (now Canada) in 1706 and required butchers to notify the authorities before animals were slaughtered (Institute of Medicine 1990). Municipal legislation covered everything from licensing vendors, mandating the use of just weights and measures, and “prohibitions on buying and selling outside the public market, prohibition on reselling, forestalling, and engrossing.” “In New York, unsound beef, pork, fish, or hides were to be destroyed by municipal officials by ‘casting them into the streams of the East or Hudson rivers,’” and in New Orleans, officials were authorized to throw diseased meat into the Mississippi. In 1765, Lord Mansfield upheld the existence of public market regulations by referring to “the need for the ‘preservation of order, and the prevention of irregular behavior’” (Novak 1996, 95–98).
In England, during much of the nineteenth century there were few regulations on the sale of adulterated foods and poisons. For example, arsenic—which is very similar in color and texture to white sugar, flour, or baking powder—was sold by grocers and an odd assortment of tradesmen and hucksters. “In short, anyone could sell” and anyone could buy. “Nothing more was expected of buyers than they must mind what they were buying.” The rule was caveat emptor. The burden of proof was on the buyer to be sure that his purchase caused him no harm. Since there was no statutory definition of a druggist or chemist, anyone could sell arsenic, and people commonly purchased it for use as a rat killer. The British Pharmaceutical Society, founded in 1841, devoted much of its activity to “achieving a Parliamentary definition of the title ‘Chemist and Druggist’” and agitated for a law that would permit only those vendors who met the legislative requirements to traffic in drugs and poisons (Whorton 2010, 113–14, 135).
As a result, arsenic was often implicated in both accidental and purposeful deaths. Unhappy wives often used arsenic to poison their husbands, and even if they were indicted for manslaughter, “juries were reluctant to convict unless it could be demonstrated that the suspect had actually bought some of the poison.” People who caused accidental poisoning were usually not punished at all. Between 1837 and 1839, over five hundred cases of accidental poisoning by arsenic were reported. Deaths continued to mount during the 1840s. A classic example of a possible poisoning was that of a little girl in 1851 who was sent to a rural grocer to get “tea, sugar, flour, currants, red herrings, and two ounces of arsenic to deal with rats.” Absent labeling, how was her mother to know which was arsenic? Parliament finally passed An Act to Regulate the Sale of Arsenic in June 1851, which required records be made of every sale and mandated that any quantity of less than ten pounds be colored so that it could not be confused with food ingredients (Whorton 2010, 114, 131, 133).
The law was often ignored by both buyers and sellers. Less than three months after its passage a woman used uncolored arsenic to kill her husband. She was executed, and the two pharmacists who sold her the arsenic were fined. Violations of the law continued, finally culminating in a ghastly tragedy in Bradford, Yorkshire, on October 25, 1858, when a confectioner’s assistant requested a quantity of plaster of Paris, which was supposed to be used as an adulterant in the candy they were making, but was mistakenly sold uncolored arsenic. Despite the fact that one worker became sick while mixing the arsenic into the peppermint lozenges that he was preparing and that “the candies took an unusually long time to dry and were darker in color than usual,” the confectioner did not realize there was a problem. He sold forty pounds of the lozenges to a vendor at Bradford’s Saturday market and mixed the remainder into an assortment of other sweets, known as a Scotch mixture. In less than three days, twenty-one people had died from eating the candy and over seventy–eight were known to be seriously ill. The confectioner and the chemist and his apprentice, who had sold the arsenic, were arrested and indicted for manslaughter. “When the trial was held…the jury could find no violation of the law. The episode was simply a highly regrettable accident” even though it was a case of gross negligence (Whorton 2010, 135–37, 139, 163).
Similar incidents of death and sickness due to food poisoning occurred in the United States. In his 1853 book The Milk Trade in New York and Vicinity, John Mullaly “included reports from frustrated physicians that thousands of children were killed in New York City every year by dirty (bacteria-laden) and deliberately tainted milk,” which was commonly known as “swill” milk.
Article from Mises Wire