Journal of Libertarian Studies

Who Should Decide What Goes into a Can of Tomatoes? Food Laws from a Voluntaryist Perspective

Journal of Libertarian Studies

ABSTRACT: This paper recounts the history of food inspection from a voluntaryist perspective. In England and the United States, the efforts to achieve food safety have relied upon two main methods: education and legislation. Governments did nothing that could not be done on the free market (and in many cases was already being done). Books on how to test for adulterated products at home were published. Some manufacturers voluntarily observed the highest standards of sanitation and cleanliness in their manufacturing plants. Private commercial testing labs were established, and third-party certifications such as the Good Housekeeping Seal came into being. At the same time, we might ask: Why was not strict liability for causing sickness or death imposed upon manufacturers and retailers that sold foods or drugs? Where were the insurance companies that might have provided product liability insurance? To answer these questions, this article looks at the historical evolution of negligence, product liability, and tort law.

Carl Watner ( is an independent scholar. This article appeared in The Voluntaryist, digital issue 192, at

When I took over the operation of Inman Feed Mill in late 1987, none of the animal products that we processed and bagged were tagged. Cracked corn, whole corn, sweet feed, and chicken scratch all went out the loading door in plain, unmarked bags. The feed mill had been started in the early 1950s in a very rural area of upstate South Carolina, and most of its customers had face–to–face contact with the various owners. Never was there a doubt in the customer’s mind about what they were getting. Feed bags were not sewn; pieces of string tied in the ubiquitous miller’s knot secured their contents. If there was a question, we only had to untie the bag, show the contents to the customer, or place it on the scale if somehow the customer doubted how many pounds he was buying. If there were federal feed and grain laws, there was no evidence of their enforcement. However, there were South Carolina Department of Agriculture regulations which mandated that statements of feed ingredients and analysis (of protein, fat, and fiber content) be placed on the bags. Due to very lax enforcement by state inspectors and the very local nature of our business, the tagging laws were not enforced until about 2015.

Why am I recounting this history? Because this was how most food and drugs for people were sold well into the late nineteenth and early twentieth centuries—no food labels; no statements of ingredients; no stated weight; no serving breakdowns of calories, fat, fiber, sugar, and protein; and no prescriptions required—not even for dangerous drugs. What first called my attention to this topic was a book by Deborah Blum titled The Poison Squad: One Chemist’s Single-Minded Crusade for Food Safety at the Turn of the Twentieth Century. The Poison Squad consisted of young healthy men who volunteered as human guinea pigs to test the safety of additives, adulterants, and preservatives in foods sold for human consumption. It was an experimental program designed to test the toxicity of ingredients in food. It was begun in late 1902 by Harvey Wiley, who was chief chemist of the United States Department of Agriculture from 1882 to 1912. Wiley used the the Poison Squad’s results and the publicity surrounding the publication of Upton Sinclair’s The Jungle to promote the Pure Food and Drug Act, which was passed in 1906.

The purpose of this paper is to recount the history of food inspection from a voluntaryist perspective. In England and the United States, the efforts to achieve food safety have relied upon two main methods: education and legislation (Whorton 2010, 156). I suppose one could argue that if education were sufficient and successful, legislation would be unnecessary, but we shall see how this argument worked out historically. But even if legislation were necessary, which I am not granting, governments did nothing that could not be done on the free market (and in many cases was already being done). Books on how to test for adulterated products at home were published. Some manufacturers voluntarily observed the highest standards of sanitation and cleanliness in their manufacturing plants and used only the best ingredients in their products. . Private commercial testing labs were established, and third-party product certifications such as the Good Housekeeping Seal came into being. At the same time, we might ask: Why was not strict liability for causing sickness or death imposed upon manufacturers and retailers that sold foods or drugs? Where were the insurance companies that might have provided product liability insurance? To answer these questions requires a look at the historical evolution of negligence, product liability, and tort law.

As B. I. Smith (2013) has noted, “Legislation designed to prevent the sale of unsafe or unwholesome food represents one of the oldest forms of government” intervention in the marketplace. The English Assize of Bread and Ale, enacted during the reign of King John during the mid-1200s, contains one of the earliest references to food adulteration. Both in England and in British North America the establishment of public markets was usually a prerogative of city governments. The first meat inspection law in North America was enacted in New France (now Canada) in 1706 and required butchers to notify the authorities before animals were slaughtered (Institute of Medicine 1990). Municipal legislation covered everything from licensing vendors, mandating the use of just weights and measures, and “prohibitions on buying and selling outside the public market, prohibition on reselling, forestalling, and engrossing.” “In New York, unsound beef, pork, fish, or hides were to be destroyed by municipal officials by ‘casting them into the streams of the East or Hudson rivers,’” and in New Orleans, officials were authorized to throw diseased meat into the Mississippi. In 1765, Lord Mansfield upheld the existence of public market regulations by referring to “the need for the ‘preservation of order, and the prevention of irregular behavior’” (Novak 1996, 95–98).

In England, during much of the nineteenth century there were few regulations on the sale of adulterated foods and poisons. For example, arsenic—which is very similar in color and texture to white sugar, flour, or baking powder—was sold by grocers and an odd assortment of tradesmen and hucksters. “In short, anyone could sell” and anyone could buy. “Nothing more was expected of buyers than they must mind what they were buying.” The rule was caveat emptor. The burden of proof was on the buyer to be sure that his purchase caused him no harm. Since there was no statutory definition of a druggist or chemist, anyone could sell arsenic, and people commonly purchased it for use as a rat killer. The British Pharmaceutical Society, founded in 1841, devoted much of its activity to “achieving a Parliamentary definition of the title ‘Chemist and Druggist’” and agitated for a law that would permit only those vendors who met the legislative requirements to traffic in drugs and poisons (Whorton 2010, 113–14, 135).

As a result, arsenic was often implicated in both accidental and purposeful deaths. Unhappy wives often used arsenic to poison their husbands, and even if they were indicted for manslaughter, “juries were reluctant to convict unless it could be demonstrated that the suspect had actually bought some of the poison.” People who caused accidental poisoning were usually not punished at all. Between 1837 and 1839, over five hundred cases of accidental poisoning by arsenic were reported. Deaths continued to mount during the 1840s. A classic example of a possible poisoning was that of a little girl in 1851 who was sent to a rural grocer to get “tea, sugar, flour, currants, red herrings, and two ounces of arsenic to deal with rats.” Absent labeling, how was her mother to know which was arsenic? Parliament finally passed An Act to Regulate the Sale of Arsenic in June 1851, which required records be made of every sale and mandated that any quantity of less than ten pounds be colored so that it could not be confused with food ingredients (Whorton 2010, 114, 131, 133).

The law was often ignored by both buyers and sellers. Less than three months after its passage a woman used uncolored arsenic to kill her husband. She was executed, and the two pharmacists who sold her the arsenic were fined. Violations of the law continued, finally culminating in a ghastly tragedy in Bradford, Yorkshire, on October 25, 1858, when a confectioner’s assistant requested a quantity of plaster of Paris, which was supposed to be used as an adulterant in the candy they were making, but was mistakenly sold uncolored arsenic. Despite the fact that one worker became sick while mixing the arsenic into the peppermint lozenges that he was preparing and that “the candies took an unusually long time to dry and were darker in color than usual,” the confectioner did not realize there was a problem. He sold forty pounds of the lozenges to a vendor at Bradford’s Saturday market and mixed the remainder into an assortment of other sweets, known as a Scotch mixture. In less than three days, twenty-one people had died from eating the candy and over seventy–eight were known to be seriously ill. The confectioner and the chemist and his apprentice, who had sold the arsenic, were arrested and indicted for manslaughter. “When the trial was held…the jury could find no violation of the law. The episode was simply a highly regrettable accident” even though it was a case of gross negligence (Whorton 2010, 135–37, 139, 163).

Similar incidents of death and sickness due to food poisoning occurred in the United States. In his 1853 book The Milk Trade in New York and Vicinity, John Mullaly “included reports from frustrated physicians that thousands of children were killed in New York City every year by dirty (bacteria-laden) and deliberately tainted milk,” which was commonly known as “swill” milk. Thomas Hoskins, a Boston physician, published his What We Eat: An Account of the Most Common Adulterations of Food and Drink with Simple Tests by Which Many of Them May Be Detected in 1861. In an 1879 speech before the American Social Science Public Health Association, George Thorndike Angell “recited a disgusting list of commercially sold foods that included diseased and parasite-ridden meat…that poison and cheat the consumer.” Jesse Battershall, a New York chemist, published his book Food Adulteration and Its Detection in 1887, in which he decried “candy laced with poisonous metallic dyes, mostly arsenic and lead chromate,” and “warned of cyanide, indigo, soapstone, gypsum, sand, and turmeric in teas” (Blum 2018, 2, 15, 29).

During the Spanish-American War, the US Army contracted with Swift, Armour, and Morris, three of the biggest meat-packing companies in Chicago, to supply refrigerated and canned meat provisions to soldiers in Cuba and the Philippines. Much of the meat arriving in Cuba “was found to be so poorly preserved, chemically adulterated, and/or spoiled that it was toxic and dangerous to consume.” After the war a court of inquiry was held to investigate these problems, and Commanding General Nelson A. Miles of the American forces in Cuba referred to the refrigerated products provided to the army as “embalmed beef.” General Charles P. Eagan, commissary general, defended his procurement practices and in the end “there were no official findings of large-scale trouble with meat supplies” (“United States Army Beef Scandal” 2020).

The “embalmed beef scandal” was just one of many events that gave impetus to the passage of new federal laws. Muckrakers at the beginning of the twentieth century highlighted the problems they saw in the Chicago meat-packing industry. The publication of Upton Sinclair’s The Jungle as a magazine series in 1905, and then its publication as a book in early 1906, brought pressure to bear on President Theodore Roosevelt to push for the adoption of the Meat Inspection Act and the Pure Food and Drug Act. Prior to their passage on June 30, 1906, there had been a number of what can only be called “political inspections” of the meat processors in Chicago. One inspection supported the the meat companies’ claims that their processing facilities and methods were sufficiently up to industry standards, while another confirmed the descriptions in The Jungle. On March 10, 1906, investigators sent by Secretary of Agriculture James Wilson arrived in Chicago to report on the conditions in the packing houses. They held Sinclair responsible for “willful and deliberate misrepresentation of fact” (Schlosser 2006). In their initial report a month later, they “concluded that meat inspection could and should be improved, but (they) also refuted most of the charges made in…The Jungle” (Ogle 2013,78). Finally, in a June 8, 1906, letter to the president transmitting the reports of the Agricultural Department’s committee’s inspection of the stock yards, the inspectors stated that they believed that Sinclair had “selected the worst possible conditions which could be found in any establishment as typical of the general conditions existing in the Chicago abattoirs, and…willfully closed his eyes to establishments where excellent conditions prevail” (US House of Representatives 1906, 349). By early May 1906, Roosevelt had already decided to dispatch Commissioner of Labor Charles P. Neill and Assistant Secretary of the Treasury James B. Reynolds to Chicago for further investigation. This time “Roosevelt’s inspectors found stockyard conditions comparable to those Sinclair had portrayed and told of rooms reeking with filth, of walls, floors, and pillars caked with offal, dried blood, and flesh of unspeakable uncleanliness” (Goodwin 2013, 462).

The Meat Inspection Act of 1906 amended the earlier Meat Inspection Acts of 1890, 1891, and 1895, which had provided for “inspection of slaughtered animals and meat products but (which) had proven ineffective in regulating many unsafe and unsanitary practices” (Rouse 2020). The new law provided for the inspection of “all cattle, swine, sheep, goats, and horses both before and after they were slaughtered for human consumption,” as well as establishing new sanitary standards and ongoing monitoring and inspection of all slaughter and processing operations (ibid.) The Pure Food and Drug Act of 1906, on the other hand, banned all “foreign and interstate traffic in adulterated or mislabeled food and drug products” (“Pure Food and Drug Act” 2020). It was primarily a “truth in labeling law” that for the first time in federal legislation defined “misbranding” and “adulteration” by referring to the standards set by the US Pharmacopoeia and the National Formulary. As Harvey Wiley, chief chemist and the chief proponent of the new law put it, “The real evil of food adulteration (and mislabeling) was the deception of the consumer” (Blum 2018, 103).

Despite these laws a new tragedy occurred some three decades later. During September and October of 1937, more than one hundred people in fifteen states died after having taken the Elixir Sulfanilamide, which had been formulated by the chief chemist of the S. E. Massengill Company of Bristol, Tennessee. Sulfanilamide had been used in powder and tablet form to treat streptococcal infections. When it was found that it could be dissolved in diethylene glycol, it was marketed in liquid form after being tested for flavor, appearance, and fragrance. It was not, however, tested for toxicity, and the formulating chemist failed to realize that diethylene glycol was a deadly poison. After the product had been distributed, reports came back of deaths and sickness. The Food and Drug Administration then attempted to retrieve all of the product that had been sold. “Although selling toxic drugs was undoubtedly bad for business and could damage a firm’s reputation, it was not illegal. In 1937 the law did not prohibit sale of dangerous, untested, or poisonous drugs.” The unsold and unused elixir was seized, because it was misbranded, not because it was poisonous. According to the FDA, “elixir” implied that the product was in an alcoholic solution, whereas diethylene glycol contained no alcohol. “If the product had been called a ‘solution’ instead of an ‘elixir’ no charge of violating the law could have been made.” Dr. Samuel Evans Massengill, the owner of the firm, refused to accept any responsibility: “My chemists and I deeply regret the fatal results, but there was no error in the manufacture of the product. We have been supplying a legitimate professional demand and not once could have foreseen the unlooked-for results. I do not feel there was any responsibility on our part.” The company paid a fine of $ 26,100 for mislabeling and the commissioner of the FDA at that time, Walter Campbell, “pointed out how the inadequacy of the law had contributed to the disaster….[T]hen citing other harmful products, [he] announced that ‘The only remedy for such a situation is the enactment by Congress of an adequate and comprehensive national Food and Drug Act,’” which came about the following year (Ballentine 1981).

How would these tragedies have been handled on the free market? No one can say for sure that they could have been avoided, because there are no guarantees in this world. Would the free market provide more equitable, practical, and moral solutions to the problems of swindling and cheating that have been part of human history? We do not maintain that market solutions woulud solve all of humanity’s problems, but neither can we assume that because markets and other social mechanisms produce imperfect results that a central monopolistic authority will produce better ones. “Markets are desirable not because they lead smoothly to improved knowledge and better coordination, but because they provide a process for learning from our mistakes and the incentives to correct them” (Knych and Horwitz 2011, 33). As voluntaryists, we conclude from examining human nature, human incentives, and human history that a stateless society would not be perfect but would be a more moral and practical way of dealing with human aggression than reliance on a centralized, monopolized institution. Governments require taxes; taxes require coercion; coercion necessitates the violation of persons and properties, hardly moral or practical alternatives. Furthermore, we can say that government regulation usually gives consumers a false sense of security and reduces their incentive to do their own checking and acquire information about what they are buying. Government inspection and meeting government standards tend to preempt nongovernmental forms of inspection, such as product testing by third parties.

It is safe to say that a thorough application of the libertarian common law legal code and common sense would go far in preventing the kinds of catastrophes described here. The first thing to recognize is that in the absence of the state every manufacturer and every retailer would have strict liability for the products they sold. This incentive would induce them to exercise extreme care. As we have seen, particularly in the Massengill episode, neither the manufacturer nor any officials in the government’s Food and Drug Administration recognized that they had any personal responsibility for what happened. So long as they met the technical requirements of the statutory law, they were not liable for the deaths caused by sulfanilamide. As Rothbard has pointed out in Power and Market, with government regulation and reliance on government experts there is not the same measure of success or failure as when the individual relies on competitive market experts. “On the market, individuals tend to patronize those experts whose advice proves most successful. Good doctors or lawyers reap rewards on the free market, while the poor ones fail; the privately hired expert tends to flourish in proportion to his demonstrated ability” (Rothbard 1970, 17).

Where governments exist and government regulations and government inspections fail to prevent something like the sulfanilamide tragedy, what do the government regulators do? They call for new and more encompassing regulations. It is comparable to a successful terrorist attack today being used to call for stricter gun regulations and new antiterrorist laws. This is a perfect example of one government intervention leading to another.

How would the disasters described here be handled under the libertarian legal code? As Rothbard has written, “The free-market method of dealing, say, with the collapse of a building killing several persons is to” hold the owner of the building responsible for manslaughter.” Furthermore “a mis-statement of ingredients is a breach of contract—the customer is not getting what the seller states in his product.” This is “taking someone else’s property under false pretenses,” and therefore “under…the legal code of the free society which would prohibit all invasions of persons and property” the perpetrator would become liable. If the adulterated product injures the health of the buyer by substituting a toxic ingredient, the seller is further liable for prosecution for injuring and assaulting the person of the buyer (Rothbard 1970, 34).

Even with the existence of government, meat packers and manufacturers such as Armour and Swift still had an incentive to maintain quality and prevent food poisonings and deaths caused by their products. But they also had an incentive to use the fact that their products met government minimum standards as a shield against potential liability. As one commentator put it, “the responsible packer cannot afford to put upon the market meat virulently diseased. Government inspection, however…permits the packer to sell under sanction of law questionable products as first class” (US House of Representatives 1906, 345). This confirms Rothbard’s analysis that setting quality standards has an injurious effect upon the market:

Thus, the government defines “bread” as being of a certain composition. This is supposed to be a safeguard against “adulteration,” but in fact it prohibits improvement. If the government defines a product in a certain way, it prohibits change. A change, to be accepted by consumers, has to be an improvement, either absolutely or in the form of a lower price. Yet it may take a long time, if not forever, to persuade the government bureaucracy to change the requirements. In the meantime, competition is injured, and technological improvements are blocked. “Quality” standards, by shifting decisions about quality from the consumers to arbitrary government boards, impose rigidities and monopolization on the economic system. (Rothbard 1970, 18, 34)

Even in the face of government inspection and regulation, there is nothing to keep reputable producers from trying to exceed government standards. In England, Crosse and Blackwell, purveyors of food to the royalty, began using purity as a general marketing device in the mid-1850s. (Wilson, 141–143) Henry J. Heinz’s company, which is still in existence today, is another example. “Between 1865 and 1880, the H. J. Heinz Company had established a reputation for high-quality condiments.” Heinz predicated his business upon his belief that a “wide market awaited the manufacturer of food products who set purity and quality above everything else.” All of the company’s marketing and advertising efforts were focused on “Pure Food for the Table” and maintaining an unblemished brand record. In 1890, Heinz opened his factories to the public and invited his customers to come and inspect his operation for themselves. “Within a decade, more than 20,000 people per year were touring (his) manufacturing facilities.” As early as 1901, Heinz became one of the first companies to hire chemists and establish a quality control department. Nevertheless, Heinz was one of the few large-scale producers that supported government legislation covering “food production, labeling, and sales” (Koehn 2001, 72–86). As one historian has noted:

Heinz’s involvement in the campaign for food regulation grew out of his commitment to producing safe, healthy food. But he also had strategic reasons for championing federal regulation. Heinz believed that such legislation would help increase consumers’ confidence in processed foods, legitimating the broader industry and guaranteeing its survival. Stringent guidelines for food manufacturing and labeling, he believed, would enhance the reputation of the overall (food processing) business. Such guidelines might also focus public attention on his brand’s core attributes of purity and quality. Heinz’s standards for ingredients, production processes, and cleanliness were among the highest in the industry. The entrepreneur welcomed another opportunity to promote his products and his company’s identity.

From Heinz’s perspective, there were other advantages to endorsing federal regulation. Government-imposed standards for food manufacturing, labeling, and distribution would alter the terms of competition in the industry, forcing some companies to change their operating policies, usually at higher cost. Other manufacturers would be driven out of business. Both possibilities, Heinz realized, would enhance the Heinz Company’s competitive position. (Koehn 2001, 86–87)

So, there were definitely mixed motives at work among those who supported or opposed the passage of government legislation governing food inspection. The problem is that given the existence of government, opposition to specific legislation is exactly that. One can support it, or call for its amendment, but in either case one is in effect legitimizing the government. True opposition on voluntaryist grounds would be to oppose the government itself, calling for its abandonment rather than trying to challenge it on grounds that certain of its regulations are too stringent or inadequate.

What historical elements can we discern at work that give us some idea of how the free market in food safety might work were there no government? As we have seen, there were books written about food adulteration and how to detect adulterants. The What to Eat Magazine began publishing in August 1896 and made consumers aware of the importance of food safety. In England, the names of manufacturers and of their toxic food products were made known to the public via books and lectures (Whorton 2010, 148, 151). During the nineteenth century, “Canada’s Hiram Walker Company, producer of Canadian Club blended whiskey, reacted to fakery in the U.S. market by hiring detectives to hunt cheats. The company took out newspaper advertisements listing the perpetrators or had names listed on billboard posters proclaiming ‘A Swindle, These People Sell Bogus Liquors.’” From the company’s perspective this was more effective than instituting legal proceedings against those who copied their blend. Other nineteenth-century examples include a variety of clubs such as the General Federation of Women’s Clubs, the National Consumers League, and the Woman’s Christian Temperance Union (which opposed the use of cocaine in Coca-Cola), all of which could have mobilized consumer boycotts that would have pressured producers to change their ways (ibid., 148, 151, 157). Today, other professionals and their associations, such as the National Association of Nutrition Professionals, would certainly promote healthy foods. Health insurance companies, who have a proprietary interest in seeing that their customers come to no harm, would want to alert them to untested, potentially dangerous, and toxic food and chemicals (Blum 2018, 50, 114, 299).

The Good Housekeeping magazine was a commercial enterprise sustained by subscription and advertising revenues. It was first published in 1885, and by 1912, when Harvey Wiley (of Poison Squad notoriety) resigned his post at the Department of Agriculture and became director of the Good Housekeeping Bureau of Foods, Sanitation, and Health, it had over four hundred thousand subscribers (Blum 2018, 272). By 1925 it had over 1.5 million subscribers (Anderson 1958, 24). Its Experiment Station was started in 1900 and was the predecessor of the Good Housekeeping Research Institute, which was established in 1910. “In 1909, the magazine established the Good Housekeeping Seal of Approval,” which continues to this day. Consumers’ Research was started in 1929, and its spinoff, Consumers Union, was organized in 1936. Both were devoted to publishing “comparative test results on brand-name products and publicized deceptive advertising claims (“Consumers’ Research” 2020). The principals involved in these organizations published a best-selling book in 1933 titled 100,000,000 Guinea Pigs in which they pointed out that “pure food laws do not protect you” (Blum 2018, 285). The Non-GMO Verified Project is another example of a consumer education organization. Begun in 2007 by two food retailers who wanted consumers to know that their products contained no genetically modified ingredients, its first official food label was applied to tea products in 2012. A more recent effort can be found in The Moms Across America’s Gold Standard seal program, which began in late 2019. It “is a multi-tiered level of verification that can be achieved only by food and supplement brands that” meet the most stringent standards (Temple 2019). There can be problems with corruption and violation of trust within such private groups, but this same criticism applies equally to government organizations, which are supported by taxes and even more prone to be influenced by lobbyists.

As we ponder this history, several overriding questions remain. Whether we champion the free market or the state, why did these abuses happen? Why weren’t manufacturers and retailers held responsible? Where were the insurance companies that could have provided some measure of protection to both the consumers and manufacturers? It certainly is a criticism of both the common law and government legislation that people who were readily known and identified were not held responsible for their actions, which caused death and harm to others. The bottom-line answer is that “during the 19th century, manufacturers had no liability for the goods they made. The liability of manufacturers for the losses suffered by consumers took several centuries to be established” in both common law and statutory legislation ( “Example of the Development of Court Made Law” n.d.).

There are two aspects of the common law with which we need to be concerned. The common law concerns itself with contracts, under which two parties engage in a transaction in which the terms are normally outlined in advance and evidenced by a written or oral agreement. Fraud, which is intentional deception, usually occurs within the context of a contract (“Fraud” 2020). Torts, which are “wrongdoings not arising out of contractual obligations” evolved out of the common law of prosecutions in eighteenth-century England ( “The Historical Development of Law of Torts in England” 2017, introduction). Negligence is a form of tort. “A person who is negligent does not intend to cause harm” but is still held responsible, “because their careless actions injured someone” (FindLaw 2018a). Most of the deaths we have discussed here are examples of torts. The people who died were not intentionally poisoned but rather died due to accidents caused by carelessness.

As Rothbard explains,

In the free economy, there would be ample means to obtain redress for direct injuries or fraudulent “adulteration.”…If a man is sold adulterated food, then clearly the seller has committed fraud, violating his contract to sell the food. Thus, if A sells B breakfast food, and it turns out to be straw, A has committed an illegal act of fraud by telling B he is selling him food, while actually selling straw….The legal code of the free society…would prohibit all invasions of persons and property….[I]f a man simply sells what he calls “bread,” it must meet the common definition of bread held by consumers, and not some arbitrary specification. However, if he specifies the composition on the loaf, he is liable for…breaching a contract—taking someone else’s property under false pretenses. (Rothbard 1970, 19)

Under the common law, as it was interpreted throughout most of the nineteenth century, “a plaintiff could not recover for a defendant’s negligent production or distribution of a harmful instrumentality unless the two were in privity of contract” (“Common Law” 2020). Under this doctrine, there was no privity between a consumer who bought a product from a retailer and the manufacturer that produced it. An 1837 case in England, well-known to law students, illustrates how privity was originally seen.

A man purchased a gun from a gun maker, warranted to be safe. The man’s son used the gun and one of the barrels exploded, resulting in the mutilation of the son’s hand. As the son did not buy the gun there was no remedy in contract law. The court was asked to consider if the son could sue the gun seller or manufacturer, and if so what for. The Court said he could not sue because 1) in contract the son did not buy the gun and 2) could not sue for negligence because negligence did not exist in law. (“Example of the Development of Court Made Law” n.d.)

In another English case five years later, the court “recognized that there would be ‘absurd and outrageous consequences’ if an injured person could sue any person peripherally involved, and knew it had to draw the line somewhere…. The Court looked to the contractual relationships, and held that liability would only flow as far as the person in immediate contract (‘privity’) with the negligent party.” An early exception to the privity rule is found in a New York State case of 1852. Here it was held that mislabeling a potentially poisonous herb which could “put human life in imminent danger” was reason enough to breach the privity rule, especially since the herb was intended to be sold through a dealer. In an English case of 1883, a ship’s painter was injured when the platform (slung over the side of the ship) on which he was standing collapsed. The platform was faulty but there was no contract between the injured painter and the company that built it. The court ruled that the builder of the platform owed a duty to whomsoever used it, regardless of whether there was privity between them. As the court opined, “It is undoubted, however, that there may be the obligation of such a duty from one person to another although there is no contract between them with regard to such duty” (“Common Law” 2020).

Nevertheless, the privity rule survived. In 1915, a federal appeals court for the New York region held that “a car owner could not recover for injuries (caused by) a defective wheel.” The car’s owner’s contract was with the automobile dealer, not with the manufacturer. The court concluded that manufacturers were “not liable to third parties for injuries caused by them, except in cases of willful injury or fraud” (“Common Law” 2020). Finally, in 1932, the English courts recognized that third parties had the right to seek damages even if they had no direct dealings with the manufacturer of defective goods. A new rule of law, known as the duty of care, was enunciated. “The new law placed on the manufacturer a direct duty of care (due) to the consumer, not just the purchaser.” The ultimate consumer—“the person for whom the goods were intended”—was now protected under the law of negligence even though there was no contract between the end user and the producer of the product (“Example of the Development of Court Made Law” n.d.). Thus the core concept of negligence as it has developed in English and American law is that “people should exercise reasonable care in their actions, by taking account of the potential harm they might foreseeably cause to other people or their property” (“Negligence” 2020).

So, to return to our question: where were the insurance companies? The answer must be that for the most part, until the development of product liability, implied warranty, and negligence laws, there was nothing for the insurance companies to insure. However, it is clear from the general role that insurance companies would play in a free society that they would have a very significant impact on assuring food safety and setting requirements which their insureds would have to meet in order to maintain product liability coverage.

It is interesting to see how recent federal laws were applied to those responsible for a deadly outbreak of salmonella poisoning that occurred in 2008 and 2009. Executives and owners of the Peanut Corporation of America knowingly ordered that tainted peanut butter be shipped out to their distributors with the result that nine people died and at least 714 others were sickened. Here are excerpts from a CNN report: “Food safety advocates said the trial was groundbreaking because it’s so rare corporate executives are held accountable in court for bacteria in food. Never before had a jury heard a criminal case in which a corporate chief faced federal felony charges for knowingly shipping out food containing salmonella.” (Basu 2014) “Stewart Parnell (one of the owners) and his co-defendants were not on trial for poisoning people or causing any deaths stemming from the outbreak, and prosecutors did not mention these deaths to the jury” (ibid.). In other words, the perpetrators were still not held responsible for the death and sickness caused by their bad product. This was little different from the Bradford, Yorkshire, case 150 years ago, where the claim was that “no law was violated” or from the 1937 Massengill tragedy, where the most that could be claimed was a case of mislabeling. Would the libertarian legal code be more robust in response to such events? All we can hope is that it would be so.

Who is responsible for the foods that consumers put into their mouths, the market or the government, the buyer or the seller? As one consumer advocate has concluded, “government intervention to stop bad food has always come later than it should; and it has never been adequate to the problem” (Wilson 2008, 326–27). “Who is right? Who can say?” (Wilson 2008, 247) Paraphrasing Ayn Rand: Who decides what is the right way to make an automobile? Her answer was: “any man who cares to acquire the appropriate knowledge and to judge, at and for his own risk and sake.” So, to return to the question posed in our title: Who should decide what goes into a can of tomatoes? (Ogle 2013, 67) The answer is relatively simple: the owner of the can, the owner of the tomatoes, the insurance company that insures them, and the person who acquires the appropriate knowledge as to what is safe and is not safe, and is willing to take the responsibility for that decision (Rand 1990). Additionally, it is up to us as individual consumers to “do what is in our power to prevent ourselves and our families” from being cheated and poisoned. “Buy fresh foods, in whole form. Buy organic, where possible. Buy food from someone you can trust…. Cook it yourself…. Above all, trust your own senses” (Wilson 2008, 326–27).


Carl Watner, “Who Should Decide What Goes into a Can of Tomatoes? Food Laws from a Voluntaryist Perspective,” Journal of Libertarian Studies 24 (2020): 188–205.

Note: The views expressed on are not necessarily those of the Mises Institute.
What is the Mises Institute?

The Mises Institute is a non-profit organization that exists to promote teaching and research in the Austrian School of economics, individual freedom, honest history, and international peace, in the tradition of Ludwig von Mises and Murray N. Rothbard. 

Non-political, non-partisan, and non-PC, we advocate a radical shift in the intellectual climate, away from statism and toward a private property order. We believe that our foundational ideas are of permanent value, and oppose all efforts at compromise, sellout, and amalgamation of these ideas with fashionable political, cultural, and social doctrines inimical to their spirit.

Become a Member
Mises Institute