The Costs of the Progressives

This historical series focuses on great leaps forward in Washington's consolidation of power on a variety of issues that affect our day-to-day lives, and the costs we pay with our taxes and our liberty.

Also Available on YouTube

Video file

Henry Adams, a direct descendent of two of presidents, was one of the earliest proponents of what became known as “Progressivism.” In the late nineteenth century, Progressivism was taking hold of American elites, who hoped to perfect society by engineering social policies that would transform people into ideal citizens.

Henry believed America’s individualist ethos stood in the way of this vision, declaring that “the American people are obliged to choose between the principle of individualism and the principle of socialism.” As Henry explained it, individualists sought to strictly limit the powers of the state, while Progressives—following the socialist principle—“would merge the personality of the individual into that of the state.”

But what did this mean?

First, it meant viewing “society,” rather than the individual, as the basic unit of social organization. Progressives believed social order had to be imposed uniformly by a coercive government, and that “society”—an ambiguous concept—formed the state, which then established an administrative bureaucracy.

Second, it meant placing decision-making power in the hands of “experts”—or, government-appointed bureaucrats who possessed the necessary wisdom to determine what was best for the American people as a whole. Essentially, Progressives contended that policies should prioritize the needs of “society” above the rights of individuals, who they considered too ignorant and selfish to make their own decisions.

Writing sixty years after Henry Adams, economist Friedrich Hayek defended individualism against the Progressive assault. “True individualism,” Hayek argued, “affirms the value of the family, believes in local autonomy, and contends that much for which the coercive action of the state is usually invoked can be done better by voluntary collaboration.” He believed social order emerged spontaneously as individuals interacted with each other, forming families and communities.

To Hayek, the crucial difference between these philosophies was how they affected the decision-making process, which came down to questions of incentives, knowledge, and responsibility. In other words, who has the interests of your family in mind? Who best understands their unique needs and constraints? And who pays for the decisions people make?

The Progressive philosophy holds the state responsible for these decisions, to do what’s best for society as a whole; that decision-making should be centralized because only professional bureaucrats have the necessary expertise to know what’s truly best for people; and that costs should be socialized—in other words, rural Floridians should be financially responsible for San Franciscans, and vice-versa.

But the individualist philosophy recognizes that you care more about your family’s well-being and are more knowledgeable of their particular circumstances than Washington elites, and that the costs of any decisions should be privatized—you should be responsible for your own choices, not those made by distant strangers.

The choice between these two philosophies relates to all areas of life, but perhaps nowhere is the difference between them more pronounced than in the decisions we face regarding our children’s education.

Video file

In 1897, John Dewey—often hailed as the father of Progressive Education—published his “Pedagogical Creed,” in which he emphasized that one of the primary purposes of education was to teach children to “share in the social consciousness, and that the adjustment of individual activity on the basis of this social consciousness is the only sure method of social reconstruction.”


Dewey believed his philosophy embodied the ideals of both individualism and socialism. It was individualistic, he said, not because it encouraged the child to develop his own unique interests and abilities, but “because it recognizes the formation of a certain character as the only genuine basis of right living.” His philosophy was “socialistic” because “this right character is to be formed by the influence of a certain form of institutional or community life upon the individual, and that the social organism through the school, as its organ, may determine ethical results.”

Thirty years later, Dewey visited Stalin’s Soviet Union, and he wrote glowingly of the educational system he found there. “The Russian school children [are] much more democratically organized than our own, [and] are receiving through the system of school administration a training that fits them for later active participation in the self-direction of both local communities and industries.”

Dewey and other Progressives wanted to train children for social activism by perfecting factory-model education. Nineteenth-century educational reformers wanted schools as efficient and impersonal as America’s impressive manufacturing facilities, so they established a system that treats children like industrial workers. Under the watchful eye of an overseer, students toil silently until a bell signals their opportunity to eat and briefly socialize. Unlike factory workers, though, students take unfinished work home to complete before the following day.

Progressives, however, viewed children not as workers, but as commodities. In his 1916 treatise on Public School Administration, Ellwood Cubberley described schools as “factories in which the raw products (children) are to be shaped into products to meet the various demands of life. The specifications for manufacturing come from the demands of civilization, and it is the business of the school to build its pupils according to the specifications laid down.”

Cubberley dreamed of a national educational system modeled after giant corporations. He was inspired by Frederick Winslow Taylor’s theories of “Scientific Management,” which sought to enhance industrial output through centralized management, strictly regimented work, and regularized performance measurements. Following Taylor’s lead, Cubberley hoped to improve education through centralized administration, uniform curricula, and standardized testing.

Cubberley’s dream became a reality in the second half of the twentieth century, as the national government increasingly asserted its authority over America’s public schools. Progressive were confident that professional educators and government schools would produce better outcomes than parental education at home. The question for the twenty-first century, therefore, is which method of education has produced the best results?

Video file

In 2019, Harvard Law Professor Elizabeth Bartholet published an article calling for a “presumptive ban” on homeschooling, which she believed conflicted with Progressive ideals. “Homeschooling presents both academic and democratic concerns,” she argued. Public education “makes children aware of important cultural values and provides skills enabling [them] to participate productively in their communities and the larger society through various forms of civic engagement. Even homeschooling parents capable of satisfying the academic function of education are not likely to be capable of satisfying the democratic function.”

Bartholet published her condemnation of homeschooling as faith in America’s public school system was plummeting, yet she neglected to provide any comparative analysis. So how do the two systems stack up?

Following Progressive educational theories, the federal government began exerting more authority over the educational system in the twentieth century, imposing national standardized testing in 1965 and establishing a federal educational bureaucracy, the Department of Education, in 1979. In the twenty-first century, both George Bush’s No Child Left Behind and Barack Obama’s Common Core demanded even more federal funding, bureaucratic oversight, and standardized testing for public schools.

The results have been dismal. Since 1970, the U.S. has massively increased educational expenditures, mostly to expand school administration, which has vastly outpaced the growth of both students and teachers. Today, taxpayers spend more than $15,000 per public school student. Yet test scores have largely flatlined and, in some areas, even declined. These results became especially bleak after No Child Left Behind tied school funding to test scores, pressuring teachers to devote more time to “teaching the test,” at the expense of other subjects.

Homeschool students, by contrast, consistently outperform their public school counterparts by as much as 30 percentile points, even when comparing students from households with similar economic and education levels. These disparities should not be surprising when considering the incentives involved. When you spend your money on your children, you will likely be more attentive to both the cost and quality of education than bureaucrats spending somebody else’s money on somebody else’s children.

It is also puzzling how public schools promote social development by grouping children according to their age, creating an environment where students interact almost exclusively with children at the same level of maturity. Despite “socialization” being the most commonly cited benefit of public education, homeschoolers significantly outperform their public-school peers when tested for social, emotional, and psychological development. Far from being isolated, they enjoy greater opportunity to socialize with people of all ages. Homeschool co-ops, for example, bring students together for group lessons, often taught by parents with expertise in the subject.

Unfortunately, many families that would prefer to homeschool simply can’t afford to withdraw their children from public schools, which begs the question of why the government forces them to continue financing the very schools they are trying like to flee?

Video file

As with education, Progressives continue to insist that only bureaucratic “experts” know what’s best for you and your family. Regulations are their key to "protecting" us all.

The Food and Drug Administration (FDA)—America's first consumer regulatory agency—was established in 1906 as part of the Progressive push to impose scientific expertise on society. In reality, the FDA was to protect everyone from the free market.

In July 2019, federal agents raided a warehouse in Philadelphia and seized $162,000 worth of "black market" goods. The illegal good was the organic German baby forumla, HiPP.

The reason for the confiscation of HiPP was that it's label failed to include the words “low iron.” Baby formula sold in the United States is required to specify whether it is “with iron” or is “low iron,” irrespective of whether the iron content is already included in the nutritional information.

Seizures of European baby formula are surprisingly common, because the FDA urges Americans to purchase formula that complies with its regulations, such as Similac PM 60/40, which, in compliance with FDA regulations, includes the words “low iron” on its label. With FDA support Similac became the formula of choice in the United States. Tragically, in 2021, it was this properly labeled low-iron formula that caused a string of infant illnesses due to a factory contamination. This lead to a nationwide shortage of domestic baby formula.

If you were among the millions of parents who struggled to find formula to feed your children during the shortage, you might have asked why  so much of the domestic supply of formula came from one factory. Why was there a 17.5 percent tariff on foreign imports of baby formula during this shortage? Why weren't there more US suppliers of baby formula to fill void?

To answer these questions, we need only look to the Progressive agenda of increased government reliance. Nearly  two-thirds of America’s baby formula is purchased through the federal government’s Special Supplemental Nutrition Program for Women, Infants, and Children (WIC), and in 1989, Congress passed legislation requiring states to award single-purchase contracts to whoever offered the lowest bid, setting off a period of rapid consolidation in the formula industry.

Abbott Nutrition, which owns Similac, emerged as the largest of four formula manufacturers, and its massive government contracts allowed the company to spend millions lobbying for regulations that would deter more competitors from entering the market. Economists refer to this phenomenon as regulatory capture. In 2020, just a year before the contamination that lead to the formula shortage, Abbott advertised its dedication to consumer safety, publicizing its involvement in securing labeling requirements, among other regulations.

The baby formula shortage illustrates the continued danger posed to us all by the bureaucratic "experts" and their regulations. Not only did regulators fail to prevent the baby formula contamination, but they directly encouraged the growth of the company responsible and, by virtually eliminating alternatives, created a situation in which millions of parents struggled to feed their infant children.

Video file

In 1992, the Department of Agriculture unveiled would become the most recognizable image in nutrition: the Food Pyramid. In some respects, this image—which cost taxpayers nearly $1 million—is among the most successful government initiatives in history. A decade after it was unveiled, a Gallup survey found that 82% of Americans believed the Pyramid was the key to healthy eating. Parents across the country looked to it as a guide to feeding their children.

Yet obesity rates continued to climb. So where did the Food Pyramid go wrong?

In the 1960s, Senator George McGovern spearheaded the Committee on Nutrition and Human Needs to tackle America’s “hunger problem.” The original purpose was to expand food assistance programs, but in 1974, McGovern expanded the committee’s efforts to not only malnutrition, but also overeating.

Three years later, the committee published its “Dietary Goals for the United States.” At the time, obesity rates hovered around 10% among adults and 5% for children. McGovern hoped his report would help reduce these figures by encouraging low-fat, high-carbohydrate diets.

Over the next decade, even as the federal government continued promoting McGovern’s nutritional guidelines, obesity rates doubled.

This convinced Surgeon General C. Everett Koop to publish his own “Report on Nutrition and Health,” modeled after the Health Department’s 1964 study publicizing the dangers of cigarettes. The key to the anti-smoking campaign had been the agency’s marketing strategy, so Koop enlisted the Department of Agriculture to help distill McGovern’s recommendations into a simple graphic.

Unlike cigarettes, however, dietary guidelines could hardly follow the straightforward formula of telling people not to smoke, yet this is precisely what the Food Pyramid attempted to do with fat. Instead of advising people to substitute healthy unsaturated fats for unhealthy trans fats, the Pyramid grouped all fats together in the tip of the pyramid as foods to avoid.

The base of the pyramid—representing the largest portion of a healthy diet—was reserved for carb-heavy grains. Like fats, the food pyramid made no distinction between different types of carbohydrates, some of which are healthier than others.

So successful was the Food Pyramid campaign that it directly contributed to the “low fat” craze of the 1990s. Grocery stores became stuffed with “fat-free” foods, from potato chips to Devil’s Food Cakes. The 1988 report on nutrition, in fact, repeatedly encouraged food manufacturers to adopt this kind of labeling, which was added to the regulatory code in 1990.

Obesity rates have more than tripled since the McGovern Report was published, yet the government’s dietary guidelines continue to promote his basic recommendations despite mounting criticism from dieticians for omitting studies that contradict official advice. For families who want to practice healthy dietary habits, there may not be a simple formula, but the Food Pyramid serves as a testament to the follies of placing faith in the expertise of federal bureaucrats. 

Video file

On May 9, 2016, Kate James and Tom Evans welcomed their first child, Alfie Evans, into the world. By the end of the year, however, their joy turned to horror as their infant son began suffering from seizures. Doctors diagnosed Alfie with a degenerative neurological condition similar to severe epilepsy. He remained on life support for a year, but doctors believed he would not recover and recommended ending his treatment.

Alfie’s parents refused to consent to ceasing Alfie’s treatment, but Kate and Tom lived in Liverpool, where they were compelled to seek care exclusively through the British Government’s socialized healthcare system, the National Health Service. Instead of respecting Kate and Tom’s wishes, as doctors in the United States would be expected to do, Alfie’s doctors applied for court approval to withdraw his life support without parental consent.

Both the U.S. and British common law systems recognize the “best-interest principle” as justification for overruling medical decisions parents make on their children’s behalf. In the United States, where healthcare is not a state-run enterprise, judges occasionally invoke the best-interest principle to overrule parents who refuse life-sustaining treatment for their children.

In Britain, where the State owns the healthcare system, the law follows different standards. As with many similar cases, the British courts sided with Alfie’s doctors, invoking the best-interest principle not to provide treatment, but to withhold it.

Kate and Tom were unwilling to give up, so they attempted to seek medical asylum in the United States and Rome. Their efforts won them an enormous following of supporters, nicknamed “Alfie’s Army,” and the Italian government even granted Alfie citizenship in an attempt to aid the parents. Yet the British courts still refused to let them take their child out of the country.

British officials deferred to medical experts, who were certain Alfie would be unable to breathe unassisted. Assured that Alfie would pass quickly, the judge ruled that it would be “inhumane” to keep him on life support.

But the experts were wrong. Alfie was able to breathe without life support, albeit with great difficulty. As Alfie gasped for breath, his parents begged the medical staff to give him oxygen, but they were now legally obligated to refuse. In the name of upholding Alfie’s “best interest,” British officials forced him to suffer five excruciating days off life support before he finally passed away.

Alfie’s story is but one example of the British government defying parental rights and deciding that a child’s “best interest” is to cease treatment. These tragedies reveal one of the oft-overlooked dangers of placing the government in charge of healthcare. When government bureaucrats are empowered to make medical decisions for your family, as they did for Alfie’s, whose interests do they really have in mind?

Remote Media URL

In December 2022, the scientific journal Biological Psychiatry published a study that compared adolescent brain scans before and after the COVID-19 pandemic. Teenagers scanned after the pandemic showed reduced cortical thickness, larger hippocampal and amygdala volume, and greater brain aging than those scanned before COVID.

This kind of adolescent brain development, the study noted, was typically associated with “exposure to early life adversity, including violence, neglect, and family dysfunction.” The post-COVID scans showed something new. “As a result of social isolation and distancing during the shutdown,” the study concludes, “virtually all youth experienced adversity in the form of significant departures from their normal routines.”

Early into the COVID-19 outbreak, it was clear that unless somebody was immunocompromised, age was the primary determinant in the risk of severe health complications from the virus, yet for many policymakers, the country needed a one-size-fits-all solution, with no regard to individual circumstances. Prudent policy for a seventy-year-old with respiratory problems, the thinking went, was equally necessary for a healthy family of eight. 

The efficacy of masking and isolation mandates remains a matter of controversy, but their unintended consequences are becoming increasingly clear, as more studies come out documenting the harmful effects they had on childhood development.

In 2022, the journal Frontiers in Psychology published an article exploring the implications of masks on infants. “Human faces convey critical information for the development of social cognition,” the authors explained, but “with masks, the facial cues available to the infant are impoverished.” Other studies have documented the consequences. Columbia University researchers found that “babies born during the pandemic showed reduced motor and social-emotional development compared to pre-pandemic babies,” for example, and the medical journal Contemporary Pediatrics reported that pediatric speech disorders more than doubled.

The deleterious effects of isolation mandates on schoolchildren were made visible by a viral photograph of French preschoolers sitting alone in chalk squares during playtime. Numerous studies have since confirmed that children exhibited increased rates of depression, anxiety, and indignation due to the lockdowns. The CDC reported that mental-health related emergency room visits for adolescents had increased by more than thirty percent between 2019 and 2020. Other students have documented the dramatic spike in teenage substance abuse as a coping mechanism while in isolation.

We are still discovering the damage that masking and lockdown mandates subjected children to, which provides a bleak lesson about the dangers of replacing individual autonomy with one-size-fits-all policies in times of crisis. Can you trust even the most expert bureaucrat to design policies that are appropriate for the unique circumstances of your household, or does it make more sense to make your own decisions about what’s best for you and your family?

Remote Media URL

After Frank Potts molested an eleven-year-old girl in 1982, he was sentenced to fifteen years in prison. Six years later, he was granted an early release, despite his parole report warning that he was still dangerous. Sadly, the report proved prescient, as Potts was arrested a second time in 1994 after he molested another eleven-year-old girl. 

Once he was back behind bars, authorities began searching Potts’s forty-acre property on Alabama’s Garrett Mountain. Local rumors had long circulated that there were bodies buried around the property, and they were finally confirmed with the discovery of the remains of a nineteen-year-old man who had gone missing in 1989. The rough mountain terrain impeded their search for more bodies, but authorities believed Potts had committed as many as fifteen murders since his release from prison. 

Given that the parole report warned that Pott’s was high-risk, why was he freed?

The Anti-Drug Abuse Act, passed in 1986, dramatically expanded mandatory minimum sentences for drug offences, including marijuana possession. The act was a gift to violent criminals and sex offenders, who were often granted early release to make room for convicted drug users. Frank Potts was one such beneficiary, allowing him to enjoy a six-year killing spree. 

From education to nutrition and healthcare, this series has looked at how individuals are better positioned to make decisions regarding their well-being than bureaucrats and politicians, but this doesn’t mean that people never make poor choices. Substance abuse certainly reflects poor personal choices, and many people look to the State to prevent it. But even policies meant to save people from their own reckless decisions, such as the War on Drugs, often cause more harm than they prevent. 

The fentanyl crisis reveals another way that prohibitionist policies have exacerbated the problems they were designed to prevent. Economist Mark Thornton has shown that one unintended consequence of drug prohibition is an increase in potency, by creating an incentive for smugglers to maximize the street value of their product while minimizing its size. This phenomenon, known as the Iron Law of Prohibition, played out during alcohol prohibition, as rumrunners found it more cost effective to smuggle moonshine than beer. 

Fentanyl, a synthetic opioid, is only the most recent—and most deadly—manifestation of the Iron Law of Prohibition. Fentanyl is now linked to nearly all overdose deaths, as drug dealers mix it with black market heroin, cocaine, and amphetamines. By contrast, legal intoxicants, such as alcohol and tobacco—though still dangerous—remain fentanyl free. 

We all want our loved ones to make good choices, but when they don’t, we hope they can recover and learn from their mistakes. The War on Drugs has failed in its efforts to protect people from making bad decisions, but it has made those mistakes far deadlier, and it has allowed innocent bystanders to become collateral damage along the way. 

Remote Media URL

San Francisco officials decided in 2022 that the city needed more trash cans. While they could have simply purchased simple, ready-made bins, they decided instead to commission a local firm to create fifteen custom designs, costing between eleven and twenty-one thousand dollars each. They then placed QR codes on reach respectable so residents could vote for their favorite model, despite having no knowledge of the price. 

One member of the city’s Board of Supervisors, Democrat Matt Haney, criticized the plan, despite voting to approve it. “I think most people would say just replace the cans with cans that we know work in other cities,” he said. “A trash can is one of the most basic functions of city governance, and if the city can’t do something as simple as this, how can they solve the bigger issues?”

Haney’s criticism raises important questions about why bureaucrats make different decisions on how to spend money than private individuals. Consider how you make spending decisions for your household. Most likely, you try to maximize quality while minimizing cost. But how would your priorities shift if you were purchasing something for a stranger? You would likely prioritize price well above quality. 

Now imagine that you’re spending somebody else’s money. If you are purchasing items for yourself, you will probably prioritize quality over price. But if you were spending somebody else’s money for the benefit of strangers, you would likely pay far less attention to both price and quality than you would when spending your own money on yourself and your family. This is how bureaucrats make decisions about how to spend the money you pay in taxes.

Think about how much money your household pays in taxes each year. Let’s assume a family of four earning $70,000 a year, which is the national medium household income. Most obvious are the taxes withheld from your paychecks, which include income and payroll taxes. Depending on the state you live in, you will have your salary reduced by roughly $21,000 each year.

But this is only the tip of the taxation iceberg. When accounting for property taxes, sales and excise taxes, and embedded taxes—which refers to the higher price of goods we pay due to things like business taxes and tariffs—the total tax burden for a family earning $70,000 will be around $30,000 each year. And even this doesn’t include inflation, or the many fines and fees levied by government agencies. 

But it is enough to ask one simple question: how would an additional $30,000 per year change your family’s life? Would you voluntarily spend it on things like custom-made trash bins? Or would you make better spending decisions than bureaucrats who have little regard for price and quality when spending the money taken out of your hard-earned paycheck?