Power & Market

Why Didn't China Industrialize First?

06/17/2021Lipton Matthews

Historians still ponder why, despite its dominance in prior centuries, China failed to industrialize before Europe. Some contend that the culture of conformity engendered by Confucianism prevented the influx of disruptive ideas able to spark an economic revolution. Meanwhile, there are those who posit that the Chinese preferred employment in the government service, instead of pursuing commercial activities. Although there is a kernel of truth in these arguments such assumptions are insufficient to explain the sluggishness of China, relative to Western Europe.

For example, notwithstanding the perceived conformity of medieval Europe there were pockets of intellectual dissent. Eminent medievalist Edward Grant posits that contrary to the stereotype of medieval Europe being mired in ignorance - the academy was characterized by lively debates. It is also largely unknown that during this era most students studied law and natural philosophy, since theology was a graduate degree requiring mature students. Moreover, Aristotle was the undisputed king of medieval philosophy and his arguments were frequently applied to the study of religion. Medieval scholars were never slaves to the scriptures as some would have us believe. Neither were they unwilling to engage with scientific experimentation. Let us not forget that the monk Theodoric provided scientific explanations of rainbows in the 14th century, using experiments with a water-filled spherical flask designed to imitate a raindrop.

However, compared to contemporary societies medieval Europe may seem conformist, but critics forget that the character of conformity is equally important. Whereas students in China were encouraged to regurgitate classical philosophers, the medieval scholar Bernard of Chartres promoted the view that one ought to enhance learning by refining the ideas of his intellectual ancestors, this outlook is expressed by the metaphor “Standing on the shoulders of Giants.” Hence the substance of conformity can positively impact creative output. By this account both medieval Europe and China were conformist, but they differed in their concept of conformity.

Furthermore, like in China, some scholars in Europe avoided technical professions, yet their ideas nonetheless revolutionized society. For example, the Protestant reformers placed a high value on literacy. Because people were becoming literate, they acquired an interest in books other than the bible, so indirectly the Protestant Reformation resulted in the secularization of society. Essentially the individualistic ethos of the Reformation implored people to seek knowledge on their own. As such, with the diminished importance of religion, men were primed to pursue science, economics, and other non-religious affairs.

Therefore, the Reformation indicates that ideas are crucial to revolutionary changes. Hence the paucity of highly intelligent men working in technical professions is an inept explanation for the failure of China to industrialize, because gifted people do not need to be industrialists for their ideas to promote growth. We have explored theories proffered by academics to describe China’s inability to industrialize before Europe, so we will now discuss possibilities offered by economic studies.

According to economist Mark Elvin, China suffered from a high-level equilibrium trap meaning that the efficiency of production processes limited the demand for innovations. Yet Jan Luiten Van Zanden and Bozhong Li in a 2010 paper note that lower labor costs in China did not stimulate the adoption of machines to minimize labor expenses. Based on these findings and those of Stephen Broadberry, it is apparent that economists exaggerated the productivity of pre-industrial China.

Another intriguing theory is proposed by scholars who argue that the clannish nature of Chinese society obstructed the formation of institutions facilitating large-scale partnerships. Avner Greif and Guido Tabellini write: ‘Clan loyalty and the absence of formal impartial enforcement limited inter-clan cooperation. There were, obviously cities in China. Yet, intra-clan loyalty and interactions limited urbanization, city size and self-governance. Considering large cities, China’s urbanization rate remained between three and four percent from the eleventh to the nineteenth century, while the initially lower urbanization rate in Europe rose to about ten percent. Including small cities, urbanization rates were comparable, but China’s small cities were venues for cooperation among members of local clans rather than their melting polt. While the European cities gained self-governance, this did not happen in China until the modern period.”

Unfortunately, kinship structures in China hindered institutional transfers across cities since they precluded the formation of associations independent of tribal ties. By impeding the creation of widescale trust kinship groups deterred the networks required to create successful innovations and boost growth.

Evidently, examining the failure of China to industrialize is a complicated task. Though it appears that this is due to an intricate interplay of factors ranging from economics to culture. Therefore, it is prudent for economists to adopt a multi-faceted approach by exploring how dynamic interactions between cultural beliefs and institutions aided in delaying the rise of an industrial China.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Wall Street Journal Remembers the Great Inflation

06/16/2021Robert Aro

The Wall Street Journal (WSJ) joined the chorus of headlines about rising prices by recounting price inflation with the article: When Americans Took to the Street Over Inflation, warning readers:

Today, after decades of nearly invisible inflation in the U.S., many Americans have little idea what it looks like... But America’s long inflation holiday shows signs of ending…

Ringing the alarm after the Labor Department’s Consumer Price Index (CPI) reading showed 5% in May, the author tells us:

History provides some useful lessons.

This is precisely the problem. How can any supposed solution of the 1970’s be applied to today?

The author begins the narrative:

The nagging inflation of the late 1960s and 1970s didn’t happen overnight. It took root over years, building through a cascade of policy missteps and misfortunes… It would take two deep recessions and new ways of thinking about economics to tame the inflation of that period.

Going back to as early as 1966, protests began to sweep the nation:

Fed up with the increasing cost of living, they marched outside of supermarkets with placards demanding lower prices...

Ironically, the CPI data illustrates the limitation of trying to understand price increases through the use of CPI data. Per the chart below, the mid-1960’s rise in the index doesn’t look anymore remarkable than the periods before:

aro

The stated resolution of this inflation problem is the concerning point. It took two recessions and new economic ideas to somehow “tame” inflation.

This alleged taming remains stated but not proven. Suspension of belief is required to accept (price) inflation was defeated without prices ever decreasing. Two recessions being the cure is a grandiose claim! However, it’s difficult to disprove, as this commonly accepted thesis cannot be proven to begin with. Like a nod to Keynes himself, victory is declared through the declaration of victory itself.

Tidbits of history are provided, explaining various government failures in fighting price increases. There was LBJ’s Vietnam War effort and “Great Society” social programs which did nothing to lower prices. Nixon severed the international exchange of dollars for gold and the USD exchange rate went tumbling, while he also imposed price caps on meat. His administration famously urged housewives to try “shopping wisely.” Jimmy Carter had the Council on Wage and Price Stability, described by one director of the program as a “complete failure.”

The Fed gave into government, acquiescing to LBJ and Nixon’s urges to keep pumping money into the system to maintain low rates obliterating the strength of the dollar. As for mainstream economists, their models of the Phillips-Curve proved disastrous:

Some economists had thought that when unemployment rose, inflation would fall. Instead, both went up, giving rise to yet another new term, “stagflation.”

And to no surprise, women took some blame for the various missteps:

A flood of women into the labor force also made it harder to decipher a stable rate of unemployment. 

Perhaps when “too many” women go to work it ruins the Fed’s predictive models?

Last but not least, Fed Chair (1979-1987) Paul Volcker continues to play the role of hero, the Chair who raised interest rates and, per the author, restricted growth of the money supply. How higher rates fixed the problem is peculiar. The restrictive money supply idea is curious as well. The M2 Money supply from 1960 to 2000 table below shows that under Volker, the money supply saw a steeper increase than the decades before, never decreasing.

aro

We should learn from the past. But this becomes difficult when the method in which these problems were resolved was never made clear. Between government, the Fed and economists, it seems no one had an adequate solution, save for the legend of Paul Volker who apparently fought inflation. At least that is the prevailing story.

Few people want to say the obvious, that the Fed did nothing to stop price increases; society simply managed to live through hardship caused by government and central planners. Unfortunately, the most honest and consistent pattern over the decades is perpetual decline of the dollar, unaffordability of life for the masses, and ever-increasing debt levels showing no signs of letting up anytime soon. Society continues to not be capable of “taming” inflation as much as simply finding ways to “live through” inflation, made easier when we ascribe great feats to leaders said to have carried us through such trying times. 

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Will Special Interests Allow America’s "Longest War" to Finally End?

05/04/2021Ron Paul

Even if “won,” endless wars like our 20 year assault on Afghanistan would not benefit our actual national interest in the slightest. So why do these wars continue endlessly? Because they are so profitable to powerful and well-connected special interests. In fact, the worst news possible for the Beltway military contractor/think tank complex would be that the United States actually won a war. That would signal the end of the welfare-for-the-rich gravy train.

In contrast to the end of declared wars, like World War II when the entire country rejoiced at the return home of soldiers where they belonged, an end to any of Washington’s global military deployments would result in wailing and gnashing of the teeth among the military-industrial complex which gets rich from other people’s misery and sacrifice.

Would a single American feel less safe if we brought home our thousands of troops currently bombing and shooting at Africans?

As Orwell famously said, “the war is not meant to be won, it is meant to be continuous.” Nowhere is this more true than among those whose living depends on the US military machine constantly bombing people overseas.

How many Americans, if asked, could answer the question, “why have we been bombing Afghanistan for an entire generation?” The Taliban never attacked the United States and Osama bin Laden, who temporarily called Afghanistan his home, is long dead and gone. The longest war in US history has dragged on because … it has just dragged on.

So why did we stay? As neocons like Max Boot tell it, we are still bombing and killing Afghans so that Afghan girls can go to school. It’s a pretty flimsy and cynical explanation. My guess is that if asked, most Afghan girls would prefer to not have their country bombed.

Indeed, war has made the Beltway bomb factories and think tanks rich. As Brown University’s Cost of War Project has detailed, the US has wasted $2.26 trillion on a generation of war on Afghanistan. Much of this money has been spent, according to the US government’s own Special Inspector General for Afghanistan Reconstruction, on useless “nation-building” exercises that have built nothing at all. Gold-plated roads to nowhere. Aircraft that cannot perform their intended functions but that have enriched contractors and lobbyists.

President Biden has announced that the US military would be out of Afghanistan by the 20th anniversary of the attacks of 9/11. But as always, the devil is in the details. It appears that US special forces, CIA paramilitaries, and the private contractors who have taken an increasing role in fighting Washington’s wars, will remain in-country. Bombing Afghans so that Max Boot and his neocons can pat themselves on the back.

But the fact is this: Afghanistan was a disaster for the United States. Only the corrupt benefitted from this 20 year highway robbery. Will we learn a lesson from wasting trillions and killing hundreds of thousands? It is not likely. But there will be an accounting. The piper will be paid. Printing mountains of money to pay the corrupt war profiteers will soon leave the working and middle classes in dire straits. It is up to non-interventionists like us to explain to them exactly who has robbed them of their future.

Reprinted with permission.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

What Makes Western Civilization Different?

05/03/2021Lipton Matthews

Believing that Western civilization is not unique is a fashionable sentiment. Today many argue that the West has no distinctive traits. However, critics suggest that individualism, freedom and human rights are innately Western constructs. Yet there is more to the West than its history of freedom. Western civilization is easily rejuvenated by creative elements. Throughout history, other cultures have relied on new knowledge to justify old beliefs, but Westerners have consistently allowed foreign ideas to unleash their revolutionary potential.

During the medieval ages, for example, the Latin West was mesmerized by the teachings of Islamic scholars. Such knowledge was appropriated to create novel intellectual inquiries. Historian Peter O’brien offers a glimpse of this spectacular development: “The knowledge transmitted to Latin Christendom via Islamic civilization touched and upset virtually every discipline. Thomas Aquinas, for instance, devoted the lion's share of his scholarly attention to wrestling with theological and epistemological quandaries stemming from Arab philosophy…Lettered Europeans scrambled to absorb this torrent of new knowledge pouring in from their rivals. Those who could, journeyed to loci of Islamic erudition. "Since at present the instruction of the Arabs...is made available to all in Toledo," explained Daniel of Morley, "I hastened there to attend the lectures of the most learned philosophers in the world." Both Adelard of Bath and Ramón Llull travelled to the Levant to learn Arabic, study Arab texts and carry the newly acquired knowledge back to Europe.”

Earlier in his text, O’brien recounted evidence that may suggest that Western civilization is not unusual in this regard: “Cultivated Muslims embraced ancient learning. Not only did they preserve and venerate the works of Greek masters such as Plato, Aristotle and Euclid that were lost to the Latins, Islamic and Jewish sages the likes of Musa al-Khwârizmî, al-Farabi, al-Ghazzali, Abu Ma'shar (Albumasar), Ibn Sina (Avicenna), Ibn Rushd (Averroës), and Maimonides augmented and improved the inherited storehouse of knowledge.”

But what O’Brien failed to say is that the Islamic Golden Age was inspired by a few dissident Muslims who were influenced by Hindu, Greek, and Persian learning. Furthermore, Christian intellectuals who were trained by the scholars at Jundi Shapur played a crucial role in translating ancient texts. It was only in the West that intellectual revolutions became a permanent fixture. Notwithstanding the brilliance of some Muslim scholars in the Islamic faith, reason is intertwined with revelation. Up until the nineteenth-century the principle of natural causality was denied by Muslim intellectuals. Whereas though Christians believed that natural laws were instituted by God – there was the expectation that one would explore the natural world without resorting to religion.

The destruction of the Abbasid Caliphate by the Mongols negatively affected the course of science in the Islamic world, but nonetheless there was already a revival of traditional schools that were hostile to scientific inquiry prior to the invasion. Islam lacked a culture able to sustain the passionate debates that would lead to continuous revolutions. Ali. A Allawi in the Crisis of Islamic Civilization lucidly explains the tension between Islam and non-theological reasoning: “The Arabic word “individual”- al -fard - does not have the commonly understood implication of a purposeful being, imbued with the power of rational choice…The power of choice and will granted to the individual is more to do with the fact of acquiring these from God, at the point of a specific action or decision – the so-called iktisab – rather than the powers themselves which are not innate to natural freedoms or rights…Therefore to claim the right and responsibility of autonomous action without reference to the source of these in God is an affront, and is discourteous to the terms of the relationship between human beings and God…None of the free thinking schools in classical Islam – such as the Mu’tazila – could ever entertain the idea of breaking the God-Man relationship and the validity of revelation, in spite of their espousal of a rationalist philosophy.”

Similarly, the Chinese are excessively praised for their successes during the Song Dynasty. Using Chinese history as a case study, multiculturalists often posit that the West is not peculiar. Although as David Landes informs readers, the Chinese did in fact build a great civilization, but under the spell of hubris they shunned foreign technologies thinking that outsiders could not enrich a superior culture: “Along with Chinese indifference to technology went imperviousness to European science. The Jesuits and other Christian clerics brought in not only clocks but (sometimes obsolete) knowledge and ideas. Some of this was of interest to the court: in particular, astronomy and techniques of celestial observation were extremely valuable to a ruler who claimed a monopoly of the calendar and used his mastery of time to impose on the society as a whole…Little of this got beyond Peking, however, and the pride some took in the new learning was soon countered by a nativist reaction that reached back to long forgotten work of earlier periods. One leader of this return to the sources, Wen Ting (1635–1721), examined the texts of mathematicians who had worked under the Song dynasty (10th–13th centuries) and proclaimed that the Jesuits had not brought much in the way of innovation.”

Unlike countless societies Western civilization is willing to admit when its culture requires regeneration from outside forces, and this is a major reason for its dynamism. Had the West not been a self-critical society there is no doubt that it would have stagnated like other areas of the world. Another interesting point about the West is the centrality of the idea of progress. Because Western culture is self-reflective it can objectively judge the true state of society. As such, innovation often trumps traditionalism. The Renaissance, for example, repudiated much of medieval scholasticism.

Yet one cannot discuss the concept of progress in Western civilization without examining its link to the Christian notion of linear time. Contrary to the Greeks, Chinese, and other civilizations, Christianity asserts that time will not revert to older cycles. Based on this reasoning, society can only go forward. Obviously, developments can either be progressive or regressive, but one must always strive to attain progressive ends by not returning to the ignorance and superstition of the past. In short, despite the rantings of multiculturalists Western civilization is indeed special.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

What Clarence Thomas Gets Wrong about Big Tech

04/08/2021Jeff Deist

Listen to the Audio Mises Wire version of this article.

Supreme Court Justice Clarence Thomas's recent concurring opinion in the Biden vs. Knight decision sent hopeful tremors across conservative legal circles and drew condemnation from libertarians. Was Thomas finally laying the groundwork for regulation of Big Tech, which conservatives correctly view as both deeply biased against them and actively biased in favor of left-wing causes? 

At first blush, the case primarily concerned First Amendment questions about whether former president Donald Trump (while in office) could block certain individuals or groups from following his Twitter account.1 The Blockees argued that a sitting president should not be able to prevent access to "news" he creates on social media, especially when particular tweets relate to official government business. Yet if Twitter is indeed a constitutionally protected "public forum," how does the company get away with deplatforming the president of the United States?

No clear answers from the court were forthcoming: since Trump is no longer in office, the court remanded the case back to the Second Circuit to dismiss on grounds of mootness. But Thomas took the opportunity to move past any free speech questions and make a much broader case for Congress to radically rewrite regulations for the modern digital space. In his words, the "principal legal difficulty that surrounds digital platforms—namely, applying old doctrines to new digital platforms is rarely straightforward." But in the same concurrence, his language appears to argue simply for the application of existing legal doctrines, namely those surrounding antitrust, common carriers, and public utilities regulation. Thus there is a tension between his view that new thinking is required and his default to statutory or bureaucratic approaches to defeat what he sees as de facto tech monopolies:

The analogy to common carriers is even clearer for digital platforms that have dominant market share. Similar to utilities, today’s dominant digital platforms derive much of their value from network size. The Internet, of course, is a network. But these digital platforms are networks within that network. The Facebook suite of apps is valuable largely because 3 billion people use it. Google search—at 90% of the market share—is valuable relative to other search engines because more people use it, creating data that Google’s algorithm uses to refine and improve search results. These network effects entrench these companies. Ordinarily, the astronomical profit margins of these platforms—last year, Google brought in $182.5 billion total, $40.3 billion in net income—would induce new entrants into the market. That these companies have no comparable competitors highlights that the industries may have substantial barriers to entry.

Size and dominance in the provision of "essential services" are arguments we've heard against everything from railroad trusts to Ma Bell. Yet Thomas's common carrier analogy is far less accurate for today's search and social media platforms than it was for tech companies at the birth of widespread internet adoption. In the 1990s, when Congress passed the Communications Decency Act, telephony was the prevailing regulatory model. Internet service providers like AOL provided "pipe" in the form of fiber optic cable, akin to a utility providing water or electricity. Satellite and cellular ISPs would come only later. Search engines and browsers like AltaVista were the on-ramps to this information superhighway. Email companies like Hotmail provided instantaneous worldwide text communication across the old telephone networks. These early internet firms bridged the gap between old analog systems and the emerging digital networks we take for granted now.2 But unlike the AOLs of yesteryear, the biggest tech players today mostly own cloud servers and endless lines of software code, brought to life via websites or social media platforms. Yes, servers can crash due to traffic. But for the most part companies like Facebook and Twitter resemble neural networks more than pipelines. And who knows what the rapidly evolving technology landscape will look like even five or ten years from now?

This is precisely why a sclerotic federal bureaucracy ruling over Silicon Valley is the last thing we need, despite Thomas's valid concerns (in my view). Contra the CDA, and contra Justice Thomas, common law tort and contract actions are the pragmatic and just approach to address harms caused by tech companies. As I argued in "A Tort Law Approach to Fighting Big Tech," long-standing legal concepts like equitable estoppel, conversion, fraud, and waiver are available and malleable. Libertarian legal theory—rooted in natural law, property, and restitution—relies on common law "discovery" rather than positive law edicts. Ever-evolving common law, highly temporalized and localized, provides the best mechanism for determining what actions by tech companies should give rise to legal liability. Stealing a horse in 1800s Tombstone, Arizona, is different than stealing a horse in 2021 Middleburg, Virginia: the former may have left the victim dead in the desert and the perpetrator ordered whipped by an exceedingly unsympathetic jury. Today, deplatforming a celebrity from social media or unbanking a small business owner may leave them metaphorically stranded in the desert. In both cases societal and technological evolution should compel us to adjust our ideas of harm and proportionality. Shouldn't common law, rather than rigid and highly political statutory or regulatory frameworks, give us better outcomes?

The larger question for libertarians is whether their existing conceptions of property rights, harms, torts, and free speech still work in a thoroughly digital era. Principles may not change, but facts and circumstances certainly do. Rothbard's strict paradigm for what ought to constitute actionable force, especially as discussed in part II of The Ethics of Liberty, requires some kind of physical invasion of person or property. In doing so, Rothbard necessarily distinguishes between aggression (legally actionable) and the broader idea of "harm." The former gives rise to tort liability in Rothbardian/libertarian law; the latter is part of the vicissitudes of life and must be endured. Theorists like Professor Walter Block and Stephan Kinsella have expanded on this "physical invasion" rule, applying it to everything from blackmail to defamation to (so-called) intellectual property. Aggression against physical persons or property creates a legally actionable claim, mere harm does not. 

But Rothbard's bright-line rule seems unsatisfying in our digital age. If anything, the complexity of modern information technology and the pace of innovation make the case against bright-line tests. For one thing, the sheer scale of instantaneous information ought to inform our view of aggression vs. harm. A single (false) tweet stating "famous person X is a pedophile" could reach hundreds of millions of people in a day, ruining X's life forever. This is a bit worse than a punch to X's nose in a bar fight, to put it mildly. Moreover, physical trespass against property takes on an entirely different form when said property is intangible, e.g., Twitter's platform and servers. There is a difference, at least in scale, between Donald Trump occupying a tiny sliver of data storage (at almost no additional marginal cost to Twitter) and Donald Trump occupying the lobby at Twitter’s headquarters. Again, surely the best argument is to let naturally evolving common law grapple with these questions. Yes, we don't have private common law courts, and yes, we have a gargantuan statutory overlay at both the federal and state levels. But we ought to argue for the underlying principle of evolving, discovered law—and advocate for legislatures to get out of the way of private litigating parties and juries.

Common law tort and contract doctrines, not a hopelessly befuddled Congress or agency bureaucrats, can regulate Big Tech. But libertarians and conservatives should broaden their conceptions of tort and contract remedies, and support the evolution of what constitutes harm in a digital era. "Private companies" that openly deplatform, impoverish, and unperson dissident voices are waging a war of attrition. Those inclined to fight back should look to courts rather than legislatures, and they don't need novel legal theories to do so. Common law tort and contract will do just fine.

  • 1. In what universe does "Congress shall make no law" apply to presidents? Our universe, apparently. And is political speech really much of a virtue, in the sense of securing individual liberty? Property rights, to the extent they are respected, anyway, yield very tangible benefits for average people. It is less clear whether so-called political rights (voting, speech, petition) have done much good for the modern West at all—look at the people who keep getting elected, for starters!
  • 2. Since all of this was new, the authors of the Communications Decency Act reasonably decided these nascent companies should not be legally liable for the misdeeds or defamatory content produced by their users. After all, if two individuals enter into a criminal conspiracy over AT&T's phone network, we don't charge AT&T as a coconspirator. And in stark contrast to the social media companies of today, early ISPs and search engines exercise almost no oversight over content whatsoever, much less editorial oversight. They were truly neutral platforms.
    Still, the CDA's chief mechanism for promoting a largely unregulated internet—Section 230—not only provides certain classes of tech companies with immunity from federal suits, but also preempts certain kinds of cases from being heard in state courts. This was and is constituionally shaky, as Congress has no business telling state courts what kind of lawsuits they may hear.
Image source:
Cknight70 via Flickr
When commenting, please post a concise, civil, and informative comment. Full comment policy here

Want a Job? Get a Shot!

03/23/2021Ron Paul

Mask tyranny reached a new low recently when a family was kicked off a Spirit Airlines flight because their four-year-old autistic son was not wearing a mask. The family was removed from the plane even though the boy’s doctor had decided the boy should be exempted from mask mandates because the boy panics and engages in behavior that could pose a danger to himself when wearing a mask.

Besides, four-year-olds do not present much risk of spreading or contracting coronavirus. Even if masks did prevent infections among adults, there would be no reason to force children to wear masks.

Mask mandates have as much to do with healthcare as Transportation Security Administration (TSA) screenings have to do with stopping terrorism. Masks and TSA screenings are “security theater” done to reassure those frightened by government and media propaganda regarding coronavirus and terrorism that the government is protecting them.

Covid oppression will worsen if vaccine passports become more widely required. Vaccine passports are digital or physical proof a person has taken a coronavirus vaccine. New York is already requiring that individuals produce digital proof of taking a coronavirus vaccine before being admitted to sporting events.

Imagine if the zealous enforcers of mask mandates had the power to deny you access to public places because you have not “gotten your shot.” Even worse, what if a potential employer had to ensure you were “properly” vaccinated before hiring you? This could come to pass if proponents of mandatory E-Verify have their way.

E-Verify requires employers to submit personal identifying information — such as a social security numbers and biometric data — to a government database to ensure job applicants have federal permission to hold jobs.

Currently, E-Verify is only used to assure a job applicant is a citizen or legal resident. However, its use could be expanded to advancing other purposes, such as ensuring a potential new hire has taken all the recommended vaccines.

E-Verify could even be used to check if a job applicant has ever expressed, or associated with someone who has expressed, “hate speech,” “conspiracy theories,” or “Russian disinformation,” which is code for facts embarrassing to the political class.

Many employers will be reluctant to hire such an employee for fear their businesses will become the next targets of “cancel culture.” Those who doubt this should consider how many businesses have folded under pressure from the cultural Marxists and fired someone for expressing an “unapproved” thought.

Politicians and bureaucrats have used overblown fear of coronavirus to justify the largest infringement of individual liberty in modern times. Covid tyranny has been aided by many Americans who are not just willing to sacrifice their liberty for phony security, but who help government take away liberty from their fellow citizens.

The good news is that, as it becomes increasingly clear that there was no need to shut down the economy, throw millions out of work, subject children to the fraud of “virtual” learning, and force everyone to wear a mask, more people are turning against the politicians and “experts” behind the lockdowns and mandates. Hopefully, these Americans will realize that, in addition to coronavirus lockdowns and mandates, the entire welfare-warfare-fiat money system is built on a foundation of lies.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Will 2021 Be the Year of the Revolution?

03/03/2021Jeffrey Overall

On the heels of the covid-19 pandemic, unprecedented socioeconomic challenges have emerged on a global scale, including growing resentment toward: government-enforced lockdowns, government corruption, inequality, and climate change. In the United States alone, the economic costs of the pandemic have been estimated at $16 trillion. With international socioeconomic systems in flux, policy makers, economists, and forecasters are left puzzled on what is going to happen next. In this article, I provide three predictions for 2021.

Over the past decade, there has been more social unrest and political uprisings than ever before. In 2019, there were protests in over 110 countries that were tied to: inequality (through the #metoo and black lives matter (BLM) movements), climate change, and the further encroachment of the state into the lives of citizens, which occurred in Hong Kong. In 2020, mass antigovernment protests have emerged throughout North America and Western Europe involving police brutality against racialized communities and government-enforced lockdowns. However, since the increase in government-enforced lockdowns, these movements of discontent have been suppressed, which is worrisome because as Sigmund Freud famously argued, “unexpressed emotions will never die. They are buried alive and they will come forth later in uglier ways.” Indeed, unexpressed emotions fester and as we saw with the storming of Capital Hill, eventually come out, often in more extreme ways.

Prediction #1: 2021 will be the year of the revolution

Heavy-handed covid-19 government-enforced lockdowns will be matched with social unrest of greater magnitude until a G20 government is overthrown. With the greatest fear of any state being social unrest, governments across the globe have strict measures in place to suppress social unrest and protect their authority. In China, for example, the Financial Times reported that the Communist Party of China (CPC) censors the media and polices protests strategically. Similar approaches, to varying degrees, including mass government surveillance, have been and are continuing to be used in other countries. In the United States, the magnitude of the invasion of privacy rights was exemplified by the governmental surveillance that became apparent from the whistle-blowing activities of Edward Snowden, the former National Security Agency (NSA) analyst, in 2013. In the Snowden documentation, the instances of mass surveillance being conducted by the US government was not only localized within the US, but also abroad involving foreign nationals and governments. Consistent with these privacy concerns, governments across the globe are releasing digital contact tracing technologies (in the form of mobile apps) to not only monitor the movements of citizens, but also to extract their mobile phone data. The New York Times has reported that several regimes are taking this opportunity to seize new powers that have little to do with the pandemic and without the likelihood of relinquishing them after it is resolved. To address the possibility of oppression, UN human rights experts have warned governments not to exploit emergency pandemic measures to erode the human rights of their citizens. As a result of these issues, trust in the government is at an all-time low and since the pandemic emerged, almost everyone that you speak with has a theory of what they think is "really going on."

Prediction #2: A high-level government official to reveal additional information about the pandemic leading to a formal accusation against a foreign government

To restore trust, it is expected that a high-level whistle-blower from a G20 government, similar to the Snowden case, will be used to provide "evidence" that a foreign state manufactured the virus in an act of biochemical war.1 Biochemical warfare is not a new phenomenon. It has been used by British colonialists when they gave the indigenous people of Canada blankets infected with smallpox. This act caused many deaths in a localized epidemic.

With the international collusion that occurred during the 2016 US presidential election, the US government is sensitive to foreign entities interfering in its affairs. At the end of 2019, two years after the US-China trade war began, covid-19 emerged out of China and spread throughout the globe causing mass socioeconomic devastation. Although there has been anti-China propaganda for nearly twenty years 0151due, in part, to the transference of manufacturing employment from industrialized nations to China and, now, the active trade war—the US president suggested that China manufactured the virus. In August 2020, Forbes published an interview with Dr. Mark Kortepeter, a physician and biodefense expert for the US Army where he concluded that covid-19 “has some "desirable" properties as a bioweapon, but probably not enough to make it a good choice for military purposes.”

This formal accusation will likely trigger an "us versus them" mentality in the United States, centered on stimulating nationalism and reducing social unrest. This approach has been used by the US government for nearly a century. Specifically, the US government has consistently portrayed an "enemy figure" that citizens need to fear and be protected against. In the twentieth century, enemies were the: Germans, Russians, Communists, Japanese, and Middle Easterners. In the early part of the twenty-first century, citizens were programed to fear Islamic terrorists. In these instances, the government is positioned as the only protector of citizens against foreign enemies.

Although it is expected that China will deny these accusations, there are four possible motivations for using covid-19 as a biochemical weapon:

A. Retaliation for the US-China trade war, which caused the Chinese economy to slow to its lowest level in three decades while industrial output fell to a seventeen-year low. This is of strategic significance to the Chinese regime as it has predicted that a significant economic slowdown could be a potential cause of social unrest. Much to the chagrin of Beijing, there was mass social unrest throughout Hong Kong from 2019 to 2020, causing the regime significant domestic and international challenges that persist today.

B. Retaliation for the support given by the US government to the: Hong Kong protest movement, rights of the Tibetan people, and international rights of Taiwan. All three locations are of strategic importance to Beijing.

C. The pandemic has caused economic devastation and mass social unrest in the US. Depending on the magnitude, this has the potential to cause the US government to redeploy portions of their military from key international areas, like the South China Sea, which is of utmost strategic importance to China, to domestic locations. Consistent with this prediction, to counter the Capital Hill riots of 2021, the US government summoned the support of the National Guard.

D. In recent years, Beijing has focused resources on shifting their economy from manufacturing to service based. With the pandemic forcing service-based employees to work remotely, we are seeing a surge in digital nomads and remote workers. This has created an international labor market whereby organizations are able to hire international employees and allow them to work remotely from anywhere in the world. Like the transference of manufacturing employment from North America and Western Europe to China that began 20 years ago, there are significant cost savings available to organizations that begin recruiting their managerial and staff positions from the international labor market and, in particular, China.

Prediction #3: Universal basic income

As we saw with the Hong Kong protests, people who have nothing to lose are more willing to revolt. With this in mind, to appease the population, curb social unrest, and restore trust in government, a universal basic income will be announced this year. By doing this, citizens would become increasingly reliant on the government for monetary support and, as a result of this dependence, less willing to revolt.

  • 1. It is important to note that questionable evidence was used by the Bush Administration in their determination that Iraq had weapons of mass destruction (WMD). This evidence was the reasoning behind the decision by the US Government to declare war on Iraq in 2003. It was later determined that these weapons were not in the possession of the Iraqi Government.
When commenting, please post a concise, civil, and informative comment. Full comment policy here

Why Trust the Experts?

02/11/2021Lipton Matthews

It has now become commonplace to accuse anyone who opposes covid lockdowns of being “antiscience.” This sort of treatment persists even when published scientific studies suggest the usual prolockdown narrative is wrong. support the antilockdown position.

There are sociological, economic, and cultural reasons why experts will take the politically popular position, even when the actual scientific evidence is weak or nonexistent.

Experts Are Biased and Are Self-Interested like Everyone Else

Though we are often encouraged to listen to experts because of their intelligence and expertise, there is a strong case for us to be skeptical of their pronouncements.

Beliefs serve a social function by indicating one’s position in society. Hence to preserve their status in elite circles, highly educated experts may subscribe to incorrect positions, since doing do so can confer benefits. Refusing to hold a politically popular viewpoint could damage one’s career. And since upper-class professionals are more invested in acquiring status than working people, we should not expect them to jettison incorrect beliefs in the name of pursuing truth. Cancel culture has taught us that promoting the world view of the elite is more important than truth to decision makers.

So why should we listen to experts when they give greater primacy to appeasing elites than solving national problems? In contrast to what some would want you to believe—revolting against experts is not an attack on science, considering that little evidence suggests that they care about scientific truth. Let us not fool ourselves. People occupying powerful offices are uninterested in being toppled from positions of influence, and as such, they will seek to minimize views that threaten their professional or intellectual authority. As a result, expecting influential bureaucrats to value truth is unwise. Truth to a bureaucrat is merely the consensus of the intelligentsia at any given time.

Of note is also the lesser ability of intelligent people to identify their own bias. Stemming from their greater levels of cognitive development, it is easier for intelligent people to rationalize nonsense. Justifying extreme assumptions requires a lot of brainpower, so this could possibly explain why highly intelligent people—specifically, people “higher in verbal ability”—are inclined to express more extreme opinions. Our culture has immense faith in expert opinion, although the evidence indicates that such confidence must be tempered by skepticism. Intelligent people, whether they be experts or politicians, do not have a monopoly on rationality.

Admittedly, intelligence may act as a barrier to objective thinking. Brilliant people are adept at forming arguments, therefore even when confronted with compelling data, they are still able to offer equally riveting counterpoints. Smart people can engage opponents without resorting to a bevy of studies to buttress their conclusions. Thus, clearly, the proposals of experts ought to be held to a higher standard primarily because they are smarter than average.

The capacity of an intelligent person to provide coherent arguments in favor of his ideas can be impressive, and may only serve to solidify him or her in his or her conclusions. For instance, in the arena of climate change experts have recommended policies that are consistent with data on nothing but the claim that a consensus supports such proposals. Promoting the wide-scale use of renewables, for example, is usually touted as a sustainable climate strategy despite the fact that studies argue the reverse.

Counter to the rantings of the intelligentsia, we should implore more people to express skepticism of experts. Due to their high intelligence, experts tend to be more inflexible and partisan than other people. This is solid justification for ordinary people to be skeptical of the intellectuals in charge of national affairs. Unlike wealthy bureaucrats, who are insulated from the economic fallout of their bad ideas, the poor usually bear the burden.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Wall Street Regulators Love Broken Windows

01/29/2021Pia Varma

The Joker was right about one thing…introduce a bit of chaos and everyone loses their mind. And that’s just what happened with the fervor created by a rogue group of investors on Reddit as they pushed the price of GameStop and other nearly defunct companies higher and higher. Regulators usually have no idea what to do with themselves during a black swan situation like this, but they know they need to do something. And it doesn’t matter what the unintended consequences might be. This rally has given them the perfect opportunity to crack down on Wall Street without actually looking at what caused the issues in the first place.

I am reminded of Frédéric Bastiat’s parable of the broken window. It goes something like this: a boy breaks a window at his father’s shop. His father hires the town glazier to replace the broken plane. The glazier than spends the money he earns on himself or his business, thus stimulating the economy. The townspeople decide that the boy has done the community a huge service and everyone is better off as a result of this happy accident.

However, Bastiat points out that this is a fallacy. By forcing his father to pay for the window, the boy has reduced his father’s disposable income. His father will not be able to purchase new shoes he may have been wanting or invest in his own business; thus, other industries will experience losses. The time he spends dealing with the broken window could have been put to better, more productive use. Furthermore, replacing the window is a maintenance cost, not a purchase of new goods, and maintenance does not stimulate production.

This parable has typically been used to discredit the idea that going to war stimulates a country’s economy. But I think this parable perfectly highlights the general tendency of bureaucrats and policymakers toward implementing sweeping reforms without looking at the unintended consequences. Sound bites and feel-good-isms, not full analyses from all angles, rule the day.

For instance, there was a time in 2008 when regulators in the United States banned naked short selling without considering the unintended consequences. There was also a media war against futures trading, derivatives and securities (all legitimate and very important market mechanisms that actually help to lower volatility and spread wealth) without any acknowledgement of the unintended consequences. In other words, these regulations broke a lot of windows. But some people on Wall Street found a way to profit from these broken windows. This doesn't mean the regulations weren't damaging. But the regulators can see only those who benefited. Meanwhile, there has been very little mention of the continued unintended consequences of maintaining low interest rates, the real culprit of all these problems. (After all, it was zero-interest rates along with the Community Reinvestment Act, which meant everyone with a pulse could buy a house in the first place, that made the mortgage-backed securities so toxic.)

Again now, Wall Street regulators want to get involved over the Reddit/GameStop/AMC hype. Blame Wall Street, as usual, for treating the stock market as a personal casino and for the hubris that has ensued as the little guy suffers.

But what do you expect with near-negative interest rates? That hubris is the result of Band-Aid after Band-Aid after Band-Aid. Whether you call it ZIRP (zero interest rate policy), QE (quantitative easing) 1/2/3, bailouts, stimulus, or some other such name, a rose is still a rose. 

Can the Markets Rein in Wall Street?

And now a legitimate market solution has presented itself to combat the old boys' club of Wall Street. And this solution is doing something Elizabeth Warren had not been able to do…hold Wall Street accountable. This is partly because Warren never understood the problem. The problem was the regulation itself. 

Of course, there will be plenty of concern in Washington and at the SEC (Securities and Exchange Commission) about what happened with GameStop, AMC, and others, and the busybodies will want to get involved. The problem when legislators get involved, though, is that they don’t understand or are not honest about the full picture. For now, regulators like Warren and Alexandria Ocasio-Cortez seem to be on the side of the small investor, criticizing the Robinhood app after it banned them from buying stock.

Online trading forums, on the other hand—like those at Reddit—allow smaller, amateur investors a seat at the table. This is the democratization of finance at its finest.

As this unfolded, I was instantly reminded of Thomas Friedman’s electronic herd metaphor:

The electronic herd cuts no one any slack….The herd is not infallible. It makes mistakes too. It overreacts and it overshoots. But if your fundamentals are basically sound, the herd will eventually recognize this and come back. The herd is never stupid for too long. In the end, it always responds to good governance and good economic management.

We also saw the herd in action after Trump/Twitter/Parler debacle. Both Twitter and Facebook lost billions in market value overnight. 

The move toward decentralization and democratization of technology and finance is the shining lifeboat in the storm of bureaucratic chaos. And forums like Reddit are our Galt's Gulches. Regulating these forums could have huge unintended consequences. But hey, if things go sour, they could always just hire a glazier to patch things up. 

When commenting, please post a concise, civil, and informative comment. Full comment policy here

When Fascism Comes, It Will Be Wearing a Mask

01/26/2021Ron Paul

Listen to the Audio Mises Wire version of this article.

Almost immediately after his inauguration, President Joe Biden began creating new government dictates via executive orders. Many of these executive orders concern coronavirus, fulfilling Biden’s promise to make ramping up a coronavirus-inspired attack on liberty a focus of his first one hundred days.

One of Biden’s executive orders imposes mask and social distancing mandates on anyone in a federal building or on federal land. The mandates also apply to federal employees when they are “on-duty” anywhere. Members of the military are included in the definition of federal employees. Will citizens of Afghanistan, Iraq, and other countries where US troops are or will be “spreading democracy” be happy to learn the troops shooting up their towns are wearing masks and practicing social distancing?

Another one of Biden’s executive orders forces passengers on airplanes, trains, and other public transportation to wear masks.

Biden’s mask mandates contradict his pledge to follow the science. Studies have not established that masks are effective at preventing the spread of coronavirus. Regularly wearing a mask, though, can cause health problems.

Biden’s mask mandates are also an unconstitutional power grab. Some say these mandates are an exercise of the federal government’s constitutional authority to regulate interstate commerce. However, the Constitution gives Congress, not the president, the power to regulate interstate commerce. The president does not have the authority to issue executive orders regulating interstate commerce absent authorization by a valid law passed by Congress. The Founders gave Congress sole law-making authority, and they would be horrified by the modern practice of presidents creating law with a “stroke of a pen.”

Just as important, the Commerce Clause was not intended to give the federal government vast regulatory power. Far from giving the US government powers such as the power to require people to wear masks, the Commerce Clause was simply intended to ensure Congress could protect free trade among the states.

Biden also signed an executive order supporting using the Defense Production Act to increase the supply of vaccines, testing supplies, and other items deemed essential to respond to coronavirus. The Defense Production Act is a Cold War relic that gives the president what can fairly be called dictatorial authority to order private businesses to alter their production plans, and violate existing contracts with private customers, in order to produce goods for the government.

Mask and social distancing mandates, government control of private industry, and some of Biden’s other executive actions, such as one creating a new “Public Health Jobs Corps” with responsibilities including performing “contact tracing” on American citizens, are the type of actions one would expect from a fascist government, not a constitutional republic.

Joe Biden, who is heralded by many of his supporters as saving democracy from fascist Trump, could not even wait one day before beginning to implement fascistic measures that are completely unnecessary to protect public health. Biden will no doubt use other manufactured crises, including “climate change” and “domestic terrorism,” to expand government power and further restrict our liberty. Under Biden, fascism will not just carry an American flag. It will also wear a mask.

Reprinted with permission. 

Image source:
Getty
When commenting, please post a concise, civil, and informative comment. Full comment policy here
Shield icon power-market-v2