Even Cows Understand the Problem with the Commons

Even Cows Understand the Problem with the Commons

09/05/2018Jim Cox

A popular poster depicts four cows standing in the corners of their respective fields at the intersection of two barbed wire fences. Each of the four cows has stretched her neck through the wires to reach the grass in another cow’s field. The poster invokes a humorous reaction from most observers. To most it would illustrate the phrase that “the grass is always greener on the other side,” or maybe how silly we all are pursuing distant pleasures when there is an abundance available to us where we are.

But the poster actually illustrates rational behavior and the importance of property rights in preserving resources! The rational behavior of the cows is that each is attempting to maximize its access to grass. The remaining non-fence line grass in each cow’s field is readily available to her since she is in that field and the other cows are fenced out.

But the grass on the perimeter of her field along the fence line is within reach of the adjoining cows. Therefore, each cow is faced with first eating the grass along the fence line or missing out on the same if the other cows get there first. The grass along the fence line is therefore effectively common property and such resulting behavior is often referred to as the tragedy of the commons.1

Unowned or collectively owned resources tend to be consumed and not conserved because no one has the right to the long-term value of that good—that is, no one has a property right in that good. It is in the self-interest of each cow (or person) to get what they can before it is gone. The cows are merely responding to the institutional setting in which they find themselves. If we want people or cows to do X we would be well advised to make it in their self-interest to do X. If the fences were so constructed to protect each cow from the incursion of the other there would be no rush to consume grass along the fence line. Under this alternate arrangement resources could be conserved since ownership is secured—that is, each would enjoy a property right in the good.

  • 1. Garret Hardin, The Tragedy of the Commons,Science , December 13, 1968
Image source:

Secretary Yellen's Dream Date

02/16/2024Douglas French

Bloomberg reports that Treasury Secretary Janet Yellen’s pick for a dream date lunch would be none other than John Maynard Keynes, who Bloomie reporter Christopher Condon describes as “the founding father of modern macroeconomics.” I thought John Law held that title. 

This pronouncement happened during a speed round of questions in Detroit while chatting with Michigan Governor Gretchen Whitmer. “I would choose John Maynard Keynes,” said Yellen. Keynes “changed the way all of us understand business cycles, public policy and financial markets.”

Murray Rothbard referred to Keynes in his History of Economic Thought class class as simply “Maynard.” In his Forward to Henry Hazlitt’s The Failure Of The ‘New Economics’, What Yellen reveres so much, Rothbard called a “Keynesian holocaust” in his Forward to Henry Hazlitt’s The Failure Of The ‘New Economics’. Yes, there's been bubbles, busts and inflation ever since. About Keynes, the man, Yellen’s dream date, Rothbard wrote, 

John Maynard Keynes, the man — his character, his writings, and his actions throughout life — was composed of three guiding and interacting elements. The first was his overweening egotism, which assured him that he could handle all intellectual problems quickly and accurately and led him to scorn any general principles that might curb his unbridled ego. The second was his strong sense that he was born into, and destined to be a leader of, Great Britain’s ruling elite. Both of these traits led Keynes to deal with people as well as nations from a self-perceived position of power and dominance. The third element was his deep hatred and contempt for the values and virtues of the bourgeoisie, for conventional morality, for savings and thrift , and for the basic institutions of family life.

There “is really a bipartisan understanding that he really hit deep insights into how economies work,” the Treasury chief said. Macroeconomics “as a distinct discipline began with Keynes’s masterpiece, The General Theory of Employment, Interest and Money, in 1936,” according to an International Monetary Fund note.

Rothbard in his Forward to Hazlitt’s critique of Keynesianism, wrote that Hazlitt “in this vitally important and desperately needed book throws down the challenge in a detailed, thoroughgoing refutation of the General Theory.” 

The General Theory was anything but a masterpiece. As Hazlitt explained,

Now though I have analyzed Keynes’s General Theory in the following pages theorem by theorem, chapter by chapter, and sometimes even sentence by sentence, to what to some readers may appear a tedious length, I have been unable to find in it a single important doctrine that is both true and original. What is original in the book is not true; and what is true is not original. In fact, as we shall find, even much that is fallacious in the book is not original, but can be found in a score of previous writers.

During her gushing the Treasury Secretary noted that President Richard Nixon famously said in the 1970s “we’re all Keynesians now.”

Not all of us.

Pre-order the 4th Expanded Edition of Early Speculative Bubbles & Increases In The Supply of Money today. 

Jesús Huerta de Soto's Commentary on Javier Milei's Davos Speech

This video is an  English translation of Jesús Huerta de Soto's comments about President Javier Milei's address at Davos. This lecture was recorded during a session of the Master of Austrian Economics program at King Juan Carlos University:

Professor Huerta de Soto examines Javier Milei's speech in Davos

The Problem with the Arbitrary Line between Legal and Illegal Immigration

02/14/2024Ryan McMaken

For at least twenty years, the term "illegal alien" (or similar variations like "illegal immigrant") has been the subject of a contentious debate between pro-immigation and anti-immigration activists. Over the past decade, the Left has increasingly managed to limit the use of the term "illegal" to refer to any migrant. For example, until fairly recently, the Associated Press was still using the term "illegal immigrant," but as Foreign Policy magazine notes, under "pressure from immigration advocates, the Associated Press update[d] its stylebook" to change the term in 2013. Other media organizations have followed suit. Nonetheless, some government organizations occasionally continued to use the term "illegal alien" until 2021 when the Biden administration instructed the US executive branch to abandon the term "alien" altogether. "Illegal alien" is on its last leg.

The disagreement between Left and Right is largely over how to portray the immigrants themselves. The Left seeks to banish the term "illegal" so as to normalize undocumented migration and increase it in general. The Right, on the other hand, seeks to portray undocumented immigration as nefarious in order to further restrict overall immigration. 

For the sake of argument, however, let's say that we are agnostic on the matter of whether immigration should be increased or decreased. 

So let's ask: is the term "illegal immigrant" useful? The answer is: "it depends." Moreover, the designations of legal and illegal do little to tell us about the productivity of a migrant, or the demands he or she makes of the welfare state. In practice, legal immigrants have greater access to public funds than illegal immigrants, and it shows. 

How Bureaucrats Arbitrarily Decide What is Legal

The core of the problem lies in the fact that the designations of legal and illegal are not rooted in market transactions or voluntary exchange. Rather, the designations rely primarily on arbitrary bureaucratic criteria. For example: Congress has declared that if immigrant X has filled out the appropriate approved paperwork and has been given the green light by some federal agent, he is thus legal. Congress has also declared that if immigrant Y's paperwork does not receive the necessary rubber stamp from some bureaucrat, he is not legal. In this latter case, private employers are not legally permitted to hire the worker regardless of the worker's skills or the employers needs. Even for those specific "illegal" immigrants who have been offered jobs and lodging in the private sector—and can pay their own bills—the lack of proper government paperwork precludes these potential workers from peaceful exchanges with employers and others.  

We can see the arbitrariness of this sort of thing in a number of other examples. 

One example is the minimum wage: the government has declared that an employer cannot enter into a contract with employees at a wage level under the minimum wage. Thus, employment at or above the minimum wage is "legal." Employment below that level is "illegal." The contract remains illegal even if both parties are willing. Thus, the line between legal and illegal in this case is totally arbitrary and based on mothing more than the central-planning impulses of federal lawmakers. 

We see a similar phenomenon in relation to drugs. At the federal level, Viagra is legal because Congress says so, and marijuana is illegal because Congress has decreed it. There’s certainly no objective standard determining why the federal government grants private citizens the freedom to choose one but not the other. There’s no clear difference between the two in terms of long-term health risks. In fact, Viagra is likely a bigger risk than marijuana. 

A final example can be found in the idea in the lockdowns that governments imposed on businesses and households during the covid panics of 2020. At that time, the government arbitrarily defined some businesses as essential while other businesses were deemed non-essential. In some jurisdictions, those workers deemed non-essential were even told to not leave their homes or risk prosecution. Thus, there were "legal" businesses and "illegal" businesses. This distinction was purely arbitrary, of course, and reflected nothing more than the biases of politicians and health officials. 

In all four examples, the only real difference at work here is that a legislator or bureaucrat has decided that one drug/immigrant/employee/business fits a government-defined standard while another drug/immigrant/employee/business does not. 

Nonetheless, the distinction between legal and illegal is relevant in the context of law and public policy. Obviously, when an employee and an employer agree the employer will pay the employee less than the government's minimum wage, that can bring serious penalties. The same is true of using illegal drugs or hiring illegal immigrants. In all cases, the state has the de facto power to prosecute and sanction those who run afoul of the regime's arbitrary decrees. The regime violates property rights when it prosecutes people for these peaceful activities, of course, but states have never much troubled themselves about violating property rights. 

The use of the terms "legal" and "illegal" in these contexts also serve a rhetorical purpose. They tell us that government policymakers like the legal thing, and don't like the illegal thing. Skeptics of the state's "wisdom," however, have long understood that governments have never been reliable arbiters on what is good, moral, proper, or healthy. The legal status of an activity or person or product has never been a definitive criterion on which to base an opinion about much of anything. 

Using Legal Status to Expand the Welfare State

The arbitrariness of the legal/illegal distinction in immigration cuts both ways.

It's true that the designation of "illegal" can be used to reduce immigration by cutting off access to the legal marketplace for immigrants without the proper government approval.  On the other hand, the designation of "legal" can be used to expand taxpayer-funded subsidies. Contrary to the widely-held belief that legal immigrants are productive and illegal immigrants are unproductive, the reality is that legal immigration tends to be a larger overall drain on taxpayer resources that illegal immigrants. This is partly because there are more legal immigrants than illegal. But it's also true because most legal immigrants have more access to the American welfare state than do illegal immigrants. Moreover, many legal immigrants avail themselves of the many generous American safety-net programs.  

Indeed, a label of "legal" can often be applied to unproductive immigrants who take advantage of welfare programs and who may not even work for a living. Measures of legal immigrant use of welfare programs show robust levels of participation in these programs. The best that can be said of legal immigrants (on average) in this regard is that (according to some conservative measures) they collect welfare at slightly lower rates than the native population.  But this should not be surprising. After all, after a mere five years of residence in the United States, most immigrants with legal permanent resident status have access to the full array of welfare programs including food stamps, Medicaid, CHIP, cash assistance, and more. Some US states (California, for example) offer taxpayer-funded benefits to immigrants without the five-year bar, including Medicaid and food stamps. 

Consequently, the extension of the status of "legal" immigrant is really just an indication that the immigration is more likely to collect taxpayer-funded benefits of one type or another. Yes, illegal immigrants are eligible for some social benefits such as emergency medical care and taxpayer funded childcare.  Legal immigrants, however, are eligible for far more in the way of social benefits. A permanent resident—once declared "legal"—cannot be deported for being willfully unemployed or collecting social benefits.  In other words, an immigrant can obtain and maintain legal status even if he or she is unemployed, on welfare, and a net drain on taxpayers. Note that such a person can be "legal" while self-reliant immigrants with jobs and private economic support can still be arbitrarily labeled "illegal." 

Moreover, there are additional loopholes that allow the regime to declare many immigrants—many of them initially labeled "illegal"—as eligible for greater and more immediate access to taxpayer funded benefits. For example, "[r]efugees, people granted asylum or withholding of deportation/removal, Cuban/Haitian entrants, certain Amerasian immigrants" and other specific groups are exempted from the waiting period.  By definition, everyone in these groups is a legal immigrant, and we see is not at all necessarily the case that legal immigration results in fewer demands on the taxpayers than legal immigrants. 

So, is the term "illegal alien" useful? It's not very useful beyond simply determining the state of that migrant's paperwork and the migrant's relationship with government authorities. When it comes to the private sector and the net economic contribution that person makes, the terminology doesn't tell us much about any specific case. 

A more fruitful question may be to ask how much taxpayers ought to be called upon to fund foreign nationals, legal or otherwise.1 By limiting access to the welfare state for all foreign nationals—not just the "illegal" ones, but also legal permanent residents—immigration policy would move toward a less arbitrary standard that is not quite so easily manipulated by government policymakers. 

  • 1. There are many definitions of "foreign national" in use, and many organizations claim that legal permanent residents are not foreign nationals. Common defintions of "foreign national," include "a person or organization who is not a citizen of the United States, and who is a citizen of a foreign country" or "A non-naturalized citizen of a country." The US Department of Homeland Security defines legal permanent residents as "foreign nationals who have been granted the right to reside permanently in the United States." According to the State Department, a "national" is "a person owing permanent allegiance to a state." WIt is reasonable, therefore, to conclude that a colloquial understanding of the term "foreign national" strongly suggests all non-citizens are best classified as foreign nationals.There are presently approximately 23 million foreign nationals living in the United States. 
Image source:
Adobe Images

Maculate Disinflation

02/13/2024Jonathan Newman

Stock markets tumbled this morning when the January Consumer Price Index (CPI) data came in hotter than expected. If you are wondering what the connection could be, the answer is that higher-than-expected price inflation means a longer-than-expected wait for the Fed to cut its interest rate target. It’s clear that financial markets are addicted to artificially low interest rates when any hint of a delay in rate cuts pushes stock prices off a ledge. Even news that most would consider good, like quarterly GDP growth and official unemployment rate data staying below 4%, can sour markets because of their implications for monetary policy.

The CPI release shows that “Team Transitory” ran its victory laps before the race was over. Paul Krugman has been declaring victory for over a year, with headlines like these:

  • Goodbye, Inflation: The latest numbers show that it’s yesterday’s problem.
  • The Soft Landing Is Happening: Why the new inflation numbers contain some very good news.
  • Why Did So Many Economists Get Disinflation Wrong?
  • Inflation Is Down, Disinflation Denial Is Soaring
  • None Dare Call It Victory: Has the war on inflation already been won?
  • How (Many) Economists Missed the Big Disinflation: The fault lay not in the models, but in themselves
  • Everything’s Coming Up Soft Landing: Inflation seems to be fading without a recession
  • Wonking Out: From Stagflation to ‘Immaculate Disinflation’

Meanwhile, monthly CPI data hasn’t reached the Fed’s 2 percent target.

Source: U.S. Bureau of Labor Statistics, Consumer Price Index for All Urban Consumers: All Items in U.S. City Average [CPIAUCSL] and Consumer Price Index for All Urban Consumers: All Items Less Food and Energy in U.S. City Average [CPILFESL], retrieved from FRED, Federal Reserve Bank of St. Louis.

Annualized monthly rates over the past few months also show that Krugman’s “Immaculate Disinflation” isn’t materializing.

Source: U.S. Bureau of Labor Statistics, Consumer Price Index for All Urban Consumers: All Items in U.S. City Average [CPIAUCSL] and Consumer Price Index for All Urban Consumers: All Items Less Food and Energy in U.S. City Average [CPILFESL], retrieved from FRED, Federal Reserve Bank of St. Louis.

Krugman was widely ridiculed for using tortured price inflation statistics that remove food, energy, shelter, and used cars to help him make the claim that the economic picture is better than surveys of economic sentiment suggest.

This prompted me to construct the “Anti-Krugman Price Index,” which only includes the items he excludes. When we compare the AKPI to average earnings, we see why he wants to ignore these components.

Source: U.S. Bureau of Labor Statistics, Consumer Price Index for All Urban Consumers: Food in U.S. City Average [CPIUFDSL], Consumer Price Index for All Urban Consumers: Energy in U.S. City Average [CPIENGSL], Consumer Price Index for All Urban Consumers: Shelter in U.S. City Average [CUSR0000SAH1], Consumer Price Index for All Urban Consumers: Used Cars and Trucks in U.S. City Average [CUSR0000SETA02], and Average Weekly Earnings of All Employees, Total Private [CES0500000011], retrieved from FRED, Federal Reserve Bank of St. Louis.

The prices of these items, as measured by their corresponding CPI components, have risen twice as much as average earnings since 2020.

The moral of the story is that court intellectuals will weave a narrative that supports the State, using whatever (manipulated) statistics will help them tell their tales. Krugman especially wants to tell the story that under Biden’s lucid leadership, the economy is doing great and the government (with help from the Fed) can simply turn the dials to steer the economy toward stability and growth without any negative repercussions.

Of course, this is a farce. Printing money and manipulating interest rates have many consequences, and the full costs are yet to be realized.

Happy Centennial, "Rhapsody in Blue"!

The 1920s in America, as well as much of the West, were characterized by a feeling that anything was possible. In the Roaring ‘20s Americans no longer worried about war, and they had seen the post-war depression of 1920-1921 vanish quickly, thanks to a paucity of government meddling. The Federal Reserve was managing the money supply in what eventually became a fatal disaster but was heralded at the time, and inventions flowed forth from a great release of creative energy that had been pent-up during the war.

With low unemployment Americans were prosperous and mobile as mass-production made cars affordable for the middle class. Women had won the right to vote and were asserting their independence in culture and work, and alcoholic beverages were prohibited giving rise to massive defiance in the form of organized crime and speakeasies. In dance, the fast-kicking Charleston became wildly popular. 

The 1920s was also the Jazz Age. Originating from African American musicians in New Orleans, “jazz” had no agreed-upon definition, though improvisation became one of its defining elements.

As one writer described it,

Jazz represented the Roaring Twenties’ spirit: energetic, modern, and slightly rebellious. Musicians like Louis Armstrong, Duke Ellington, and Bessie Smith became national icons, pushing musical boundaries with improvisation and new rhythms. Jazz clubs, especially in cities like New York and Chicago, became cultural hubs, drawing diverse audiences and facilitating the mingling of different racial and social groups.

The popularity of jazz rendered it an American signature, but the reigning classical orthodoxy considered it low brow. Americans, therefore, were low brow.

This bothered some jazz musicians, and one of them decided to shake-up that prejudice.

Paul Whiteman’s concert

On Friday, January 4, 1924 Ira Gershwin sat reading the morning’s New York Tribune while his younger brother George and a friend were nearby playing a game of pool. Ira noticed an item in the music section headlined “Committee Will Decide ‘What is American Music?’.” As he read on he learned that jazz bandleader Paul Whiteman was planning a concert for Lincoln’s Birthday, February 12 — five weeks away. 

Whiteman’s “An Experiment in Modern Music,” he read, would be judged by four iconic musicians of the day: Sergei Rachmaninoff, Jascha Heifetz, Efrem Zimbalist, and Elma Gluck. How they would know if the “experiment” could be called American music was likely a mystery even to them.

It was the brief article’s last paragraph that made Ira straighten up and take notice:

George Gershwin is at work on a jazz concerto, Irving Berlin is writing a syncopated tone poem and Victor Herbert is working on an American suite.

His brother is at work on a jazz concerto?! In the upcoming days George would be occupied with a musical comedy he had written which was about to open on Broadway, Sweet Little Devil. Where did Whiteman get the idea Gershwin was writing a concerto?

It turned out George had forgotten about his promise to Whiteman during talks back in December. He called the band leader early the next morning to tell him it would be impossible to write a concerto in the time remaining. But Whiteman somehow talked him into it, though Gershwin promised not a concerto but a freer piece such as a rhapsody. Whiteman assured him he only needed to write the piano score; his trusted in-house arranger Ferde Grofé would do the orchestration.

After he had sold his first song in 1916 at age 17 for 50 cents, Gershwin worked as a song plugger and producer of piano rolls for a while. His first commercial success as a composer was the ragtime Rialto Ripples in 1917 followed by a bigger hit Swanee in 1919. With songwriter William Daly he began collaborating on Broadway musicals beginning in 1920. 

Gershwin had been in the habit of jotting down song ideas in what he called his Tune Books. Now, at age 25, he had collected an abundance of musical phrases, and he turned to these to get him started on what would eventually become a sure bet to pack concert halls here and abroad for the next 100 years — Rhapsody in Blue.

Manuscript evidence suggests he only worked on Rhapsody a total of 10 days from January 7, 1924, to the end of rehearsals in February.

Last minute anxiety

Carnegie Hall had been booked for February 12, 1924, and surrounding dates, so Whiteman settled for the less-capacious Aeolian Hall. 

Whiteman was taking a risk for the concert he had planned. On the day of the event, scheduled to begin at 2:45 p.m., he slipped out of the hall to check on the box office, and in his own words:

There I gazed upon a picture that should have imparted new vigor to my wilting confidence. It was snowing, but men and women were fighting to get into the door. . .

Such was the state of my mind by this time that I wondered if I had come to the right entrance. And then I saw Victor Herbert going in. It was the right entrance. . . The next day the ticket office people said they could have sold out the house ten times over.

All very encouraging but by late afternoon Whiteman’s experiment was fading. Applause had been polite for the performances up to that point. Slowly, people began to head for the exits.

Then Gershwin, “a lank and dark young man,” stepped quietly on stage. Settled at the piano he nodded to Whiteman, who gestured to Ross Gorman whose clarinet wail electrified the audience. The hall’s deserters rushed back in.

Later, critics said Rhapsody was flawed, it was too heavy on the piano part and its form was not classically proportioned. But when the final ffz (very loud) chord was struck by Gershwin and orchestra, the audience exploded with applause. According to Whiteman he and the young composer took five curtain calls.

The question “What is American music?” never got answered. 

Today, a who’s who of concert pianists have recorded the Rhapsody, many available on YouTube. 

In late January of this year pianist-composer Ethan Iverson wrote a piece for the NY Times, saying

Thanks to the centennial, you’re likely to come across a lot of “Rhapsody” performances this year — not that the anniversary makes much difference, because that’s always the case.

As conductor Michael Tilson Thomas once said, Gershwin’s music has that elusive quality of making people fall in love with it.

Happy centennial, Rhapsody in Blue!

Ron Paul for President

A few days ago, Ginny Garner sent me a message. Ginny wrote:


We know Biden is not a man of peace. This week Trump did not tell the Nevada Caucus rally audience he opposed sending billions for the war in Ukraine, instead he said European nations should contribute more. And, after a long pause, RFK Jr. when questioned by comedian/podcaster Dave Smith said he didn’t know if he was concerned about Israel’s influence over American foreign policy. So much for RFK Jr. being the antiwar candidate. It looks like anyone wanting to vote for an authentic pro-peace presidential candidate is going to have to write in the name of Ron Paul.”

Dr. Ron Paul is our greatest living American, and he is sound on all issues. None of the other candidates is sound on all issues. You might object, “It would be great if Ron would run—but he won’t! The answer to that is simple. We have to draft him!

I spoke about being sound on all issues, and the most important of these issues is the one Ginny mentioned—being anti-war. Ron has been clear that the neocon gang who control brain-dead Biden have always been trying to get us into war. Here is what he said about them last December:

“Over the weekend Defense Secretary Lloyd Austin explained to the American people what’s really wrong with US foreign policy. Some might find his conclusions surprising.

The US standing in the world is damaged not because we spent 20 years fighting an Afghan government that had nothing to do with the attacks on 9/11. The problem has nothing to do with neocon lies about Iraq’s WMDs that led untold civilian deaths in another failed “democratization” mission. It’s not because over the past nearly two years Washington has taken more than $150 billion from the American people to fight a proxy war with Russia through Ukraine.

It’s not the military-industrial complex or its massive lobbying power that extends throughout Congress, the think tanks, and the media.

Speaking at the Reagan National Defense Forum in California’s Simi Valley, Austin finally explained the real danger to the US global military empire.

It’s us.

According to Secretary Austin, non-interventionists who advocate “an American retreat from responsibility” are the ones destabilizing the world, not endless neocon wars.

Austin said the US must continue to play the role of global military hegemon – policeman of the world – because “the world will only become more dangerous if tyrants and terrorists believe that they can get away with wholesale aggression and mass slaughter.”

How’s that for reason and logic? Austin and the interventionist elites have fact-checked 30 years of foreign policy failures and concluded, “well it would have been far worse if the non-interventionists were in charge.”

Read the full article at LewRockwell.com.

CRE Maturities Pile Up

02/13/2024Douglas French

The amount of commercial real estate loans coming due in 2024 has jumped to $929 billion according to the Mortgage Bankers Association. Previously the amount had been estimated to be $659 billion, with the 40% increase “attributed to loan extensions and other delays rather than new transactions,” according to Bloomberg’s John Gittelesohn.

Of the nearly $1 trillion in commercial-property debt maturing this year, banks hold $441 billion of it, the mortgage bankers group reported. 

With commercial-property prices down 21% from the early 2022 peak and office prices falling 35%, an estimated $85.8 billion of commercial property debt was considered distressed at the end of 2023, according to MSCI Real Assets, who believes there is an additional $234.6 billion of potential distress.

Reuters reports “investors are combing through portfolios of regional banks, as small banks account for nearly 70% of all commercial real estate (CRE) loans outstanding, according to research from Apollo.”“As long as interest rates stay high, it's hard for the banks to avoid problems with CRE loans," said short-seller William C. Martin of Raging Capital Ventures told Reuters, who decided to place a bet against NYCB after the bank's disastrous Jan. 30 earnings release which detailed real estate pain and led him to believe that shares could sink further on more real estate losses.


"The regional banks ... (are) doubly more exposed to rates," said Dan Zwirn, co-founder and CEO of distressed debt investment firm Arena Investors, who is avoiding real estate for the next year or two, citing in part higher risk of default.

Nearly 1,900 banks with assets less than $100 billion had CRE loans outstanding greater than 300% of equity, according to Fitch.

Credit rating company Fitch, in a detailed report in December, said if commercial real estate “prices decline by approximately 40% on average, losses in CRE portfolios could result in the failure of a moderate number of predominantly smaller banks,” Reuters reported. (emphasis added)

NYCB said on Wednesday options could include loan sales and that the bank "will be razor-focused on reducing our CRE concentration."


But loan sales are unlikely with properties now valued 50%-75% below their valuations at the time loans were made. No one will buy loans at par where the underlying collateral has fallen by 25%-50%. Selling loans at a loss will generate capital-eroding losses these banks can’t afford. For the same reason, borrowers seeking an extension for their undercollateralized loan will be asked to pay down the principal balance with cash they likely don’t have. "Loans that were done over the last five to seven years, a lot of those are challenged now," said Ran Eliasaf, founder and managing partner of real estate investment firm Northwind Group, who is investing in the New York multifamily market.

Real Estate investor Ken McElroy told Jeff Snider that if this real estate downturn is a baseball game it is in the 2nd or 3rd inning.

Watch this space. 

Pre-order the 4th Expanded Edition of Early Speculative Bubbles & Increases In The Supply of Money today. 

Tucker Slayed the Mainstream Media Dragon

02/13/2024Ron Paul

There has been much written and said about Tucker Carlson’s interview with Russian President Vladimir Putin last week. As of this writing the video on Twitter alone has been viewed nearly 200 million times, making it likely the most-viewed news event in history.

Many millions of viewers who may not have had access to the other side of the story were informed that the Russia/Ukraine military conflict did not begin in 2022, as the mainstream media continuously reports, but in fact began eight years earlier with a US-backed coup in Ukraine. The US media does not report this because they don’t want Americans to begin questioning our interventionist foreign policy. They don’t want Americans to see that our government meddling in the affairs of other countries – whether by “color revolution,” sanctions, or bombs – has real and deadly consequences to those on the receiving end of our foreign policy.

To me, however, perhaps the most interesting aspect of the Tucker Carlson interview with Putin was the US mainstream media reaction. As Putin himself said during the interview, “in the world of propaganda, it’s very difficult to defeat the United States.” Even a casual look at the US mainstream media’s reporting before and after the interview would show how correct he is about that. In the days and weeks before the interview, the US media was filled with stories about how horrible it was that Tucker Carlson was interviewing the Russian president. There was the danger, they all said, that Putin might spread “disinformation.”

That Putin might say something to put his country in a better light was, they were saying, reason enough to not interview him. With that logic, why have journalism at all? Everyone interviewed by journalists – certainly every world leader – will attempt to paint a rosy picture. The job of a journalist in a free society should be to do the reporting and let the people decide. But somehow that has been lost. These days the mainstream media tells you what to think and you better not dispute it or you will be cancelled!

What the US mainstream media was really worried about was that the “other side of the story” might start to ring true with the public. So they attacked the messenger.

The CNN reporting on Tucker’s interview pretty much sums up the reaction across the board of the US mainstream media. Their headline read, “Tucker Carlson is in Russia to interview Putin. He’s already doing the bidding of the Kremlin.”

By merely doing what used to be called “journalism” – interviewing and reporting on people and events, whether good or bad – one is “doing the bidding” of the subject of the interview or report?

No wonder fellow journalist Julian Assange has been locked away in a gulag for so many years. He dared to assume that in a free society, being a journalist means reporting the good, the bad, and the ugly even if it puts those in power in a bad light.

In the end, the massive success of the Tucker Carlson interview with Vladimir Putin demonstrates once and for all that the American people are sick to death of their mainstream media propagandists and liars. They are looking not for government narratives, but for truth. That’s the really good news about this interview.

Thanks to David Jarrett for These Iconic Photos of Mises

02/13/2024Mises Institute

[From the  September-October 2023 issue of The Austrian.]

David Jarrett has been a longtime supporter of the Mises Institute and attended Mises’s NYU lectures in 1965, when they were held in Nicholas Hall. The building no longer stands, but the photos that David took one evening with his Leica M3 camera and APO-Summicron-M 90mm f/2 ASPH telephoto lens are still around. The negatives have been carefully guarded for decades, and now, David has generously donated them to the Mises Institute: 


The Re-Monetization of Gold

02/12/2024Gary North

[This article was originally published on LewRockwell.com in 2003.]

If gold is to be re-monetized, then this must mean that it has been de-monetized. But isn’t gold money?

No, gold is not money. It has not been money for Europeans since 1914, when the commercial banks stole it from depositors at the outbreak of World War I, and central banks then stole it from commercial banks before the war was over. Gold has not been money for Americans since 1933, when Roosevelt unilaterally by executive order stole it from the public.

Gold is high-powered money for central bankers, who settle their banks’ accounts in gold. But this is so far removed from the decisions of consumers that I can safely say that gold is not money.

The question is: Will it ever again become money?

This is the most important of all monetary questions.


Money is the most marketable economy. Gold is therefore not money. You have to buy gold from a specialized broker. There are so few gold brokers any more that they are all known to each other. Local coin stores don’t do much business in bullion gold coins such as the American eagle or Canadian maple leaf. The large wholesale firms like Mocatta don’t deal with the public. There are so few full-time bullion coin dealers that you could have a convention of them in a Motel 6 conference room. (When was the last time you were in a Motel 6 conference room?)

But . . . it costs $39 to rent a Motel 6 room. That tells us something. It’s not $6 a room any longer. Inflation has done its work.

Money is liquid. Liquidity means that you can exchange money for goods and services directly without the following costs:

  • Advertising
  • Discounting
  • Waiting

There is a price spread between what you can sell a gold coin for (in money) and what you buy a gold coin for (in money). Gold coins therefore are not money.

I realize that old-time gold bugs go around saying “gold is the only true money” and similar slogans. These slogans reflect a lack of understanding of either gold or money. They are comforting slogans, no doubt, for someone who bought gold coins at twice the price that they command today, and held them for a quarter of a century at no interest while all other prices doubled or tripled. If he had instead made down payments on rental houses, he would be a whole lot richer. But the fact is, gold is not only not the only true money, it is not money at all. When you can walk into Wal-Mart and buy whatever you want with a gold coin or gold-denominated debit card, then gold will be money. Not until then.

To tell a gold bug this is to strike at his core beliefs. But his core beliefs are based on a lack of understanding of economics.

Money is the most marketable commodity. Gold is not the most marketable commodity. Given the lack of retail outlets where you can buy and sell gold, it is not even remotely money. Unless you are a central banker, gold is not money for you.


Gold is a valuable commodity. It was originally valuable for its physical properties: its glorious shine, its imperviousness to decay, its limited supply (high cost of mining), its malleability, its divisibility. In most religions, gold is used to represent deity or permanent truth. When something is “as good as gold,” it’s valuable.

Because of these properties, gold long ago became widely used in economic exchange. When the city-state of Lydia started issuing gold coins over five centuries before the birth of Jesus, gold became the most recognizable form of money in the classical world. Gold had been monetized long before this, as all historical records indicate, but the convenience of the coins amplified what had already been the case. This increased the demand for gold.

Gold was no longer money in Western Europe after the fall of Rome in the fifth century. In March of 2003, I visited the British museum. The museum has an exhibit of an early medieval grave-ship, where a Saxon seafaring king had been buried. The wood is gone, but metal implements remain. There was a small stash of gold coins. This is the Sutton Hoo exhibit. The burial’s date is estimated at 625. By that time, gold coins were rare in the West. In another museum exhibit of gold coins, you can see that from about 625 until the introduction of gold coins in Florence in 1252, there is only one gold coin.

Gold coins did circulate for the entire period in the Eastern Roman Empire (Byzantium), from 325 (Constantinople) to the fall of Byzantium to the Turks (1453). But there was little trade between the two halves of the old Roman Empire until late in the Middle Ages. The low division of labor in the West made barter far more common, and silver and bronze coins were the media of exchange.

It was the rise of the modern world, which was marked by an increasing division of labor, that brought gold coins back into circulation. Fractional reserve banking and gold coins developed side by side. Fractional reserve banking is why the boom-bust cycle has been with us, with credit money stimulating economic growth (an increase in the division of labor), and bank runs shrinking the money supply and contracting the economy (a decrease in the division of labor).

There has been a 500-year war in the West between gold coins and bank-issued credit money.


Bankers want to make money on money that their institutions create. They use the promise of redemption-on-demand in gold or silver as the lure by which they trick depositors into believing in something for nothing, i.e., the possibility of redemption on demand of money that has been loaned out at interest. The public believes this numerical impossibility, but then, one fine day, too many depositors present their IOU’s for gold or silver to the bank. A bank run begins, the lie is exposed, and the bank goes bankrupt (bank + rupture). The depositors lose their money. They get nothing for something, which is always the small-print inscription on the other side of something for nothing.

The bankers hate gold as money. Gold as money acts as a restraint on their profits, which are derived from creating money “out of thin air” and lending it at interest. Gold as money acts as a barrier to the expansion of credit money. The public initially does not trust the bankers or their money apart from the right of redemption on demand. Depositors initially insist on IOU’s for gold coins. So, the bankers partially submit to gold, but only grudgingly.

To keep from facing their day of judgment — redemption day, when the public presents its IOU’s and demands payment — fractional reserve bankers call on the government. They persuade the government to create a bankers’ monopoly, called a central bank, which stands ready to intervene and lend newly created fiat money to any commercial bank inside the favored cartel that gets into trouble with its depositors. By reducing the risk of local bank failures, the central bank extends the public’s acceptance of a system of unbacked IOU’s, called “an elastic currency” when members of the banking cartel create it, and called “counterfeiting” when non-members of the cartel create it.

Then why do central bankers use gold to settle their own interbank accounts? Because central bankers don’t trust each other — the same reason why the public prior to 1914 used gold coins and IOU’s to gold coins. The central bankers don’t want to get paid off in depreciating money. At the same time, they do want to retain the option of paying off the public in depreciating money.

It’s not that they want depreciating money. They want economic growth, lots of borrowers, and lots of opportunities to lend newly created money at interest. The problem is, they are never able to maintain the economic boom, which was fostered by credit money, without more injections of credit money. The same holds true for additional profits from lending. If a bank has additional money to lend and a booming economy filled with would-be borrowers, that’s great for the bankers. But the result has always been either a deflationary depression when the credit system collapses, or else price inflation, which overcomes the collapse at the expense of reliable money. The result in both cases is lost profits.

Bankers want the fruits of a gold coin standard: predictably stable or slowly falling prices, a growing economy, international trade, and a currency worth something when they retire. But they don’t want the roots of a gold coin standard: lending limited by deposits, a legal link between the time period of the loan and the time period when the depositor cannot redeem his deposit, and profits arising solely from matching lenders (depositors) with borrowers. Bankers sacrifice the roots for the profitable pursuit of the fruits. The results: boom-bust business cycles, bankruptcies, depreciating currencies, shattered dreams of retirement, and political revolutions.

In the twentieth century, fractional reserve bankers won the war of economic ideas: Keynesianism, monetarism, and supply-side economics. They also won the political wars. They succeeded in getting all governments to de-monetize gold, thereby creating unbreakable banking cartels (but not unbreakable currencies). The result was the decline in purchasing power of the dollar by 94%, 1913—2000. Verify this here:

Verify this with the Inflation Calculator, posted on the U.S. government’s Bureau of Labor Statistics Web site.

$1,000 in 1913 = $17,300 in 2000. 1 divided by 17 = .06, or 6%.100% minus 6% = 94%.

In other nations, the depreciation was even worse: World War I and its post-war inflations, plus World War II and its post-war inflations, when added to the Communist revolutions, destroyed entire currency systems, sometimes more than once.


Official statistics indicate that most of the world’s gold is stored in the vaults of central banks. The bulk of the rest of it is in women’s dowries in India, or on ring fingers of Westerners, or in jewelry of affluent women. But, as I have argued previously, central banks have in fact been transferring their gold to private owners by way of the “bullion banks,” which have borrowed gold at 1% per annum, sold it to the public, and invested the money at high interest rates.

If my thesis is correct, then gold has been de-monetized almost completely. It is no longer serving as an ultimate restriction on central bank policies. The central bankers are now trading paper gold — promises to pay gold — that have been issued by private bullion banks, which cannot afford to buy the gold back to re-pay the central banks. The bullion bankers have done to the central bankers what the fractional reserve bankers did to their depositors, and the central bankers did to the commercial banks. They have gotten their hands on gold in exchange for written promises to repay this gold — promises that cannot possibly be fulfilled — and have made oodles of money by lending the money derived from the sale of the gold.

This means two things: this gold has been repatriated to the private markets (yea!), and gold in general is now almost fully de-monetized (boo!). Men have put bracelets and necklaces on their daughters (India) and wives (the West), but consumers do not have gold coins in their individual repositories, especially their pockets.

This means that what had been the highest-value use for gold for 2,600 years — gold as money — has disappeared except among central bankers, and even then increasingly merely IOU’s to gold issued by bullion banks. There has been a huge, historically unprecedented reduction in demand for gold since 1914. This should be obvious to anyone. Demand for gold today is for industrial and ornamental uses, not monetary uses. Yet I am just about the only person within the camp of the gold bugs who is willing to admit this in print.


Nothing is permanent except death, taxes, and the lies of politicians, but in the West, the de-monetization of gold appears to be as permanent as the West. The West has bet its future on fractional reserve banking. This is additional evidence that the West is doomed. It has placed the extension of the division of labor into the hands of the bankers’ cartel.

Faculty members in Western universities are agreed on few things, but one universally shared assumption is that gold should not be money. In business schools and economics departments, in political science department and history departments, the professors are agreed: gold is a relic, and probably a barbarous relic.

But then there is Asia.

In Asia, the people are still barbarians. This means that they don’t trust their governments because they know the truth: governments cheat, lie, and steal. Government corruption is a way of life in heartland Asia. This is a tremendous advantage that Asians enjoy. The less educated the Asian, the more likely he is to distrust the government. He is partially immunized against trusting promises to pay that are issued by governments. This is why Chinese peasants still want silver coins and Indian peasant wives still have gold jewelry. All over the Asian mainland, paper money has universally depreciated. The division of labor has been thwarted.

In the Asian tiger nations, whose economies have been closely tied to the capitalist West, fractional reserve banking is accepted, and fiat currencies are trusted. These nations have experienced a loss of trust in gold, which is the other side of the debased coin of fractional reserve banking. The war is on in Asia.

China’s currency is government-controlled and highly inflationary. What is saving China from mass price inflation is the rapid spread of the division of labor through freeing the economy. Capitalism’s extension of the division of labor is paralleling the Bank of China’s extension of credit money. So far, capitalism has won the race. But the race is not a sprint; it’s a marathon. At some point, there will be a massive recession in China as a result of the monetary inflation that has been going on for two decades. The boom will turn into a bust. Then the Chinese may remember the truth that their great-grandparents knew: you cannot safely trust government money. Those Chinese who did trust government’s money in 1948 were destroyed economically by Chiang’s mass inflation and then wiped out politically and economically by Mao’s tyranny.


Gold is an inflation hedge. There has been inflation since 1980. But gold has not risen in price since 1980 for many reasons: the gold bubble of 1979, the continuing de-monetization of gold by central banks, the steady sell-off of gold by central banks, the central banks’ gold leasing programs (disguised sales), and dollar supremacy internationally. The third factor, dollar supremacy, is looking shaky.

Gold is not a deflation hedge whenever it is not monetized, and it has not been monetized for generations. But, in the midst of deflation, there is a possibility of the re-monetization of gold. I regard this as a distant possibility. During a breakdown in the payments system — cascading cross defaults, as Greenspan calls it — there is an outside possibility that gold will become used again in the monetary system. But for this change to take place, a massive breakdown is necessary, in order to overcome a century of anti-gold economic theories. There is no case for gold being made by Ph.D.-holding economists, politicians, pastors, and TV commentators. A return to gold as money in the West will take a cataclysm, which will impose enormously high costs on the public for not using gold as money, thereby pressuring consumers to adopt gold as money. In a cataclysm, the cost of moving from fiat money to gold would be accompanied by a horrendous reduction in the social division of labor — life-threatening, in my view. A collapse of the derivatives market could produce such a cataclysm. To say that it cannot happen is foolish, but very few people can afford to do much to prepare for such an event. I have. Maybe you have. But we are a minority. We are all dependent on the division of labor to sustain our lives, let alone our lifestyles.

In Asia, the costs of returning to gold as money are much lower. The division of labor is lower. There is less trust in government. Old ideas die hard. There is also increasing wealth, which will further the purchase of gold. But I think this will be gold as ornament and investment, not gold as money.

That’s why I do not expect to see gold as money in my lifetime. But I still recommend gold as an investment. This is because, when it comes to monetary inflation, the mamby-pamby policies of the post-war West are only a cautious prelude to the future. To overcome any deflation of the money supply in today’s debt-induced, credit-induced world economy, central bankers will stop acting like wussies. They will start inflating in earnest, for only through inflation can the fractional reserve process continue. It is inflate or die. They will inflate. Then the West’s currencies will die. But bankers will inflate now in order to postpone the death of money. They believe that “something will turn up” other than prices.

For gold to become money in the West will take an economic cataclysm. I am too old to be enthusiastic about going through such a cataclysm. So, I remain content with the de-monetization of gold. The consumer is economically sovereign, and he has not shown any interest in gold as money. Long live the consumer, especially in his capacity as a producer!

But as for gold as an inflation hedge . . . that’s a horse of a different color. Gold as a commodity will outperform digits as money.

In this sense, I remain a pessimist. The world needs gold as money, but the transition costs are astronomical. “Everybody wants to go to heaven, but nobody wants to die.”

Nevertheless, I would rather be a rich pessimist with gold than a poor optimist with digits.

How about you?