Power & Market
In a January 14 article for Vox, Ian Millhiser discusses a new proposal in the Harvard Law Review designed to accomplish four things:
(1) a transfer of the Senate’s power to a body that represents citizens equally; (2) an expansion of the House so that all citizens are represented in equal-sized districts; (3) a replacement of the Electoral College with a popular vote; and (4) a modification of the Constitution’s amendment process that would ensure future amendments are ratified by states representing most Americans.
The scheme consists of dividing up the District of Columbia into more than one hundred new tiny states so as to drastically increase the number of leftist-controlled states so as to push through a wide variety of new Constitutional amendments.1
Milhiser is enthusiastic, since he believes the US electoral system is too "undemocractic" and that the system must be overhauled "so that the United States has an election system 'where every vote counts equally.'"
[RELATED: Stop Saying "We're a Republic, Not a Democracy" by Ryan McMaken]
Milhiser's concept of "unequal" is illustrated more clearly when he calls the US Senate "ridiculous" because it is a system "where the nearly 40 million people in California have no more Senate representation than the 578,759 people in Wyoming."
So now we have arrived at the heart of the matter.
Milhiser contends Californians are underrepresented in Washington—or more likely he thinks leftists are underrepresented, given the context of the piece—because Wyoming and California have equal representation in Congress.
This, we are supposed to believe, somehow leaves millions of Californians disenfranchised.
Common sense, however, strongly suggests Californians—or at least the politicians who claim to represent them—are quite well represented in Washington and wield quite a bit of power. Let's look at some numbers:
California has fifty-three votes in the US House of Representatives while Wyoming has one.
As a percentage of the full House, California's delegation makes up 12 percent of all votes while Wyoming's makes up 0.2 percent.
In fact, California's representation in the House is so large that California House members outnumber members from an entire region of the US: namely the Rocky Mountain region. If we add up all eight states of the Mountain West (Nevada, Utah, Arizona, Montana, Idaho, Wyoming, Colorado, and New Mexico) we come up with only thirty-one votes. Moreover, many of these states have split delegations (as with Arizona and Colorado) and rarely vote as a unified group. California, on the other hand, has only seven Republican House members out of fifty-three, meaning that the state's delegation tends to reliably vote together.
California even enjoys a sizable advantage in the electoral college as well. It is true that the electoral college formula evens things up somewhat. California representatives wield more than 10 percent of all electoral college votes, and Wyoming enjoys only 0.6 percent of all votes. Now, if the Rocky Mountain region were to vote together for a certain candidate, it could theoretically, almost equal the power of California. But, the region doesn't vote together, with Colorado, New Mexico, and Nevada often going for one party while the rest of the region goes for another. Thus, California, because its electoral votes are centralized, brings more power to presidential elections than the entire Rocky Mountain region.
So, while it's true that Wyoming's two votes in the US Senate make it easier for a Wyoming-led coalition to veto legislation that favors California, the same can be said of California in the House. While California and Wyoming theoretically have equal power in the Senate, Wyoming has essentially no power in the House, and could not possibly hope to do much to overcome opposition from California House members. The fact that voters in Wyoming have more US senators per person hardly places people with California-type interests at the mercy of people with Wyoming-type interests.
Essentially, the system as it now functions places significantly more veto power in the hands of California in the House. At best, though, Wyoming has only an equal veto to California in the Senate. Veto power, of course, is one of the most important aspects of US legislative institutions, since it is designed to help minority groups protect their own rights even when lacking a majority. This is the philosophy behind the Senate's filibuster, and the philosophy behind the bicameral legislature. The purpose is to provide numerous opportunities for minorities to veto legislation pushed by more powerful groups.
The importance of protecting minority rights, of course, is a mainstay of the ideology we used to call "liberalism"—the idea that people ought to enjoy basic human rights even when the majority doesn't like it.
All that is out the window, however, where progressives and other leftists suspect they are in the majority, in which case the protection of minority groups is null and void. Suddenly, the Democratic majority is the only thing that matters.
Had rank majoritarianism won the day in the past, Indian tribes, Catholics, Quakers, and Japanese Americans would have all been extirpated or run out of the country a century ago. But the sort of prudence that put some limits on the power of the majority in the is now thoroughly unfashionable on the Left. The Left now strains to grasp the opportunity to rid themselves forever of even what small amount of legislative resistance can be offered by the hayseeds in flyover country.
The fact that California gets more than fifty times the votes of North Dakotans isn't enough for progressives. They must also stamp out what limited influence North Dakota has in the Senate as well.
This sort of thinking suggests that in order to make the US more "democratic" large minorities of voters—voters with specific economic and cultural interests that differ from those in other regions—ought to be rendered essentially powerless.
All that said, I endorse the plan from the Harvard Law Review Milhiser is pushing. It undermines the idea that the US's current state boundaries are somehow sacrosanct and that enormous states with millions of people are a perfectly fine thing. The US and its member states are far too large, and could use a thorough dismembering. But Milhiser should be careful what he wishes for.
- 1. I should note there's nothing wrong with dividing up the US into a large number of small states. It's just that the scheme ought to encompass the entire country rather than just a tiny of part of it with the goal of helping a single political party. Indeed, most US states are far too large and ought to be broken up into far smaller pieces: https://mises.org/wire/if-american-federalism-were-swiss-federalism-there-would-be-1300-states
George Mason University economist Tyler Cowen has penned a brief manifesto for what he calls "State Capacity Libertarianism" on the Marginal Revolution blog. In it he makes the case for libertarians to embrace "state capacity" in certain limited cases. You can read his essay here.
My initial responses, in no particular order, are as follows:
1. There is no political will or constituency for skillful technocratic state management of society. This is a pipe dream, once simply referred to as elusive "good government." When do public choicers of all people give this up?
2. There is no third way between state and market, regardless of technology or material development. Futurism is bunk; the question before us today is the same as thirty, fifty, or one hundred years ago: who decides? Decentralization vs. centralization is the most important policy question.
3. Western states won't give up their sclerotic regulatory, tax, central banking, and entitlement systems no matter how many flying cars or hyperloops we want. This reality will be a huge drag on science, infrastructure, medicine/health, and overall well-being.
4. The environmental movement will quash nuclear (especially after Fukushima), and the energy capacity vs. weight/cost issue will continue to plague electric cars/planes.
5. Left socialism, not libertarian futurism, is the rising tide across the West — and its constituency skews young. Adopting its pose, language, or ostensible goals won't produce Singapore.
6. Climate change is not a problem or issue for anyone to solve.
7. The West can't advance until it stops warring. War and peace won't be solved technocratically, and true noninterventionism requires a painful rethinking of the hubris known as universalism. I thought technocrats believed in realpolitik?
8. Human happiness and prosperity depend on elements of civil society which libertarian futurists don't like (faith, family, et al.). Hence the cheap jab at "Ron Paulism."
9. We build "capacity" in society through profit, saving, and capital investment. Government makes this worse, not better, in each and every case.
10. Libertarianism simply means "private." It is a non-state approach to organizing human society. It is not narrow or confining; in fact everything Cowen desires in an improved society can be advanced through private mechanisms.
Buried in the House Judiciary Committee's impeachment report is some insight into how the foreign policy establishment is attempting to re-create a new Cold War with the Russian Federation. Much of the report is devoted to one of the primary charges against the president: that he allegedly obstructed Congress's investigation.
The first claim is largely asserted through legalese about how Trump was insufficiently cooperative with Congressional investigators.
I'll let the lawyers cover that one.
The second charge, however, is more policy-based. The report claims it is possible to commit treason by simply acting to avoid giving money to a foreign government — if that money is being used against the US's adversaries. From there, the report asserts Trump essentially committed treason as part of an "aggravating factor" related to "an impeachable abuse of power." He did this by allegedly withholding foreign aid dollars from the Ukrainian state.
This "abuse of power" consists of the president engaging in bribery by attempting to use taxpayer dollars to extract political favors from the Ukrainian leadership in the form of dirt on former Vice President Biden. I'll leave the bribery and abuse charge to the lawyers, but what is of special interest here is the claim that the withholding of funds destined for the Ukraine government was objectionable largely because the withholding of funds imperiled the US's quasi war against Russia.
Let's follow the report's logic: according to the report, "a person commits treason if he uses armed force in an attempt to overthrow the government, or if he knowingly gives aid and comfort to nations (or organizations) with which the United States is in a state of declared or open war."
The report then asserts "America has a vital national security interest in countering Russian aggression, and our strategic partner Ukraine is quite literally at the front line of resisting that aggression."
The report goes on to claim it is essential that the US president "stand with our ally in resisting the aggression of our adversary." Basically, the logic of the report rests on the old propaganda tactic of claiming "we're fighting them over there so we don't have to fight them here."
Thus, the report strongly suggests that by temporarily and briefly withholding foreign aid dollars from Ukraine, Trump committed treason because he was obstructing Ukraine's military efforts against Russia.
The report's language reminds us that in the minds of DC policymakers, the US is essentially at war with Russia and that any withholding of aid is the equivalent of conspiring with foreign enemies.
There are several problems with this logic, of course.
First, the United States is not "in a state of declared or open war" with Russia. Congress has not declared war on Russia — or anyone else at this time — as the law (i.e., US Constitution) mandates. Nor is the US in a state of "open war" with Russia except in the minds of modern-day McCarthyites and and their supporters.
It is telling that the phrase "open war" was added to the definition of "treason" since it is clear no legal state of war exists between the US and Russia. No doubt, the authors of the report think that the US must obviously be in a state of "open war" with Russia, but this naturally is a matter of opinion. This is why we have a legal process of declaring war on specific groups exists in the US Constitution. The fact Congress has chosen to not declare war would suggest to the reasonable person that the US is, in fact, not at war with Russia. If certain people in the US government want the US to be at war with Russia, they ought to be forced to submit their motion to a majority vote in Congress. Until that happens, the US is not at war with Russia.
Secondly, given that Russia has not even been established as the US's "adversary" in accordance with Article I of the Constitution, it is difficult to see how any US agent commits treason by refusing to hand over taxpayer dollars to the Ukraine regime.
One could reasonably claim that by withholding these dollars, Trump was violating the law. This, however, is a long way from "treason." Moreover, it's entirely possible the president has engaged in bribery, obstruction or justice, or other offenses. The inclusion of the "treason" charge, however, suggests the Judiciary Committee thinks the obstruction and bribery charges were insufficient on their own. Thus, the treason charge had to be created on the back of a neo-Cold War ideology now prevailing in Washington.
This is the natural outcome of a foreign policy in which it is perceived to be the job of the United States to guarantee the safety of any and every foreign regime the US government happens to support.
This should surprise no one since the bread and butter of Washington, DC is perpetual war against countless real and imagined enemies. We're told ever greater resources must be devoted to Washington's continued dreams of global war. For example, the Pentagon is currently funded at levels above those of the Vietnam War, and above the Cold War average , but we relentlessly hear about how the military establishment is at crisis levels of neglect. This is demonstrably false, as is the claim withholding a few bucks from the corrupt Ukraine regime puts the US in danger from a Russian invasion.
There are certainly good reasons to impeach presidents, but not being sufficiently pro-war isn't one of them. If the US Congress were less committed to a maximalist foreign policy, it would be impeaching presidents for war crimes, instead of claiming — rather ridiculously — that the US has a "vital national security interest" in Ukraine. After all, virtually every president since 1945 — including the current one — has started or continued undeclared illegal wars against foreign regimes. Every president since Reagan has bombed foreigners without any legal justification whatsoever. Each one of them was eligible for impeachment on this issue.
But we never hear any call for impeachment from Congressional leaders on those grounds. Instead, what we have now appears to be an impeachment process largely driven by the desire to punish a president for not provoking a war.
Frederic Bastiat made a clear distnction between the good economist and the bad economist. For him, the good economist looks beyond what is immediately apparent and instead looks much further into the future. In this world, however, we have many bad economists, and as Bastiat writes in "That Which Is Seen, and That Which Is Not Seen," "It almost always happens that when the immediate consequence is favorable, the ultimate consequences are fatal, and the converse.” As a result, Bastiat warns, “It follows that the bad economist pursues a small present good, which will be followed by a great evil to come."
A refusal to look beyond the immediate and “seen” to the “unseen” also leads to bad economic theory, which, beyond the mere foibles of individual economists, solidifies the practice of ignoring the hidden future effects of economic policies. These bad theoretical frameworks inevitably lead to bad policies and eventually the destruction of wealth via misallocation of resources within the economy.
This phenomenon is certainly common enough in modern economic policies and their underlying theoretical frameworks. For example, we have a large body of economics built on pseudo-facts and unrealistic assumptions. This has been characterized in part by the over-mathematization of economics. As a result, economic analysis relies only on those phenomena that can be quantified, measured, and fit within certain types of models. Other information is ignored.
In other words, policies built on these theoretical frameworks and mathematical models focus heavily on what is seen, such as prices, wages, volume, GDP, and other such metrics. A decline in these factors, it is believed, must immediately be remedied by other measurable activities, such as injection of money and credit into the economy, regulations, subsidies, and so on.
These have an immediate short term and "seen" effect. But, as they are hard to observe, the long term and "unseen" consequences are often ignored, disregarded, or outright denied.
However, the long term and unseen consequences of these may include changes in and the distortion of the market structure, inflation, stifled trade, destruction of capital and wealth, and misallocation of labor, capital, and production capacity. These factors may be very hard to observe, although they have deeper and lasting effects on the growth and sustainability of the economy.
These unmeasurable distortions of the economy often destroy capital and give rise to zombie companies and sectors that end up suffocating the economy by driving out productive companies and sectors.
Bastiat continues, "The true economist pursues a great good to come, at the risk of a small present evil." Good economics looks to the long term and it confers great importance to what is unseen due to the fact that it almost always has longer lasting, more severe, and deeper consequences. For example, excessive borrowing by government and then the handing out of welfare is seen. But the entrepreneur who is crowded out of the credit market, the potential jobs, and the increase in national wealth that he could have contributed are unseen.
So how to practice good economics?
The good economist takes a very different view, recognizing the importance of uncovering hidden effects and relationships. This can be facilitated with less attention to mere quantitative analysis and more attention to sound theory and qualitative analysis. This, however, requires prudence and restraint on the part of the economist. It requires he or she allow natural market structures and mechanisms to follow their natural trajectories so as to organize the market in the most efficient and productive way. How or when this is done cannot always be directly observed, and thus it becomes difficult to tinker endlessly with the machinery of the economy.
The good economist, however, will not be discouraged by this, and will instead pursue a greater understanding of the economy that includes the unseen and well as the seen.
Today the Washington Post published a bombshell report titled “The Afghanistan Papers,” highlighting the degree to which the American government lied to the public about the ongoing status of the war in Afghanistan. Within the thousands of pages, consisting of internal documents, interviews, and other never-before-released intel, is a vivid depiction of a Pentagon painfully aware of the need to keep from the public the true state of the conflict and the doubts, confusion, and desperation of decision-makers spanning almost 20 years of battle.
As the report states:
The interviews, through an extensive array of voices, bring into sharp relief the core failings of the war that war is inseparable from propaganda, lies, hatred, impoverishment, cultural degradation, and moral corruption. It is the most horrific outcome of the moral and political legitimacy people are taught to grant the state. persist to this day. They underscore how three presidents — George W. Bush, Barack Obama and Donald Trump — and their military commanders have been unable to deliver on their promises to prevail in Afghanistan.
With most speaking on the assumption that their remarks would not become public, U.S. officials acknowledged that their warfighting strategies were fatally flawed and that Washington wasted enormous sums of money trying to remake Afghanistan into a modern nation....
The documents also contradict a long chorus of public statements from U.S. presidents, military commanders and diplomats who assured Americans year after year that they were making progress in Afghanistan and the war was worth fighting.
None of these conclusions surprise anyone that has been following America’s fool's errand in Afghanistan.
What makes this release noteworthy is the degree to which it shows the lengths to which Washington to knowingly deceive the public about the state of the conflict. This deception extends even to the federal government’s accounting practices. Notes the report, the “U.S. government has not carried out a comprehensive accounting of how much it has spent on the war in Afghanistan.”
As the war has dragged on, the struggle to justify America’s military presence. As the report notes:
A person identified only as a senior National Security Council official said there was constant pressure from the Obama White House and Pentagon to produce figures to show the troop surge of 2009 to 2011 was working, despite hard evidence to the contrary.
“It was impossible to create good metrics. We tried using troop numbers trained, violence levels, control of territory and none of it painted an accurate picture,” the senior NSC official told government interviewers in 2016. “The metrics were always manipulated for the duration of the war.
Making Washington’s failure in Afghanistan all the more horrific is how easily predictable it was for those who desired to see the warfare state for what it is.
In the words of Lew Rockwell, in reflecting on the anti-war legacy of Murray Rothbard:
War is inseparable from propaganda, lies, hatred, impoverishment, cultural degradation, and moral corruption. It is the most horrific outcome of the moral and political legitimacy people are taught to grant the state.
On this note, it is important to note that the significance of the Washington Post’s report should not distract from another major story that has largely been ignored by mainstream news outlets.
Recently, multiple inspectors with the Organisation for the Prohibition of Chemical Weapons have come forward claiming that relevant evidence related to their analysis of the reported 2017 chemical gas attack in Syria. As Counterpunch.org has reported:
Assessing the damage to the cylinder casings and to the roofs, the inspectors considered the hypothesis that the cylinders had been dropped from Syrian government helicopters, as the rebels claimed. All but one member of the team concurred with Henderson in concluding that there was a higher probability that the cylinders had been placed manually. Henderson did not go so far as to suggest that opposition activists on the ground had staged the incident, but this inference could be drawn. Nevertheless Henderson’s findings were not mentioned in the published OPCW report.
The staging scenario has long been promoted by the Syrian government and its Russian protectors, though without producing evidence. By contrast Henderson and the new whistleblower appear to be completely non-political scientists who worked for the OPCW for many years and would not have been sent to Douma if they had strong political views. They feel dismayed that professional conclusions have been set aside so as to favour the agenda of certain states.
At the time, those who dared question the official narrative about the attack - including Rep. Tulsi Gabbard, Rep. Thomas Massie, and Fox News’s Tucker Carlson - were derided for being conspiracy theorists by many of the same Serious People who not only bought the Pentagon’s lies about Afghanistan but also the justifications for the Iraq War.
Once again we are reminded of the wise words of George Orwell, “truth is treason in an empire of lies."
These attacks promoted as justification for America to escalate its military engagement in the country, with the beltway consensus lobbying President Trump to reverse his administration's policy of pivoting away from the Obama-era mission of toppling the Assad regime. While Trump did respond with a limited missile attack, the administration rejected the more militant proposals promoted by some of its more hawkish voices, such as then-UN Ambassador Nikki Haley.
In a better timeline, the ability of someone like Rep. Gabbard to see through what increasingly looks like another attempt to lie America into war would warrant increased support in her ongoing presidential campaign.
Instead, we are likely to continue to see those that advocate peace attacked by the bipartisan consensus that provides cover for continued, reckless military action abroad.
We usually think of Friedrich Hayek as a moderate, as least when compared with Mises and Rothbard, but he had a radical side as well. Hidden away in a note to the third volume of Law, Legislation, and Liberty, he makes a comment that puts him far outside “respectable” public opinion. He says that the inventor of “freedom from want” was “the greatest of modern demagogues.” Hayek’s condemnation of Franklin Roosevelt is as forthright as any radical could wish.
The passage where he says that is this: ”In view of the latest trick of the Left to turn the old liberal tradition of human rights in the sense of limits to the powers both of government and of other persons over the individual into positive claims for particular benefits (like the 'freedom from want' invented by the greatest of modern demagogues) it should be stressed here that in a society of free men the goals of collective action can always only aim to provide opportunities for unknown people, means of which anyone can avail himself for his purposes, but no concrete national goals which anyone is obliged to serve. The aim of policy should be to give all a better chance to find a position which in turn gives each a good chance of achieving his ends than they would otherwise have.” (Law, Legislation, and Liberty, Volume 3, note 42, pp.202-203 in the one-volume edition of the trilogy published by Routledge, 1982)
On Tuesday, Congressional impeachment hearings exposed an interesting facet of the current battle between Donald Trump and the so-called deep state: namely, that many government bureaucrats now fancy themselves as superior to the elected civilian government.
In an exchange between Rep. Devin Nunes (R-CA) and Alexander Vindman, a US Army Lt. Colonel, Vindman insisted that Nunes address him by his rank.
After being addressed as "Mr. Vindman," Vindman retorted "Ranking Member, it's Lt. Col. Vindman, please."
Throughout social media, anti-Trump forces, who have apparently now become pro-military partisans, sang Vindman's praises, applauding him for putting Nunes in his place.
In a properly functioning government — with a proper view of military power — however, no one would tolerate a military officer lecturing a civilian on how to address him "correctly."
It is not even clear that Nunes was trying to "dis" Vindman, given that junior officers have historically been referred to as "Mister" in a wide variety of times and place. It is true that higher-ranking offers like Vindman are rarely referred to as "Mister," but even if Nunes was trying to insult Vindman, the question remains: so what?
Military modes of address are for the use of military personnel, and no one else. Indeed, Vindman was forced to retreat on this point when later asked by Rep. Chris Stewart (R-UT) if he always insists on civilians calling him by his rank. Vindman blubbered that since he was wearing his uniform (for no good reason, mind you) he figured civilians ought to refer to him by his rank.
Of course, my position on this should not be construed as a demand that people give greater respect to members of Congress. If a private citizen wants to go before Congress and refer to Nunes or any other member as "hey you," that's perfectly fine with me. But the important issue here is we're talking about private citizens — i.e., the people who pay the bills — and not military officers who must be held as subordinate to the civilian government at all times.
After all, there's a reason that the framers of the US Constitution went to great pains to ensure the military powers remained subject to the will of the civilian government. Eighteenth and nineteenth century Americans regarded a standing army as a threat to their freedoms. Federal military personnel were treated accordingly.
Article I, Section 8 of the Constitution states that Congress shall have the power "to raise and support Armies …" and "to provide and maintain a Navy." Article II, Section 2 states, "The President shall be the Commander in Chief of the Army and Navy of the United States, and of the Militia of the several States when called into the actual Service of the United States." The authors of the constitution were careful to divide up civilian power of the military, and one thing was clear: the military was to have no autonomy in policymaking. Unfortunately, early Americans did not anticipate the rise of America's secret police in the form of the CIA, FBI, NSA, and other "intelligence" agencies. Had they, it is likely the anti-federalists would have written more into the Bill of Rights to prevent organizations like the NSA from shredding the fourth amendment, as has been the case.
The inversion of the civilian-military relationship that is increasingly on display in Washington is just another symptom of the growing power of often-secret and unaccountable branches of military agencies and intelligence agencies that exercise so much power both in Washington and around the world.
The Federal Reserve lowered its benchmark interest rate on Wednesday, cutting the target federal funds rate by 0.25 percent to a range of 0.5 to 0.75 percent.
The Fed's rate-setting committee, the FOMC, has now cut rates three times this year. The committee's rhetoric around the rate cut was the usual routine. The committee's statement indicated that " labor market remains strong and that economic activity has been rising at a moderate rate." But the official statement says something similar nearly every time the committee meets. So, there is no information here to suggests why the committee is cutting now versus all the other times the labor market is "strong" and economic strength is "moderate."
Two members of the committee voted against the cut: Esther L. George and Eric S. Rosengren.
Rosengren voted against the measure because he wanted a bigger ate cut. George, like her predecessor Thomas Hoenig at the Kansas City Fed, is relatively hawkish — although not the extent Hoenig was.
Thus, George noted in response to the rate cut: “While weakness in manufacturing and business investment is evident, it is not clear that monetary policy is the appropriate tool to offset the risks faced by businesses in those sectors when weighted against the costs that could be associated with such action.”
In other words, George recognizes that, yes, there are downsides to expansionary monetary policy.
Although the Fed statements offer no insights, the fact the Fed continues to cut rates suggests it is working from a position of fear about the true strength of the economy. Although jobs data continues to point to expansion, a number of other indicators look less rosy. The Case-Shiller index, for example, has fallen to 2-percent growth, and appears to be headed toward zero. We have seen a similar dynamic since 2006. Moreover, new housing permit growth has been negative (year-over-year) in six of the last ten months. Tax receipt data has also been weak, with seven out of the last ten reported periods showing negative year-over-year growth.
It's true that other indicators point to strength, but if things are going so well, why cut rates?
After all, the target rate is already remarkably low even by the standards of the most recent expansion, when the Fed Funds rate was allowed to rise to over five percent.
The Fed has justified this ultra-low-rate policy with theories about the natural interest rate, and about the alleged need to keep prices at or above two-percent inflation.
The problem is that the Fed cannot actually observe the natural interest rate and the two-percent inflation standard is a completely arbitrary standard invented in recent years.
Nonetheless, the Fed continues to look relatively restrained compared to other central banks, to which its policies are in part a reaction. Other central banks have set a very low bar, to be sure, but the Fred nonetheless looks almost hawkish compared to the ECB and the Bank of Japan. Both are pursuing a negative-interest-rate policy, and even with the latest rate cut, the Fed's target rate also remains above that of the Bank of England, and equal with the Bank of Canada.
But the target rate is, of course, not the Fed's only policy tool. To address liquidity problems observed during the recent repo crisis, the Fed has stepped up purchases and added to its balance sheet.
And then there is the interest the Fed pays on reserves. On Wednesday, the FOMC also announced a cut to the interest rate "paid on required and excess reserve balances," dropping the rate from 1.8 percent to 1.55 percent, mirroring the drop in the fed funds rate.
This keeps the interest paid on reserves at 0.2 percent below the fed funds rate. That's the biggest gap we've seen since 2008, and it suggests the Fed wants more lending in the real economy, even though it's also apparently concerned about liquidity for banks.
This makes sense if we're in a late phase of the boom which brings increased demand for loans, but without sufficient savings and earnings at the street level to assure liquidity for banks through the marketplace. This is only a problem one encounters in an economy built on central-bank credit expansion. Central bankers no doubt are sure they can navigate these waters, but its unclear how long they can keep the current boom going.
2019 was the year that the blood-test requirement for marriage was finally abolished in all 50 US states.
This past March, the governor of Montana signed the legislature's bill abolishing the state's requirement that women submit to blood tests to be screened for rubella prior to the granting of a marriage license.
Technically, Montana had removed the absolute mandate in 2007, but the change would only "allow brides to opt out after signing an acknowledgement of the pregnancy risks related to rubella, and only if the groom signs too. Otherwise, the female applicant must provide a medical certificate signed by a physician stating that she has been tested or is exempt for medical reasons."
The 2019 legislation now completely removes the requirement.
Montana was the only remaining state with a blood test requirement. As recently as 1980, though, 34 states still had laws on the books requiring blood tests before marriage. Kasey S. Buckles, Melanie Guldi, and Joseph Price provide a concise summary of the legislative trend:
Of these 34, 19 states repealed their law in the 1980s, 7 repealed in the 1990s, and 7 more repealed between 2000 and 2008, leaving only Mississippi with a BTR in 2009.
(Buckles, et al,counted the Montana "opt out" as abolition, leaving only Mississippi.)
Mississippi ended its requirement in 2012.
But why was there ever a requirement at all?
Like so many invasive procedures mandated by governments, mandatory blood tests for couples seeking marriage licenses were a product of the age of eugenics and Progressive politics — two things that often go together.
As Ruth C. Engs notes in The Progressive Era's Health Reform Movement : "'Racial improvement' through positive eugenics, such as marriage to a healthy individual, [and] blood tests for syphilis prior to marriage ... were promoted for improving the 'race,' thus leading to a healthier nation."
The rights of individuals to marry whom they wished was thus swept aside in the name of "hygiene" and public health. Blood tests took their place along with prohibitions on interracial marriage as a means of "racial improvement."
Between 1980 and 2008, abolition was more a function of medical treatment options rather than any commitment to medical freedom or marriage freedom.
Buckles, et al note:
Historically, many states have required applicants for a marriage license to obtain a blood test. These tests were for venereal diseases (most commonly syphilis), for genetic disorders (such as sickle-cell anemia), or for rubella. The tests for syphilis were part of a broad public health campaign enacted in the late 1930s by U.S. Surgeon General Thomas Parran. Parran argued that premarital testing was necessary to inform the potential marriage partner of the risk of contracting a communicable disease, and to reduce the risk of birth defects associated with syphilis. According to Brandt (1985), "by the end of 1938, twenty-six states had enacted provisions prohibiting the marriage of infected individuals." Screenings for genetic disorders and for rubella were also implemented in the interest of minimizing the risk of genetic disease or birth defects in the couple's offspring.
Buckles, et al., note that it soon became apparent that the cost of the mandate was very high and benefits were quite low:
In the case of syphilis, however, it was soon recognized that premarital blood testing was not a cost-effective way to screen for the disease. Despite reports that 10% of Americans were infected, only 1.34% of applicants in New York City's first year of testing were found to have the disease. Brandt (1985) notes that a premarital exam was "not the optimal locus for screening," since couples seeking to marry were not likely to be in the most at-risk groups, and individuals who knew they were infected could wait until the infection cleared to apply for a license. ... Nationwide, couples spent over $80 million to reveal 456 cases.
By the 1980s, sexually-transmitted diseases were far more treatable than was the case in the 1930s. This lessened the importance of alerting future sex partners about one's health status. (This theory, of course, relies partly on an assumption people rarely have sex outside marriage — a view that was rather fanciful even in the 1930s.)
Nonetheless, abolition was not just a matter of deliberation over the medical efficacy of the laws. Ordinary people never appeared to be enthusiastic about the mandates, and many resented the additional hoops they needed to jump through to carry on with their personal lives.
It should surprise no one, then, that couples actively sought to avoid the costly and time-consuming test requirements.
Blood test requirements led to couples choosing to marry in states that did not have the mandates: "it appears that about one-third of the decrease in licenses is due to couples marrying out of state, while about two-thirds choose not to marry at all."
So, it turns out the mandated blood tests worked to discourage marriage while doing little to actually identify people with disease or improve public health.
The mandate was great for the medical industry, however, since it required the payment of many millions of dollars for otherwise unnecessary medical procedures.
You'll Marry Your Sister!
Another source of confusion over mandated blood tests has been a belief held by some that blood tests were used to screen for the "marry-your-sister" problem. That is, some think that blood-test mandates exist for purposes of genetic testing.
Unlike tests for venereal disease, however, genetic testing for consanguinity is very expensive, and has never been generally mandated by states. The issue could much more cheaply and pragmatically be addressed by granting people the legal right to know who their biological parents are, when the information is available. Cases of consanguinity are generally tied to cases when a marriage partner has been adopted, abandoned, or otherwise is unaware of his or her biological parents.
Conflicting Messages on Medical Freedom
It may be, however, that the abolition of one violation of medical freedom could be replaced by another.
After all, Montana's blood-test requirement was justified on the grounds that it prevented health problems for a third party. That is, if a woman with rubella becomes pregnant, this can have devastating effects on the developing fetus. Functionally, it was more a pre-pregnancy requirement than a pre-marriage requirement.
Just as modern treatments for STD lessened the need to test for syphilis ahead of time, the need for pre-pregnancy testing for rubella was largely supplanted by the prevalence of vaccines against rubella. Will the repeal of the rubella blood test signal a renewed drive toward mandatory rubella vaccination in Montana? For now, efforts to remove all non-medical exemptions for vaccines are concentrated in states like New York and California. But the issue is certainly not confined only to these places.
It should not be assumed the movement toward abolishing blood tests was motivated by libertarian impulses.