OK Boomer, It's Not Important to Respect (All) Your Elders

OK Boomer, It's Not Important to Respect (All) Your Elders

11/20/2019Ryan McMaken

Now that I've reached the ripe old of age of 42, I've been married for twenty years, and I've partially raised four children.

The older I get, the more I realize how very wrong I was to ever think that a disproportionate number of people older than me possessed some sort of special knowledge about how to properly run one's life.

The amount of laziness, moral degeneracy, arrogance, and general buffoonery I've witnessed among the older set has forever cured me of the idea that my "elders," prima facie, are a source of wisdom.

This doesn't mean none of our elders provide excellent examples after which to aspire. Many do.

But the problem lies in figuring out which ones are worthy of such consideration.

Many parents will recognize this conundrum from problems encountered while parenting.

After all, obedience and respect of others, practiced properly, are virtues. But who is deserving of obedience or respect?

As a a parent, what quickly becomes apparent is that it takes very little effort to tell young people they should be obedient to people who are in positions of authority. This, apparently, is what people have done in a great many times and places. Many are told to "respect" cops, soldiers, their teachers, clergy, government officials, parents, elders, and people with impressive titles.

But this is also a very lazy way of teaching children how to engage with their world. Any half-wit can just wave a hand and tell children to respect people in positions of authority.

The proper — but much more difficult — way of teaching "respect" is to teach the young that only some people in positions of authority deserve respect. The hard part is figuring out who deserves it and who doesn't. (Even more difficult is the task of earning respect from others.)

For example, a police officer who doesn't know the law, shirks his duty, or abuses his power does not deserve respect. A politician who is dishonest or imagines himself a hero while living off the sweat of taxpayers doesn't deserve respect. A school teacher who is lazy, teaches her subject poorly, or treats students badly, deserves only contempt. A parent who spends the family budget on toys for himself doesn't deserve respect. An "elder" who lives a life of dissipation ought to be treated accordingly.

Unfortunately, all police officers wear the same uniform. All politicians wear similar "respectable" outfits. There is no easy way to just look at a teacher or college professor and know if he she is competent.

This task is especially difficult for children who are only just beginning to learn how to differentiate between honorable people, and ignorant fools.

But we have to start somewhere, and a good place to start is not by insisting that just because Old Man Wilson managed to avoid death for a certain number of decades, his words must be heeded.

That many people still believe this nonsense, however, has been on display in recent years thanks to social media and and the seemingly endless number of news articles and op-eds about "Millennials." The recent rise of the dismissive phrase "OK boomer" has elicited even more whining from some boomers about how the youngsters ought to show them more respect. Some have even attempted to claim the term is a slur like the "n-word" or a violation of federal anti-discrimination law.

Please.

And for what exactly is this respect so deserved? Admitting that boomers didn't directly exercise much political power until the 1990s, we still ask:

Do they deserve respect for running up 20 trillion dollars of government debt since the 90s?

Do they deserve respect for inaugurating a period of endless war that began with the periodic bombing of Iraq and the Balkans, and which continues to today?

Do they deserve respect for ushering in a culture in decline, characterized by latchkey children, widespread divorce and out-of-wedlock children, a rising suicide rate, and the continued obliteration of civil society in general?

Do they deserve respect for the destruction of the Bill of Rights through "patriotic" legislation like the Patriot Act and the continued spread of our modern surveillance state?

Too Much Aggregation

This sort of "analysis" of course, misses most of the details, and relies on broad generalizations. It is not true that all boomers supported the sort of policies that led to endless war, out-of-control spending and the destruction of our human rights. Many boomers actively opposed this sort of thing. But many did either directly or indirectly support all these unfortunatel developments in recent decades. And they deserve the scorn they receive.

But this very fact makes our point for us: it is never a good idea to pay respect to elders just because they are elders. They deserve no more respect than anyone else, until proven otherwise. The same ought to be applied to any group demanding respect, whether that be judges, cops, bishops, or university faculty.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

The Tory Landslide May Soon Bring Scottish Independence

12/13/2019Jeff Deist

The results from yesterday's general election in the United Kingdom are stark for the Labour Party, which lost 59 seats in Parliament. The Tories picked up 47 seats, leaving Labour with its worst showing since 1935. This "second referendum" on Brexit presents Prime Minister Boris Johnson with clear support for a no-deal Brexit, while delivering a stinging rebuke to the London-centric Remain bloc. It also signifies the likely end of Jeremy Corbyn's career, because unlike American politicians their British counterparts at least have the decency to go away after voters reject them.

The more interesting story yesterday was the remarkable and ongoing success of the Scottish Independence Party (SNP), which gained a whopping 13 seats in Parliament. It now represents most of Scotland geographically: Liberal Democrats hold only the northernmost counties and parts of Edinburgh, while Labour clings to a tenuous hold in the southern part of that city. What does it portend when the two primary left-wing parties in the UK no longer represent left-leaning Scotland? 

Scotland Parliament

The 2014 Scottish referendum on independence from the UK revealed many of the same schisms present in the later Brexit referendum and 2016 US presidential election: young vs. old, pensioner vs. worker, country vs. city, and cosmopolitan vs. parochial. But the attendant narrative of nationalist vs. globalist falls apart when it come to the Scots, who are generally far more pro-EU than the UK generally. The pro-independence Yes! vote in 2014 skewed younger, favored left-wing policies across the board, and sought greater connection with Europe and Brussels. In fact, Scots later voted nearly 2-1 against Brexit.

But while Fox-hunting rural Brexiteers and Scottish secessionists may both share the same disdain for London and Westminster, they do so for entirely different reasons. 

Many older Scots worried that independence might threaten their pensions, and the banking community questioned whether Westminster would allow a breakaway Scotland to continue using the British pound. Nobody wanted a rushed transition to the euro, but without a central bank of its own (that pesky sovereignty issue again) Scotland might have been stuck in a vice between two currencies. Independence forces also failed to convince voters that Scotland's vaunted North Sea oil reserves would help fund the new independent county, especially given falling oil prices and potential territorial disputes over revenues.

These economic concerns were enough to squelch the independence vote by a comfortable 55-45% margin. But economics is not everything. Politically, culturally, and socially it was clear the Scots wanted to be part of Europe, not part of an English-dominated UK.

It may be more clear today. Already this morning the Twittersphere buzzes with talk of a renewed Scottish independence campaign, while the SNP yesterday announced its support for another referendum if a "material change in circumstances" arose between Scotland and the greater union. Surely a landslide victory by the Tories — who are widely disliked by the Scots — and a flashing green light for a deeply unpopular Brexit represent exactly such a change. 

Scotland and England are not magically joined at the hip. If the Scots don't want Brexit, don't want Boris Johnson, and don't want the Tories, who says the current political makeup of the UK is forever and unchanging? Political arrangements are not something to impose on reluctant, disbelieving people. If we favor independence and political self-determination only when we like the results, the only liberty on offer is the liberty to agree. But political universalism is an abstraction, and an arrogant one at that.

If Scots choose Holyrood over Westminster, or even Brussels over Holyrood, who are we to object?

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Upcoming State Minimum Wage Increases Will Outlaw Jobs for Low-Wage Workers

Americans have witnessed people picketing in support of a $15 minimum wage. It’s called “The Fight for $15.” Florida citizens will even vote on embedding a $15 minimum wage in their state constitution in the November 2020 election. A minimum wage plebiscite is pending in Idaho in 2020. If the past is any indication, the prospects for the Florida Idaho initiatives are good. Of the 27 state minimum wage votes since 1988, 25 passed. The two that were rejected, Missouri and Montana, were in 1996. For the 25 that have passed, electoral margins have almost always substantial. For information about these ballot initiatives, see ballotpedia.org.

Interestingly, Floridians first put a minimum wage in their constitution in 2004 when it was set at $6.25 and indexed to inflation each year. That’s the reason Florida’s current minimum wage is $8.46, an otherwise curious number. Florida’s 2020 amendment proposes increasing Florida’s current minimum wage to $10.00 in September, 2021, and thereafter in $1.00 increments to $15.00 in September 2026.

Idaho’s plebiscite, if enacted, will raise the current $7.25 minimum wage to $12 by 2024. Moreover, and particularly important for what follows, it will eliminate a current provision that allows people under age 20 to be paid $4.25 an hour for their first 90 days on the job.

Media accounts of the “Fight for $15” are supportive. They typically offer numerical exercises showing “inadequate” earnings of people working full time at the minimum wage. The accounts are laced with terminology such as “fair” and “living” wages, a tactic surely designed to evoke public sympathy/support for those struggling economically. It also seizes the high moral ground for supporters since it puts their opponents in the untenable position of appearing to favor “unfair” and “non-living” wages. It always helps to control the terminology used in a debate, doesn’t it?

Sympathy and terminology without knowledge can be dangerous, or as the saying goes, “the road to hell can be paved with good intentions.” Or as University of Chicago Nobel Laureate in economics George Stigler once put it: “Whether one is a . . . churchman or a heathen, it is useful to know the causes and consequences of economic phenomena.”

The Source of the Problem

Whether intended or not, increases in the minimum wage doom those whose economic value to employers is between the current minimum wage and the proposed higher minimum wage. The lower rungs of peoples’ economic ladders are cut off. They lose their jobs regardless of the intentions of their supposed supporters. Their supposed supporters are actually their enemies. They need to “help” less.

It is unfortunate when we read that the person who filed Florida’s ballot initiative said: "In life, I think that you’re supposed to do the most, for the most with the least. ... I did [the ballot initiative] in a way that would be business-friendly, and not just throw them in the deep end.”  Similarly, an official in the “Idahoans for a Fair Wage” says "I think this would stimulate the economy, and our goal really is to help lift working Idahoans out of poverty. We’ve found that a lot of the people that this would really help would be people pretty much 35 and under."

In truth, the bills only makes many low-wage workers now legally unemployable. This is an odd way to “stimulate the economy.” And as noted at the outset, eliminating a sub-minimum wage for workers under 20 is a way to hurt, not help, those who are less productive workers. Just because a worker has relatively low productivity is no reason to outlaw that person's job. But that is what minimum wage increases do.

Read More:

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Artwork Made from Old Bananas Shows Value Is Subjective

12/11/2019Ryan McMaken

Last week, Miami art gallery Art Basel sold, for $120,000, a piece of art composed of a banana duct taped to a wall. At least one other identical piece sold for a similar amount. A third piece was priced at $150,000. The banana used in the display is a real banana, and on Saturday, a performance artist named David Datuna ate some of it.

Datuna's stunt merely illustrated what everyone should have already known: the value of the artwork had almost nothing to do with the banana itself. Its value came not from the amount of labor that went into it or from the cost of the physical materials involved. A spokeswoman for the museum summed up the real source of the item's value, noting, “He [Datuna] did not destroy the artwork. The banana is the idea.

In other words, the people who purchased the art weren't actually purchasing a banana and tape. The person who purchased the art was buying the opportunity to communicate to peers that he or she was rich enough to throw around $120,000 on a work of art that would soon cease to exist. This was a transaction that involved purchasing status in exchange for money. The banana was only a tiny part of the exchange.

Moreover, the transaction offered the opportunity for the gallery, the art seller, and the art buyer to all further increase their status by being the topic of discussion in countless news articles and discussions in social media. As was surely anticipated by the artists and everyone else in the banana sale, the media could be counted on to act as if this art was something new, outrageous, or exciting. "Art world gone mad," the New York Post announced on its front page. Hundreds of thousands of commentators in various social media forums chimed in to comment on the matter.

One wonders, however, how many times this shtick can be repeated over and over until people lose interest. Apparently: many times. After all, this sort of art is not a new thing. For decades, avant-garde artists have been using garbage and other found objects to create art. And people with a lot of disposable income have been willing to pay a lot of money for it. It's all basically an inside joke among rich people. And regular people have the same reaction over and over again.

But there's absolutely nothing at all that's shocking, confusing, or incomprehensible from the point of view of sound economics. Transactions like these should only surprise us if we're still in the thrall of faulty theories of value, such as the idea that goods and services are valued based on how much labor and materials went into them. That's not true of any good or service. And it's certainly not true of art.

Is It Garbage or Is It Art?

In fact, two identical items can be valued in two completely different ways simply if the context and description of the objects changes.

According to the Daily Mail, a 2016 study suggests that people value ordinary objects differently depending on what they are told about the objects: "According to the new research, being told that something is art automatically changes our response to it, both on a neural and a behavioural level."

In this case, researchers in Rotterdam, the Netherlands, told subjects to rate how they valued objects in photographs. When told that those objects were "art" people valued them differently. 

In other words, the perceived value of objects could change without any additional labor being added to them, and without any physical changes at all. 

The value, it seems, is determined by the viewer, and we're reminded of Carl Menger's trailblazing observations about value

Value is a judgment economizing men make about the importance of the goods at their disposal for the maintenance of their lives and well-being. Hence value does not exist outside the consciousness of men.

One moment the viewer may think he's looking at garbage, which he has likely learned is of little value. When told that said junk is really "art," the entire situation changes. (Of course, we would need to see their preferences put into real action via economic exchange to know their preferences for sure.)

The change, as both Menger and Mises understood it, is brought about not by changes to the object itself, but by changes in context and in the subjective valuation of the viewer. 

A glass of water's value in a parched desert is different from that of a glass next to a clean river. Indeed, a glass of water displayed in a museum as art — as in the case of Michael Craig-Martin's "An Oak Tree" — is different from water found in both deserts and along rivers. Similarly, the value of a urinal displayed in a museum as art — as with Marcel Duchamp's "Fountain" — is different from a physically identical urinal in a restroom. 

The Daily Mail article attempts to tie the researchers' observations to the theories of Immanuel Kant on aesthetics. But, one need know nothing about aesthetics at all to see how this study simply shows us something about economic value: it is, to paraphrase Menger, found in the "consciousness of men." 

And it is largely due to this fact that centrally planning an economy is so impossible. How can a central planner account for enormous changes in perceived value based on little more than being told something is art? 

Is a glass of water best utilized on a shelf in a museum, or is it best used for drinking? Maybe water is best used for hydroelectric power? Exactly how much should be used for each purpose? 

When discussing the problems of economic calculation in socialism, Mises observed that without the price system, there simply is no way to say that a specific amount of water is best used for drinking instead of being used for modern-art displays. Nor is the fact that people need water for drinking the key to determining the value of water. (See the diamond-water paradox.)

In a functioning market, consumers will engage in exchanges involving water in a way that reflects how much they prefer each use of water to other uses. At some moments, some consumers may prefer to drink it. At other moments, they may prefer to water plants with it. At still other moments, they may want to contemplate an art display composed of little more than a glass of water. The price of water at each time and place will reflect these activities. 

Without these price signals, attempting to create a central plan for how each ounce of water should be used is an impossible task.

Do we need to know why people change their views of object when told they are art? We do not. Indeed, were he here, Mises would perhaps be among the first to remind us that economics need not tell us the mental processes that lead to people preferring different uses for different objects, although we can certainly hazard a guess. It's unlikely that the buyer of the taped banana bought it because he or she planned to eat it.

But even if we are wrong about the buyer's motivation, the fact remains that the buyer valued the banana at $120,000 for some reason — and the value was subjective to the buyer.

Similarly, we can't know for sure why each individual values water for drinking over "art water" or vice versa. And a government planner or regulator — it should be noted — can't know this either.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

How the Soviets Replaced Christmas with a Socialist Winter Holiday

12/10/2019Ryan McMaken

Leftist revolutionaries have long been in the habit of reworking the calendar so as it make it easier to force the population into new habits and new ways of life better suited to the revolutionaries themselves.

The French revolutionaries famously abolished the usual calendar, replacing it with a ten-day week system with three weeks in each month. The months were all renamed. Christian feast days and holidays were replaced with commemorations of plants like turnips and cauliflower.

The Soviet communists attempted major reforms to the calendar themselves. Among these was the abolition of the traditional week with its Sundays off and predictable seven-day cycles.

RELATED: "When the Communists Abolished the Weekend" by Ryan McMaken]

That experiment ultimately failed, but the Soviets did succeed in eradicating many Christian traditional holidays in a country that had been for centuries influenced by popular adherence to the Eastern Orthodox Christian religion.

Once the communists took control of the Russian state, the usual calendar of religious holidays was naturally abolished. Easter was outlawed, and during the years when weekends were removed, Easter was especially difficult to celebrate, even privately.

But perhaps the most difficult religious holiday to suppress was Christmas, and much of this is evidenced in the fact that Christmas wasn't so much abolished as replaced by a secular version with similar rituals.

Emily Tamkin writes at Foreign Policy:

Initially, the Soviets tried to replace Christmas with a more appropriate komsomol (youth communist league) related holiday, but, shockingly, this did not take. And by 1928 they had banned Christmas entirely, and Dec. 25 was a normal working day.

Then, in 1935, Josef Stalin decided, between the great famine and the Great Terror, to return a celebratory tree to Soviet children. But Soviet leaders linked the tree not to religious Christmas celebrations, but to a secular new year, which, future-oriented as it was, matched up nicely with Soviet ideology.

Ded Moroz [a Santa Claus-like figure] was brought back. He found a snow maid from folktales to provide his lovely assistant, Snegurochka. The blue, seven-pointed star that sat atop the imperial trees was replaced with a red, five-pointed star, like the one on Soviet insignia. It became a civic, celebratory holiday, one that was ritually emphasized by the ticking of the clock, champagne, the hymn of the Soviet Union, the exchange of gifts, and big parties.

In the context of these celebrations, the word "Christmas" was replaced by "winter." According to a Congressional report from 1965,

The fight against the Christian religion, which is regarded as a remnant of the bourgeois past, is one of the main aspects of the struggle to mold the new "Communist man." … the Christmas Tree has been officially abolished, Father Christmas has become Father Frost, the Christmas Tree has become the Winter Tree, the Christmas Holiday the Winter Holiday. Civil-naming ceremonies are substituted for christening and confirmation, so far without much success.

It is perhaps significant that Stalin found the Santa Claus aspect of Christmas worth preserving, and Stalin apparently calculated that a father figure bearing gifts might be useful after all.

According to a 1949 article in The Virginia Advocate,

at children’s gatherings in the holiday season … grandfather frost lectures on good Communist behavior. He customarily ends his talk with the question “to whom do we owe all the good things in our socialist society?” To which, it is said, the children chorus the reply, ‘Stalin.’

When commenting, please post a concise, civil, and informative comment. Full comment policy here

The Warfare State Lied About Afghanistan, Iraq, and Syria. They Will Lie Again.

12/09/2019Tho Bishop

Today the Washington Post published a bombshell report titled “The Afghanistan Papers,” highlighting the degree to which the American government lied to the public about the ongoing status of the war in Afghanistan. Within the thousands of pages, consisting of internal documents, interviews, and other never-before-released intel, is a vivid depiction of a Pentagon painfully aware of the need to keep from the public the true state of the conflict and the doubts, confusion, and desperation of decision-makers spanning almost 20 years of battle.

As the report states:

The interviews, through an extensive array of voices, bring into sharp relief the core failings of the war that war is inseparable from propaganda, lies, hatred, impoverishment, cultural degradation, and moral corruption. It is the most horrific outcome of the moral and political legitimacy people are taught to grant the state. persist to this day. They underscore how three presidents — George W. Bush, Barack Obama and Donald Trump — and their military commanders have been unable to deliver on their promises to prevail in Afghanistan.

With most speaking on the assumption that their remarks would not become public, U.S. officials acknowledged that their warfighting strategies were fatally flawed and that Washington wasted enormous sums of money trying to remake Afghanistan into a modern nation....

The documents also contradict a long chorus of public statements from U.S. presidents, military commanders and diplomats who assured Americans year after year that they were making progress in Afghanistan and the war was worth fighting.

None of these conclusions surprise anyone that has been following America’s fool's errand in Afghanistan. 

What makes this release noteworthy is the degree to which it shows the lengths to which Washington to knowingly deceive the public about the state of the conflict. This deception extends even to the federal government’s accounting practices. Notes the report, the “U.S. government has not carried out a comprehensive accounting of how much it has spent on the war in Afghanistan.”

As the war has dragged on, the struggle to justify America’s military presence. As the report notes:

A person identified only as a senior National Security Council official said there was constant pressure from the Obama White House and Pentagon to produce figures to show the troop surge of 2009 to 2011 was working, despite hard evidence to the contrary.

“It was impossible to create good metrics. We tried using troop numbers trained, violence levels, control of territory and none of it painted an accurate picture,” the senior NSC official told government interviewers in 2016. “The metrics were always manipulated for the duration of the war.

Making Washington’s failure in Afghanistan all the more horrific is how easily predictable it was for those who desired to see the warfare state for what it is.

In the words of Lew Rockwell, in reflecting on the anti-war legacy of Murray Rothbard:

War is inseparable from propaganda, lies, hatred, impoverishment, cultural degradation, and moral corruption. It is the most horrific outcome of the moral and political legitimacy people are taught to grant the state. 

On this note, it is important to note that the significance of the Washington Post’s report should not distract from another major story that has largely been ignored by mainstream news outlets.

Recently, multiple inspectors with the Organisation for the Prohibition of Chemical Weapons have come forward claiming that relevant evidence related to their analysis of the reported 2017 chemical gas attack in Syria. As Counterpunch.org has reported:

Assessing the damage to the cylinder casings and to the roofs, the inspectors considered the hypothesis that the cylinders had been dropped from Syrian government helicopters, as the rebels claimed. All but one member of the team concurred with Henderson in concluding that there was a higher probability that the cylinders had been placed manually. Henderson did not go so far as to suggest that opposition activists on the ground had staged the incident, but this inference could be drawn. Nevertheless Henderson’s findings were not mentioned in the published OPCW report.

The staging scenario has long been promoted by the Syrian government and its Russian protectors, though without producing evidence. By contrast Henderson and the new whistleblower appear to be completely non-political scientists who worked for the OPCW for many years and would not have been sent to Douma if they had strong political views. They feel dismayed that professional conclusions have been set aside so as to favour the agenda of certain states.

At the time, those who dared question the official narrative about the attack - including Rep. Tulsi Gabbard, Rep. Thomas Massie, and Fox News’s Tucker Carlson - were derided for being conspiracy theorists by many of the same Serious People who not only bought the Pentagon’s lies about Afghanistan but also the justifications for the Iraq War.  
 
Once again we are reminded of the wise words of George Orwell, “truth is treason in an empire of lies."

These attacks promoted as justification for America to escalate its military engagement in the country, with the beltway consensus lobbying President Trump to reverse his administration's policy of pivoting away from the Obama-era mission of toppling the Assad regime. While Trump did respond with a limited missile attack, the administration rejected the more militant proposals promoted by some of its more hawkish voices, such as then-UN Ambassador Nikki Haley. 

In a better timeline, the ability of someone like Rep. Gabbard to see through what increasingly looks like another attempt to lie America into war would warrant increased support in her ongoing presidential campaign.

Instead, we are likely to continue to see those that advocate peace attacked by the bipartisan consensus that provides cover for continued, reckless military action abroad.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

The Greatest of Modern Demagogues

12/09/2019David Gordon

We usually think of Friedrich Hayek as a moderate, as least when compared with Mises and Rothbard, but he had a radical side as well. Hidden away in a note to the third volume of Law, Legislation, and Liberty, he makes a comment that puts him far outside “respectable” public opinion. He says that the inventor of “freedom from want” was “the greatest of modern demagogues.” Hayek’s condemnation of Franklin Roosevelt  is as forthright as any radical could wish.

The passage where he says that is this: ”In view of the latest trick of the Left to turn the old liberal tradition of human rights in the sense of limits to the powers both of government and of other persons over the individual into positive claims for particular benefits (like the 'freedom from want' invented by the greatest of modern demagogues) it should be stressed here that in a society of free men the goals of collective action can always only aim to provide opportunities for unknown people, means of which anyone can avail himself for his purposes, but no concrete national goals which anyone is obliged to serve. The aim of policy should be to give all a better chance to find a position which in turn gives each a good chance of achieving his ends than they would otherwise have.” (Law, Legislation, and Liberty, Volume 3, note 42, pp.202-203 in the one-volume edition of the trilogy published by Routledge, 1982)

When commenting, please post a concise, civil, and informative comment. Full comment policy here

We're Told Americans Have No Free Time, Yet We're Watching More than Four Hours of TV Per Day

12/06/2019Ryan McMaken

We are repeatedly told that basic human rituals are falling by the wayside. Why don't we all sit down to dinner as a family anymore? Why don't we spend time with each other anymore? Why are we all sleep deprived?

Sometimes these problems are blamed on people spending too much time devoted to kids' intramural activities or other types of school- and recreation-based activities. Some analysts note people can't tear themselves away from their smart phones in order to go to bed at a decent hour.

But very often, we're told, this lack of time comes down to too much work. The articles covering these topics are full of anecdotal evidence of people with multiple jobs, long commutes, and crushing work responsibilities.

These problems no doubt afflict many people. They're certainly an issue for people at that state of life where couples have school-age children, and have a host of bills from many responsibilities that comes with raising a family.

But, the anecdotal evidence is contradicted by years of data showing people aren't nearly as hard pressed for a few free moments as is supposed.

Specifically, consider the 2019 Q1 data provided on media consumption by the Neilsen Company. According to their extensive sampling of TV, smart phone, and video game console users, American adults spend an average of four-and-a-half hours per day watching television. The spend an additional 54 minutes using TV-connected devices such as DVD players and video game consoles.

neilsen2.PNG

People over fifty watch the most television and generally consume the most screen-based media. People in the 50-64 age bracket watched nearly six hours of television, and spend an additional two hours and forty-seven minutes on smart phones. People in the over-65 category watched even more television than that.

Not surprisingly, people in the 18-34 age group consumed the least media overall, and also used televisions the least. Those people have younger children — which makes TV viewing harder — and may be spending more time outside the house with friends. In this group, people watched on average one hour and fifty-four minutes of television, but were on phone apps for three-and-a-half hours.

Across age groups, media consumption ranged from nine hours to nearly thirteen hours. Per day.

But to err on the conservative side, let's remove radio time — which could just be part of the daily commute — and "internet on a computer," which could be chores and work time. Even if we do this, we find Americans are on average watching videos, playing video games, and consuming media seven or eight hours per day.

And yet, media outlets and pundits are often telling us that ordinary people absolutely don't have time to prepare a meal or maintain friendships. Given the data here, I'm skeptical of these assertions.

Now, these are averages, so it may be that people are very squeezed for time during the week, but then consume enormous amounts of media on the weekends. Certainly, there are people out there who consume live sports programming virtually all day on Sunday during football seasons. But then that would imply these people at least have time to spend with friends and family on weekends.

neilsen1.PNG

But if people have more than seven hours per day on average to watch re-runs of Friends, watch in-depth analysis of NBA games, and fire up the Playstation, why can't they manage to get eight hours of sleep?

If this data is correct, then the anecdotal evidence just doesn't add up, and it's simply not the case that people don't have time to do anything other than work, eat some fast food, and then do it all over again.

This isn't to say that poverty doesn't exist or that everyone is more or less average. We've all encountered people who at least sometimes work multiple jobs or are pushed to their limits by family obligations, work, and medical problems.

But the statistical data on media consumption suggests this isn't the typical experience.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Prohibition Ended Today 86 Years Ago

12/05/2019D.W. MacKenzie

Today is an interesting milestone for libertarian-minded people, as well as those with a fondness for trivia.

86 years ago today FDR 86’d prohibition.

Drinking became a crime starting on January 17th 1920, and remained a crime until December 5th 1933. Prohibition serves as a leading example of what happens when people in a largely free society lose part of their freedom. Prohibition did not stop Americans from drinking, it just drove an industry underground and into the control of gangs. Consequently, gang violence escalated during the prohibition years.

Prohibition also escalated police raids against harmless commerce. Prohibition fueled speakeasies as dispensers of beer & booze. Speakeasies obviously dealt with violent gangs as suppliers, but speakeasy customers engaged in voluntary transactions for desired goods. Police raids on speakeasies drove willing customers out of these businesses now and then, and these raids prompted both corruption and a minor change in the English language.

One speakeasy was “Chumley’s” located at 86 Bedford Street in Manhattan. Some police acted as informants to the bartenders at Chumley’s: shortly before a raid they would call with the message to “86 the customers”, to stop business and push all customers out the door. Hence the term 86’d began as a term for putting a stop to illicit business in one bar, but developed subsequently into a more general term for getting rid of something or refusing service. Prohibition ended 86 years ago today.

This is perhaps the only day during any year that libertarian-minded person might find it appropriate to raise a toast to FDR.

Cheers to the 32nd President, for just this one occasion.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Who Is on the Shortlist to Lead the Bank of England?

 A few months ago, just after Boris Johnson had become Prime Minister, I wrote an article addressing the ongoing selection process for the next Governor of the Bank of England, in which I gave my prediction of who the top 5 most likely candidates might be.

Much has changed in the British political landscape since then, including the decision to hold a general election on 12th December. As a result, Chancellor of the Exchequer Sajid Javid has announced that he will not be making his selection for the next BoE Governor until after the election, to avoid compromising the Bank’s independence by announcing during a “politically sensitive” time.

However, an official shortlist has been delivered to the Treasury, and the 5 names “thought to be” on the shortlist, as reported by the Grauniad, are Andrew Bailey, Minouche Shafik, and Ben Broadbent (who were on my predicted shortlist) as well as Shriti Vadera and Jon Cunliffe (who were not).

I included Vadera as an “honourable mention” in my article, but am admittedly surprised she made it onto the official shortlist, given her reputed “fiery” management style and strong partisan links to the Labour Party. However, bearing in mind the government’s stated intention to make this a diverse hiring process, and their use of the headhunting firm Sapphire Partners which “specialises in diversity and placing women in top roles”, it makes sense that they would have wanted to include her, at least to avoid the shortlist being 80% pale, male, and frail.

Everyone seems to be surprised that former Reserve Bank of India Governor and central banking superstar Raghuram Rajan was not included on the list, having previously been second only to Andrew Bailey in the bookies’ estimations. It’s perfectly true, as has been pointed out, that his failure to be included on the shortlist (or even interviewed) doesn’t necessarily mean he’s out of the race; current Governor Mark Carney was not included in the shortlist to replace Mervyn King in 2013. However, I have long had my doubts about the likelihood of Rajan getting the job, mainly due to the simple fact that (through no fault of his own, mind you) he isn’t British. I imagine this wouldn’t be such an issue in normal circumstances, but current Governor Mark Carney is the first of the Bank’s 120 Governors to have been foreign, and his tenure has been marked by repeated accusations of insufficient familiarity with the British economy. So it’s easy to imagine the pressure that must exist to not give the job to a second full-blown foreigner in a row.

I say “full-blown” foreigner to distinguish Rajan from the person who I personally believe is most likely to get the job, Egyptian-born Nemat “Minouche” Shafik. As I mentioned in my original article, Shafik’s status as a woman of colour would tick all the diversity boxes the government could reasonably hope for, yet she has sufficient “insider status”, both within the British economy and the Bank of England itself, to shield her from the sort of criticism to which Rajan might be subjected, and to be a considerable advantage in its own right. Educated at Oxford and the London School of Economics (two damn fine institutions, in my own entirely unbiased opinion), Shafik is the current Director of the latter institution, and was formerly a Deputy Governor at the Bank of England, having sat on its rate-setting Monetary Policy Committee from mid-2014 to early-2017. During her tenure, Shafik typically voted with the rest of the MPC, making it difficult to isolate her personal views on monetary policy. The only factor I can imagine holding her back would be her reportedly difficult working relationship with current Governor Mark Carney. However, if I were a betting man the research for my original article would have led me to bet on Shafik, and that remains true now that the official shortlist is out.

The only character on the list who I didn’t mention in my original article is Sir Jon Cunliffe, who is currently the Bank’s Deputy Governor for Financial Stability. Cunliffe has held a wide variety of senior civil service positions since 1990, and is currently on the Bank’s Financial Policy and Monetary Policy committees. He was educated at the University of Manchester, with his highest degree being a Master’s in English Literature, which, more than anything else, illustrates Britain’s unique status as a country where you can work in the financial sector with a degree in almost any subject. Cunliffe recently made headlines when he gave a speech arguing that low long-term interest rates put pressure on financial stability, and risk more severe downturns; a potentially welcome sentiment for Austrian ears.

When Mark Carney’s term as Governor comes to an end in late-January, the situation in British politics could potentially be very different: either the Conservatives will win the election and pass Johnson’s EU withdrawal bill, in which case Britain will be out of the EU by February, or Jeremy Corbyn will be Prime Minister, Brexit will be delayed (potentially indefinitely), and this shortlist of candidates might be re-thought or thrown out entirely. For the time being however, this shortlist provides an interesting insight into the priorities and policy goals of Britain’s government and central but doesn’t provide much hope for Austrians.

When commenting, please post a concise, civil, and informative comment. Full comment policy here

Fight Another "Terror War" Against Drug Cartels? There's a Better Way!

12/02/2019Ron Paul

The 50-year US war on drugs has been a total failure, with hundreds of billions of dollars flushed down the drain and our civil liberties whittled away fighting a war that cannot be won. The 20 year “war on terror” has likewise been a gigantic US government disaster: hundreds of billions wasted, civil liberties scorched, and a world far more dangerous than when this war was launched after 9/11.

So what to do about two of the greatest policy failures in US history? According to President Trump and many in Washington, the answer is to combine them!

Last week Trump declared that, in light of an attack last month on US tourists in Mexico, he would be designating Mexican drug cartels as foreign terrorist organizations. Asked if he would send in drones to attack targets in Mexico, he responded, “I don't want to say what I'm going to do, but they will be designated.” The Mexican president was quick to pour cold water on the idea of US drones taking out Mexican targets, responding to Trump’s threats saying “cooperation, yes; interventionism, no.”

Trump is not alone in drawing the wrong conclusions from the increasing violence coming from the drug cartels south of the border. A group of US Senators sent a letter to Secretary of State Mike Pompeo urging that the US slap sanctions on the drug cartels in response to the killing of Americans.

Do these Senators really believe that facing US sanctions these drug cartels will close down and move into legitimate activities? Sanctions don’t work against countries and they sure won’t work against drug cartels.

A recent editorial in the conservative Federalist publication urges President Trump to launch “unilateral, no-permission special forces raids” into Mexico like the US did into Pakistan to fight ISIS and al-Qaeda!

I am sure the military-industrial complex loves this idea! Another big war to keep Washington rich at the expense of the rest of us. And the 2001 Authorization for the Use of Military Force can even be trotted out to fight this brand new “terror war”!

Perhaps unintentionally, however, this sudden push to look at the Mexican drug cartels as we did ISIS and al-Qaeda does make sense. After all, the rise of the drug cartels and the rise of the terror cartels have both been due to bad US policy. It was the US invasion of Iraq based on neocon lies that led to the creation of ISIS and expansion of al-Qaeda in the Middle East and it was the US war on drugs that led to the rise of the drug cartels in Mexico.

Here’s another suggestion: maybe instead of doing the same things that do not work we might look at the actual cause of the problems. The US war on drugs makes drugs enormously profitable to Mexican suppliers eager to satisfy a ravenous US market. A study last year by the CATO Institute found that with the steady decriminalization and legalization of marijuana across the United States, the average US Border Patrol agent seized 78 percent less marijuana in fiscal year 2018 than in FY 2013.

Instead of declaring war on Mexico, perhaps the answer to the drug cartel problem is to take away their incentives by ending the war on drugs. Why not try something that actually works?

 

 

When commenting, please post a concise, civil, and informative comment. Full comment policy here
Shield icon power-market-v2