Power & Market
Our regular readers are by now familiar with the work of economist Brendan Brown who offers some of the most detailed analysis of investment and monetary trends at mises.org.
Joseph Salerno writes: "With this book, Brendan Brown joins the ranks of our leading monetary policy experts. His acute and learned analysis and critique of the failed fiat-money regimes since 1914 and the fatal flaws in the current 2-percent inflation standard constitute the definitive treatment of an approach to monetary policy that is rapidly approaching its end."
The Case Against 2 Per Cent Inflation analyses the controversial and critical issue of 2% inflation targeting, currently practiced by central banks in the US, Japan, and Europe. Where did the 2% target inflation originate, and why?
Brown's book presents a novel theoretical perspectives, intertwined with historical and market understanding, and features analysis that draws on monetary theory (including Austrian school), behavioral finance, and finance theory.
And finally, the book explores how the 2% global inflation standard could collapse and what would ideally follow its demise, including a new look at the role of gold.
On Monday, National Public Radio revealed that two-thirds of school shootings reported in 2015-2016 never actually happened.
Morning Edition reports:
This spring the U.S. Education Department reported that in the 2015-2016 school year, "nearly 240 schools ... reported at least 1 incident involving a school-related shooting." The number is far higher than most other estimates.
But NPR reached out to every one of those schools repeatedly over the course of three months and found that more than two-thirds of these reported incidents never happened. Child Trends, a nonpartisan nonprofit research organization , assisted NPR in analyzing data from the government's Civil Rights Data Collection.
We were able to confirm just 11 reported incidents, either directly with schools or through media reports.
In 161 cases, schools or districts attested that no incident took place or couldn't confirm one. In at least four cases, we found, something did happen, but it didn't meet the government's parameters for a shooting. About a quarter of schools didn't respond to our inquiries.
The Education Department, asked for comment on our reporting, noted that it relies on school districts to provide accurate information in the survey responses and says it will update some of these data later this fall. But, officials added, the department has no plans to republish the existing publication .
...A separate investigation by the ACLU of Southern California also was able to confirm fewer than a dozen of the incidents in the government's report, while 59 percent were confirmed errors. ...
...Most of the school leaders NPR reached had little idea of how shootings got recorded for their schools. For example, the CRDC reports 26 shootings within the Ventura Unified School District in Southern California.
"I think someone pushed the wrong button," said Jeff Davis, an assistant superintendent there. The outgoing superintendent, Joe Richards, "has been here for almost 30 years and he doesn't remember any shooting," Davis added. "We are in this weird vortex of what's on this screen and what reality is."
Here's the report. The false claims appear on page 2:
Nearly 240 schools (0.2 percent of all schools) reported at least 1 incident involving a school-related shooting, and over 100 schools (0.1 percent of all schools) reported a school-related homicide involving a student, faculty member, or staff member. About 1 out of every 100,000 students was enrolled in a school that reported a school-related shooting or school-related homicide during the 2015–16 school year.
The government's definition of shooting includes "any discharge of a weapon at school-sponsored events or on school buses" even if no one is hurt. It's true that few would welcome news that someone is firing guns at their child's school, even if it resulted in no injuries. And, as NPR reports, the number that would be a rate of shootings, and a level of violence, much higher than anyone else had ever found.
However, once data like this makes it into the news, the 240 incidents are reported as "shootings" which strongly implies the presence of physically harmed victims. Moreover, even if we include all firearm discharges into the data, it appears that a great many of schools that reported "shootings" can't remember them happening.
These sorts of substantial data discrepancies are part of a larger tendency not only to inflate the numbers, but also to blur the line between mass shooting, school shooting, and shootings that don't even take place near any classroom.
For example, back in February, at least one gun-control activist group was claiming that 18 "school shootings" had already occurred this year.
These shootings were usually brought up in the context of massive news coverage of multi-victim mass shootings. Many of these, shootings, however, hardly fit the bill of what ordinary people would imagine a school shooting to be. For example, as John Cox and Steven Rich in the Washington Post reported, one often-cited list of school shootings included a case in which
On the afternoon of Jan. 3, a 31-year-old man who had parked outside a Michigan elementary school called police to say he was armed and suicidal. Several hours later, he killed himself. The school, however, had been closed for seven months. There were no teachers. There were no students.
Other cases include, according to CNN :
- "A student shot another student with a BB gun in Gloversville Middle School ."
- "A teacher accidentally discharged a gun during a public safety class at Seaside High School, injuring a student."
The list also includes multiple cases of single individuals being shot in apartments and dorms on the campus, including "an incident from Jan. 20, when at 1 a.m. a man was shot at a sorority event on the campus of Wake Forest University."
By May, media outlets were using this data and similar data to claim that 2018's tally was already up to 22.
These events are all certainly unfortunate, violent, and unjust, but describing them generally as "school shootings" is questionable. After all, the statistics are generally used with the intent of invoking images of mass shootings like the Columbine massacre.
Moreover, confusing mass shootings with a domestic shooting in an on-campus apartment blurs lines between suggested policy responses. Mass shootings such as the Parkland, Florida shooting are used to justify restrictions on high-capacity semi-automatic weapons. Most of the shootings listed in "school shooting" databases, however, are single-victim events that could be carried out with a small-caliber revolver. While murder is murder, these sorts of distinctions are relevant to the policy debate. After all, gun-control advocates themselves clearly think the sort of weapon is relevant since they tend to ignore homicides committed with weapons other than guns.
In addition to the fact that recent data on school violence appears to exaggerate school violence, evidence going back to the early nineties shows that school violence has declined over the period. In fact, forthcoming research from researcher James Alan Fox — publicized by Northeastern University — concludes: "Four times the number of children were killed in schools in the early 1990s than today."
Nor should this be shocking for anyone familiar with homicide trends in the United States. Since the early nineties, homicide rates have been cut nearly in half. And while homicide rates have been increasing during the past two years, numbers remains well below where they were 25 years ago. And most of that increase in attributable to homicides in a small number of American big cities.
Back in May, I did a FoxNews segment, with former Clinton aide Chris Hahn contending that school violence was at epidemic proportions. He insisted on ignoring all the trend data from the past 25 years, asserting that only the post-2014 data mattered. And yet here we are now seeing that the data he was largely going off of was fabricated. Naturally, he insisted on ignoring the overall homicide data and its obvious trend.
There's a whole lot of buzz about the sharing economy. Many seem to think it is something new, with some calling for a 'new economics' to explain it while others deride the 'gig economy' as a higher level of exploitation, inequality, and poverty. Neither is a good analysis.
First things first: the sharing economy was facilitated by advances in technology alongside consumer preferences changing from goods to services and thus from ownership to lease. These are not separate processes, but mutually constituting changes where each increases the other.
The advances in technology that brought about the sharing economy are those that allow for cheaper, faster, and more accurate communication, verification of factual claims, decentralized corroborated trust/reputation etc. They overall lower transaction costs by making information both available and trustable. The terrifying idea of 'getting into a car with a stranger' is no longer a problem if that stranger's reputation can be tracked, is publicly available, and is backed up by the experiences of many others doing the same thing (as with Uber, Lyft).
The availability of such information also means we don't need to rely only on first-hand (or second-hand through family and friends) trusted experience, but can, as it were, rely on the experience of unknown others - third-hand trust. This changes our behavior because the cost of making a mistake is much lower: getting into the car of a shady Uber driver is on average much less of a risk than doing the same as a hitchhiker or even taking a regular taxi. This change of behavior in response to 'outsourced' trust means the technology can progress further.
The sharing economy, as the term implies, also means we can 'share' (make available) productive resources in much more effective ways. In a sense, it undermines the materialist view of value creation by dissolving the difference between 'personal' property and private ownership of the means of production: your personal car can now be both your personal property and your source of income - and you, as the owner, decide when and how. It doesn't change the economic categories, but releases the economic analysis from the material goods.
This is all well and proper, since the economy is about the creation and distribution of *value* and not of material things. As value is subjective, so is the distinction between consumer good and production good. Everything can be both, and what is what depends on how you use it.
In other words, the effect of the sharing economy on the study of economics may (and likely is) the liberation of economic analysis from materialist biases. Economics becomes the subjectivist social science it was always meant to be. So in terms of economic theory, the sharing economy can help make the study of economics what it should be and always should have been: the study of the creation and distribution of [subjective] value and the unplanned social orders that emerge in the prosperity-creating process.
But what about exploitation?
I find the argument that the 'gig economy' makes people accept lower wages highly fascinating. Suddenly it is a problem to these critics, mainly on the left, that 'workers' own their own capital.
When personal property becomes capital, a source of income, then the obvious implication is a decentralization of capital ownership.
I'm not saying companies like Uber are in any sense perfect. They could certainly go further, by e.g. implementing free pricing between drivers and riders. Why not allow drivers to set their required minimum to provide a ride? Why not allow riders to set their max to get a ride? (This may actually be the next step in the sharing economy - an open-marketization of fares and fees.) The common argument that drivers (which is usually the example used) don't make a 'living wage' is also a bias that remains from the materialist and industrialist view of economy. It assumes that a job is all about the salary, and that what matters is the monetary outcome of it. But many of those who choose to drive for a ride-share company do so because it is a *flexible* type of work, which is itself a value.
Sure, it may not be as high-paying as 'full-time' employment, but any choice includes a trade-off of value: to many, flexible work hours make up for the lower pay; for others, it doesn't - so they instead seek traditional employment. The solution to the low wages in the sharing economy is not regulation, but more competition - and proper market wages. In fact, the reason ride-sharing companies can pay low wages (which, judging from my discussions with many a driver, exceed those earned by taxi drivers!) is the regulations already in place: primarily the barriers to entry in ride-sharing and elsewhere that keep overall wages down by providing employers with artificial influence (power). The expected outcome of the low-transaction cost economy should be the ultimate gig economy, where all or most hierarchies (such as firms) dissolve into market relations - and where those who (choose to) work earn a market wage. What stands in the way for this development is not employers or even capital owners, but those artificial restrictions that produce their privilege.
It is a sad irony, really, that those restrictions - regulations - that are intended to protect workers from employers (in the industrial age) is what stands in the way for worker liberation in the new era.
Formatted from Twitter.
“Truth isn’t truth,” declared Rudy Giuliani, President Trump’s personal attorney, on Meet the Press on Sunday. Giuliani’s comment — the weirdest absolution yet proffered for Trump — is the “Trump era’s epitaph,” according to a Washington Post columnist. But truth really is defined differently inside the Beltway.
Trump could face a “perjury trap” from Special Counsel Robert Mueller because of the unique way that the FBI defines reality — and the truth. The FBI rarely records interviews and instead relies on written summaries (known as Form 302s) which “are widely held up in court as credible evidence of conversations,” the New York Times noted last year. Though defense attorneys routinely debunk the accuracy and credibility of 302s, prosecutors continue touting FBI interview summaries as the voice of God. Even if Trump made factually correct comments to Mueller, he could still face legal peril if his statements failed to harmonize with FBI “trust me on what I heard” memos containing contrary assertions.
Though other federal agencies cannot play the 302 game, they have plenty of options for editing the public record. Inside the Beltway, “plausible deniability” (a phrase first publicly used by CIA chief Allen Dulles in the 1950s) is “close enough for government work” to truth.
Congress enacted the Freedom of Information Act in 1966 to boost self-government by entitling Americans to learn what Washington did in their name. But FOIA is derided nowadays as a “Freedom From Information Act” that begets merely a mirage of transparency. Last year, individuals who filed FOIA requests “received censored files or nothing in 78 percent” of the time, according to the Associated Press. Federal agencies with the most power — such as the FBI, Department of Homeland Security, and the Justice Department — are among the worst FOIA abusers.
Federal agencies also maximize their discretion in defining truth via almost 50 million decisions to classify information each year. And the Justice Department can totally suppress embarrassing facts on the most contentious issues (such as torture or assassinations) by invoking the “state secrets” doctrine. The George W. Bush administration routinely invoked “state secrets” to seek “blanket dismissal of every case challenging the constitutionality of specific, ongoing government programs,” according to a study by the Constitution Project. A federal appeals court slammed the Obama administration’s use of “state secrets”: “According to the government’s theory, the judiciary should effectively cordon off all secret government actions from judicial scrutiny, immunizing the CIA and its partners from the demands and the limits of the law.” Government’s sway over damning information is boundless — at least until some scofflaw like Edward Snowden obliterates federal credibility.
Read the full article at The Hill
Chocolate company Toblerone announced this week that it will be returning its chocolate bars to the pre-2016 shape — at least in the British market. Back in 2016, the company had decided to put less chocolate in its chocolate bars by changing the shape. It was, the company said, part of an effort to deal with rising production costs. Here's what the two different versions of the bars looked like:
Source: The Telegraph
Among chocolate lovers, this change was rather scandalous. Most people, of course, didn't exactly make an issue of it in social media, but the change constituted one of many efforts to deal with price inflation at the time — known as "shrink-flation."
Back in 2016, Christopher Westley reported on this, and how the Toblerone "scandal" was part of a larger inflation-created problem:
[Toblerone] widened the gaps between the segments of its iconic chocolate bar, reducing its total volume by some 10 percent. Although the reaction has something of an Old Coke-New Coke air to it, one can easily see it as a sign of the inflationary times, an effect of worldwide money creation coordinated by the leading central banks, with Toblerone being just one of many victims.
The economics of the decision shouldn’t surprise an actual student of economics. Since inflation is always and everywhere a monetary phenomenon, and since the world’s central banks have been pumping new money into the global economy at unprecedented rates for several years, we should expect an upward pressure on prices. In a Facebook post, Toblerone explained that it was forced into changing its product in response to “higher cost of numerous ingredients,” adding that
...we had to make a decision between changing the shape of the bar, and raising the price. We chose to change the shape to keep the product affordable for our customers, and it enables us to keep offering a great value product.
Statements such as this cause Toblerone to become, unwittingly, a case study for how firms in competitive markets respond when monetary inflation raises their costs of production. When that happens, firms are less able to pass the cost on to consumers in the form of higher prices because if they do, they face a strong likelihood of losing market share and revenues. Instead, these firms cut back in terms of volume, size, and portions.
We see this all the time. Have you been to a restaurant lately where the menu prices haven’t seemed to change but the portions of food on your plate has? Or opened a bag of chips that hasn’t fallen in size while the volume of chips inside has? Or consumed a product of lower quality than you remembered in less inflationary times because its producer was obligated to change ingredients to break even?
The fact is, Toblerone can’t raise its prices willy-nilly due to the many substitutes available to consumers. Critics claiming otherwise ignore this common side effect of inflation in competitive industries, a phenomenon that especially has applied to candy markets in recent years.
It remains unclear, however, if the company is just reverting to the old shape and size, or just the old shape? If it's going back to the old shape and size, this would suggest that the company has found a way to produce candy bars less expensively, or that it can raise the price of the bars without driving down demand to the point it will hurt the company.
It's a safe bet, though, that we shouldn't expect a general reversal of the shrink-flation trend. A "pound" of coffee now seems to be 10 or twelve ounces of coffee in the US.
On the other hand, median income growth has in recent years finally begun to significantly exceed old pre-2008 levels, so it may be food companies are now using the current income gains as an opportunity to re-adjust prices upward before the next recession, when sensitivity to price increases will be far more significant.
It's easy to find income inequality in the United States when we compared the super-rich with the middle class. But when we compare the middle class to the "poor" there's a surprising amount of income equality.
But how can the middle class have incomes similar to the poor? Isn't that logically impossible?
Well, this sort of income equality is made possible by the existence of government social benefit programs. When we account for income transfers to low-wage workers — or to people who don't work at all — we find that the incomes of middle class people — in spite of often working far longer hours — are similar to the poor.
The political implications of this are significant.
The authors note:
The most surprising finding is the astonishing degree of equality among the bottom 60% of American earners, generated in part by the explosion of social-welfare spending and the economic and wage stagnation during the Obama era. Hardworking middle-income and lower-middle-income families must have recognized that their efforts left them little better off than the growing number of recipients of government transfers. The perceived injustice of this equality helped drive the political shift among blue-collar workers, many of whom supported the pro-growth candidacy of Donald Trump in 2016 despite having voted for Mr. Obama in the two previous presidential elections.
The bottom quintile earned 2.2% of all earned income in 2013, but after adjusting for taxes and transfer payments, its share of spendable income rose to 12.9%—six times its proportion of earnings. The second quintile’s share more than doubled, rising from 7% of earned income to 13.9% of spendable income. For the third quintile, middle-income Americans, the increase was much smaller, from 12.6% to 15.4%.
Not surprisingly, high earners lost a considerable share of their earnings after taxes and transfers are taken into account. The fourth quintile’s share fell from 20.5% to 18.6%, while the top quintile dropped from 57.7% of earnings to 39.3% of consumable income. In other words, the top quintile’s share of earnings was 26 times that of the bottom quintile, but after taxes and transfer payments its share of spendable income was only three times as much.
Even more startling is the near equality among the bottom three quintiles. The bottom quintile, which earned only 2.2% of all earned income, had virtually the same share of spendable income as the second quintile, lower-middle-income Americans. This equality is despite the fact that lower-middle-income workers earned more than three times the share of income and worked 21/2 times as much, measured by comparing each group’s number of full-time workers relative to its working-age population. Middle-income workers earned almost six times the share of income and worked almost four times as much compared with the bottom quintile, but they enjoyed only about 20% more spendable income.
This equality in income endured in spite of the fact that many middle-income families worked far harder for what income they did have:
And even these numbers understate the huge difference in work effort. Compared with the bottom quintile, the lower-middle-income quintile had almost four times as many working-age families whose members worked two or more jobs, and the middle-income quintile had more than seven times as many families with members working two or more jobs.
As Gramm and Ekelund explain, middle class people know that the wealthy make a lot more than the middle class does. But the middle classes can also see they've benefited from the goods and services brought to the market by the wealthy.
Meanwhile, a middle-class worker who has two jobs, or a 55-per hour work week looks around and sees relatives or neighbors who never seem to work, but who also have a similar standard of living. They might know perpetually unemployed acquaintances who rely on CHIP or Medicaid, free school lunches, food stamps, and a myriad of other programs, all available to many. Meanwhile, the middle class worker is putting in long hours to obtain the same amount of food and health care.
The middle class workers then realizes he's working to pay for his own needs, and also those of the neighbor.
It's easy to see why this might breed resentment.
Hillary Clinton was at it again the other day, complaining about how, if it weren't for that darned electoral college, she'd be president now. The Daily Mail reports on Clinton's remarks in her recent speech in the UK:
"Populists can stay in power by mobilizing a fervent base. Now, there are many other lessons like this," she said, adding that she had "my personal experience with winning three million more votes but still losing."
But there's nothing really novel about this. Clinton has been whining about the Electoral College since 2016.
The real news here, as Ed Morrissey at Hot Air notes, is that Clinton was condemning populism while at the same time condemning the electoral college. In other words, Clinton doesn't realize the electoral college is an anti-populist measure. In her speech
And why did the framers of the Constitution create it? To act as a buffer against populism, at least in form. The Electoral College reflects the popular vote on a state-by-state basis to prevent a handful of the most populous states from controlling the executive through the nationwide popular vote, which creates a buffer against the very impulse Hillary decries in this speech.
In other words, the purpose of the electoral college is to ensure that a successful presidential candidate appeals to a broader base of voters than would be the case under a simple majoritarian popular vote.
This makes it harder to win by doing what Clinton did during the campaign: focus on a thin sliver of rich Hollywood and business elites, coupled with urban ethnics.It's true that those two groups can offer a lot of votes and a lot of campaign dollars. But they also tend to be limited to very specific regions, states, and metro areas. The groups Clinton ignored: the suburban middle class and working class make up a much larger, more geographically diverse coalition. This can be seen in the fact that Trump won such diverse states as Alabama, Pennsylvania, and Wisconsin.
In 2016, the electoral college worked exactly as it's supposed to — it forces candidates to broaden their appeal. Or as a cynic like myself might say: it forces politicians to pander to a broader base.
Clinton complains that a fervent group of voters can take over the machinery of government. But that's harder to do with the electoral college than without it. So, Clinton is making a mockery of her own argument by one minute complaining about populism, and then complaining about the electoral college the next.
But it was the Clinton team that had the more populist strategy. For example, in the US the 4 largest states (California, Texas, New York, and Florida) constitute one-third of the US population. The top-ten largest states total 54 percent of the US population. Hillary thought she could just focus on the larger states and that would be enough. Her strategy was to ignore half the country, call them "deplorables," and just count on the resentments of people in some big cities to carry her to victory. It's hard to see how that's somehow less "populist" than what Trump did.
For Hillary Clinton though, everything is personal, and the fact that the electoral college came between her and the presidency means it must be a bad thing.
The fact that it also guards against Clinton-style demagoguery, however, doesn't make the electoral college "anti-democratic" as is thought my many who so tiresomely chant "we're a republic not a democracy." 50 separate presidential elections (plus DC and the territories) is not somehow less democratic than holding one big national election. It's simply a democratic method designed to ensure more buy in from a larger range of voters, not less. Other similar tactics include "double majorities" as used in Switzerland. And for all these reasons, as I note here, the electoral college should be expanded:
Double-majority and multiple-majority systems mandate more widespread support for a candidate or measure than would be needed under an ordinary majority vote.
Unfortunately, in the United States, it is possible to pass tax increases and other types of sweeping and costly legislation with nothing more than bare majorities from Congress which is itself largely a collection of millionaires with similar educations, backgrounds, and economic status. Even this low standard is not required in cases where the president rules via executive order with " a pen and ... a phone ."
In response to this centralization of political power, the electoral college should be expanded to function as a veto on legislation, executive orders, and Supreme Court rulings.
For example, if Congress seeks to pass a tax increase, their legislation should be null and void without also obtaining a majority of electoral college votes in a manner similar to that of presidential elections. Under such a scheme, the federal government would be forced to submit new legal changes to the voters for approval. The same could be applied to executive orders and treaties. It would be even better to require both a popular-vote majority in addition to the electoral-vote majority. And while we're at it, let's require that at least 25 states approve the measures as well.
No stranger to controversy, critically acclaimed rapper Kanye West has generated a whirlwind of media attention since returning to Twitter in late April . This controversial tweet storm reached its peak when West praised African American conservative activist Candace Owens for the way she thinks .
West also stirred the pot by claiming “Obama was in office for eight years and nothing in Chicago changed”, questioning former President Barack Obama’s record in bringing change to a city facing ongoing street violence .
A complete about-face from his race-baiting comments in 2005 , when he stated on live TV that then President George W. Bush did “not care about black people”, West’s recent comments have opened up considerable debate on racial affairs in America.
Pop Culture’s Potential to Change Politics
As mentioned before, West does have a penchant for stirring up controversy for publicity’s sake. In fact, West has openly admitted to being a “proud non-reader”, thus calling into question West’s political beliefs or lack thereof.
It remains to be seen whether West’s comments were sincere or reflect some sort of political “awakening” on his part, but the potential power of cultural figures like Kanye West still cannot be underestimated.
Nobel laureate economist F.A. Hayek understood the power of second-hand dealers such as academics, artists, journalists, and teachers in disseminating and popularizing ideas. These second-hand dealers play a crucial role in influencing policymakers and the general public.
Thanks to growing levels of distrust with government, a large segment of the population has lost faith in the traditional political process. Consequently, these disillusioned individuals have turned to entertainers and other pop culture icons like West as sources of credibility and relatability.
Questioning the Democrat vs Republican Narrative
Like the entertainer that he is, Kanye West has taken his social media rabble rousing to the recording booth.
“That’s the problem with this damn nation/All Blacks gotta be Democrats/Man, we ain’t made it off the plantation”.
Provocative lyrics aside, there exists a nugget of truth in West’s rap verse.
It is no secret that the Democrat Party enjoys monolithic support from the African American community. Democrat presidential candidates have averaged 87 percent of the African American vote in the past 12 presidential elections.
Why Democrats have dominated with African American voters has been highly debated among political commentators , but the majority of these discussions lead to the unproductive black hole of partisan politics.
And West has fallen into this partisan trap.
The real problem ignored in these debates is the elephant in the living room that is government interventionism — something both political parties have taken a fancy to implementing in one way or other once in power.
The broken schools , dependency on welfare services , and the deterioration of the family unit that the average African American living in the inner cities must currently put up with was unheard of for a good portion of U.S. history. From 1890 to 1954 , African Americans had similar participation rates in the labor force as whites and were able to ascend the economic ladder with ease.
However, the key ingredient to the African American community’s success during that time period was limited government, a salient feature of the Gilded Age up until the New Deal era. In sum, it’s more than just switching political parties that will help African Americans prosper, but rather focusing on creating an institutional environment that facilitates economic growth.
Crush Dissent at All Costs
Unfortunately, minority leaders and pundits ignore the socialist elephant in the living room and prefer to turn to race-baiting and victim politics. As a result, constructive political discussion has remained stagnant.
When intellectuals and political personalities like Larry Elder, Thomas Sowell, and Walter Williams propose free-market alternatives to common social issues, entrenched political commentators immediately dismiss them as Uncle Toms or race traitors .
And when Kanye West dared to question certain sacred cows, he was met with the same scorn from the mainstream media.
This type of discourse embodies the authoritarian nature of modern-day liberalism: Support diversity in name but promote one-size fits all narratives when controversial political subjects emerge.
Bringing it Back to Basics
Breaking barriers and bucking conformist trends form the bedrock of hip-hop culture. In the status quo of identity politics, Kanye West’s audacious statements line up perfectly with the original spirit of hip-hop.
Starting out as an obscure movement in the South Bronx during the 1970s, hip-hop would serve as a creative outlet for disgruntled African American youth that were tired of the inner-city conditions they lived in.
By the late 1980s, hip-hop solidified itself as one of the hottest musical trends in the United States.
Rappers under the influence of social justice narratives can rant about the horrors of capitalism as much as they want, but it was this same capitalist system that made hip-hop an integral part of American popular culture.
The jury is still out on whether or not West’s media escapade will fundamentally change racial political discussions. Nevertheless, a healthy degree of skepticism is advised when breaking down these recent developments.
In today’s environment of Team R vs. Team D politics, the temptation to gravitate towards one political party or the other for solutions is still strong. For African Americans, joining the Republican Party—or any other political party for that matter—does no guarantee the path to the promised land.
The GOP’s interventionist policies merit substantial criticism, and just like their Democrat rivals, the GOP has played an integral role in perpetuating the current welfare state paradigm that disproportionately hurts minority groups.
Moral of the story:
African Americans must look beyond the traditional political process for genuine socio-economic stability.
Let’s hope that Kanye West’s recent actions don’t turn out to be another of his long line of publicity stunts. For inner-city dweller’s sake, it’s high time to start talking about free-market solutions to their problems.
Dr. Guido Hulsmann joined the Tom Woods show last week to discuss his work on the cultural and political consequences to inflation - a subject often neglected by most economists.
To read more about Dr. Hulsmann's work on the topic, check out his book The Ethics of Money Production.
Yesterday’s Inspector General report on the FBI’s investigation of Hillary Clinton contained plenty of bombshells, including a promise by lead FBI investigator Peter Strzok that “We’ll stop” Donald Trump from becoming president. The report reveals how unjustified secrecy and squirrelly decisions helped ravage the credibility of both Hillary Clinton’s presidential campaign and the FBI. But few commentators are recognizing the vast peril to democracy posed by the sweeping prerogatives of federal agencies.
The FBI’s investigation of Clinton was spurred by her decision to set up a private server to handle her email during her four years as secretary of state. The server in her Chappaqua, N.Y. mansion was insecure and exposed emails with classified information to detection by foreign sources and others.
Clinton effectively exempted herself from the federal Freedom of Information Act (FOIA). The State Department ignored 17 FOIA requests for her emails prior to 2014 and insisted it required 75 years to disclose emails of Clinton's top aides. A federal judge and the State Department inspector general slammed the FOIA stonewalling.
Clinton’s private email server was not publicly disclosed until she received a congressional subpoena in 2015. A few months later, the FBI Counterintelligence Division opened a criminal investigation of the “potential unauthorized storage of classified information on an unauthorized system.”
The IG report gives the impression that the FBI treated Hillary Clinton and her coterie like royalty — or at least like personages worthy of endless deference. When Bleachbit software or hammers were used to destroy email evidence under congressional subpoena, the FBI treated it as a harmless error. The IG report “questioned whether the use of a subpoena or search warrant might have encouraged Clinton, her lawyers ... or others to search harder for the missing devices [containing email], or ensured that they were being honest that they could not find them.” Instead, FBI agents worked on “rapport building” with Clinton aides.