Mises Daily Articles
The Free-Market Lesson of the Web
The World Wide Web was invented in 1992 by Sir Tim Berners-Lee as a simple mechanism to share scientific papers with colleagues. The key innovation of the web was the use of hypertext — the mechanism by which we click on a link, such as a chunk of highlighted text, and are able to download the target document automatically. Although this is a simple idea, the web has changed the world we live in. Its rise is also a superb example of what happens when the private sector is left alone to meet market needs.
Despite its great complexity and rapid development over the last 10 years, the web community works largely without state intervention of any sort. Web designers did not need the hand of government to develop the skills to create ever more complex websites; IT professionals did not wait to read official reports saying they had to adapt as the technology changed; and companies were quick to offer the ever-evolving range of services needed for the web to run smoothly.
In other words, the private sector adapted, and adapted very quickly. Free-market mechanisms did what they always do — they rushed to meet consumer needs. This is reflected not only in the wide range of products available but also in the rapid drop in prices of almost every aspect of the web. Ten years ago, a personal website was an expensive proposition, especially if you needed anything professional or polished. Today, in the form of blogging software or services like Facebook, it is free. The overall cost of entry — taking into account the cost of training needed only a decade ago and now no longer necessary — has not so much dropped as evaporated. This low cost of entry has allowed a wide variety of individuals and companies to trade online, providing considerable choice for consumers.
Although the growth we have seen online is exceptional, it is still only a faster version of something capitalism does well: meeting a myriad of needs in a diverse society. It is difficult to imagine a better example of the free market at work.
Equally important is what has not happened. The web is largely divorced from government control and provides private-enterprise examples of large-scale undertakings that many statists claim can only be accomplished by the public sector. We routinely hear that the guiding hand of the state is required for complex projects. But the Internet itself, with its vast number of interconnected computers, is one of the most complex entities ever created by human beings, and much of it has grown without any central planning at all.
Similarly, government often steps in whenever there are perceived dangers to the public; hence pharmaceutical testing, standards regulation, and antifraud laws. But it is evident that the Internet, as an example of a relatively free market, often derails these arguments. The roaring trade in pharmaceuticals online — from antibiotics to endless adverts for Viagra — demonstrates the willingness of many to make their own informed decisions about personal risk.
Statists have often argued that the government must regulate standards. But the web itself is built using commonly agreed-upon standards that allow any Internet browser to make sense of any page published online. Again there were few central sources for this; the market, in the form of web designers, programmers, and users agreed upon common methods for tackling the immense variety of different websites, from streaming video to secure shopping carts to plain text. None of it required laws or regulation, and its extremely fast development was voluntary and benefited all.
Conversely, attempts to create agreed-upon, written standards by various web organizations have often emerged but are typically years behind actual practice. The lesson is simple: many competing players interacting with the public will hammer through a myriad of alternative options as a natural process whose end result will by definition reflect the direct choices people wish to make.
Fraud is a criminal activity and is therefore seen as the natural domain for government intervention. But even here we see how a free market deals with this activity. Despite vigorous attempts by fraudsters, the public has largely learned to cope with online rackets. Spam emails are quickly deleted, assuming they make it past the specially designed spam filters developed by software engineers and sold to Internet service providers and individuals. Profit-oriented private companies have reputations to protect, prompting many of them to explicitly state to customers that they never communicate using insecure channels such as email, and that they will never ask for sensitive information unless elaborate protection is in place, such as on their own custom-designed, secure websites.
In other words, the market has adapted, partly with information campaigns to educate, and partly through voluntary changes in behavior. One thing is certain: no government intervention was required. But it is not difficult to imagine a statist proposal to limit the harm of fraudulent emails and other online scams. It would no doubt involve a set of regulations controlling who can send these dangerous electronic messages to innocent and technologically inept people. The pitch would emphasize the inability of individuals to understand the complexities of online systems — hence the need for government-approved expertise.
A key charge of those statists who focus on the Internet is the ostensible role government can play in controlling access to disturbing material such as child pornography or depictions of violent rape. Current legislation covering print and broadcast material is pointed to as an example of necessary intervention; surely it should be updated and transferred to encompass the web? But in a free market, goods and services are determined by customer demand, and demand for obscene material as defined by the existing legislation is virtually nonexistent. The web as it currently stands certainly contains illegal content, but it is not readily available; it must be specifically hunted down, and we can assume such knowledge requires a preexisting network not dependent on the Internet.
Those who call for more regulation of the Internet often state that their aim is to prevent hapless citizens from seeing this kind of extreme pornography. The main charge is that we need to ensure the innocent will not stumble upon it by chance. As noted, this scenario ignores the fact that output (in this case, web content) is determined by the market. Objectionable content is difficult to find online for the same reason it is not broadcast on primetime TV — because almost no one wants it.
The statists' claim also ignores the role content providers play. Every major online player that accepts content from the public has strict rules governing what is considered acceptable. No company can afford to be associated with anything the public finds disagreeable, whether it is legal or not.
Many confuse a desire for law enforcement with a desire for the government to find a way to take control of an information medium because some people use it for illegal purposes. This is analogous to the state controlling roads on the grounds that they are sometimes used by bank robbers: no one would take this seriously, but nonetheless the same idea is proposed for the web.
Legislation designed to protect web users from casual exposure to child pornography would do nothing to stop the storage of the material, because it is already illegal and happens anyway. Content providers, however, have every incentive to control access to it because they rely on the goodwill of their audience. That is, the free market provides a more efficient method of regulation, because content providers have a strong incentive to self-regulate based on the distasteful nature of extreme pornography. Any attempt by the state to target this narrow area, such as installing government-controlled firewalls, would target us all, and it would simply use the potential for illegal activity as an excuse to monitor everyone.
The meteoric success of the web is almost a textbook case for libertarians on the benefits of unfettered markets. When people understand they are in the driving seat and that their actions count, things happen. The market provides, and people choose from their options. The more options they have, the more competition there is for our business, and the more likely we are to have our needs met. In addition, individuals are not static creatures. Something that 15 years ago was the exclusive domain of advanced computer users has become a simple everyday experience for many — one no more challenging than using the phone or a microwave oven.
This explosion in growth of the web should put to rest the criticism often expressed about libertarian ideals — that they are a fantasy because ignorant citizens interacting with the private sector cannot be trusted to provide for society's needs, and that the important stuff can only be managed by those who work for a higher ideal and not for profit.
There are many examples like the web, where government either does not or cannot intervene, and things work out just fine. The success of the web, and the clear role the consuming public has played in developing it along lines that it approves of, demonstrates the ability of society to get what it needs from a free market.
As we begin to hear distant rumblings from the political class for the need to regulate the Internet in order to protect us from pornography, jihadism, or attempted fraud, we must resist their calls for more control. Governments are not competent to control the web, and their tinkering would diminish our ability to make our own choices. They would move decision making away from our own tastes and preferences toward their state-approved model, ostensibly to protect us. As the web has shown, whatever protection we need in the future will soon be provided as if by magic, precisely because a free market responds directly to people's needs.