Technology and Life Itself
I'm reading about the remarkable push by IBM's Stuart S.P. Parkin to shrink the physical size of data memory to 100th of its present bulk. It's an amazing prospect, and we aren't talking about technological advance for its own sake. "That means," says the New York Times writer who knows how to interest readers in what would otherwise be arcane, "that the iPod that today can hold up to 200 hours of video could store every single TV program broadcast during a week on 120 channels."
Now we care! There is something about the commercial viability of technology that rivets our consumer-minded brains. This is not a bad impulse! The fewer the bumps and snags in life, the more productive we can be, and the more productive we are, the more wealth and time we have to cultivate higher pursuits. Even if we don't pursue the higher things, our well-being goes up with new and better technologies, and society is better off.
We know this with one part of our brains. But there's another part that doesn't consider the broader implications. In truth, even those familiar with market logic are accustomed to thinking of big gains in technology as the business of government or government-funded institutions such as universities or major research labs. We picture people in white coats who are somehow insulated from the horrible pressures of commercial society. They think big and long term, and eventually their discoveries trickle down to the rest of us and are snapped up by business, which uses them to make a profit.
Maybe this is a habit leftover from the official story of the atom bomb in World War Two, probably the most famous case ever of government pushing people to innovate in a way that changed our world. Unfortunately, that technology was used to slaughter people in ghastly ways, and only later did it achieve some commercial viability and thereby justification. The same is true of the internet itself. When it was the exclusive province of bureaucrats and their messages, society didn't benefit. Now it is the world's primary means of knowing, sharing, communicating, and, increasingly, exchanging.
And what good is technology unless it has some benefit to people? If you think about it, none whatsoever. A chemical that could increase the world's supply of mosquitoes a trillion fold in one minute would be useless because we don't actually want to do that, no matter how impressive the "technological advance" might be. The rocket shoes that could make us fly would be wonderful, unless making them available cost more than the gross national product.
And how do we discern what is and is not beneficial to mankind? The answer comes down to economics. There is no point to advance for its own sake. Faster and better ways of doing something we don't need to do serve no reasonable purpose. It is people, and the profit and loss test, that end up as the ultimate determinant.
The story of IBM's work to shrink memory reminds us that the market is the primary means for pushing the technological frontiers. It is private enterprise that has the incentive to do it right, and can provide the profit-and-loss infrastructure to know whether the advance is really good for society, and the means to make the advance available for all of humanity. (Now, it's true that IBM gets government grants and to that extent, its R&D department partakes of the waste endemic to fully government-run organizations; but note that the exciting and dynamic part of its research is pointed toward the commercial marketplace.)
In regard to the incentive questions, scientists who make breakthroughs are entrepreneurs of a different sort. They are not risk averse. They are dreamers who imagine things that are unknown and take bold steps forward that others are unwilling to take. These are traits that do not thrive within a large-scale bureaucratic framework, as Rothbard wrote in 1954.
The profit and loss test becomes the key to restraining their dreams, just as in everyday commerce. So we hear that the new push by IBM would "allow every consumer to carry data equivalent to a college library on small portable devices," and it would "unleash the creativity of engineers who would develop totally new entertainment, communication and information products." But within the story itself, we are reminded of reality. As the head of Seagate storage says: "There are a lot of neat technologies, but you have to be able to make them cost-effectively."
Finally, there is the question of marketing the technology. Herein lies most of the battle. Those who can go from the lab to the retail shop are the ones who make the big bucks in the market economy. Anyone can tinker at home. But it takes enterprise and marketing to get the product from here to there, and finally to its intended destination.
Now we come to the perpetual myth that we are experiencing a shortage of scientists. Every few years, this claim is touted by government elites who issue grave warnings that we'd better act, and fast. But the usual way to tell a shortage is to observe a price rise. That rise solicits new entries into the sector. And scientists can tell you that there has been no wage increase relative to other professions, except and insofar as these scientists serve commercial interests. In other words, there might be a shortage of specific kinds of talents, but there is no general shortage of people with merely abstract knowledge.
What does society need to do to make sure that it has enough scientists and the right level of technological advance that is also economically viable? It needs a free market.