There are two broad views of technology and how it is likely to change in the foreseeable future.
The first is pessimistic: Previous waves of technology had a much greater impact on the organization of work and society than could any existing or conceivable future innovation.
The second is more optimistic: The advent of cheap computing has fundamentally changed the way many people do their jobs. This will unleash productivity improvements that are at least as great as anything experienced by previous generations.
The data aren’t helpful for sorting out which of these views is correct. There is a lot of noise in short-term productivity numbers, and we are coming out of a very deep recession that has probably kept productivity growth lower than it otherwise would be.
And the pessimists have a point. Compare our world with that of 1850, particularly the U.S. or western Europe. The big changes in technology happened early: new forms of power (culminating in electricity), breakthroughs in medical technology (especially as applied to public health), and the arrival of fast, reliable transportation that was accessible to many (the train and then, of course, the car).
We spent more than a century improving these technologies and making them generally -- with some spectacular exceptions -- safer, cheaper and cleaner to run. But it is very unlikely that there will ever be a commonly available form of mass transportation that is faster than modern passenger jet airplanes. And today’s jets actually travel a bit slower -- to improve fuel efficiency -- than their predecessors did in the 1960s.
Has technology become stuck, with the result that productivity won’t improve as in the past and real wages -- at least on average -- will stagnate? To think about the answer, look in an unlikely place: the controversy about the use of imaging and computing in the sport of cricket.
To many who haven’t played cricket, the sport is largely incomprehensible, and matches seem to take far too long. The idea that any game could last five days seems strange in the modern world.
And it is certainly the case that cricket reflects, in part, a society that was formed beginning in the mid-1800s. The sport itself is much older, but the current organized form really came to the fore in the U.K. and some colonies during the latter part of the 19th century.
The first “tour” of England by players from another country was actually in 1868, when a team of indigenous Australians visited and did much better than expected. Cricket took off with the urbanization of U.K. society (along with soccer and rugby; the development of baseball, basketball and football in the U.S. follows a broadly similar timeline).
And beginning in the 1880s, there has been an epic series of grudge matches between the English and the Australians known as the Ashes (the name was originally a joking reference to the ashes of English cricket after the first Australian victory in 1882; there is now a small, symbolic urn awarded to the winners).
There is no better example of a product of the Industrial Revolution than this five-day international test match, played by two teams of 11 men dressed substantially in white on a well-groomed grass pitch in arenas with evocative names such as Lords and the Oval. Yet at the heart of this traditional sport is now a technological revolution. I’m not speaking about the British Broadcasting Corp.’s use of the Internet to broadcast games, which includes insightful or entertaining comments from fans. And I’m definitely not referring to the ability of the authorities to mitigate the effects of rain (two of the England-Australia games this summer were spoiled) -- please don’t even mention the possibility of a retractable roof.
The technological issue is quite precise: how to measure where the cricket ball is headed and which part of the batsman or his equipment it strikes (a frequent occurrence with important potential implications).
International cricket now uses some high-tech measurement devices and software, but this isn’t going well. The England-Australia series this summer, which ended Aug. 25, witnessed all manner of embarrassing technological failures and misuses. Someone forgot to switch on the right piece of equipment and missed a critical moment. Officials frequently misinterpreted -- in the view of informed observers -- what the technology indicated did or didn’t happen, with all possible combinations of mistakes.
Some traditionalists argue that cricket should give up on the technology and go back to the old system, in which an umpire makes a judgment based purely on his own eyesight and knowledge of the game.
But you can’t go back again. You cannot refuse to embrace more precise measurements. Of course, cricket authorities could drop the technology -- stranger things have happened. But then millions of people watching around the world would know, based on television images or someone’s mobile-phone photo, what really happened -- and why the umpire has just made a fool of himself. Too much of that, and they’ll all go watch or play something else, like Premier League fantasy soccer.
Tennis has figured this out, though there are still a few prominent players who think they can better determine whether the ball was out than any computer projections can. (To be fair, the technical problem in tennis is simpler: Where did the ball hit? The cricket machinery frequently needs to determine where the ball was going next.)
As Joel Mokyr pointed out in a recent essay, much scientific progress begins with improving measurements -- of heavenly bodies and microscopic organisms. A big chunk of the rest comes through trial and error and wanting to make things operate better, because otherwise someone else would steal the business. And science has made a bigger impact over time. As Mokyr wrote, the steam engine was a great piece of engineering, but “physics learned more from the steam engine than the steam engine from physics.”
Inventing new things is hard. Figuring out how to manage their applications in a sensible manner is even harder. Technological progress is no panacea. It doesn’t necessarily lift everyone out of poverty, and it doesn’t mean that any of our jobs are safe. But the process of creating and applying new technologies with far-reaching implications isn’t over.
Measuring and predicting the trajectory of a cricket ball is a very small part of the latest technological revolution unfolding before our eyes. This kind of applied-computing innovation will change how we apply ourselves to everything, even our most traditional activities.
(Simon Johnson, a professor at the MIT Sloan School of Management as well as a senior fellow at the Peterson Institute for International Economics, is co-author of “White House Burning: The Founding Fathers, Our National Debt, and Why It Matters to You.”)
To contact the writer of this article: Simon Johnson at email@example.com.
To contact the editor responsible for this article: Max Berley at firstname.lastname@example.org