‘Tectonic Shifts’ in Employment
The United States faces a protracted unemployment crisis: 6.3 million fewer Americans have jobs than was true at the end of 2007. And yet the country’s economic output is higher today than it was before the financial crisis. Where did the jobs go? Several factors, including outsourcing, help explain the state of the labor market, but fast-advancing, IT-driven automation might be playing the biggest role.
Since the beginning of the Industrial Revolution, people have feared that new technologies would permanently erode employment. Over and over again, these dislocations of labor have been temporary: technologies that made some jobs obsolete eventually led to new kinds of work, raising productivity and prosperity with no overall negative effect on employment.
There’s nothing to suggest that this dynamic no longer operates, but new research is showing that advances in workplace automation are being deployed at a faster pace than ever, making it more difficult for workers to adapt and wreaking havoc on the middle class: the clerks, accountants, and production-line workers whose tasks can increasingly be mastered by software and robots. “Do I think we will have permanently high unemployment as a consequence of technology? No,” says Peter Diamond, the MIT economist who won a 2010 Nobel Prize for his work on market imperfections, including those that affect employment. “What’s different now is that the nature of jobs going away has changed. Communication and computer abilities mean that the type of jobs affected have moved up the income distribution.”
Erik Brynjolfsson and Andrew McAfee study information-supercharged workplaces and the innovations and productivity advances they continually create. Now they have turned their sights to how these IT-driven improvements affect employment. In their new book, Brynjolfsson, director of the Center for Digital Business at MIT’s Sloan School of Management, and McAfee, its principal research scientist, see a paradox in the first decade of the 2000s. Even before the economic downturn caused U.S. unemployment to rise from 4.4 percent in May 2007 to 10.1 percent in October 2009, a disturbing trend was visible. From 2000 to 2007, GDP and productivity rose faster than they had in any decade since the 1960s, but employment growth was comparatively tepid.
Brynjolfsson and McAfee posit that more work was being done by, or with help from, machines. For example, Amazon.com reduced the need for retail staffers; computerized kiosks in hotels and airports replaced clerks; voice-recognition and speech systems replaced customer support staff and operators; and businesses of all kinds took advantage of tools such as enterprise resource planning software. “A classically trained economist would say: ‘This just means there’s a big adjustment taking place until we find the new equilibrium—the new stuff for people to do,’ ” says McAfee.
We’ve certainly made such adjustments before. But whereas agricultural advances played out over a century and electrification and factory automation rolled out over decades, the power of some information technologies is essentially doubling every two years or so as a consequence of Moore’s Law. It took some time for IT to fully replace the paper-driven workflows in cubicles, management suites, and retail stores. (In the 1980s and early 1990s productivity grew slowly, and then it took off after 1996; some economists explained that IT was finally being used effectively.) But now, Brynjolfsson and McAfee argue, the efficiencies and automation opportunities made possible by IT are advancing too fast for the labor market to keep up.