Wednesday, October 26, 2011

Jobs For Machines, Not People

Economists See More Jobs For Machines, Not People, via the New York Times

A faltering economy explains much of the job shortage in America, but advancing technology has sharply magnified the effect, more so than is generally understood, according to two researchers at the Massachusetts Institute of Technology.

The automation of more and more work once done by humans is the central theme of “Race Against the Machine,” an e-book to be published on Monday.

Many workers, in short, are losing the race against the machine,” the authors write.

Erik Brynjolfsson, an economist and director of the M.I.T. Center for Digital Business, and Andrew P. McAfee, associate director and principal research scientist at the center, are two of the nation’s leading experts on technology and productivity. The tone of alarm in their book is a departure for the pair, whose previous research has focused mainly on the benefits of advancing technology.

Indeed, they were originally going to write a book titled, “The Digital Frontier,” about the “cornucopia of innovation that is going on,” Mr. McAfee said. Yet as the employment picture failed to brighten in the last two years, the two changed course to examine technology’s role in the jobless recovery.

The authors are not the only ones recently to point to the job fallout from technology. In the current issue of the McKinsey Quarterly, W. Brian Arthur, an external professor at the Santa Fe Institute, warns that technology is quickly taking over service jobs, following the waves of automation of farm and factory work. “This last repository of jobs is shrinking — fewer of us in the future may have white-collar business process jobs — and we have a problem,” Mr. Arthur writes.

But Mr. Brynjolfsson and Mr. McAfee argue that the pace of automation has picked up in recent years because of a combination of technologies including robotics, numerically controlled machines, computerized inventory control, voice recognition and online commerce.

Faster, cheaper computers and increasingly clever software, the authors say, are giving machines capabilities that were once thought to be distinctively human, like understanding speech, translating from one language to another and recognizing patterns. So automation is rapidly moving beyond factories to jobs in call centers, marketing and sales — parts of the services sector, which provides most jobs in the economy.

During the last recession, the authors write, one in 12 people in sales lost their jobs, for example. And the downturn prompted many businesses to look harder at substituting technology for people, if possible. Since the end of the recession in June 2009, they note, corporate spending on equipment and software has increased by 26 percent, while payrolls have been flat.

Corporations are doing fine. The companies in the Standard & Poor’s 500-stock index are expected to report record profits this year, a total $927 billion, estimates FactSet Research. And the authors point out that corporate profit as a share of the economy is at a 50-year high.

Productivity growth in the last decade, at more than 2.5 percent, they observe, is higher than the 1970s, 1980s and even edges out the 1990s. Still the economy, they write, did not add to its total job count, the first time that has happened over a decade since the Depression.

From the author's Web site:

The stagnation in median income and employment is not because of a lack of technological progress. On the contrary, the problem is that our skills and institutions have not kept up with the rapid changes in technology. In the past, as each successive wave of automation eliminated jobs in some sectors and occupations, entrepreneurs identified new opportunities where labor could be redeployed and workers learned the necessary skills to succeed. In the 19th and 20th centuries, millions of people left agriculture, but an even larger number found employment in manufacturing and services.

In the 21st century, technological change is both faster and more pervasive. While the steam engine, electric motor, and internal combustion engine were each impressive technologies, they were not subject to an ongoing level of continuous improvement anywhere near the pace seen in digital technologies. Already, computers are thousands of times more powerful than they were 30 years ago, and all evidence suggests that this pace will continue for at least another decade, and probably more. Furthermore, computers are, in some sense, the “universal machine” that has applications in almost all industries and tasks. In particular, digital technologies now perform mental tasks that had been the exclusive domain of humans in the past. General purpose computers are directly relevant not only to the 60% of the labor force involved in information processing tasks but also to more and more of the remaining 40%.

As the digital revolution marches on, each successive doubling in power will increase the number of applications where it can affect work and employment. As a result, our skills and institutions will have to improve faster to keep up lest more and more of the labor force faces technological unemployment. We need to invent more ways to race, using machines, not against them.

In the end, Andy and I are optimistic that that we can harness the benefits of accelerating innovation. But addressing the problem starts with a correct diagnosis, and that’s what our e-book sets out to provide.
Commenting on Tyler Cowen's book, The Great Stagnation, Kevin Drum wrote:

So here's what I think Tyler missed: it's true that we've already made our big improvements in access to education, and we can't do that again. But even if the number of college grads stays about the same as it is now, and even if the quality of their education stays about the same as it is now, the effectiveness of their management skills is multiplied tremendously by the computerization of the workplace. The human beings who are managing our country might be about the same as the ones who managed it 30 years ago, but they're managing it with steadily improving software and networking. They'll keep doing that for a long time, and that will keep GDP growing in the same way that better and better exploitation of electricity did during most of the 20th century.

In other words, computerization isn't just about the internet, and it's not just about whether Facebook generates a lot of utility without generating a lot of traditional GDP. That's the sexy stuff, but for the next 30 years it's continuous improvements in the computerization of industry and the computerization of management that will be the big GDP driver. Providing well-educated humans with better computers is every bit as important as simply churning out more well-educated humans.

(And after that? I'm a true believer in artificial intelligence, and I figure that 30 or 40 years from now computers are literally going to put humans out of business. They'll dig ditches better than us, they'll blog better than us, and they'll make better CEOs than us. This is going to cause massive dislocations and huge social problems while it's happening, but eventually it will produce a world in which today's GDP looks like a tinker toy.)
Strangely, he does not follow up at all on that statement (!!). Later, he followed that up by writing:

Most of the best known inventions of the early 20th century were actually offshoots of two really big inventions: electrification and the internal combustion engine. By contrast, the late 20th century had one really big invention: digital computers. Obviously two is more than one, but still, looked at that way, the difference between the two periods becomes a bit more modest. The difference between the offshoots of those big inventions is probably more modest than we think too. Just as we once made better and better use of electrification, we're now making better and better use of digital computing. And to call all these computing-inspired inventions mere "improvements" is like calling TV a mere improvement of radio. These are bigger deals than we often think. We have computers themselves, of course, plus smartphones, the internet, CAT scans, vastly improved supply chain management, fast gene sequencing, GPS, Lasik surgery, e-readers, ATMs and debit cards, video games, and much more.

The flip side of this is that it's all too easy to overlook backroom process improvements. Looking at the first half of the 20th century, cars and radios and TV get all the attention, but the moving assembly line was probably more important than any of them. In the second half, Facebook and smart phones are the attention-getters, but the containerization revolution was far more important than either one. Likewise, Walmart revolutionized the retail industry in the '90s via its logistics and supply chain innovations, but hardly 1 person in 100 knows it. You could put the recent revolution in global finance in this category as well (though we obviously still have a few wee wrinkles to iron out of that one.) Computerization may be changing our daily lives, but it's arguably changed backroom operations even more, and will continue to do so.
But the key quote is this one:


"The key to innovation is the exploitation of really big inventions. Computerization is as big as it gets, and it has a much longer tail than electrification. We're not even close to mining its full potential yet."

It's as big as it gets, and it has a much longer tail than electrification. Exactly my point. I think Drum is really onto something here, which is why I quoted him at length. But the end game of computerization is the replacement of much of the workforce. When I mentioned my What Are People Good For thesis to someone, he challenged me by saying that computerization has created many more jobs than it has destroyed. After all, look at all those workers in the IT field who would not have jobs otherwise. Look at all the information technology programs and the good salaries that graduates receive. What would they be doing otherwise? It's a common argument, and one I've heard before. And it's true, there has been a tremendous boom in IT work and IT workers. From that perspective, computers have been like the assembly line, net job creators, at least for those whose skills tend toward computers. they've been more helpful than destructive.

So as I thought about it, I had to admit, he was right, to date. But where he goes wrong, and where the argument goes wrong, is in expecting that to continue into the future. To date, IT has created more jobs than it has destroyed. But going forward I believe will see a radical shift as we finally begin to mine the IT revolution as effectively as steam power and electrification. As Kevin Drum points out, we've merely scratched the surface. This is an important point (and we've already glutted the market for IT workers thanks to India, but that's another story).

A few posts ago I commented on the Singularity movement, and where I thought it goes wrong in it's thinking. But, despite my criticisms, one thing I have to admit about the movement is that it has done an excellent job of visualizing where the IT revolution is headed; its "long tail" as Kevin Drum put it. Just like Mormons are good at genealogy because it ties into their religious beliefs, Singularitarians seem particularly good at looking at future technology because it's a part of their religion. While Moore's Law may not explain all of society, it's been pretty good at predicting the trajectory of computer power.

And what they see is pretty spectacular. Almost every aspect of society can be automated using the artificial brain of the computer chip. As I pointed out in WAPGF, the vast majority of jobs are repetitive. Using a simple Pareto formula, it's easy to estimate that 80 percent of jobs are repetitive, while only 20 percent require real creative, managerial thought. How will we cope with 80 percent unemployment?

You see, it's the imperative of Capitalism to create more output with less labor. If you take this to its logical conclusion, the Omega Point if you will, it's goal is to create infinite output with zero labor!

Of course, this is impossible, but with computerization and other manufacturing techniques (lean manufacturing, 3D printing, nanotechnology), it may get a lot closer than we realize. And long before that happens, the economy will collapse. And it will happen long before we get to 80 percent unemployment. We're heading towards 20 percent already.

To reiterate, IT and automation do not have to replace all workers, they just have to prevent enough jobs from being created (and salaries being paid) to cause a crisis. Higher productivity causes the economy to become unglued if displaced workers cannot find new work. I'm glad that another set of economists is finally taking this on. The fact that they came to their conclusion, after initially being skeptical, by examining all the evidence is promising. I hope more people will finally start to take this seriously.

But as for optimism, I find very little reason for that. I know it's obligatory to put the final "optimism" chapter in any book highlighting whatever dire problem we're facing, but somehow they always seem half-hearted. They offer solutions that we know are completely impossible to implement given current political and social realities, yet we're supposed to "hope" that our leaders will rise to the challenge of making the necessary changes. Meanwhile, our leaders, like cargo cultists, still pray for "growth" and innovation" to save our dying economic system. Ain't. Gonna. Happen.

What WILL happen? Nobody know for sure, but over the next few months, we'll take a look at what history has to teach us. One thing's for sure, unless something major changes, the future isn't brighter than we think. It's much, much darker.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.