4. Darkness on the Edge of Town
The wheels started to come off in the 1970's, prompted in part by several oil crises. At the same time, the United States began to face steep international competition from other nations, particularly Japan, who had developed highly-efficient, rational, and highly automated manufacturing systems based upon the ideas of American industrial engineers such as Taylor and W. Edwards Deming. To compete, U.S. factories began to automate and consolidate, a trend that picked up steam with the financialization of the American economy in the 1980's under Ronald Reagan. America began to undergo the process of deindustrialization. For the first time since the war, joblessness became a major concern. As factory jobs disappeared, politicians told an anxious public that the loss of factory jobs was nothing to fear. As American companies became more "competitive", they argued, they would create new jobs in other areas through "innovation." The capital freed up by automating repetitive factory jobs would be invested in new industries that would easily reabsorb the displaced workers, they claimed. All that was needed were low taxes and deregulation.
Manufacturing had absorbed the mass of dispossessed workers as agriculture became mechanized. With manufacturing gone, what was to absorb those displaced workers? Some touted a "service economy" where low-wage service work would absorb them. But the low status and wages of these jobs led to a downward spiral of living standards. Industrial cities like Cleveland and Detroit started to fall apart. Many white Americans got into their cars and fled to the suburbs, where they took clerical and office jobs. For many African-Americans who had lost manufacturing jobs and had no access to transportation, things did not turn out so well. The nation's urban areas became ghettos, with crime and drugs taking their toll. Many towns throughout the Midwest were founded entirely on manufacturing, converting local raw materials like wood, stone, agricultural products or metal ore into products and shipping them off via railroad or seaway. America had once been the world's factory floor. Without access to education, these towns became decimated, with only low-wage service jobs or health care jobs available. Some fled to cities to look for work while others took civil service jobs or enlisted into the military as a way out. Wal-Mart became the nation's largest employer, taking over from General Motors. Food and durable goods were cheap and plentiful, but the costs for housing, education, transportation and health care skyrocketed. Banks extended easy credit to make up for lost income, and people sank heavily into debt to pay for what they could no longer buy with their stagnant wages.
Then came the double-whammy - the Internet and globalization. China, with its billion-plus population provided an unbeatable combination - unlike many developing countries, it was thoroughly industrialized thanks to the legacy of Mao, and it had a vast, inexhaustible pool of cheap, easily exploited labor. The Internet and computers allowed vast supply chains to stretch around the world, with the coup-de-grace provided by discount retailers such as Wal-Mart putting pressure on suppliers of goods to lower prices at all costs or lose contracts to those who could. This perfect storm hollowed out American manufacturing. During the nineties, the vast pool of cheap Chinese labor substituted for automation. Due to the relative difference in currencies, even if American workers were paid a dollar an hour, they could still not compete on price. America entered what was termed the "post-industrial economy."
Eventually, American politicians no longer even paid lip-service to a revival of American manufacturing. In the nineties, president Bill Clinton told the American people that "these jobs are gone, and they aren't coming back." Clinton told unemployed workers that they needed to retrain for new "high-tech" jobs. Economists trotted out a series of buzzword-centric economic models like the "knowledge economy," the "information economy," and the "experience economy." Economists claimed that these new economic models would take the place of manufacturing and provide stable employment for the masses. Yet none of these economic models panned out. A growing wealth gap between workers and the investor class grew to obscene levels. Corporations took advantage of the vast labor pool to increase profits. Workers were increasingly left out in the cold. The jobs that were created since the 1970‘s were predominantly low-paying jobs of lesser quality than those in the past, at least for the majority of workers.
In 1996, iconoclastic economist Jeremy Rifkin published The End of Work. Rifkin contended that a combination of higher productivity and automation would eventually displace workers in every field as computers became cheaper and artificial intelligence became more refined. Rifkin provided an extensive history of labor dislocations, including an analysis of Technocracy’s arguments during the Depression years. He contended that the amount of workers had outstripped the need for them back in the Depression, and that only a series of bubbles, from the postwar consumer/suburbanization bubble to the financial bubble to the Dot-com bubble to the credit bubble (and, eventually, the housing bubble), had provided the illusion of an economy that could provide enough jobs for everyone. Yet, he predicted, the drive for automation would continue unabated, eventually destroying the purchasing power needed to sustain all those bubbles and inexorably ushering in an era of permanent unemployment for the masses. Nearly every page could be quoted as relevant to our discussion, but I will only offer this snippet from chapter 1:
"While earlier industrial technologies replaced the physical power of human labor, substituting machines for body and brawn, the new computer-based technologies promise a replacement of the human mind itself, substituting thinking machines for human beings across the entire gamut of economic activity. The implications are profound and far-reaching. To begin with, more than 75 percent of the labor force in most industrial nations engage in work that is little more than simple repetitive tasks. Automated machinery, robots, and increasingly sophisticated computers can perform many of not most of these jobs. In the United States alone, that means that in the years ahead more than 90 million jobs in a labor force of 124 million are potentially vulnerable to replacement by machines. With current surveys showing that less than 5 percent of companies around the world have even begun to make the transition to the new machine culture, massive unemployment of a kind never before experienced seems all but inevitable in the coming decades. Reflecting on the significance of the transition taking place, the distinguished Nobel laureate economist Wassily Leontief has warned that with the introduction of increasingly sophisticated computers, ‘the role of human as the most important factor of production is bound to diminish in the same way that the role of horses in agricultural production was first diminished and then eliminated by the introduction of tractors’"
The End of Work, p. 6
The timing of Rifkin’s message could not have been worse; the book came out right as America was in the throes of the dot-com boom, and mainstream economists proclaimed that the Internet was the long-promised "innovation" that would provide full employment for anyone who could learn HTML or become a Microsoft Certified Professional. Newly created Internet startups attracted billions in investment, and ordinary people were starting multi-million-dollar companies in their basements. It was a time of exuberant optimism. Of course, the dot-com bubble burst and made a mockery of such predictions. Internet companies folded left-and-right, and while a few survivors remained, billions of dollars were lost, with the resulting employment fallout. More importantly the Internet also allowed office jobs to be accomplished anywhere in the world, allowing even clerical and professional jobs to be performed by low-cost labor half a world away. Outsourcing was now coming for the suburban white-collar worker.
But the nation bounced back quickly--the next bubble was just around the corner, and this one was the biggest yet. The housing bubble was built on nothing more than rising asset values, financial fraud, and easy credit. The jobs that were created were almost all related to finance, real-estate sales, and services to those who used their homes as cash machines. Then in 2008 it all came crashing down. All of the job gains over the past decade vanished, leaving a decade of no net job growth, despite an increasing population, massive immigration from Mexico, and numerous rounds of tax cuts. Unemployment soared, and even creating enough jobs for new entrants into the labor force became an impossible task. Even if job growth were to return to the levels of the 1990‘s, it would take a decade just to bring the unemployment down to the level it was before the housing crash, and no one seriously expected that to happen. Without another bubble, Americans were told that high unemployment levels were simply a "new normal" and that government was powerless to change the situation. Rifkin’s stark predictions, presented twelve years too early, were finally starting to hit the mark.
Please read Part 1 of this article
Please read part 2 of this article
Please read part 3 of this article
Please read part 5 of this article
Please read part 6 of this article
What's a Hipcrime? You committed one when you opened this blog. Keep it up. It's our only hope.
Thursday, March 31, 2011
Wednesday, March 30, 2011
What Are People Good For? (part 3)
3. The Ballad of John Henry
We've been heading this way for quite some time - almost the entire history of the Industrial Revolution, in fact. For most of human history the vast majority of human beings have had to work the land in some way to provide adequate sustenance. The British Agricultural Revolution changed all that by drastically reducing the amount of workers needed for agricultural production. These displaced workers filed into overcrowded and squalid cities, where they became the surplus labor for the newly emerging method of factory production, powered by England's vast coal deposits and inventions such as the steam engine and the power loom.
Factory work came to dominate the world in the nineteenth and early twentieth centuries. The earliest factories, like Richard Arkwright's cotton mills were actually set up to produce textiles. The steam engine kicked off a race to develop more and more mechanical devices to do work, while the emergence of precision tooling and measurement meant that these inventions could be mass produced on a huge scale by relatively few workers. The application of steam powered engines, and eventually electrically powered engines, allowed a few workers with machines to do jobs that would have taken years of human labor. It has been estimated that a single barrel of oil contains the equivalent of ten years of human labor. Henry Ford pioneered the application of mass production and interchangeable parts (both prior inventions) to sophisticated mechanical devices like automobiles, and engineers like F.W. Taylor strived to eliminate waste and create maximum worker efficiency in the new science of industrial production. Inventions such as the electric light allowed factories to run 24 hours a day.
In the 1920's, The Technocracy Movement pointed out that as workers became more efficient, less of them were needed. As the amount of goods per worker increased, small amounts of workers could produce massive amounts of goods; enough for everybody and more than people could reasonably purchase. With such a surplus of goods, they argued, prices would have to fall, destroying the purchasing power needed to consume those goods. Their grim analysis seemed to be proven right when the nation's economy went off the rails in 1929, even as the United States was still flush with oil and coal and factories stretched from coast to coast. The Technocrats' radical solution was to plan economic production by putting engineers in charge, and to mass produce goods based on available energy reserves rather than money capital, distributing them without the burden of a price system. The Technocrats noted that the price system only functions if food and goods are reasonably scarce, not abundant, and that business interests would prefer scarcity to abundance, as it leads to higher profits. They also pointed out that the incredible efficiency of the production line meant that were far more workers looking for work than there were jobs available. It's worth noting that this analysis was made well before the advent of advanced robotics, bar codes, or digital computers, and when almost half of all Americans still lived on family farms.
While it had its supporters, the Technocrats' radical solution, with it's quasi-socialistic undertones alienated much of the general public. The American people were not willing to abandon democracy and put engineers in charge. Unfortunately, it also meant that their astute analysis of the wage and employment situation fell by the wayside. The economy's woes were blamed entirely on an errant financial market, with the underlying problems swept under the rug. The nation turned instead to Roosevelt’s New Deal, which advocated government creation of jobs to provide opportunities to the unemployed and pump purchasing power back into the economy (based on the work of Keynes). It did not address overproduction and fundamental labor surpluses, however, and the nation lingered on in the doldrums of the Depression for a decade.
We all know what happened next. World War 2 came along and the government seized control of the economy and provided full employment. Essentially, what they did was similar to what the Technocrats advised, only they needed a war to make it happen. Goods were rationed. Prices were no longer a problem, as the government was picking up the tab, and overproduction was not an issue since the end result of all these products was to be blown up. Factories worked overtime, and full employment was achieved. The United States' massive weapons production was decisive in defeating the Axis powers, whom they could vastly outproduce.
After the war was over, there was a very real concern that the economy would just go right back into the depression it had emerged from. The solution proposed by the nation's political and corporate elites was to create a consumer economy - a large middle class paid well enough to afford the vast economic output produced by the nation's factories and generate enough economic activity to put everyone to work. The middle class would be encouraged to consume by advertisements beamed into their homes by television, and this was coupled with vast suburbanization which required new house construction, automobiles, roads, furniture, household appliances, services, conveniences, schools, supermarkets, etc. The nation's factories, the only ones to emerge unscathed from the war, dominated world production and provided stable, well-paying employment for millions with excellent benefits. The nation became dependant upon housing starts and automobile production, which still dominate economic discussion to this day.
Yet there was some unease. The novelist Kurt Vonnegut, who was working in the newly created field of corporate public relations for General Electric in 1952, was given a tour of a factory where an early computer operated milling machine was cutting rotor blades for jet engines and turbines, something extremely difficult for traditional machinists to accomplish. The experience prompted him to pen his first novel, Player Piano, depicting a world where all labor was displaced by machines, and conflict ensued between the engineers and managers who ran production, and the displaced laborers who found themselves superfluous to society. The novel deals more with the moral issues presented by this society, rather than the practical ones. Indeed, Vonnegut's displaced workers are fairly well cared for by a generous welfare state that would be impossible to imagine in the America of today. Vonnegut's extreme scenario was consigned to the realm of speculative fiction. The American economy at the time was booming, and any displaced factory workers and machinists easily found new jobs as salesmen, office workers, carpenters, radio announcers, ad copywriters, franchise owners, truck drivers, etc. During this time one income was sufficient to maintain a comfortable lifestyle, and concerns about overproduction and automation were quickly forgotten.
Please read Part 1 of this article
Please read part 2 of this article
Please read part 4 of this article
Please read part 5 of this article
Please read part 6 of this article
We've been heading this way for quite some time - almost the entire history of the Industrial Revolution, in fact. For most of human history the vast majority of human beings have had to work the land in some way to provide adequate sustenance. The British Agricultural Revolution changed all that by drastically reducing the amount of workers needed for agricultural production. These displaced workers filed into overcrowded and squalid cities, where they became the surplus labor for the newly emerging method of factory production, powered by England's vast coal deposits and inventions such as the steam engine and the power loom.
Factory work came to dominate the world in the nineteenth and early twentieth centuries. The earliest factories, like Richard Arkwright's cotton mills were actually set up to produce textiles. The steam engine kicked off a race to develop more and more mechanical devices to do work, while the emergence of precision tooling and measurement meant that these inventions could be mass produced on a huge scale by relatively few workers. The application of steam powered engines, and eventually electrically powered engines, allowed a few workers with machines to do jobs that would have taken years of human labor. It has been estimated that a single barrel of oil contains the equivalent of ten years of human labor. Henry Ford pioneered the application of mass production and interchangeable parts (both prior inventions) to sophisticated mechanical devices like automobiles, and engineers like F.W. Taylor strived to eliminate waste and create maximum worker efficiency in the new science of industrial production. Inventions such as the electric light allowed factories to run 24 hours a day.
In the 1920's, The Technocracy Movement pointed out that as workers became more efficient, less of them were needed. As the amount of goods per worker increased, small amounts of workers could produce massive amounts of goods; enough for everybody and more than people could reasonably purchase. With such a surplus of goods, they argued, prices would have to fall, destroying the purchasing power needed to consume those goods. Their grim analysis seemed to be proven right when the nation's economy went off the rails in 1929, even as the United States was still flush with oil and coal and factories stretched from coast to coast. The Technocrats' radical solution was to plan economic production by putting engineers in charge, and to mass produce goods based on available energy reserves rather than money capital, distributing them without the burden of a price system. The Technocrats noted that the price system only functions if food and goods are reasonably scarce, not abundant, and that business interests would prefer scarcity to abundance, as it leads to higher profits. They also pointed out that the incredible efficiency of the production line meant that were far more workers looking for work than there were jobs available. It's worth noting that this analysis was made well before the advent of advanced robotics, bar codes, or digital computers, and when almost half of all Americans still lived on family farms.
While it had its supporters, the Technocrats' radical solution, with it's quasi-socialistic undertones alienated much of the general public. The American people were not willing to abandon democracy and put engineers in charge. Unfortunately, it also meant that their astute analysis of the wage and employment situation fell by the wayside. The economy's woes were blamed entirely on an errant financial market, with the underlying problems swept under the rug. The nation turned instead to Roosevelt’s New Deal, which advocated government creation of jobs to provide opportunities to the unemployed and pump purchasing power back into the economy (based on the work of Keynes). It did not address overproduction and fundamental labor surpluses, however, and the nation lingered on in the doldrums of the Depression for a decade.
We all know what happened next. World War 2 came along and the government seized control of the economy and provided full employment. Essentially, what they did was similar to what the Technocrats advised, only they needed a war to make it happen. Goods were rationed. Prices were no longer a problem, as the government was picking up the tab, and overproduction was not an issue since the end result of all these products was to be blown up. Factories worked overtime, and full employment was achieved. The United States' massive weapons production was decisive in defeating the Axis powers, whom they could vastly outproduce.
After the war was over, there was a very real concern that the economy would just go right back into the depression it had emerged from. The solution proposed by the nation's political and corporate elites was to create a consumer economy - a large middle class paid well enough to afford the vast economic output produced by the nation's factories and generate enough economic activity to put everyone to work. The middle class would be encouraged to consume by advertisements beamed into their homes by television, and this was coupled with vast suburbanization which required new house construction, automobiles, roads, furniture, household appliances, services, conveniences, schools, supermarkets, etc. The nation's factories, the only ones to emerge unscathed from the war, dominated world production and provided stable, well-paying employment for millions with excellent benefits. The nation became dependant upon housing starts and automobile production, which still dominate economic discussion to this day.
Yet there was some unease. The novelist Kurt Vonnegut, who was working in the newly created field of corporate public relations for General Electric in 1952, was given a tour of a factory where an early computer operated milling machine was cutting rotor blades for jet engines and turbines, something extremely difficult for traditional machinists to accomplish. The experience prompted him to pen his first novel, Player Piano, depicting a world where all labor was displaced by machines, and conflict ensued between the engineers and managers who ran production, and the displaced laborers who found themselves superfluous to society. The novel deals more with the moral issues presented by this society, rather than the practical ones. Indeed, Vonnegut's displaced workers are fairly well cared for by a generous welfare state that would be impossible to imagine in the America of today. Vonnegut's extreme scenario was consigned to the realm of speculative fiction. The American economy at the time was booming, and any displaced factory workers and machinists easily found new jobs as salesmen, office workers, carpenters, radio announcers, ad copywriters, franchise owners, truck drivers, etc. During this time one income was sufficient to maintain a comfortable lifestyle, and concerns about overproduction and automation were quickly forgotten.
Please read Part 1 of this article
Please read part 2 of this article
Please read part 4 of this article
Please read part 5 of this article
Please read part 6 of this article
Tuesday, March 29, 2011
What Are People Good For? (part 2)
2. Voices in the Wilderness
Economists tend to distinguish between cyclical unemployment, which are temporary periods of job loss, and structural unemployment, which is a mismatch of skills between available employees and job openings. What they tend to ignore is technological unemployment. Anything that does not involve Bayesian utility maximizers operating in a rational market tending towards equilibrium is out of bounds. This has only become more pronounced as economics dealt with more and more abstract mathematical models of the economy. Automation simply does not fit into their world view - it's the stuff of science fiction and best left to engineers and physicists. They would much rather deal only with issues like interest rates, money supply, trade flows, and tax policy. As Martin Ford put it:
So far, I have not seen a great deal of deep thought given to how the future economy will work. Most people—and nearly all economists—make the obvious assumption about that: they assume the economy will essentially work the way it has always worked. The basic principles that govern the economy are seen as being relatively fixed and reliable. Economists look to history and find evidence that the free market economy has always adjusted to impacts from advancing technology and from resource and environmental constraints, and they assume that the same will always occur in the future. Crises and setbacks are temporary in nature: in the long run, the economy will rebalance itself and put us back on the path to prosperity.
One person who is taking it seriously is engineer Marshall Brain, who is often associated with the Singularity Movement and is the founder of the popular Web site How Stuff Works, (and should not to be confused with Brain from Pinky and the Brain). He was one of the earliest to sound the alarm on automation’s effects on the workforce. His Web site, called Robotic Nation, has been around a long time, and it argues that robots will eventually become advanced enough to perform nearly all necessary tasks to make society function. As he says in his FAQ:
I firmly believe that the rapid evolution of computer technology will bring us smart robots starting in a 2030 time frame. These robots will take over approximately 50% of the jobs in the U.S. economy over the course of just a decade or two. Something on the order of 50 million people will be unemployed.
The economy may adjust and invent new jobs for those 50 million unemployed workers, but it will not do so instantaneously. What we will have is a period of economic turmoil. All of those unemployed workers will be in a very bad spot. The economy as a whole will suffer from this turmoil and the downward economic spiral it causes. No one will benefit when this happens.
Brain has spoken and written extensively on these issues in a wide variety of forums. He has also written a speculative short fiction story called Manna, detailing a possible version of a society where work is taken over by robots. He presents two possible scenarios. In the first scenario, set in the United States, extensive automation produces a dystopia where workers are worked like dogs by automated "bosses" that run every conceivable business and monitor their every move. Employees that fail to produce adequately are digitally blacklisted, and wind up in internment camps constructed of cheap foam materials where robots keep watch to make sure they don't escape. Profits are maximized by a small class of fabulously wealthy business owners who control the entire economy. In the other scenario, set in Australia, robots preside over an egalitarian society providing every conceivable want. Goods are distributed equally via a credit system and based on resources available. Work is voluntary, and people are free to engage in whatever captures their interest. Everything is free. There are some "singularity" elements like space elevators, and a virtual-reality internet plugged right into people's nervous systems. While some of Brain’s speculations may be over the top, his basic concerns are becoming more and more a reality.
Another lonely voice of concern is Silicon Valley engineer and entrepreneur Martin Ford. He has written a book called The Lights In The Tunnel detailing his concerns about the economy of the future. His blog, Econfuture, is essential reading. Ford has also written for The Huffington Post and the Atlantic Monthly dealing with issues surrounding automation. Like Brain and unlike most economists, he does not believe that jobs displaced by technology are going to automatically be replaced in other areas:
"The biggest problem with the conventional wisdom is the number of jobs we are talking about. In the U.S. we have a workforce of around 140 million workers. The majority of these jobs are basically routine and repetitive in nature. At a minimum, tens of millions of jobs will be subject to automation, self-service technologies or offshoring. The automation process will never stop advancing: computer hardware and, perhaps most importantly, software will continue to relentlessly improve. Therefore, simply upgrading worker skills is not going to be a long-term solution; automation will eventually (and perhaps rapidly) catch up. If you are willing to look far enough into the future, the number of impacted jobs is potentially staggering."
One of the few economists to seriously study the issue is economist David Autor of MIT. In a paper published in 2009, Autor came to an ominous conclusion: >technology is rendering middle class jobs obsolete. Autor’s work found that the job growth over the past years has been predominantly on the high-end and low-end of the wage scale, and his findings indicate that automation is a major cause of this phenomenon.
Economists group all such arguments under the term "Luddite Fallacy." The Luddites were groups textile artisans who were opposed to the use of mechanized looms in factories fearing they would cause the loss of their occupations and consequently mass unemployment. Beginning in 1811 they smashed the labor-saving knitting machines in protest, clashing with the British army in the process. Obviously, the Luddites lost, and industrialization proceeded apace. The vast economy produced by subsequent mechanization eventually produced employment for the displaced labor force. If we had listened to the Luddites, the argument goes, we would not have the marvelous technological achievements of today, nor the affluence we all enjoy.
The conventional economic view states that as you can produce more output per worker, the costs of those outputs go down. As the costs go down, so do the prices, increasing the demand for those goods and allowing them to be supplied to more markets. The increased demand causes more workers to be hired, thus causing automation to maintain or even increase employment. Furthermore, even if workers are lost from one sector, lower prices increase demand in other sectors, and those sectors will absorb unemployed workers. You simply need to match labor with what the economy needs. While workers could be reduced or eliminated in certain operations such as steel milling or automobile manufacture, it would never be the case that need for workers in the entire economy were diminished. Economists simply do not believe that automation causes unemployment, period. Economist Alex Tabarrok summarizes the thinking this way: "If the Luddite fallacy were true we would all be out of work because productivity has been increasing for two centuries." A commenter on Paul Krugman's blog put it more succinctly: "Humanity is a beehive, there will always be work." Note, however that this is based on a series of assumptions, of which these are but a few:
1. Our demands are insatiable.
2. The future will be similar to the past.
3. There are jobs that cannot be automated.
4. Our resources are infinite.
5. All displaced workers will find employment somewhere else.
6. There will always be new industries to absorb population growth.
It appears that economic "science" is little more than taking what had happened in the past, and assuming without any evidence that it will necessarily happen in the future. After all, all those wool weavers found other jobs, didn't they? But are conditions really the same? Can these questions even be realistically addressed in these abstract economic models that are used to dismiss the arguments? What if the underlying assumptions of economics have fundamentally been altered? Martin Ford again:
Among economists and people who work in finance it seems to be almost reflexive to dismiss anyone who says "this time is different." I think that makes sense where we’re dealing with things like human behavior or market psychology. If you’re talking about asset bubbles, for example, then it’s most likely true: things will NEVER be different. But I question whether you can apply that to a technological issue. With technology things are ALWAYS different. Impossible things suddenly become possible all the time; that’s the way technology works. And it seems to me that the question of whether machines will someday out-compete the average worker is primarily a technological, not an economic, question.
Economists did not always ignore technological unemployment. John Maynard Keynes was perhaps the most influential economist of the twentieth century. His General Theory was the guidepost for combating the Great Depression. In one passage, he wrote:
For the moment the very rapidity of these changes is hurting us and bringing difficult problems to solve. Those countries are suffering relatively which are not in the vanguard of progress. We are being afflicted with a new disease of which some readers may not yet have heard the name, but of which they will hear a great deal in the years to come–namely, technological unemployment. This means unemployment due to our discovery of means of economising the use of labour outrunning the pace at which we can find new uses for labour.
This was in the 1930's, when a computer like Watson was beyond the imagination of all but speculative science fiction writers. Even though he acknowledged the impact increasing mechanization and the resultant productivity had on workers, Keynes still believed that aggregate demand was the main problem affecting the economy, and his theories were centered around stimulating demand. Despite this, Keynes clearly believed the future would be different from the past. Towards the end of his career, Keynes wrote an essay entitled Economic Possibilities for our Grandchildren, where he put forward the claim that we would someday have the material abundance to abolish scarcity. He believed that productivity gains would eventually lead to more leisure time and greater affluence for all. He saw this leading to a profound social change, where commercial values of wealth acquisition and material goods would give way to a more general appreciation of art, scientific discovery, and social relationships, and that this would be a necessary outgrowth of increasing automaton and efficiency.
Today, economists scoff at his naïveté. Keynes clearly did not foresee the automobile-driven globalized consumer economy. We didn't slow down, we just bought more and found lots of new people to sell to. Consequently, his arguments concerning automation and productivity were dismissed inteir entiretly, as well as Keynes’ ideas about a future where increased productivity was traded for leisure. If we have not brought about Keynes' vision, it is not because it is impossible, but rather it is because of the choices we have made. Those choices have been specifically designed to increase scarcity and preserve the intersts of the powerful. In fact, we work much harder for much less than we did thirty just years ago, despite the increasing aggregate wealth of our society.
It cannot be overstated that the mainstream economics profession entirely missed the economic crash and downturn of 2008 that we are continuing to experience, as well as the crash of 1929 and the Great Depression. In fact, every downturn has been missed by mainstream economists, otherwise they could theoretically have been prevented. Economists were assuring us that the economy was stable and the underlying fundamentals were sound up until the very eve of the crash! How much can we trust mainstream economists?
Please read Part 1 of this article
Please read part 3 of this article
Please read part 4 of this article
Please read part 5 of this article
Please read part 6 of this article
Economists tend to distinguish between cyclical unemployment, which are temporary periods of job loss, and structural unemployment, which is a mismatch of skills between available employees and job openings. What they tend to ignore is technological unemployment. Anything that does not involve Bayesian utility maximizers operating in a rational market tending towards equilibrium is out of bounds. This has only become more pronounced as economics dealt with more and more abstract mathematical models of the economy. Automation simply does not fit into their world view - it's the stuff of science fiction and best left to engineers and physicists. They would much rather deal only with issues like interest rates, money supply, trade flows, and tax policy. As Martin Ford put it:
So far, I have not seen a great deal of deep thought given to how the future economy will work. Most people—and nearly all economists—make the obvious assumption about that: they assume the economy will essentially work the way it has always worked. The basic principles that govern the economy are seen as being relatively fixed and reliable. Economists look to history and find evidence that the free market economy has always adjusted to impacts from advancing technology and from resource and environmental constraints, and they assume that the same will always occur in the future. Crises and setbacks are temporary in nature: in the long run, the economy will rebalance itself and put us back on the path to prosperity.
One person who is taking it seriously is engineer Marshall Brain, who is often associated with the Singularity Movement and is the founder of the popular Web site How Stuff Works, (and should not to be confused with Brain from Pinky and the Brain). He was one of the earliest to sound the alarm on automation’s effects on the workforce. His Web site, called Robotic Nation, has been around a long time, and it argues that robots will eventually become advanced enough to perform nearly all necessary tasks to make society function. As he says in his FAQ:
I firmly believe that the rapid evolution of computer technology will bring us smart robots starting in a 2030 time frame. These robots will take over approximately 50% of the jobs in the U.S. economy over the course of just a decade or two. Something on the order of 50 million people will be unemployed.
The economy may adjust and invent new jobs for those 50 million unemployed workers, but it will not do so instantaneously. What we will have is a period of economic turmoil. All of those unemployed workers will be in a very bad spot. The economy as a whole will suffer from this turmoil and the downward economic spiral it causes. No one will benefit when this happens.
Brain has spoken and written extensively on these issues in a wide variety of forums. He has also written a speculative short fiction story called Manna, detailing a possible version of a society where work is taken over by robots. He presents two possible scenarios. In the first scenario, set in the United States, extensive automation produces a dystopia where workers are worked like dogs by automated "bosses" that run every conceivable business and monitor their every move. Employees that fail to produce adequately are digitally blacklisted, and wind up in internment camps constructed of cheap foam materials where robots keep watch to make sure they don't escape. Profits are maximized by a small class of fabulously wealthy business owners who control the entire economy. In the other scenario, set in Australia, robots preside over an egalitarian society providing every conceivable want. Goods are distributed equally via a credit system and based on resources available. Work is voluntary, and people are free to engage in whatever captures their interest. Everything is free. There are some "singularity" elements like space elevators, and a virtual-reality internet plugged right into people's nervous systems. While some of Brain’s speculations may be over the top, his basic concerns are becoming more and more a reality.
Another lonely voice of concern is Silicon Valley engineer and entrepreneur Martin Ford. He has written a book called The Lights In The Tunnel detailing his concerns about the economy of the future. His blog, Econfuture, is essential reading. Ford has also written for The Huffington Post and the Atlantic Monthly dealing with issues surrounding automation. Like Brain and unlike most economists, he does not believe that jobs displaced by technology are going to automatically be replaced in other areas:
"The biggest problem with the conventional wisdom is the number of jobs we are talking about. In the U.S. we have a workforce of around 140 million workers. The majority of these jobs are basically routine and repetitive in nature. At a minimum, tens of millions of jobs will be subject to automation, self-service technologies or offshoring. The automation process will never stop advancing: computer hardware and, perhaps most importantly, software will continue to relentlessly improve. Therefore, simply upgrading worker skills is not going to be a long-term solution; automation will eventually (and perhaps rapidly) catch up. If you are willing to look far enough into the future, the number of impacted jobs is potentially staggering."
One of the few economists to seriously study the issue is economist David Autor of MIT. In a paper published in 2009, Autor came to an ominous conclusion: >technology is rendering middle class jobs obsolete. Autor’s work found that the job growth over the past years has been predominantly on the high-end and low-end of the wage scale, and his findings indicate that automation is a major cause of this phenomenon.
Economists group all such arguments under the term "Luddite Fallacy." The Luddites were groups textile artisans who were opposed to the use of mechanized looms in factories fearing they would cause the loss of their occupations and consequently mass unemployment. Beginning in 1811 they smashed the labor-saving knitting machines in protest, clashing with the British army in the process. Obviously, the Luddites lost, and industrialization proceeded apace. The vast economy produced by subsequent mechanization eventually produced employment for the displaced labor force. If we had listened to the Luddites, the argument goes, we would not have the marvelous technological achievements of today, nor the affluence we all enjoy.
The conventional economic view states that as you can produce more output per worker, the costs of those outputs go down. As the costs go down, so do the prices, increasing the demand for those goods and allowing them to be supplied to more markets. The increased demand causes more workers to be hired, thus causing automation to maintain or even increase employment. Furthermore, even if workers are lost from one sector, lower prices increase demand in other sectors, and those sectors will absorb unemployed workers. You simply need to match labor with what the economy needs. While workers could be reduced or eliminated in certain operations such as steel milling or automobile manufacture, it would never be the case that need for workers in the entire economy were diminished. Economists simply do not believe that automation causes unemployment, period. Economist Alex Tabarrok summarizes the thinking this way: "If the Luddite fallacy were true we would all be out of work because productivity has been increasing for two centuries." A commenter on Paul Krugman's blog put it more succinctly: "Humanity is a beehive, there will always be work." Note, however that this is based on a series of assumptions, of which these are but a few:
1. Our demands are insatiable.
2. The future will be similar to the past.
3. There are jobs that cannot be automated.
4. Our resources are infinite.
5. All displaced workers will find employment somewhere else.
6. There will always be new industries to absorb population growth.
It appears that economic "science" is little more than taking what had happened in the past, and assuming without any evidence that it will necessarily happen in the future. After all, all those wool weavers found other jobs, didn't they? But are conditions really the same? Can these questions even be realistically addressed in these abstract economic models that are used to dismiss the arguments? What if the underlying assumptions of economics have fundamentally been altered? Martin Ford again:
Among economists and people who work in finance it seems to be almost reflexive to dismiss anyone who says "this time is different." I think that makes sense where we’re dealing with things like human behavior or market psychology. If you’re talking about asset bubbles, for example, then it’s most likely true: things will NEVER be different. But I question whether you can apply that to a technological issue. With technology things are ALWAYS different. Impossible things suddenly become possible all the time; that’s the way technology works. And it seems to me that the question of whether machines will someday out-compete the average worker is primarily a technological, not an economic, question.
Economists did not always ignore technological unemployment. John Maynard Keynes was perhaps the most influential economist of the twentieth century. His General Theory was the guidepost for combating the Great Depression. In one passage, he wrote:
For the moment the very rapidity of these changes is hurting us and bringing difficult problems to solve. Those countries are suffering relatively which are not in the vanguard of progress. We are being afflicted with a new disease of which some readers may not yet have heard the name, but of which they will hear a great deal in the years to come–namely, technological unemployment. This means unemployment due to our discovery of means of economising the use of labour outrunning the pace at which we can find new uses for labour.
This was in the 1930's, when a computer like Watson was beyond the imagination of all but speculative science fiction writers. Even though he acknowledged the impact increasing mechanization and the resultant productivity had on workers, Keynes still believed that aggregate demand was the main problem affecting the economy, and his theories were centered around stimulating demand. Despite this, Keynes clearly believed the future would be different from the past. Towards the end of his career, Keynes wrote an essay entitled Economic Possibilities for our Grandchildren, where he put forward the claim that we would someday have the material abundance to abolish scarcity. He believed that productivity gains would eventually lead to more leisure time and greater affluence for all. He saw this leading to a profound social change, where commercial values of wealth acquisition and material goods would give way to a more general appreciation of art, scientific discovery, and social relationships, and that this would be a necessary outgrowth of increasing automaton and efficiency.
Today, economists scoff at his naïveté. Keynes clearly did not foresee the automobile-driven globalized consumer economy. We didn't slow down, we just bought more and found lots of new people to sell to. Consequently, his arguments concerning automation and productivity were dismissed inteir entiretly, as well as Keynes’ ideas about a future where increased productivity was traded for leisure. If we have not brought about Keynes' vision, it is not because it is impossible, but rather it is because of the choices we have made. Those choices have been specifically designed to increase scarcity and preserve the intersts of the powerful. In fact, we work much harder for much less than we did thirty just years ago, despite the increasing aggregate wealth of our society.
It cannot be overstated that the mainstream economics profession entirely missed the economic crash and downturn of 2008 that we are continuing to experience, as well as the crash of 1929 and the Great Depression. In fact, every downturn has been missed by mainstream economists, otherwise they could theoretically have been prevented. Economists were assuring us that the economy was stable and the underlying fundamentals were sound up until the very eve of the crash! How much can we trust mainstream economists?
Please read Part 1 of this article
Please read part 3 of this article
Please read part 4 of this article
Please read part 5 of this article
Please read part 6 of this article
Monday, March 28, 2011
What Are People Good For?
There is only one condition in which we can imagine managers not needing subordinates, and masters not needing slaves.
This condition would be that each (inanimate) instrument could do its own work, at the word of command or by intelligent anticipation, like the statues of Daedalus or the tripods made by Hephaestus, of which Homer relates that
"Of their own motion they entered the conclave of Gods on Olympus "
as if a shuttle should weave of itself, and a plectrum should do its own harp playing.
-Aristotle, Politics.
Any labor which competes with slave labor must accept the economic conditions of slave labor.
-Norbert Weiner, cybernetics pioneer.
1. I Robot.
Recently my local library installed a series of automatic check-out stations. Instead of handing your books to a circulation aide (that’s what they’re technically called), you scan your card, enter a PIN (personal identification number), scan the items, and print your slip with the due dates. This comes many months after the grocery store I used to go to regularly changed all the express checkout lanes to self scan lanes, where you scan the bar code of items yourself, insert cash into the machine (or more commonly, swipe a card), receive your receipt, and bag your own groceries. Work that used to be done for you was now either done by you or a computer, with the savings from eliminated workers’ wages theoretically passed along to you in the form of lower prices. At this point, it is worth noting that the concept is somewhat self-defeating, as the process is currently so unfamiliar that in each of these situations staff were needed on hand just to walk the customers through the process of successfully checking out. Of course, it would have been simpler for these workers to just check the customers out themselves, not to mention less stressful for the customers. Thus, although it seems like there are no real net savings using these machines for now, the corporations deploying them are betting that after a certain length of time and once they become commonplace enough, customers will no longer need such hand-holding, and these procedures will just become the standard way things are done with no special attention being paid to impersonal, automated check-outs. And they’re probably right; these machines, once rarities, are cropping up everywhere: groceries, hardware stores, libraries, gas stations, etc. ATM machines have been around for decades; you no longer need to interact with a teller to do financial transactions (and you pay extra for the privilege, making banking the only business that charges you more for using automation). When I call customer support, an automated menu with a robotic voice reads me my options and asks me to speak it into the receiver to indicate my choice. You no longer need to interact with an attendant at any of these businesses. At one time, vending machines were an oddity and service over the phone was unthinkable; now they are commonly accepted.
Some of these developments in self-service have been theoretically possible since the barcode was invented, but they have become a lot more visible lately. The Recession has actually spurred the drive for automation; the last recession in 2000-2001 also saw such a wave of automation, and with each new wave, the computers get more sophisticated. There’s been a lot of discussion and hand-wringing over outsourcing, but scant attention had been paid to what is called technological unemployment, another horseman of the labor apocalypse.
But thankfully the discussion is finally starting. The prominent economist and New York Times columnist Paul Krugman wrote a column March 7, 2011 entitled Degrees and Dollars. In the column, Krugman noted an earlier story in the Times about new software that allows legal documents to be intelligently perused by a computer program for relevant information, eliminating the need for large teams of junior lawyers and paralegals to sift through copious documents on large, complex cases:
Computers, it turns out, can quickly analyze millions of documents, cheaply performing a task that used to require armies of lawyers and paralegals. In this case, then, technological progress is actually reducing the demand for highly educated workers.
And legal research isn’t an isolated example. As the article points out, software has also been replacing engineers in such tasks as chip design. More broadly, the idea that modern technology eliminates only menial jobs, that well-educated workers are clear winners, may dominate popular discussion, but it’s actually decades out of date.
The fact is that since 1990 or so the U.S. job market has been characterized not by a general rise in the demand for skill, but by “hollowing out”: both high-wage and low-wage employment have grown rapidly, but medium-wage jobs — the kinds of jobs we count on to support a strong middle class — have lagged behind.
In the article, Krugman argues that seeing education as a panacea for falling wages is out of date, as many medium skill jobs are being automated out of existence as fast as low-skilled jobs. Krugman also dealt with the falling demand for brains issue in this blog post. In it he mentions Watson, IBM’s “intelligent” computer that only weeks prior defeated some of the best contestants on the television game show “Jeopardy.” In order to compete on Jeopardy, a computer has to not merely sift through data, but “understand” normal questions, context, relevancy, even puns, slang and wordplay. Jeopardy champion Ken Jennings, one of the contestants who was bested by Watson, wrote a perceptive article for Slate magazine where he said the following:
IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of "thinking" machines. "Quiz show contestant" may be the first job made redundant by Watson, but I'm sure it won't be the last.
Krugman still thinks that many low-wage jobs like janitors are less likely to be automated:
Most of the manual labor still being done in our economy seems to be of the kind that’s hard to automate. Notably, with production workers in manufacturing down to about 6 percent of U.S. employment, there aren’t many assembly-line jobs left to lose. Meanwhile, quite a lot of white-collar work currently carried out by well-educated, relatively well-paid workers may soon be computerized. Roombas are cute, but robot janitors are a long way off; computerized legal research and computer-aided medical diagnosis are already here.
His optimism over the creation of low-wage jobs is probably misplaced; they are not being created either. According to labor statistics, the unemployment rate for those earning over 150,000 a year is 3 percent, while it is 31 percent for county's poorest households. In fact, janitors may not even be safe from self-cleaning automated toilets. Also that same week, Boing Boing posted a video of people riding in one of Google’s experimental self-driving cars navigating through an obstacle course. Goodbye cab and truck drivers. A robot has been invented to fold laundry. Goodbye housekeeping staff. Surely low-paid agricultural work is safe. Think again. The Institute of Agricultural Machinery at Japan ’s National Agriculture and Food Research Organization, along with SI Seiko, has developed a robot with stereoscopic vision that can select and harvest strawberries based on their color. Future developments are predicted for tomatoes, grapes and other plants. Automation is already used extensively in dairy farming (milking machines, etc.), and M.I.T. is working on prototype bots that can monitor, feed and harvest tomato plants. According to one commentator:
“The automation of agriculture could prove to be a pivotal development in the early 21st century, akin to the adoption of combustion engines in the early 20th century. Just as horses were eventually replaced by tractors, humans may find themselves replaced by robots in the remaining realms of agricultural labor in which they still hold sway.”
Construction workers and the trades aren’t immune, either. New housing starts are seen as barometers of economic health, yet there is a drive to construct entire houses in factories via automation, ensuring higher quality and lower costs than building houses ad-hoc outdoors in unpredictable weather conditions. BLDGBlog recently posted an article entitled “The Robot and the Architect are Friends,” exploring using robots to translate building designs into reality without teams of construction workers. The article stated:
"Swiss architects Gramazio & Kohler "have a vision: architecture using robotics to take command of all aspects of construction. Liberated from the sidelines, the profession would be freed to unleash all its creative potential—all thanks to its obedient servants, the robots"
In fact, the New York Times is doing a whole series of reports chronicling the rise of recent artificial intelligence breakthroughs called “Smarter Than You Think.” It is sobering reading. It seems like there is already no task that is beyond being performed by a computer. Even relatively complex tasks are not far out of reach. Having long ago mastered chess, they are even playing poker against human opponents. NASA is already sending robotic astronauts into space. Highly paid television journalists are not immune, either. The Internet blogger Randall Parker at Future Pundit wrote an article based on the same New York Times piece in which he imaginatively describes a legal system almost entirely devoid of people. He goes further than Krugman in describing an automated workforce of the future. His article may seem a bit fanciful, yet decision making software is a reality, and it is getting more sophisticated all the time. There's even a computer that composes beautiful music! I could go on and on. The sense throughout the last decades of the twentieth century was that automation only affected low-skilled blue-collar factory workers. College-educated workers thought, “if only those losers would simply go to school and hit the books, they wouldn’t have to worry. It’s their own damn fault.” Now we know that even many jobs that require extensive education can be eliminated just as easily.*
So we have a continually growing population thanks to the fruits of industrialization that can feed itself by using a mere 2.7 percent of it's laborers. This continually growing population needs to sell it's labor in return for wages to survive, yet, every business in the economy strives for maximum "efficiency", which is defined by getting the most work done with the lowest amount of inputs possible. Closely related is productivity, which is the amount of work produced per worker. Any economist will tell you that rich societies are the ones with the highest productivity, and businesses are always striving to increase productivity. In the U.S. worker productivity has soared, especially since the advent of the computer (with the resulting gains not shared with the workers). Yet productivity gains also lead to the need for less workers. Every incentive in a money/wage economy is to reduce the amount of human labor! That's why there are so few people involved in agriculture, and why food is so (relatively) cheap. So you have an economy that needs to continually create more jobs for more and more people, yet provides every incentive to produce with less and less workers. And let's not forget, since the 1970's or so, both men and women are expected to work, and in many cases, must work, so now we need to create enough jobs for all of them. The question is not so much, "how can we make this system work?", it's "how has this system worked for so long?" In the past, technology merely allowed workers to work more effectively and efficiently, it did not actually eliminate the needs for workers themselves.
Some have pointed out that automation does not eliminate the need for lawyers - even if legal research is done by computers, you still need lawyers, and even if medical diagnosis is done by computer, you still need doctors. As Krugman noted, computers excel and doing routine tasks, "cognitive and manual tasks that can be accomplished by following explicit rules.” But this misses the point. The fact that we need some doctors and lawyers and janitors doesn’t matter. The economy needs to be continually creating more jobs just to accommodate the people constantly entering the workforce. In fact, the economy needs to add roughly one million jobs every year, just to keep the unemployment rate from increasing! That's some 120,000 jobs every month, year in and year out. With automation constantly decreasing the need for “cognitive and manual tasks," this becomes more and more of a receding horizon. Even a slight decrease in available jobs adds up after months and months. Every month we miss our jobs target, the amount of unemployed workers is increasing, and making up for it becomes harder and harder. We're fighting a losing battle here.
While we may not like repetitive jobs with clearly defined, rules, the fact is that these jobs provide the majority of employment in our economy. They provide the entry-level jobs that allow one to climb the career ladder. An economy without them amounts to a ladder with no rungs between the bottom and the top. The lawyers doing the grunt work today will be the senior associates in twenty years. The interns doing medical diagnosis now will be the future doctors. The architects building models now will eventually design entire buildings. At least, that’s how it used to work. Once those basic jobs are not there, who will get to be doctors and lawyers of the future, and how will that be decided? The sad fact is, jobs that require true thought and creativity are very limited, and competition for them is intense. It’s always been this way, whether we acknowledge it or not. Such jobs are highly desirable, and thus occupied by those with the requisite money and social connections. Competition for such jobs is getting more and more intense, which is no doubt leading to the stratospheric rise in the cost of education in both time and money. Competition to get into the best schools currently begins before children are even five years old! In fact, there is so much competition for jobs in the design and entertainment industries that employees need to literally work for free just to get a foot in the door! Similar free internships are required at highly desirable workspaces like investment firms and media companies, where interns hope to make a killing or be discovered as the next new celebrity journalist or on-air personality. Already, highly-paid and creative positions are dominated by the children of the wealthy and privileged. People will simply graduate right to the top. It leads to an even more class-stratified caste system than we have today. Are we heading toward a new aristocracy? We have what I call a creative surplus, a concept I hope to explore in more detail. What it means is that our workforce has far more creative ideas than society has the ability to realistically implement at this time. Even if you are one of those intelligent, creative individuals, don’t count on making any money off of it. There is only a need for so many fashion designers, starchitects, or New York Times columnists. The massive amount of “free” creativity floating around the Internet (including this essay) is testament to that. Our workforce is far more skilled and educated now than in the nineteen fifties, yet unemployment is much higher.
Please read part 2 of this article
Please read part 3 of this article
Please read part 4 of this article
Please read part 5 of this article
Please read part 2 of this article
Please read part 3 of this article
Please read part 4 of this article
Please read part 5 of this article
Please read part 6 of this article
*Oddly enough, the one place that should have been entirely run by robots, a nuclear facility in Japan, was not.
*Oddly enough, the one place that should have been entirely run by robots, a nuclear facility in Japan, was not.
Subscribe to:
Posts (Atom)