Monday, February 25, 2013

Human Extinction

“if all insects on Earth disappeared, within 50 years all life on
Earth would end. If all human beings disappeared from the Earth,
within 50 years all forms of life would flourish.”

― Biologist Jonas Salk
In the near future, many potential triggers could lead to a cataclysm. The 20th century gave us nuclear bombs and weaponized smallpox. The 21st will surely deliver a greater variety of bioweapons. The prospect of a natural killer like the influenza virus adapting to a globalized world of 7 billion people is worrisome. The machines we have built our civilization upon—computers, software, networks—contain the seeds of destruction for the simple fact that we have come to depend on them, and they are vulnerable to manipulation. We are always figuring out new ways of bringing apocalypse on our heads. Even climate, which we tend to think of as a slowly unfolding crisis, could conceivably bite us sooner than we think. Some researchers think that weather patterns such as the ones that bring monsoons to India and sustain glaciers in Antarctica could behave like dynamical systems, prone to sudden, unpredictable, and dangerous changes.  
It’s possible—perhaps likely—that any of these factors, or several acting at the same time, could cause a plunge in the human population in this century or the next. United Nations estimates have the world population, now 7 billion, rising to 10 billion by the end of the century and then leveling off. When we consider such estimates, we tend to make a questionable assumption: that the human population will behave like no other—that, after rising with breakneck speed, it will assume a steady state precisely at its peak. Ecologists will tell you that that is not usually the way it goes. Yeast cells that rapidly fill up their culture dish generally die off suddenly and in great numbers.
Could Humans Go Extinct? (Slate)

There's also a good interview with the author about his book on existential threats to humanity on Skeptically Speaking. I was struck by this point about how the hunter-gatherer way of finding food is so different from the utter dependency on cereal-based monocrops today:
Host: You looked at the Sanak food chain as an example.

Fred Guterl: Yeah, this was some work that Jennifer Dunne at Santa Fe Institute and others were doing. Jennifer Dunne is part of a group that's studying ancient hunter-gatherer societies in the Aleutian Islands. And what she was doing, she was building a food web. She's trying to model in a computer how the hunter-gatherers of thousands of years ago fit in the food web in these islands. And what she found was really interesting. You think of humans as being on the top of the food pyramid, you know, we eat things that eat other things that eventually eat plants, right? Plants are the things, and algae are the things that take energy from the sun and convert it into chemical energy, and then they get eaten, and we're at the top. But we're actually in the middle. Humans are in the middle.

And hunter-gatherer societies, according to her data, are right smack in the middle. And that means that we are virtually unique. I think we're unique to the extent that we are omnivores, and we will eat things at the top of the food chain, the food web, and things at the bottom; we eat everything. We are just voracious. And one of the things that hunter-gatherer societies did, was that they would prey-switch; they would switch from one prey to another depending on availability.

If you were really into eating sea otters, you know, you were sneaking up on the sea otters and were hitting them over the head and roasting them over the fire, that was really great until one day, well, gee, I can't find any sea otters. So what do I do? Should I go extinct or should I find some other source of food? The answer is always, find some other source of food. So, well, there are a lot of mussels, so we'll gather some mussels and we'll eat those. Mussels are much farther down on the food chain. Or maybe we'll just eat a bunch of seaweed, you know, let's get the seaweed and we'll roast it over the fire. I don't know if this is what they really did, I'm just kind of making it up.

But this is the idea - we switch from one prey to the other depending on what's available. And this had a stabilizing effect on the ecosystem. So if there was some overpopulation, you could count on humans to go and chase these things down with their knives and forks, and when they became scarce, the humans would go chase something else.

Now, our modern agricultural system is not like this. Our modern agricultural system is, we have come to rely on a few staple crops. We've got these vast monocultures, and they're unstable. I mean, when you have billions of people being fed by a few different strains of wheat, of a few different strains of rice, you're leaving yourself very vulnerable to some kind of pathogen that would affect that species. And we're starting to see this with things like wheat rust. there's a really nasty strain of wheat rust that has been resistant to all treatment that's spreading eastward through Uganda and has been for a number of years. And when it gets to India, it could be very bad. It could cause a lot of pain.

When you're talking about a hunter-gatherer society in the Aleutian Islands, you have a little, self-contained ecosystem. Then when you extend it to the entire globe and you have this mass agricultural system. And then you have a huge impact of humans on habitat. And then you have the fact that we fish species to nearly extinction, say, I'm thinking maybe of the bluefin tuna. I haven't checked the prices recently but they were rising and rising and rising as the fish are getting scarcer. And biologists are worried that the bluefin tuna will be threatened with extinction.

So we're not prey switching here. We're not saying, 'oh, bluefin tuna, there aren't that many of them left, let's go an eat something else.' We're saying, 'bluefin tuna, they fetch even more money, lets do whatever we can to find whatever bluefin tuna there are and deliver them to people's dinner plates!' Our agricultural incentives and our agricultural relationship with the rest of the world is really different than it was. And I think, where if you were to devise, if you were to say, well, lets have a world where there are 10 billion people, and lets devise a way to feed them, you would not devise the current agricultural system. Because the one we have now is taking us to a place that is very precarious. So the question then is, what do we do about that.
More info on food webs: http://www.foodwebs.org/

Wikipedia on Risks to civilization, humans, and planet Earth

Sunday, February 24, 2013

Anthropologipalooza

A few final notes on Jared Diamond, and related topics . Here's an article in Slate questioning the intellectual rigor of Diamond's approach:
Diamond has a gift for storytelling. He presents his examples in a seductively readable voice with unflinching confidence, which makes his conclusions about the similarities and differences between traditional and modern society seem like common sense. But as I read the text, I found that I agreed with Diamond in inverse relation to my pre-existing knowledge about whatever subject he was addressing. When Diamond was writing about topics that I know in depth, I felt as though he was leaving out important information; when I didn’t know what he was writing about, I was thoroughly convinced. Diamond is a generalist and will always paint with a brush that a specialist finds too broad. The danger lies not in simplifying source material by leaving out extraneous details, but in selectively highlighting only the facts that support one’s argument and casting contravening cases aside.
Can You Trust Jared Diamond? One thing he could learn from traditional societies: Show your work. (Slate)

I also discovered this excellent blog - Living Anthropologically, which has several excellent pieces on Diamond and other topics:
It is important to first underscore that we cannot read anthropology’s ethnographic record for evidence of whether or not violence is inherent to human nature, as some have attempted. Fortunately, on this point Jared Diamond is clear and correct: “It is equally fruitless to debate whether humans are intrinsically violent or else intrinsically cooperative. All human societies practise both violence and cooperation; which trait appears to predominate depends on the circumstances.”  
It is also important to underscore that human groups have had varying levels of violence, both historically and across both state and non-state societies. Diamond also realizes this point. What I object to is that following these two acknowledgements, Diamond then portrays non-state societies as generally more violent than state societies, and believes that “the long-term effect of European, Tswana, or other outside contact with states or chiefdoms has almost always been to suppress tribal warfare. The short-term effect has variously been either an immediate suppression as well or else an initial flare-up and then suppression.” (Of course, the duration of this “initial flare-up” could be for centuries as Diamond writes a few sentences earlier, that in some cases “warfare had been endemic long before European arrival, but the effects of Europeans caused an exacerbation of warfare for a few decades (New Zealand, Fiji, Solomon Islands) or a few centuries (Great Plains, Central Africa) before it died out.”) I hope to have shown above that the empirical evidence for those claims is not reliable.
The Yanomami Ax Fight: Science, Violence, Empirical Data, and the Facts
This chapter challenges the repeated refrain of “absence of evidence is not evidence
 of absence.” War does leave behind recoverable evidence. True, in some cases, war could be present but for some reason not leave traces. However, comparison of many, many cases, from all different regions, shows some clear patterns. In the earliest remains, other than occasional cannibalism, there is no evidence of war, and barely any of interpersonal violence. In Europe’s Mesolithic, war is scattered and episodic, and in the comparable Epipaleolithic of the Near East, it is absent. Neolithic records vary, but all except one begin with at least a half a millennium of peace, then war appears in some places, and over time war becomes the norm. War does not extend forever backwards. It has identifiable beginnings.

 My suggestion is that as archaeologists search for signs of war, they also consider the possibility that humans are capable of systematically dealing with conflict in peaceful ways. . . . Across all of Europe and the Near East, war has been known from 3000 BC, or millennia earlier, present during all of written history. No wonder we think of it as “natural.” But the prevalent notion that war is “just human nature” is empirically unsupportable. The same types of evidence that document the antiquity of war refute the idea of war forever backwards. War sprang out of a warless world. Humankind has suffered infinite misery because systems of war conquered our social existence. Better understanding of what makes war, and what makes peace, is an important step toward bringing peace back.
War, Peace, & Human Nature: Convergence of Evolution & Culture
Beyond the brilliant headlines, Corry and Survival International may pull off a rare feat: to facilitate the entry of others into the discussion, challenging the structure of what Michel-Rolph Trouillot called the Savage slot: “In the rhetoric of the Savage slot, the Savage is never an interlocutor, but evidence in an argument between two Western interlocutors about the possible futures of humankind” (Global Transformations 2003:133).
Angry Papuan leaders demand Jared Diamond apologizes – Survival Intl
Myths of the Spanish conquest prove surprisingly durable, and Matthew Restall’s aptly titled Seven Myths of the Spanish Conquest is one of the best places to begin debunking. By highlighting the importance of indigenous allies both during and throughout colonization, Restall points the way to a different kind of history, of contingent outcome rather than inevitability.
Myths of the Spanish Conquest – Indigenous Allies & Politics of Empire

Here's another anthropolgy blog with some stuff about Diamond: Savage Minds. They even have a whole category dedicated to him.
I’m puzzled at Diamond’s claim that the purpose of law is to make peace — that it is the type of thing that would be improved by including mediators and restorative justice. It doesn’t take a lot of insight (or experience with the court system) to realize that law is a way of making war, not making peace. Diamond, thinking like Hobbes, seems to think this is the case with criminal law, where the state makes war with its citizens. But it’s equally true of civil law. Suing someone is what you do when the talking is done. This is something that Papua New Guineans have often remarked to me in the course of litigating against mining companies: when you have gavman (government) you fight with money, not arrows.
Law, Justice, and War in World Until Yesterday

Here are a couple posts from a blog called Easily Distracted:
In an earlier comment, I mentioned at least a few areas where there seems to me to be a genuine debate with a range of legitimate positions that require respect, if not agreement, in terms of Diamond’s latest (as well as Pinker’s latest book, which has some overlap):

1. Maybe New Guinea isn’t representative of all modern “traditional societies”, let alone hunter-gatherers in all of human history. Maybe there is considerably more variety in terms of violence and many other attributes than Diamond lets on. Maybe he’s not even paying attention to the full range of anthropological or historical writing about New Guinea. Maybe Diamond isn’t even living up to his own stated interest in the variations between such societies.

2. Maybe modern hunter-gathering societies are not actually pristine, unchanging survivals of an earlier era of human history, but instead the dynamic consequence of large and small-scale migrations of agriculturalists and even more recently, industrial workers. At least in some cases, that might be why hunter-gatherers inhabit remote or marginal environments, not because of preference, but as a response to the sometimes-violent movement of other human societies into territories that they used to inhabit. Meaning taking whatever it is that they have been doing in the 20th Century (violence or otherwise) as evidence of what they’ve always done is a mistake.

3. Maybe defining violence or war in a rigorous, consistent, measurable and fully comparative way is much harder than Diamond or Pinker think it is.

4. Maybe between what Diamond calls a “traditional society” and modern “WEIRD” societies (Western, educated, industrialized, rich and democratic) there are lots of other models. Maybe “between” is the wrong term altogether since it implies that there’s a straight developmental line between “traditional society” and modernity, an old teleological chestnut that most anthropologists and historians would desperately like to get away from. I haven’t read very far yet into the book, but Diamond doesn’t seem to have any idea, for example, that there have been numerous societies in human history where there have been many connected communities sharing culture and language at high levels of population density and complexity of economic structure that have nevertheless not had a “state” in the usual sense. What are those? Also: maybe Diamond frequently confuses “traditional” and “premodern”. Much of the time when he says, “Well, we modern WEIRD people do X, ‘traditional societies’ do Y”, the “Y” in question would apply equally to large premodern states and empires.

Or to summarize: maybe Diamond is pushing way, way too hard for a clean distinction between two broadly drawn “types”: “traditional society” and “modern society”, and is distorting, misquoting, truncating or overlooking much of what he read (hard to tell what he read, since there’s no footnotes) to make the distinction come out right.
On Diamond (Not Again!) (Easily Distracted)
Anything that arranges human history as a matter of “stages” progressing neatly towards the modern is just factually wrong before we ever get to the troubled instrumental and ideological history of such schema. Yes, that includes most versions of dialectical materialism: the dogged attempts of some Marxist historians and anthropologists in the 1970s and 1980s to get everything before 1500 into some kind of clear dialectical schema long since crashed into either an assertion that there’s only been one general world-systemic polity ever in human history (the “5,000 year-old world system”) or that lots of variant premodern histories collapsed into a single capitalist world-system after 1500.

When scholars who see politics or culture or warfare or many other phenomena in granular and variable terms rise to object to strong generalizing or universalizing accounts, their first motive is an empirical one: it just isn’t like that. Human political structures didn’t ALL go from “simple tribes” to “early states” to “feudalism” to “absolutist centralization” to “nation-states” to “modern global society”. They didn’t even go that way in Western Europe, really. Certain kinds of structures or practices appeared early in human history, sure, and then recurred because they radiated out from some originating site of practice or because of parallel genesis in relationship to common material and sociobiological dimensions of human life. Other common structures and practices appeared later, sometimes because new technological or economic practices allow for new scales or forms of political life and structure. But there is a huge amount of variation that is poorly described by a linear relation. There are movements between large and small, hierarchical and flat, organized and anarchic, imperial and national, etc., which are not linear at all but cyclical or amorphous.
Particularism as a Big Idea

Here a couple of articles that might back Diamond up, depending on your point of view:
Stone Age farmers lived through routine violence, and women weren't spared from its toll, a new study finds.

The analysis discovered that up to 1 in 6 skulls exhumed in Scandinavia from the late Stone Age — between about 6,000 and 3,700 years ago — had nasty head injuries. And contrary to findings from mass gravesites of the period, women were equally likely to be victims of deadly blows, according to the study published in the February issue of the American Journal of Physical Anthropology.

Linda Fibiger, an archaeologist at the University of Edinburgh in Scotland, and her colleagues focused on the late Stone Age, when European hunter-gatherers had transitioned into farming or herding animals.

Some mass graves unearthed from that time contained mostly males who had died in violent conflicts. As such, researchers had thought women were spared from conflicts due to their potential childbearing value, Fibiger told LiveScience.

But looking only at the aftermath of big, bloody conflicts can obscure the day-to-day realities of Neolithic farmers.

"It would be like only looking at a war zone to assess violence," Fibiger said. "That's not going to tell you what's going on in your neighborhood."
Battered Skulls Reveal Violence Among Stone Age Women (Live Science)
A young mother was burned alive in Papua New Guinea this week after townspeople accused her of being a witch.

According to multiple reports, Kepari Leniata, 20, was tortured and killed in front of a mob of hundreds in the town of Mount Hagen. The woman, stripped naked and covered in gasoline, was burned alive on a pile of trash by relatives of a young boy who had died earlier in the week. The relatives had accused Leniata of killing him with sorcery.
Accused 'Witch' Kepari Leniata Burned Alive By Mob In Papua New Guinea (Huffington Post)

And not related to Diamond, but to one of the few more famous and more controversial anthropolgists, is this excellent stroy in the New York Times Magazine about Napoleon Chagnon.:

How Napoleon Chagnon Became Our Most Controversial Anthropologist (New York Times)

Much of the controversy surrounding Chagnon and Darkness in El Dorado is explored in the documentary film "Secrets of the Tribe." I was able to watch it on Vimeo, but it has since been made private. Here is a review:
To interview the Yanomami, Padilha did what those before him had done: He paid them. "Everything is trade with the Yanomami," he explains. How did he get Chagnon to willingly revisit the allegations that forced the embattled anthropologist into early retirement? "I say I am making a film about science," Padilha explains. "Everyone thinks they are the good scientists and everyone else is doing bad science.

"The methodology of anthropology is flawed," Padilha continues. "Each anthropologist finds exactly the evidence to fit his paradigm. To destroy the data you have to destroy the person. Who cares how you feel about Einstein? Take his data to the lab and see if what he says holds up. No one ever said that about Einstein, but you get my point...Chagnon doesn't agree with Ken Good, so he says, ‘Oh, he married a teenager.'"

The cavalcade of bickering eggheads that Padilha created in the editing room is riveting, sometimes even funny. The interviews with the Yanomami, who describe entire villages of people dying, sexual abuse and the havoc wrought by anthropologists who traded information for steel axes and machetes, create a cumulative effect that can only be described as heartbreak. Watching archival footage of Yanomami: A Multidisciplinary Study (1968) and The Feast (1970), both shot by Asch during the joint Neel-Chagnon study on a measles vaccine, we learn that most of the people on film died shortly thereafter.

Anthropologists behaving badly is nothing new. Franz Boas, the father of American anthropology, asked Arctic explorer Robert Peary to bring him back "a middle-aged Eskimo, preferably from Greenland," for the American Museum of Natural History's live dioramas. Within eight months, four of the six Inuits Peary delivered had died of tuberculosis. Congolese pygmy Ota Benga lived at the museum and later at New York's Bronx Zoo before killing himself. Ishi, the last of the California Yahi Indians, lived at the University of California's Museum of Anthropology, and some of his remains were shipped off to the Smithsonian. Robert Flaherty--whose Nanook of the North unleashed a controversy in ethnographic filmmaking that continues today--fathered an Inuit son he later refused to acknowledge, or help. Even the ethically meticulous Margaret Mead admitted to having considered a sexual affair with one of the Samoans she was studying.

Today, anthropology is going through another round of soul-searching. Barbara Rose Johnston, who saw Secrets when it premiered at the 2010 Sundance Film Festival, invited Padilha and his film to the American Anthropological Association's annual meeting, held in November in New Orleans. "I think it is a trap," Padilha joked back in April. "Maybe they will try to kill me." The film, minus its director, became part of a panel exploring the ethics of the discipline and, in a move that cannot be coincidental, the AAA decided to drop the word "science" from its statement on long-range plans. "The thing is, I think that biology has a lot to do with behavior," Padilha says." But the science is clumsy. Chagnon is an embarrassment to sociobiology. This film will help that."
Anthropologists Behaving Badly: Jose Padilha's 'Secrets of the Tribe' Does Some Digging of Its Own (Documentary.org). Incidentally, there is also a good brief debunking of Pinker, Chagnon, and other portrayals of the impoverished, warlike pre-agricultural past in Sex at Dawn. - Chapeters 11-14. Indeed, as one of the articles cited above points out:
Before wading into this issue around Diamond’s book, I had not realized how much the idea of a warlike human nature had become a near religious dictum. And I must again note the irony that in his 1987 breakout article Worst Mistake in the History of the Human Race, Jared Diamond pinned warfare not on non-state societies, but on agriculture: “Forced to choose between limiting population or trying to increase food production, we chose the latter and ended up with starvation, warfare, and tyranny.”

Thursday, February 21, 2013

Growth!


Take a tour of China's "economic growth":
As growth slows, China's huge investment in infrastructure is looking ever harder to sustain, leaving a string of ambitious projects - towns, shopping malls and even a theme park - empty and forlorn. 
"We have spoken a lot about these ghost towns in Ireland and Spain recently [but China] is Ireland and Spain on steroids," says Kevin Doran, a senior investment fund manager at Brown Shipley in the UK. Investment in infrastructure accounts for much of China's GDP - the country is said to have built the equivalent of Rome every two months in the past decade. And with such a large pool of labour, it is harder to put the brakes on when growth slows and supply outstrips demand.  
"You have got seven to eight million people entering the workforce in China every single year, so you have to give them something to do in order to retain the legitimacy of the government," says Doran. "Maybe 10 or 15 years ago they were doing things that made sense - roads, rail, power stations etc - but they have now got to the point where it's investment for investment's sake."
China's ghost towns and phantom malls (BBC)

So know we know what we're doing with the last of the fossil fuels. Glad to know growth is all going to improved living standards like the economists tell us. Hopefully the Keystone pipeline will enable much more of this.

The Disneyesque castle and medieval ramparts of this theme park north of Beijing, conceived nearly 20 years ago, lie abandoned. Local farmers grow crops among the empty buildings. In the mid-1990s, developers had promised to build the largest amusement park in Asia, but the project got mothballed over a land rights dispute. The site does in fact attract visitors, according to locals quoted by Chinese media, but hardly the sort the developers had in mind - they are drawing students, photographers and artists from Beijing, apparently, in search of a "ruin culture".

RELATED: These are China’s weirdest monuments (i09)

Wednesday, February 20, 2013

Bright Lights, Big City

 
Dr Linnell, from the university's psychology department, carried out cognitive tests with the Himba tribe in Namibia in south west Africa - and also included a further comparison with young people in London. She found that the Himba tribesmen and women who had stayed in a rural, cattle-herding setting were much better at tests requiring concentration than members of the same tribe who had been urbanised and were living in towns and cities. The results for urbanised Himba were "indistinguishable" from the results of undergraduates taking the same tests in London, said Dr Linnell. The researchers suggest that people in an urban setting have too much stimulation, with an overload of sights and sounds competing for attention.

Concentration is improved when people's senses are aroused, says Dr Linnell, but if this becomes excessive it seems to have the opposite effect and reduces the ability to focus on a single task. As such the people living in cities were not as good at tests which required sustained focus and the ability not to be distracted. The rural living people were much better at such tests of concentration, even computer-based tasks, where they might have been expected to be less familiar with the technology.

This is not necessarily a case of being better or worse, says Dr Linnell, but it could be a reflection of what is needed to survive in an overcrowded urban setting. It is also not a "fleeting" impact, she suggests, as the tests show that urbanised people from this tribe have developed a different way of looking at events."There are really quite profound differences as a function of how we live our lives," she says.

Another finding is that the Himba people who have moved to the city are more likely to be dissatisfied and show signs of unhappiness. In contrast the simpler, frugal life of the rural tribespeople seems to leave them with a greater sense of contentment.
City living 'makes it harder to concentrate' (BBC)

More unhappy and distracted, eh? So massive ubanization isn't a panacea? Too bad we're herding the populations of the third world into shatytowns and slums to provide the last wave of cheap globalized labor. Somebody call the Long Now Foundation.


Tuesday, February 19, 2013

UK Powerdown

Britain 'on the brink' of energy crisis, warns regulator (Telegraph):
The country would become more reliant on foreign gas to generate electricity as European Union pollution laws meant the dirtiest coal-fired stations had to shut, said Alistair Buchanan, the regulator’s outgoing head. He pointed out that gas was already 60 per cent more expensive in countries such as Japan that relied on imports. It was impossible to predict how high bills could go for British households, he said.  
Ministers admitted that Britain faced a “looming energy gap” but blamed the previous government for agreeing to shut coal plants too quickly. Nick Clegg, the Deputy Prime Minister, admitted that consumers could feel a “pinch” starting within two years. The Government was fighting to “keep the lights on”, he insisted.

Mr Buchanan warned that the “near-crisis” would occur between 2015 and 2018, pushing up bills. Blackouts were “not likely” but there would be a “double squeeze” on energy prices.

The UK would face “the horror” of greater reliance on gas just as its cost on world markets was expected to rise, he said. The average household energy bill is more than £1,400 following a series of increases in the past six months. Prices have risen by almost a fifth in the past four years. Mr Buchanan told the BBC that nuclear power, many wind farms and clean coal technology would not be available until after 2020.

“So we’ll lean on gas and gas will account for about 60 per cent of our power station needs instead of 30 per cent as it does today,” he said. “And in order to get hold of that gas we’re going to have to go shopping around the world. And just at the time that we’re tight on power stations, the world is going to go tight on energy gas prices. So you’ve got a double squeeze.”

He said it was very important for the Government to persuade people to use less energy.
So much for things getting back to "normal." Hopefully Transition Towns is up to the task. Maybe they should take a cue from Germany:
Imagine a town which no longer relies on fossil fuels or nuclear power, a place where residents reached into their own pockets to build their own energy grid, reaping the benefit of lower electric and heating prices from their investment. You are dreaming of Feldheim, a 100% energy independent town.
German town goes off the grid, achieves energy independence (Treehugger)

Speiseabfälle

Greeks aren't the only ones eating out of dumpsters:
Fellmer is on a three-year-old "money strike": he does not earn or spend a euro and he, his wife and child eat only food that has been rescued from the trash.

A rangy 29-year-old in a baggy blue jumper with spiky blond hair and a pointed beard, he is already something of a German media phenomenon. On a recent visit, a TV documentary crew and a reporter from a local daily were crowded into his one-room flat.

He plonks on the table a packet of ginger biscuits for Christmas - from a batch of hundreds fished out of the garbage nearby - bearing a "use by" date which is still a month away. They taste fine, as do some red and gold-wrapped chocolate Santas.

The "use by" dates infuriate the foodsharers, many of whom were first inspired by the 2011 film "Taste the Waste" by their guru Valentin Thurm.

It documents waste ranging from farmers discarding tomatoes that are not red enough to bakeries burning the excess bread they made to keep the shelves looking full until closing time.

Fellmer's friend Schmitt was brought up in a "very food-conscious vegetarian household". His mother is a food chemist who advises him on hygienic ways to eat and share food from plastic sacks that he admits are sometimes "mushy" under your fingers in the dark.

Like Fellmer, he lives not in east Berlin, with its history of squats and communes, but in the leafy western suburb of Dahlem where he dumpster dives under the noses of the German capital's most affluent residents.

Foodsharing appeals to the "hipster" culture of Berlin with its tradition of anti-establishment protest, Schmitt said.

The German crowdsourcing techniques could turn out to be "best practice" for reducing waste in other countries too, said the FAO's Bucatariu.

"Solutions may vary according to the culture, the context and to what access to food there is," she said. "But each and every one of us can do something."
German dumpster divers get connected to wage war on food waste (Reuters)

Robofarm

Forget about farm jobs saving us from automation:
The property Kevin Liefer and his son, Kirk, cultivate in southern Illinois has been expanding for decades without adding a single manager. These are boom times for farming and a bust for farm jobs. The 3,600 acres of mostly corn, wheat and soybeans the Liefers hold were about 30 separate, individually operated farms more than 40 years ago, said Kevin. As families left, the homesteads near Red Bud, about 40 miles (64 kilometers) southeast of St. Louis, melded into one operation. Older tractors were replaced with models that cultivate more ground and serve as miniature offices, complete with global positioning systems that allow them to steer themselves. Mobile phones enable communication while in the fields.

“There’s so much more you can do now without as much labor,” said Kevin, 58. “The consolidation has been rapid.”

A U.S. farm boom showing few signs of a let-up isn’t translating into more opportunities in one of the most robust areas of the economy. Farmers, ranchers and other agricultural managers will see the steepest decline of any employment category by 2020, losing a projected 96,000 jobs this decade out of 1.2 million positions, part of a broader trend toward less labor in the sector, according to the Bureau of Labor Statistics.

The drop comes even as agricultural managers have the highest median wage of any of the top 20 declining categories, at more than $60,000 a year. Farm owners like the Liefers are able to manage larger tracts of land without hiring overseers. Full-time farm managers hired by others can handle more property for more clients, said Jerry Warner, a past president of the American Society of Farm Managers and Rural Appraisers, a farm- management organization based in Denver.

Many of the fastest-declining U.S. job categories result from industry contraction: post offices closing because of lower mail volume and textiles factories because of outsourcing, for example. Agriculture is an expanding sector with rising profits, even as overall employment, including laborers, is projected to drop 2.3 percent over this decade.

Total planted acreage has risen in seven of the past 10 years, the prices of corn and soybeans reached records last year, and profits for 2012 of $114 billion are estimated to be second only to 2011, even after the worst drought since the 1930s. Farmer debt is near its lowest in at least 60 years of record-keeping while land prices are at an all-time high. Still, all 291,000 of the farm-manager positions expected to become available in the decade up to 2020 -- net of those lost -- will be replacement jobs, according to the Labor Department.

Consolidation tells only part of the story. The number of U.S. farms, which fell by half in the three decades up to 1986, dropped by just 3.1 percent in the past quarter-century, according to the U.S. Department of Agriculture. About one- quarter have sales of less than $100,000 annually and don’t need a full-time manager, while larger farms are more easily overseen by one person, said David Anderson, an agricultural economist at Texas A&M University in College Station.
Record Profits No Job Creator on Farms as Owners Automate (Bloomberg)

Maybe we should grow some vegetable besides corn and soybeans:
Gardener Mike Alt has come up with the idea, and thinks that making gardens and urban farms more techy is a great way to help new farmers find more success and streamline the efforts of more experienced farmers. In the process, the urban gardening movement -- from postage stamp-sized backyard gardens to urban farms housed in vacant lots -- can get a leg up from the rapidly advancing Internet of Things.

From the Kickstarter campaign:

We've built a sophisticated, yet easy to use device that will help remove the guesswork for new farmers and provide automation and optimization features for those more experienced. The device is deployed in your farm or garden to monitor the key environmental conditions for improving your yield. This information is relayed back to HarvestGeek where you are provided detailed analysis. This device has affectionately taken the name HarvestBot around the shop...
HarvestGeek Automates Your Entire Garden (Treehugger)

Monday, February 18, 2013

Desperate Measures

From the BBC this weekend, a story of an expedition that is diving to the bottom of the ocean to search for the next generation of antibiotics. The 8 million pound (12 million dollar) expedition is going to troll the sediment of the sea bed to look for bacteria and fungi that have not already been exploited:
The inappropriate prescribing of antibiotics - and an over-reliance on the drugs - has led to a rapid increase in resistant bugs and medical experts fear effective antibiotics might soon run out completely. In January, Chief Medical Officer for England, Dame Sally Davies, compared the threat to global warming and said going for a routine operation could become deadly due to the risk of untreatable infection.

Project leader Marcel Jaspars, professor of chemistry at the University of Aberdeen, said: "If nothing's done to combat this problem, we're going to be back to a 'pre-antibiotic era' in around 10 or 20 years, where bugs and infections that are currently quite simple to treat could be fatal."

He said there had not been a "completely new" antibiotic registered since 2003 - "partially because of a lack of interest by drugs companies as antibiotics are not particularly profitable". "The average person uses an antibiotic for only a few weeks and the drug itself only has around a five to 10-year lifespan, so the firms don't see much return on their investment."
Antibiotics search to focus on sea bed (BBC)

This is to me an ultimate example of what we've been saying here for years - that we're going to desperate lengths to do the things that were once relatively easy and we took for granted. We will not be able to do this forever. If spending eight million pounds diving to the bottom of the ocean to look for new antibiotics isn't a sign of the increasingly desperate lengths we have to go to to keep this society going, I don't know what is. I'm sure the mainstream will spin at as a sign of human ingenuity. What it really is is a sign that we're up against it. Note, too, how the everything-for-profit system is endangering humanity yet again.

I'm sure you recall the story of the accidental discovery of pennicillin - how mold found its way into a petri dish and killed off some of the bacteria. Antiobiotics were used extensively during World War two and saved a lot of lives that otherwise would have been lost (thoroughout history, fighting was more deadly due to infection that the actual fighting itself). See the story here. After antibiotics became mass-produced, they saved millions of lives, but were also massively overexploited for profit in yet another tragedy of the commons. Four times as many antibiotics are given to animals as sick people:


Antibiotics and Antibiotic-Resistant Bacteria in Meat: Not Getting Better (Wired)

This what we've been warning about for so long - going to ever more dramatic lengths is a losing proposition in the long run. The technical term is intensification. As large mammals died off during the Pleistocene extinctions, we turned to growing cereal crops and herding. In the nineteenth century right whales were hunted to near extinction in a desperate bid to get enough whale oil, elephants were nearly driven extinct for their ivory tusks, and nitrogen rich guano from the droppings of sea birds was strip mined on islands in the Pacific Ocean. Of course, we know what happened in all of these instances - we substituted fossil fuels for all of these scare resources. But how much longer can we do this? It's the classic Red Queen's race - running faster and faster just to stay in the same place.

And the analogy is exactly perfect with fossil fuels themselves - we're taking ever more desperate measures, drilling down to the sea floor, strip-mining the tar sands of Canada, hydraulic fracturing and horizontal drilling to look for "tight oil", etc., to get at the last of the fuels that were once trivial to access:
On Sakhalin Island, in Russia’s far east, temperatures can fall to 35 degrees below zero. Many islanders herd reindeer. And in January, oil crews drilled the world’s longest and deepest extended-reach well, 7.7 miles down into the ground and 7.1 miles out under the ocean. Seven of the 10 longest oil wells on Earth have been drilled there since Exxon Mobil launched its Sakhalin-1 project in 2003. Crews expect to keep breaking their previous records in the coming months.

The seven-story oil rig at Sakhalin, nicknamed Yastreb (the Hawk), is the industry’s most powerful, with four 7,500-psi mud pumps, 14,000 barrels of liquidmud storage and six generators. It has two walls to help it withstand the cold and earthquakes, which are frequent. The Yastreb’s drill torque is approximately 91,000 foot-pounds (a pickup truck operates with about 200).

Extended-reach drills travel both outward and down. To control the position and angle of the wellbore, drilling engineers use magnetometers and inclinometers; the information the tools gather is sent back by pressure pulses in the drilling fluid, which the engineers then analyze at the surface. The team - about 800, mostly Russians — pre-maps each expedition using 3D seismic imagery to create visual models of the conditions in the rock and the locations of the oil reservoir. They can reach their target with an accuracy of just a few feet. It’s as if they were standing in the middle of Central Park and drilled down to a specific doorway of the New York Stock Exchange.
At The End Of The Earth: The Longest, Deepest Oil Wells In The World (PopSci). And yet it's still not enough:
Technically, the world isn’t even producing enough oil to keep pace with the rise in global incomes. Oil supply has risen by 2.3 percent since 2010. But the world economy has grown by 7.1 percent since then. The only reason that oil prices haven’t soared to record highs, Hamilton points out, is that countries have been undertaking new conservation measures. Americans, for instance, are buying more fuel-efficient cars in droves.

Granted, oil prices would almost certainly be even higher than they are now without the drilling boom over the past two years in places like North Dakota. But at this point, the extra drilling is struggling to keep up with the pace of global economic growth.

Most forecasters expect that to be the case for years to come. The International Energy Agency recently projected that U.S. oil production would continue rising through 2020 and beyond, as companies extract more “unconventional” oil from shale rock and other sources. But global demand was also expected to rise 35 percent between now and 2035, with China on pace to become the largest oil consumer in the world in the next two decades.
The boom in U.S. oil drilling hasn’t lowered gas prices (Washinton Post)

 
Heck, we're even having to dilute our bourbon and spike our hamburgers with horsemeat. What's it going to take to get the message? Intensification does not lead to prosperity in the long run, just a bigger disaster.

Saturday, February 16, 2013

Power, Ponerology, and Party Politics

Via Fabius Maximus, here's a good blog entry that sums up our predicament pretty well, Political failure modes and the beige dictatorship by Charlie Stross. It has the courage to state what many people are afraid to admit - representative democracy has failed:
For a while I've had the unwelcome feeling that we're living under occupation by Martian invaders. (Not just here in the UK, but everyone, everywhere on the planet.) Something has gone wrong with our political processes, on a global scale. But what? It's obviously subtle — we haven't been on the receiving end of a bunch of jack-booted fascists or their communist equivalents organizing putsches. But we've somehow slid into a developed-world global-scale quasi-police state, with drone strikes and extraordinary rendition and unquestioned but insane austerity policies being rammed down our throats, government services being outsourced, peaceful protesters being pepper-sprayed, tased, or even killed, police spying on political dissidents becoming normal, and so on. What's happening?
Here's a hypothesis: Representative democracy is what's happening. Unfortunately, democracy is broken. There's a hidden failure mode, we've landed in it, and we probably won't be able to vote ourselves out of it.
He then points out several salient facts:
Our representative systems almost all run on a party system; even pure PR systems like that of Israel rely on a party list...Parties are bureaucratic institutions with the usual power dynamic of self-preservation, as per Michels's iron law of oligarchy: the purpose of the organization is to (a) continue to exist, and (b) to gain and hold power.

Per Michels, political parties have an unspoken survival drive. And they act as filters on the pool of available candidates. You can't easily run for election — especially at national level — unless you get a party's support, with the activists and election agents and assistance and funding that goes with it...Existing incumbent representatives have an incentive to weed out potential candidates who are loose cannons and might jeopardize their ability to win re-election and maintain a career.

A secondary issue is that professionals will cream amateurs in any competition held on a level playing field. And this is true of politics as much as any other field of human competition...The emergence of a class of political apparatchik in our democracies is almost inevitable.
His conclusion:
Overall, the nature of the problem seems to be that our representative democratic institutions have been captured by meta-institutions that implement the iron law of oligarchy by systematically reducing the risk of change. They have done so by converging on a common set of policies that do not serve the public interest, but minimize the risk of the parties losing the corporate funding they require in order to achieve re-election. And in so doing, they have broken the "peaceful succession when enough people get pissed off" mechanism that prevents revolutions. If we're lucky, emergent radical parties will break the gridlock (here in the UK that would be the SNP in Scotland, possibly UKIP in England: in the USA it might be the new party that emerges if the rupture between the Republican realists like Karl Rove and the Tea Party radicals finally goes nuclear), but within a political generation (two election terms) it'll be back to oligarchy as usual.
So the future isn't a boot stamping on a human face, forever. It's a person in a beige business outfit advocating beige policies that nobody wants (but nobody can quite articulate a coherent alternative to) with a false mandate obtained by performing rituals of representative democracy that offer as much actual choice as a Stalinist one-party state. And resistance is futile, because if you succeed in overthrowing the beige dictatorship, you will become that which you opposed.
This is interesting, in that historically, every time it's been tried, representative democracy tends to break down over time (Athens, Rome, etc.). Is oligarchy/feudalism the "natural state" of large-scale human societies? We've had a good run here in America, but it looks like we're nearing the end. Many have pointed out the similarities between the modern-day U.S. and the fall of the Roman Republic, for example, see this: Is the American Republic dying, as in the last days of the Roman Republic? (Fabius Maximus). From the article cited in the post:
The endemic corruption in the Roman state was another source of instability.  Corruption was illegal, but Rome lacked a former criminal justice system with state prosecutors.  Instead, any Roman citizen could bring a case against any other in a relevant court.  Corruption prosecutions became a tool for settling political feuds or for winning over supporters. The famed Roman orator and politician, Cicero, first established himself as a major player due to his successful prosecution of Gaius Verres for his corruption as governor of Sicily. The threat of private prosecution for official corruption became a tool for political competition, and yet it was almost impossible not to be corrupt because the cost of running for office increased so dramatically during the 1st Century BC so that even wealthy men and families needed some way to recoup the costs.
As mentioned already, these problems were widely understood.  And yet the Roman state was unable to act.  Three dynamics combined with poorly designed state institutions to lead to paralysis.  The first dynamic was that within the Roman body politic, there was a small, but solid group of aristocrats who were ideologically opposed to reform.  Their argument, in short, was that it was a mistake to mess with success.  Roman institutions and values had lead to greatness, and so the answer to any and all problems simply required a “return” to traditional roman values.  This lead to occasional anti-sumptuary laws and various efforts to outlaw corruption, but also to a determined resistance to any sort of structural reform such as land reform or multi-year terms for provincial governors.  The second dynamic was that even among reformers there was a great deal of concern about letting anyone get the “credit” for solving problem.  Whomever managed to pass land reform, for instance, would instantly gain a wide following among those new small farmer given land to work.  Third, there was a relatively small number of wealthy aristocrats, some of whom has managed to effectively privatize public lands and were unwilling to allow distribution of those lands, even if as a matter of law they didn’t actually have title to them anyway.
Perhaps you recognize these opponents to reform.  In Rome, they were called the Optimates (“Best Men”), today we call them Republicans.  Same coalition of ideologues, political hacks, and the wealthy.  Same recognition of the existence of problems, and same empty solution: a call for a return to traditional values that made the nation great.  They were opposed by a loose coalition of Populares (“favoring the people”) made up of a combination of radicals and pragmatists, and often prone to turning on each other or making deals with the Optimates.  Many, though not all, of them favored “comprehensive” reforms aimed at addressing structural problems.  And indeed, some were probably little more than political opportunists.  Anyway, we call them Democrats today.  And like the Democrats of today, the Populares were largely ineffective. Aristocrats (“Patricians”) were mostly counted among the Optimates, but not exclusively, while commoners (“Plebians or Plebs”) were often among the Populares, but also not exclusively.  Gaius Julius Caesar, a member of one of the oldest Patrician families was a Populari, whereas Cicero, a Plebian and a “new man” became an Optimata.
But it's worse than that. As the Polish psychiatrist Andrzej Łobaczewski pointed out in his work on Ponerology, the most sociopathic people are the ones attracted to positions of power. I'm sure we can all personally vouch for this. Here's a good summary from a commenter at Naked Capitalism:
In Political Ponerology: Science of the Nature of Evil Adjusted for Political Purposes Andrew M. Lobaczewski postulates a process which he calls “ponerization” by which ideological movements become corrupted. As he explains:

In the ponerogenic process of the pathocratic phenomenon, characteropathic individuals adopt ideologies,… recast them into an active propaganda form, and disseminate it with their characteristic pathological egotism and paranoid intolerance for any philosophies which may differ from their own.

“Characteropathic individuals” Lobaczewski defines as “individuals who are insufficiently critical, frequently frustrated as a result of downward social adjustment, culturally neglected, or characterized by some psychological deficiencies of their own.”

As the ponerization process proceeds, those who have been “injured by social injustice” get purged by the “essential psychopaths.” “From this time on,” Lobasczewski warns, “using the ideological name of the movement in order to understand its essence becomes a keystone of mistakes.” Eventually the essential psychopaths, which he dubs “pathocrats,” achieve absolute domination of the movement, and if the movement is successful in taking control of the government, this results in a state he calls “pathocracy.”

Since Lobaczewski lived in Communist Poland, he uses the vitiation of Marxism as his example. But he says that all types of ideological movements—-social, political, and religious—-have fallen victim to ponerization.
And here are some key quotes (emphasis mine):
The actions of this phenomenon affect an entire society, starting with the leaders and infiltrating every village, small town, factory, business, or farm. The pathological social structure gradually covers the entire country, creating a “new class” within that nation. This privileged class of deviants feels permanently threatened by the “others”, i.e. by the majority of normal people. Neither do the pathocrats entertain any illusions about their personal fate should there be a return to the system of normal man.

A normal person deprived of privilege or high position will go about finding and performing some work which will earn him a living; but pathocrats never possessed any solid practical talent, and the time frame of their rule eliminates any residual possibilities of adapting to the demands of normal work. If the laws of normal man were reinstated, they and theirs could be subjected to judgment, including a moralizing interpretation of their psychological deviations; they would be threatened by a loss of freedom and life, not merely a loss of position and privilege. Since they are incapable of this kind of sacrifice, the survival of a system which is the best for them becomes a moral imperative. Such a threat must be battled by means of any and all psychological and political cunning implemented with a lack of scruples with regard to those other “inferior-quality” people that can be shocking in its depravity.

[….]

Any war waged by a pathocratic nation has two fronts, the internal and the external. The internal front is more important for the leaders and the governing elite, and the internal threat is the deciding factor where unleashing war is concerned…

The non-pathological majority of the country’s population will never stop dreaming of the reinstatement of the normal man’s system in any possible form. This majority will never stop watching other countries, waiting for the opportune moment: its attention and power must therefore be distracted from this purpose, and the masses must be “educated” and channeled in the direction of imperialist strivings. This goal must be pursued doggedly so that everyone knows what is being fought for and in whose name harsh discipline and poverty must be endured. The latter factor—-creating conditions of poverty and hardship—-effectively limits the possibility of “subversive” activities on the part of the society of normal people.
Which brings us to this post meditating on the revenge manhunt for Christopher Dorner and the push-button drone assasinations in Afghanistan: Is This Fucking Afghanistan?!?! (Power of Narrative). The conclusion:
Like I said, I'm totally stupid, but I think the lesson goes something like this: If you threaten the State itself or interfere with its plans, and especially if you threaten one of the State's armed, militarized branches, the State will go to war, including here at home. I don't mean "go to war" metaphorically. I mean the State will go to war. So I think we know what the response will be if/when there is widespread domestic unrest, when the economy further falls apart, for example. Entire neighborhoods cordoned off, whole blocks incinerated, checkpoints everywhere, people shot for any reason, or for no reason. Oh, yeah: no more fourth amendment for you, either (via). Well, it's not like you were doing anything with it. If you haven't done anything wrong, you don't have anything to hide, etc., etc., etc. And just to be safe, you don't want to be there. Yeah, where you are right now. Don't be there. And don't move. Haha. Gotcha!

Just as the fish rots from the head, the horrors start with the president. Why should Obama be the only one to have a Murder Program? It's only right that every police department should have its own Murder Program. Especially in war time.

What we're now seeing is further proof of an argument I've made for the last several years. It can be stated very briefly:

In addition to pursuing its goal of global hegemony, the United States government uses foreign countries as a lethal laboratory in which to practice the techniques it intends to use domestically, at home within U.S. borders.

Welcome to Afghanistan.

This is just the beginning.
Representative democracy, intended to safeguard against this, has failed utterly. All the major corporate states, the United States, Russia, China, and much of Europe, are under the control of a business-financed oligarchy that ruthlessly suppresses any dissent, even while holding "elections," while the mass of citizens suffer ever-decreasing standards of living under financialized rentier capitalism with no real alternative. Was this the end of history that was predicted?

Friday, February 15, 2013

Outer Space

Everybody's talking about the meteor that hit Russia yesterday. Fortunately, everyone in Russia has dashboard cams. Why?:
According to a report last year by Al Jazeera, an estimated one million Russian motorists have dashboard video cameras installed in their cars. This is not to capture moments like the meteor flight or even miraculous survivals of horrifying highway crashes. No, Al Jazeera reported that the cams are there to help stamp out police corruption.

New York Blogger Marina Galperina, who originally hails from Russia, wrote a fascinating piece last year about Russia's dash cam culture. Galperina called the cams "Russia's last hope for civility and survival on the road." The country's roads are "perilous" and, she wrote, "psychopaths are abundant." However Galperina also points to police and government corruption as a driving force behind the Russian dash cam explosion.

"The Russian Highway Patrol is known throughout their land for brutality, corruption, extortion and making an income on bribes. Dash-cams won’t protect you from being extorted for cash, because your ass shouldn’t have been speeding. It will however keep you safer from drunks in uniform, false accusations and unreasonable bribe hikes."

Galperina also notes that along with all the posts on YouTube, numerous Russian Web sites have sprung up to host videos of these uncut crashes and accidents and many of the videos are horrifying. It's not all bad, however. "There are moments of humanity among the crashes, in between the skidding, the burning, the kicking," wrote Galperina, "There are dash-cam videos with happy endings."
Here's Why So Many Russians Have Dash Cams (Mashable)

Of course this is very reminiscent of the 1908 Tunguska explosion. It must be the vastness of Russia that puts it in the way of all these meteors. Then again, maybe that one was caused by Tesla's death ray. Did the U.S. learn something when they raided his files?
Russian nationalist lawmaker Vladimir Zhirinovsky, long known for his flamboyance and outrageous remarks, said Friday that meteorite fragments had not rained down on Russia in the morning, but that the light flashes and tremors in several of the country’s regions resulted from US weapons tests, APA reports quoting RIA Novosti.
“Those aren’t meteors falling, it’s the Americans testing new weapons,” Zhirinovsky, leader of the Liberal Democratic Party, told journalists several hours after the Emergencies Ministry began issuing statements on the incident, which has injured hundreds and damaged scores of buildings.
He also said US Secretary of State John Kerry had wanted to warn Russian Foreign Minister Sergei Lavrov about the “provocation” on Monday, but couldn’t reach him – a reference to US State Department comments earlier this week that Kerry had spent several days trying to speak to Lavrov by phone to discuss North Korea and Syria.

Then of course there is the also the asteroid near-miss happening at the same time. Maybe we need to call Sean Connery:


Or else Jackie Chan:



Wednesday, February 13, 2013

Merger Mania

Lately I've noticed a trend that seems to be growing. I wasn't planning on writing about this, but today I see this headline: American, US Airways boards OK merger to create world's largest airline, meaning I guess the universe is telling me to mention it.

One of the few perks of being an architect is that building products companies will buy everyone in your firm a lunch while they do a presentation about their products. It's called a "lunch and learn," and it's written off by companies as a business expense (whether it should be is another matter). They also typically combine that with a lecture session by which you can earn the continuing education credits you need to keep your license.

We typically have one of these a week, and over the last few years a marked trend that is so noticeable as to be impossible to ignore is the merger in nearly every area of multiple companies into just one or two giant conglomerates. It's fair to say that if there are a handful of major competitors in any market, before long most of them will be combined into one. Now, it's doubtful that people who don't interact with the construction industry are aware of this trend, or care. Certainly the people drooling in front of FOX News won't hear about it. But it's one of the many unheralded stories taking place outside of a media spotlight focused on political infighting, show business, hot-button social issues, and sensationalism rather than information.

So, for example, let's say the market for, I don't know, bricks, is dominated by BrickCo Brick Company, The Bricksman Corporation, and Butter-and-Stack Industries. You may have dealt with all these in the past, and even heard presentations from all of them at some point. But today, you'll be hearing from the BrickCo-Stack-Bricksman Corporation, and all the wonderful new products they have on offer.

It seems like every lunch presentation now starts off with a list of mergers. To be specific, today's lunch and learn was from Clark Dietrich Building Systems, makers of fine steel studs and related products. I wasn't paying enough attention to get this exactly right, so forgive the inaccuracies, but the presenter said something about 11 companies in 2001 merging down to just two (ClarkWestern Building Systems and Dietrich Metal Framing), finally merging together in 2011 to form Clark Dietrich. If you're interested in more details of these mergers, they have a handy chart on their Web site.

And they're hardly alone. It really doesn't shock us anymore when our product representatives suddenly work for a different company without ever changing jobs. These kinds of mega-mergers are the norm. Suddenly, two products which you might have specified as competitors end up owned by the same company. They may not be as noticeable as American and United Airlines, but it is happening everywhere, and at a faster rate than ever.

The other trend is the "octopus company," the one company that has about a hundred different brands. You think you're buying from different companies, but all your money's flowing to the apex of the pyramid. Exhibit A: Ingersoll Rand. Hop on over and look at how many brand names they control. Exhibit B: Georgia-Pacific. Hop on over there to see how much of the construction industry ends up in the pockets of the Koch Brothers. Exhibit C: RPM International. I didn't even realize how many brands were ultimately owned by them until just now. Or how about SikaSarnafil? Yes, they also used to be two companies; now it's owned by the Sika Group out of Switzerland. That's just off the top of my head. I'm willing to bet that most people reading this have never heard of any of those companies. Yet their products are in some way a part of your life and the world around you.

These companies straddle the globe, and ownership truly knows no borders. All the money from the thousands of products these brands sell all over the world ultimately ends up in the hands of small elite of international investors. And don't hand me the line about grandma's pension fund.

So what's the point I'm trying to make? What does all this consolidation mean? I'm glad you asked. Simple logic tells us that it means 1.) a concentration of wealth in the hands of fewer and fewer people, 2.) more control over the market and less competition, and 3.) drastically reduced job opportunities as competitors are forced out. And what are the major trends in the world today? Greater and greater wealth in fewer and fewer hands, and less and less employment for the masses. Now you can understand why it's happening. This is the thesis of Barry Long's book Cornered: The New Monopoly Capitalism and the Economics of Destruction. It's one thing to read about it in the abstract. But it's quite another when you see it first hand every week while you're eating your Potbelly or Jimmy John's.

The massive consolidation in all sectors of the economy is a trend that's not getting enough attention. Why is it happening? I think it's simple - as the economy slows down and the rate of profits decline, companies merge to keep profits high by controlling entire sectors of the economy. That can explain an often-cited contradiction - if the rate of profit is declining, why are overall corporate profits at all-time highs? Because of mergers and consolidation. The profits from ten little companies are all rolled into one giant one. Unfortunately for workers, the jobs that those ten companies might have created are now only created by a single company. And the profit and the jobs are global, leading to a small caste of international winners, even as entire nations fall into disrepair. This is a major factor that's being ignored in my opinion.

What are the perils and the promise? Perhaps entire sectors of the economy consolidating into single companies means it will be easier for workers to seize control, as Marx predicted. After all once competition is eliminated, why do these companies even need to be private? Or perhaps they will take the role of medieval lords in new variety of corporate neofeudalism, where one's rights are exclusively dependant upon your recourse to these powerful entities rather than guaranteed as a citizen of a nation state.

I guess I learned more from lunch and learns than what I was expecting.

Monday, February 11, 2013

When Work Doesn't Pay

 
There is a persistent myth that a skills shortage or not enough education is the reason why wages are so low and jobs are going unfilled. It's one of the biggest crocks we're being sold (and that's saying a lot). Wages no longer cover the costs of employee training which must be borne on the back of the employee and/or the state, most jobs are low wage (yet require a degree of some sort just to apply), and jobs go begging rather than train people or hire someone without a degree. Work no longer guarantees a decent standard of living for many people. What kind of economy cannot pay its workers enough to live on? Which economic theory accounts for that? Even slave owners had to take care of their slaves. Clearly fiddling with interest rates and tax levels is not going to fix this. It looks like the Iron Law of Wages was less of a myth than we think.
Last night 60 Minutes did a segment about manufacturing companies that can't find good entry-level workers even though millions of people are out of jobs. While reporter Byron Pitts was reading the intro, I threw my shoe at the TV (figuratively) and told Marian acidly that it would almost be worth watching the segment just to see what idiocy they were going to promote this time around. I was just about to switch back to the Bears-Texans game, but it turned out Marian wanted to watch the segment, so I ended up watching it too. About halfway through, after describing a local community college in Reno that trains students to run complex, computer-controlled machines, we got this:

    "Most of the students here will start at jobs paying 12 dollars an hour."

At that point I jeered. You're wondering why you can't get highly qualified applicants for 12 bucks an hour? Spare me.
Companies That Want Better Workers Need To Pay More (Mother Jones)
Friedman tells readers:

"Welding 'is a $20-an-hour job with health care, paid vacations and full benefits,' said Tapani, but 'you have to have science and math. I can’t think of any job in my sheet metal fabrication company where math is not important. If you work in a manufacturing facility, you use math every day; you need to compute angles and understand what happens to a piece of metal when it’s bent to a certain angle.'

Who knew? Welding is now a STEM job — that is, a job that requires knowledge of science, technology, engineering and math."

The obvious problem in this story is that Tapini apparently doesn't understand that you have to pay more money to get highly skilled workers. If the minimum wage had risen in step with inflation and productivity since the late sixties, it would be almost $20 an hour today. Back in the late sixties, a typical minimum wage worker would have a high school degree or less. Now, according to Friedman, we have CEOs who think that they can get highly skilled workers at the some productivity adjusted wage as someone who would have had limited literacy and numeracy skills 45 years ago. If we applied the same standard to doctors, they would be averaging around $100,000 a year today (instead of around $250,000). If employers really do have such poor understanding of how markets work then it will certainly be a serious impediment to economic growth in the years ahead.
Thomas Friedman Says That Our Economy Is Being Killed By Employers Who Can't Do Arithmetic (CEPR)
And yet, even as classes like Goldenberg’s are filled to capacity all over America, hundreds of thousands of U.S. factories are starving for skilled workers. Throughout the campaign, President Obama lamented the so-called skills gap and referenced a study claiming that nearly 80 percent of manufacturers have jobs they can’t fill. Mitt Romney made similar claims. The National Association of Manufacturers estimates that there are roughly 600,000 jobs available for whoever has the right set of advanced skills.

Eric Isbister, the C.E.O. of GenMet, a metal-fabricating manufacturer outside Milwaukee, told me that he would hire as many skilled workers as show up at his door. Last year, he received 1,051 applications and found only 25 people who were qualified. He hired all of them, but soon had to fire 15. Part of Isbister’s pickiness, he says, comes from an avoidance of workers with experience in a “union-type job.” Isbister, after all, doesn’t abide by strict work rules and $30-an-hour salaries. At GenMet, the starting pay is $10 an hour. Those with an associate degree can make $15, which can rise to $18 an hour after several years of good performance. From what I understand, a new shift manager at a nearby McDonald’s can earn around $14 an hour.

The secret behind this skills gap is that it’s not a skills gap at all. I spoke to several other factory managers who also confessed that they had a hard time recruiting in-demand workers for $10-an-hour jobs. “It’s hard not to break out laughing,” says Mark Price, a labor economist at the Keystone Research Center, referring to manufacturers complaining about the shortage of skilled workers. “If there’s a skill shortage, there has to be rises in wages,” he says. “It’s basic economics.” After all, according to supply and demand, a shortage of workers with valuable skills should push wages up. Yet according to the Bureau of Labor Statistics, the number of skilled jobs has fallen and so have their wages.
Skills Don't Pay The Bills (New York Times)
A front-page article in the Wall Street Journal presents a fascinating mystery: Despite persistent high unemployment, some employers are having a tough time filling jobs.

"In Bloomington, Ill., machine shop Mechanical Devices can't find the workers it needs to handle a sharp jump in business. Job fairs run by airline Emirates attract fewer applicants in the U.S. than in other countries. Truck-stop operator Pilot Flying J says job postings don't elicit many more applicants than they did when the unemployment rate was below 5 percent."

What gives? Employers these days seem taken aback when highly qualified, experienced people fail to rush to apply for the openings they post. The article supplies several possible explanations: For jobs that require specialized skills, there simply might not be enough qualified applicants; employees accustomed to working at higher-paying office jobs aren't eager to take lower-paying jobs at truck stops and restaurants; some of the unemployed might prefer collecting a few hundred dollars per week in unemployment benefits, while they last, to working a job that pays $8 per hour.

But the Journal article seems to overlook one important factor. Even in an age of historic underutilization of the labor force, the laws of supply and demand apply. Hiring is a negotiation between employers and employees over the terms at which they'll agree to come to work—wages, benefits, working conditions, length of commute, relocation requirements. Maybe some of these employers just aren't offering terms that are good enough.

At  "Mechanical Devices, which supplies parts for earthmovers and other heavy equipment to manufacturers such as Caterpillar Inc., part owner Mark Sperry says he has been looking for $13-an-hour machinists since early last year," the Journal reports. Thirteen dollars an hour, 40 hours per week, 50 weeks per year—that comes to an annual salary of $26,000. (In the adjacent Peoria, Ill., metropolitan area, per capita income was $39,965 in 2008.) Or take Emirates Airlines. When it held jobs fairs in cities like Miami and Houston, only about 50 people showed up, "compared to a global average of about 150 and as many as 1,000 at some events in Europe and Asia." The jobs don't require much in the way of education, and they come with benefits, free accommodations, and a starting salary of $30,000. But you'd have to move halfway around the world, to Dubai—an alien and expensive place. Would you uproot yourself and your family for $30,000 a year? Don't you think both of these employers would find many more interested applicants if they offered higher wages?
Is Any Job Better Than No Job? (Slate)
While there is still an upper tier of positions that require a college education and in many cases, advanced degrees, the bulk of employment growth in this economy is in badly paid service jobs. Despite the fact that the pundit class keeps wailing how America isn’t growing enough high skilled workers to compete in the world economy, the evidence is otherwise. For instance, Gene Sperling will regularly contend that America needs more engineers. Yet engineering jobs don’t pay enough to reward the cost of getting that degree. I’ve had engineers regularly say in comments that the only way to do well with an engineering degree is to then get a law degree and become a patent/intellectual property attorney. If the US really does need more engineers for competitiveness reasons, then it needs to get the cost of their education down, much the way it subsidizes the cost of educating elite mathematicians and physicists.
70% Of Jobs "Created" Don't Require A College Education (Naked Capitalism)
“How,” I asked her after I read the report, “is this anything except just depressing?” As economic struggles push people into low-wage jobs, their limited resources press retailers to keep prices low and affordable — and one way to do that, as American Public Media’s Marketplace reported, is to create still more low-wage jobs, often temp jobs, that don’t cost a company in benefits or demand steady pay for a worker in the off-season. With one in four workers in a low-wage job today, and low-wage work projected to account for two out of every three new jobs in the United States over the next decade, it’s hard to see the constructive side of a report on how those same jobs are difficult not just for workers, but for their families, and even harder to see how we get out of either cycle.

It’s part of our cultural narrative to insist that parents are fully responsible for raising our children (despite our national need to “produce” another generation of healthy, well-educated adults capable of filling the jobs we now hold). But you don’t have to favor subsidized day care or mandated child-care leaves for all to recognize that the forces that create and structure low-wage jobs are just one of the ways our society’s choices actively make taking that responsibility more difficult. We choose, again and again, to ignore the very real needs of the significant proportion of the work force with children at home. The result, in the low-wage worker’s world, is that job flexibility becomes the ability to quit one job and later find another, and many people describe themselves as being one sick kid away from unemployment.

“These families couldn’t remain intact without the effort of young kids stepping into adult shoes,” Dr. Dodson said, but without the chance to cultivate their own talents, many of these teenagers will find themselves playing out the low-wage job cycle with their own children. That’s good for creating a nation of low-wage workers, and bad for the American tradition of helping each generation do better than the one before.
How Children Subsidize 'Low, Low Prices' (New York Times)
Across the country, tens of thousands of underemployed and jobless young people, many with college credits or work histories, are struggling to house themselves in the wake of the recession, which has left workers between the ages of 18 and 24 with the highest unemployment rate of all adults.

Those who can move back home with their parents — the so-called boomerang set — are the lucky ones. But that is not an option for those whose families have been hit hard by the economy, including Mr. Taylor, whose mother is barely scraping by while working in a laundromat. Without a stable home address, they are an elusive group that mostly couch surfs or sleeps hidden away in cars or other private places, hoping to avoid the lasting stigma of public homelessness during what they hope will be a temporary predicament.

These young adults are the new face of a national homeless population, one that poverty experts and case workers say is growing. Yet the problem is mostly invisible. Most cities and states, focusing on homeless families, have not made special efforts to identify young adults, who tend to shy away from ordinary shelters out of fear of being victimized by an older, chronically homeless population. The unemployment rate and the number of young adults who cannot afford college “point to the fact there is a dramatic increase in homelessness” in that age group, said Barbara Poppe, the executive director of the United States Interagency Council on Homelessness.
After Recession, More Young Americans Are Living On The Street (New York Times)
Beatriz Morales García, 31, said she could not remember the last time she went shopping for herself. A few years ago, she and her husband, Daniel Chiva, 34, thought that they had settled into a comfortable life, he as a bus driver and she as a therapist in a rehabilitation center for people with mental disabilities. His job is financed by the City of Valencia, and hers by the regional government of Valencia.

They never expected any big money. But it seemed reasonable to expect a reliable salary, to take on a mortgage and think about children. In the past year, however, both of them have had trouble being paid. She is owed 6,000 euros, nearly $8,000. They have cut back on everything they can think of. They have given up their landline and their Internet connection. They no long park their car in a garage or pay for extra health insurance coverage. Mr. Chiva even forgoes the coffee he used to drink in a cafe before his night shifts. Still, the anxiety is constant.

“There are nights when we cannot sleep,” he said. “Moments when you talk out loud to yourself in the street. It has been terrible, terrible.”
In Spain, Having A Job no Longer Guarantees A Paycheck (New York Times)
Delta Air Lines Inc. (DAL), the world’s second-largest carrier, received 22,000 applications for about 300 flight attendant jobs in the first week after posting the positions outside the company.

The applications arrived at a rate of two per minute, Chief Executive Officer Richard Anderson told workers in a weekly recorded message. Applicants will be interviewed in January and those hired will begin flying in June, for the peak travel season.

“We’re hunting for foreign-language speakers as we continue to expand to all points around the globe,” Anderson said. “We are experiencing a phenomenal response to the job posting.”
Delta Air Gets 22,000 Applications for 300 Attendant Jobs (Bloomberg)
The millions of underemployed Americans today, working part-time or in jobs significantly beneath their skill level, underline a persistent feature of our workforce, starting long before the Great Recession: one out of four jobs pay sub-standard wages. Good Jobs America, a new book written by Paul Osterman and conceived with co-author Beth Shulman before her death, tackles this other half of the jobs crisis: the need to create more good jobs, with wages that can support a family.

A great strength of the book is the authors' creation of new data on low-wage jobs, bringing to light how little so many of us bring home from our work. The authors are exquisitely cognizant of the current policy and political climate that looks skeptically on the ability of government to intervene in the "power and correctness of the market." As a result, much of the book carefully examines the arguments and strategies that rely on non-government interventions in the labor market to increase job quality, as well as a refutation of conservative arguments against public policies to increase wage levels. In thoroughly exploring other avenues of change, and doing their best but ultimately failing to identify promising paths that don't rely principally on government, Osterman and Shulman make it clear why they conclude "what is needed is a broader political, social, and economic environment that supports progressive employment strategies." By exhausting the limits of other avenues, the book ultimately ends up making the case that we must have government action to ensure decent jobs for all.

The authors refute the "myth" that education is the solution to the problem by pointing out the obvious: "There will always be hotel room cleaners and food servers and medical assistants and the myriad of other low-wage jobs." Education may help an individual, but it won't solve the large societal problem. Furthermore, they review research that finds "most adults holding these jobs will not escape them." They describe numerous programs developed in industries like health care and hospitality to create career ladders for low-wage employees and -- while doing everything they can to accentuate the positive -- find that few of the programs are sustainable or result in many employees moving into better jobs.

The persistence of the problem is underlined in their discussion of another common bugaboo: immigration. Data they assembled show that from 1994 to 2010, while the proportion of immigrants in the workforce increased by 70 percent, the percentage of jobs that were low-wage stayed the same, 24 percent. Their data also reveal that while immigrants held more low-wage jobs in 2010, the percentage of immigrants who took jobs that were below the low-wage standard remained at just below 40 percent.

As the authors deeply believe that employers need to be part of the solution, they look closely at the problems that employers face in raising wages and promoting career training. But they find that "high-road" employers are few and far between, motivated by the rare business with a mission or CEO that is committed to decent wages and benefits. They find no evidence that a Costco has any impact on a retail job market dominated by WalMart, which when it comes into a market suppresses wages in its competitors. The history of labor partnerships also is not promising. Levi-Strauss' attempt at paying good wages collapsed under the pressure of foreign competition and when an agreement between the hotel employee union HERE and San Francisco hotels to trade employer flexibility for more training and wage increases melted in the face of non-union competition.
Why We Need the Government to Create Not Just Jobs, but Good Jobs (New Deal 2.0):
Politicians across the political spectrum herald “job creation,” but frightfully few of them talk about what kinds of jobs are being created. Yet this clearly matters: According to the Census Bureau, one-third of adults who live in poverty are working but do not earn enough to support themselves and their families.

A quarter of jobs in America pay below the federal poverty line for a family of four ($23,050). Not only are many jobs low-wage, they are also temporary and insecure. Over the last three years, the temp industry added more jobs in the United States than any other, according to the American Staffing Association, the trade group representing temp recruitment agencies, outsourcing specialists and the like.

Low-wage, temporary jobs have become so widespread that they threaten to become the norm. But for some reason this isn’t causing a scandal. At least in the business press, we are more likely to hear plaudits for “lean and mean” companies than angst about the changing nature of work for ordinary Americans.

How did we arrive at this state of affairs? Many argue that it was the inevitable result of macroeconomic forces — globalization, deindustrialization and technological change — beyond our political control. Yet employers had (and have) choices. Rather than squeezing workers, they could have invested in workers and boosted product quality, taking what economists call the high road toward more advanced manufacturing and skilled service work. But this hasn’t happened. Instead, American employers have generally taken the low road: lowering wages and cutting benefits, converting permanent employees into part-time and contingent workers, busting unions and subcontracting and outsourcing jobs. They have done so, in part, because of the extraordinary evangelizing of the temp industry, which rose from humble origins to become a global behemoth.

The story begins in the years after World War II, when a handful of temp agencies were started, largely in the Midwest. In 1947, William Russell Kelly founded Russell Kelly Office Service (later known as Kelly Girl Services) in Detroit, with three employees, 12 customers and $848 in sales. A year later, two lawyers, Aaron Scheinfeld and Elmer Winter, founded a similarly small outfit, Manpower Inc., in Milwaukee. At the time, the future of these fledgling agencies was no foregone conclusion. Unions were at the peak of their power, and the protections that they had fought so hard to achieve — workers’ compensation, pensions, health benefits and more — had been adopted by union and nonunion employers alike.

But temp leaders were creating a new category of work (and workers) that would be exempt from such protections.

The temp agencies’ Kelly Girl strategy was clever (and successful) because it exploited the era’s cultural ambivalence about white, middle-class women working outside the home. To avoid union opposition, they developed a clever strategy, casting temp work as “women’s work,” and advertising thousands of images of young, white, middle-class women doing a variety of short-term office jobs. The Kelly Girls, Manpower’s White Glove Girls, Western Girl’s Cowgirls, the American Girls of American Girl Services and numerous other such “girls” appeared in the pages of Newsweek, Business Week, U.S. News & World Report, Good Housekeeping, Fortune, The New York Times and The Chicago Daily Tribune. In 1961 alone, Manpower spent $1 million to put its White Glove Girls in the Sunday issue of big city newspapers across the country.

The strategy was an extraordinary success. Not only did the Kelly Girls become cultural icons, but the temp agencies grew and grew. By 1957, Kelly reported nearly $7 million in sales; in 1962, with 148 branches and $24 million in sales, it went public. Meanwhile, by 1956 Manpower had 91 branches in 65 cities (and 10 abroad) and, with sales at $12 million annually, employed some 4,000 workers a day. In 1962, Manpower also went public, boasting 270 offices across four continents and over $40 million in sales.

The temp agencies’ Kelly Girl strategy was clever (and successful) because it exploited the era’s cultural ambivalence about white, middle-class women working outside the home. Instead of seeking to replace “breadwinning” union jobs with low-wage temp work, temp agencies went the culturally safer route: selling temp work for housewives who were (allegedly) only working for pin money. As a Kelly executive told The New York Times in 1958, “The typical Kelly Girl… doesn’t want full-time work, but she’s bored with strictly keeping house. Or maybe she just wants to take a job until she pays for a davenport or a new fur coat.”

Protected by the era’s gender biases, early temp leaders thus established a new sector of low-wage, unreliable work right under the noses of powerful labor unions. While greater numbers of employers in the postwar era offered family-supporting wages and health insurance, the rapidly expanding temp agencies established a different precedent by explicitly refusing to do so. That precedent held for more than half a century: even today “temp” jobs are beyond the reach of many workplace protections, not only health benefits but also unemployment insurance, anti-discrimination laws and union-organizing rights.

By 1967 Manpower employed more workers than corporate giants like Standard Oil of New Jersey and the U.S. Steel Corporation. Manpower and the other temp agencies had gained a foothold, and temporary employment was widely considered a legitimate part of the economy. Now eyeing a bigger prize — expansion beyond pink-collar work — temp industry leaders dropped their “Kelly Girl” image and began to argue that all employees, not just secretaries, should be replaced by temps. And rather than simply selling temps, they sold a bigger product: a lean and mean approach to business that considered workers to be burdensome costs that should be minimized.

According to the temp industry, workers were just another capital investment; only the product of the labor had any value. The workers themselves were expendable.

The temp industry’s continued growth even in a boom economy was a testament to its success in helping to forge a new cultural consensus about work and workers. Its model of expendable labor became so entrenched, in fact, that it became “common sense,” leaching into nearly every sector of the economy and allowing the newly renamed “staffing industry” to become sought-after experts on employment and work force development. Outsourcing, insourcing, offshoring and many other hallmarks of the global economy (including the use of “adjuncts” in academia, my own corner of the world) owe no small debt to the ideas developed by the temp industry in the last half-century.

A growing number of people call for bringing outsourced jobs back to America. But if they return as shoddy, poverty-wage jobs — jobs designed for “Never-Never Girls” rather than valued employees — we won’t be better off for having them. If we want good jobs rather than just any jobs, we need to figure out how to preserve what is useful and innovative about temporary employment while jettisoning the anti-worker ideology that has come to accompany it.
The Rise Of The Permanent Temp Economy (New York Times)
America's long and steady march toward a fully disposable workforce continues apace, the Bureau of Labor Statistics reported this week. Union membership is at its lowest point in nearly a century, with just 11.3% of all workers – the same level it was in 1916. To put this in proper historical perspective, union members are as rare today as they were at a time when being one could get you shot to death in a mining camp by the Colorado national guard.
US unions' continued decline masks new forms of worker activism (Guardian)