Monday, July 14, 2014

Flight From Death

 ...For many Americans, modern medical advances have made death seem more like an option than an obligation. We want our loved ones to live as long as possible, but our culture has come to view death as a medical failure rather than life’s natural conclusion.

These unrealistic expectations often begin with an overestimation of modern medicine’s power to prolong life, a misconception fueled by the dramatic increase in the American life span over the past century. To hear that the average U.S. life expectancy was 47 years in 1900 and 78 years as of 2007, you might conclude that there weren’t a lot of old people in the old days — and that modern medicine invented old age. But average life expectancy is heavily skewed by childhood deaths, and infant mortality rates were high back then. In 1900, the U.S. infant mortality rate was approximately 100 infant deaths per 1,000 live births. In 2000, the rate was 6.89 infant deaths per 1,000 live births.

The bulk of that decline came in the first half of the century, from simple public health measures such as improved sanitation and nutrition, not open heart surgery, MRIs or sophisticated medicines. Similarly, better obstetrical education and safer deliveries in that same period also led to steep declines in maternal mortality, so that by 1950, average life expectancy had catapulted to 68 years.

For all its technological sophistication and hefty price tag, modern medicine may be doing more to complicate the end of life than to prolong or improve it. If a person living in 1900 managed to survive childhood and childbearing, she had a good chance of growing old. According to the Centers for Disease Control and Prevention, a person who made it to 65 in 1900 could expect to live an average of 12 more years; if she made it to 85, she could expect to go another fouryears. In 2007, a 65-year-old American could expect to live, on average, another 19 years; if he made it to 85, he could expect to go another six years.

Another factor in our denial of death has more to do with changing demographics than advances in medical science. Our nation’s mass exodus away from the land and an agricultural existence and toward a more urban lifestyle means that we’ve antiseptically left death and the natural world behind us. At the beginning of the Civil War, 80 percent of Americans lived in rural areas and 20 percent lived in urban ones. By 1920, with the Industrial Revolution in full swing, the ratio was around 50-50; as of 2010, 80 percent of Americans live in urban areas.

For most of us living with sidewalks and street lamps, death has become a rarely witnessed, foreign event. The most up-close death my urban-raised children have experienced is the occasional walleye being reeled toward doom on a family fishing trip or a neighborhood squirrel sentenced to death-by-Firestone. The chicken most people eat comes in plastic wrap, not at the end of a swinging cleaver. The farmers I take care of aren’t in any more of a hurry to die than my city-dwelling patients, but when death comes, they are familiar with it. They’ve seen it, smelled it, had it under their fingernails. A dying cow is not the same as a person nearing death, but living off the land strengthens one’s understanding that all living things eventually die.
Our unrealistic views of death, through a doctor’s eyes (Washington Post) See also: On Death and Dying (Daily Kos)

"Dying is not difficult. Everybody does it at least once. You don't even have to do anything special. It's usually kind of an automatic thing. Sometimes, it takes a while, and sometimes it happens in the blink of an eye.

"Leaving? Well, that's another story all together. You cannot usually tell much about a person by how they die. But, you can tell a lot about a person by the way they leave."

And, from last week: Will today's children die earlier than their parents? (BBC)

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.