As a 42-year-old man born in England, I can expect to live for about another 38 years. In other words, I can no longer claim to be young. I am, without doubt, middle-aged.
To some people that is a depressing realization. We are used to dismissing our fifth and sixth decades as a negative chapter in our lives, perhaps even a cause for crisis. But recent scientific findings have shown just how important middle age is for every one of us, and how crucial it has been to the success of our species. Middle age is not just about wrinkles and worry. It is not about getting old. It is an ancient, pivotal episode in the human life span, preprogrammed into us by natural selection, an exceptional characteristic of an exceptional species.
Compared with other animals, humans have a very unusual pattern to our lives. We take a very long time to grow up, we are long-lived, and most of us stop reproducing halfway through our life span. A few other species have some elements of this pattern, but only humans have distorted the course of their lives in such a dramatic way. Most of that distortion is caused by the evolution of middle age, which adds two decades that most other animals simply do not get.
An important clue that middle age isn’t just the start of a downward spiral is that it does not bear the hallmarks of general, passive decline. Most body systems deteriorate very little during this stage of life. Those that do, deteriorate in ways that are very distinctive, are rarely seen in other species and are often abrupt.
For example, our ability to focus on nearby objects declines in a predictable way: Farsightedness is rare at 35 but universal at 50. Skin elasticity also decreases reliably and often surprisingly abruptly in early middle age. Patterns of fat deposition change in predictable, stereotyped ways. Other systems, notably cognition, barely change.
Each of these changes can be explained in evolutionary terms. In general, it makes sense to invest in the repair and maintenance only of body systems that deliver an immediate fitness benefit — that is, those that help to propagate your genes. As people get older, they no longer need spectacular visual acuity or mate-attracting, unblemished skin. Yet they do need their brains, and that is why we still invest heavily in them during middle age.
As for fat — that wonderfully efficient energy store that saved the lives of many of our hard-pressed ancestors — its role changes when we are no longer gearing up to produce offspring, especially in women. As the years pass, less fat is stored in depots ready to meet the demands of reproduction — the breasts, hips and thighs — or under the skin, where it gives a smooth, youthful appearance. Once our babymaking days are over, fat is stored in larger quantities and also stored more centrally, where it is easiest to carry about. That way, if times get tough we can use it for our own survival, thus freeing up food for our younger relatives.
These changes strongly suggest that middle age is a controlled and preprogrammed process not of decline but of development.
When we think of human development, we usually think of the growth of a fetus or the maturation of a child into an adult. Yet the tightly choreographed transition into middle age is a later but equally important stage in which we are each recast into yet another novel form.
That form is one of the most remarkable of all: a resilient, healthy, energy-efficient and productive phase of life that has laid the foundations for our species’s success. Indeed, the multiple roles of middle-aged people in human societies are so complex and intertwined, it could be argued that they are the most impressive living things yet produced by natural selection.
The claim that middle age evolved faces one obvious objection. For any trait to evolve, natural selection has to act on it generation after generation. Yet we often think of prehistoric life as nasty, brutish and short. Surely too few of our ancestors lived beyond age 40 to allow features of modern-day middle age, such as the deposition of a spare tire around the middle, to have been selected for.
This is a misconception. Although average life expectancy may sometimes have been very low, this does not mean that humans rarely reached the age of 40 during the past 100,000 years. Average life expectancy at birth can be a misleading measure; if infant mortality is high, then the average is skewed dramatically downward, even if people who survive to adulthood have a good chance of living a long, healthy life.
The evidence from skeletal remains suggests that our ancestors frequently lived well into middle age and beyond. Certainly many modern hunter-gatherers live well beyond 40.
The probable existence of lots of prehistoric middle-aged people means that natural selection had plenty to work on. Those with beneficial traits would have been more successful at nurturing their children to reproductive age and helping provide for their grandchildren, and hence would have passed on those traits to their descendants. As a result, modern middle age is the result of millennia of natural selection…
There is a very simple reason why some of Africa’s bloodiest, most brutal wars never seem to end: They are not really wars. Not in the traditional sense, at least. The combatants don’t have much of an ideology; they don’t have clear goals. They couldn’t care less about taking over capitals or major cities — in fact, they prefer the deep bush, where it is far easier to commit crimes. Today’s rebels seem especially uninterested in winning converts, content instead to steal other people’s children, stick Kalashnikovs or axes in their hands, and make them do the killing. Look closely at some of the continent’s most intractable conflicts, from the rebel-laden creeks of the Niger Delta to the inferno in the Democratic Republic of the Congo, and this is what you will find.
What we are seeing is the decline of the classic African liberation movement and the proliferation of something else — something wilder, messier, more violent, and harder to wrap our heads around. If you’d like to call this war, fine. But what is spreading across Africa like a viral pandemic is actually just opportunistic, heavily armed banditry. My job as the New York Times‘ East Africa bureau chief is to cover news and feature stories in 12 countries. But most of my time is spent immersed in these un-wars.
I’ve witnessed up close — often way too close — how combat has morphed from soldier vs. soldier (now a rarity in Africa) to soldier vs. civilian. Most of today’s African fighters are not rebels with a cause; they’re predators. That’s why we see stunning atrocities like eastern Congo’s rape epidemic, where armed groups in recent years have sexually assaulted hundreds of thousands of women, often so sadistically that the victims are left incontinent for life. What is the military or political objective of ramming an assault rifle inside a woman and pulling the trigger? Terror has become an end, not just a means.
This is the story across much of Africa, where nearly half of the continent’s 53 countries are home to an active conflict or a recently ended one. Quiet places such as Tanzania are the lonely exceptions; even user-friendly, tourist-filled Kenya blew up in 2008. Add together the casualties in just the dozen countries that I cover, and you have a death toll of tens of thousands of civilians each year. More than 5 million have died in Congo alone since 1998, the International Rescue Committee has estimated.
Of course, many of the last generation’s independence struggles were bloody, too. South Sudan’s decades-long rebellion is thought to have cost more than 2 million lives. But this is not about numbers. This is about methods and objectives, and the leaders driving them. Uganda’s top guerrilla of the 1980s, Yoweri Museveni, used to fire up his rebels by telling them they were on the ground floor of a national people’s army. Museveni became president in 1986, and he’s still in office (another problem, another story). But his words seem downright noble compared with the best-known rebel leader from his country today, Joseph Kony, who just gives orders to burn.
Even if you could coax these men out of their jungle lairs and get them to the negotiating table, there is very little to offer them. They don’t want ministries or tracts of land to govern. Their armies are often traumatized children, with experience and skills (if you can call them that) totally unsuited for civilian life. All they want is cash, guns, and a license to rampage. And they’ve already got all three. How do you negotiate with that?
The short answer is you don’t. The only way to stop today’s rebels for real is to capture or kill their leaders. Many are uniquely devious characters whose organizations would likely disappear as soon as they do. That’s what happened in Angola when the diamond-smuggling rebel leader Jonas Savimbi was shot, bringing a sudden end to one of the Cold War’s most intense conflicts. In Liberia, the moment that warlord-turned-president Charles Taylor was arrested in 2006 was the same moment that the curtain dropped on the gruesome circus of 10-year-old killers wearing Halloween masks. Countless dollars, hours, and lives have been wasted on fruitless rounds of talks that will never culminate in such clear-cut results. The same could be said of indictments of rebel leaders for crimes against humanity by the International Criminal Court. With the prospect of prosecution looming, those fighting are sure never to give up.
How did we get here? Maybe it’s pure nostalgia, but it seems that yesteryear’s African rebels had a bit more class. They were fighting against colonialism, tyranny, or apartheid. The winning insurgencies often came with a charming, intelligent leader wielding persuasive rhetoric. These were men like John Garang, who led the rebellion in southern Sudan with his Sudan People’s Liberation Army. He pulled off what few guerrilla leaders anywhere have done: winning his people their own country. Thanks in part to his tenacity, South Sudan will hold a referendum next year to secede from the North. Garang died in a 2005 helicopter crash, but people still talk about him like a god. Unfortunately, the region without him looks pretty godforsaken. I traveled to southern Sudan in November to report on how ethnic militias, formed in the new power vacuum, have taken to mowing down civilians by the thousands.
Even Robert Mugabe, Zimbabwe’s dictator, was once a guerrilla with a plan. After transforming minority white-run Rhodesia into majority black-run Zimbabwe, he turned his country into one of the fastest-growing and most diversified economies south of the Sahara — for the first decade and a half of his rule. His status as a true war hero, and the aid he lent other African liberation movements in the 1980s, account for many African leaders’ reluctance to criticize him today, even as he has led Zimbabwe down a path straight to hell.
These men are living relics of a past that has been essentially obliterated. Put the well-educated Garang and the old Mugabe in a room with today’s visionless rebel leaders, and they would have just about nothing in common. What changed in one generation was in part the world itself. The Cold War’s end bred state collapse and chaos. Where meddling great powers once found dominoes that needed to be kept from falling, they suddenly saw no national interest at all. (The exceptions, of course, were natural resources, which could be bought just as easily — and often at a nice discount — from various armed groups.) Suddenly, all you needed to be powerful was a gun, and as it turned out, there were plenty to go around. AK-47s and cheap ammunition bled out of the collapsed Eastern Bloc and into the farthest corners of Africa. It was the perfect opportunity for the charismatic and morally challenged.
In Congo, there have been dozens of such men since 1996, when rebels rose up against the leopard skin-capped dictator Mobutu Sese Seko, probably the most corrupt man in the history of this most corrupt continent. After Mobutu’s state collapsed, no one really rebuilt it. In the anarchy that flourished, rebel leaders carved out fiefdoms ludicrously rich in gold, diamonds, copper, tin, and other minerals. Among them were Laurent Nkunda, Bosco Ntaganda, Thomas Lubanga, a toxic hodgepodge of Mai Mai commanders, Rwandan genocidaires, and the madman leaders of a flamboyantly cruel group called the Rastas…
Screening Out the Introverts: Is academe biased against quiet, thoughtful listeners in favor of big-talking extroverts?
April 18, 2012
Some years ago I joined my students in taking the Myers-Briggs Type Indicator, a test to determine personality type. It was an assignment in a course I was teaching on vocational exploration.
Assuming there would be an average distribution of results among the 20 students, I planned a series of small-group assignments in which they would discuss their own results for each of the test’s personality dichotomies (e.g., thinking versus feeling). But a problem turned up immediately: Not one student had received an “I” for introversion. Everyone, it seemed, was an extrovert (Myers-Briggs spells it with an “a,” like “extra”). Everyone but me.
Extroverts—if you accept such categories—are oriented outward, toward other people and toward action over reflection. They draw energy from social interaction, and they tend to be outspoken and gregarious. Introverts, on the other hand, are oriented toward the inner life of thought; they tend to be reserved and cautious. They find social interactions draining, and they need solitude to recharge. It’s not that introverts are antisocial so much as that they appreciate fewer, more intimate friendships. They don’t like small talk but appreciate deeper discussions.
I knew my students well enough to suspect that I was not the only one with that tendency. A third of them barely spoke in class unless called upon. A few hardly spoke to anyone. Perhaps the introverted choices on the test were too stigmatizing to consider (e.g., “Would you rather go to a party or stay home reading a book?”). The students had used the test to confirm that they had the right, “healthy” qualities.
Given that introversion is frowned upon almost everywhere in U.S. culture, the test might as well have asked, “Would you prefer to be cool, popular, and successful or weird, isolated, and a failure?” In the discussion that followed, a few students observed—with general agreement—that introversion was a kind of mental illness (and, one student noted, a sign of spiritual brokenness). “We are made to be social with each other” was a refrain in the conversation.
A few sympathetic students tried to persuade me that my introvert result was a mistake. How could I stand in front of that room, leading that very conversation, smiling at them, without being an extrovert? The answer: careful planning, acting, and rationing my public appearances. Also, my introversion fades when I become comfortable with unfamiliar people (the first weeks of classes are a strain).
We soon moved on to other personality dichotomies that were more evenly distributed. When the class was over, many of the students continued talking in an animated way about their results. Several left, silently, by themselves. The conversation left me exhausted; I went to my office and closed the door for an hour as I prepared for my next performance.
Those experiences came back to me while reading Susan Cain’s new book Quiet: The Power of Introverts in a World That Can’t Stop Talking. The book is a hybrid work of cultural history, advocacy, personal narrative, and persuasive self-help. It’s a wider-ranging companion to such recent works as The Introvert Advantage: How to Thrive in an Extrovert World by Marti Olsen Laney; Living Fully With Shyness and Social Anxiety by Erika B. Hilliard; and The Introverted Leader: Building on Your Quiet Strength by Jennifer B. Kahnweiler. Cain does not offer a significant critique of the pharmaceutical industry, which has established rapturous sociability as a norm to which everyone should aspire. For that topic, one should read Shyness: How Normal Behavior Became a Sickness by Christopher Lane.
Most notably, Cain argues for the value of introverts in a culture that has a long history of privileging extroversion—something, she argues, that has only grown more powerful, and perhaps costly, in recent decades. It’s a trend that affects business, religion, education, parenting, and just about everyone’s sense of self-worth in the United States.
According to Cain, the 19th century valued personal character based on seriousness, discipline, and honor, but the 20th century emphasized personality: selling oneself and being a “mighty likeable fellow.” Dale Carnegie’s bestseller, How to Win Friends and Influence People (1937), is a major signal of that shift for Cain. Perhaps it would have made for an excessively long cultural history, but one can trace those tendencies in the United States back through the writings of Horatio Alger, P.T. Barnum, Stephen Burroughs, and even to that epitome of the Protestant work ethic: Benjamin Franklin, author of the “The Way to Wealth,” who gave at least as much attention to the appearance of being hardworking as he did to working hard.
Meanwhile, one can easily find notable celebrations of reticence and reflection in writers such as Thoreau and Dickinson, just as there are many 20th-century critiques of the extrovert ideal, such as Babbitt (1922) and Death of a Salesman (1949).
The Power of Positive Thinking (1952) has coexisted in American culture with the pathos of The Catcher in the Rye (1951) for decades, but Cain makes a compelling case that, these days, Norman Vincent Peale has J.D. Salinger in a headlock, and he’s not letting go anytime soon.
We now live under a kind of extrovert tyranny, Cain writes, and that has led to a culture of shallow thinking, compulsory optimism, and escalating risk-taking in pursuit of success, narrowly defined. In other words, extroverts—amplifying each other’s groundless enthusiasms—could be responsible for the economic crisis because they do not listen to introverts, even when there are some around (and they are not trying to pass as extroverts).
If that’s stretching matters, it seems harder to deny that the routine exclusion and silencing of talented, quiet people has costs just like other forms of arbitrary discrimination. And, Cain argues, the extrovert idea is discriminatory on the basis of ethnicity, particularly against those who share the Asian cultural ideal of speaking less and thinking more…
April 18, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.
April 18, 2012