October 8, 2011
Americans for generations have fretted over the relationship between the military and civilian society, over how the one institution fits within the other, how the broader population receives and perceives its soldiers. But as the U.S. approaches the 10th anniversary of the launch of the war in Afghanistan this week, this much is novel: The longest war in U.S. history is being fought by the smallest percentage of its population.
The resulting implications — which Jeff Shear touched on for Miller-McCune.com earlier this year — are unsettling. As these wars have moved off of the front page, and as the soldiers fighting them have moved into sixth and seventh deployments, is a disconnect evolving between the country’s servicemen and civilians? Can the U.S. make good decisions as a country about war when so few feel personally invested? And what does it mean to welcome home a soldier when so few people really know where he or she has been?
The country plows toward this latest anniversary with retiring Navy Adm. Mike Mullen’s observation about the nation’s civilians ringing in the air: “I fear they do not know us.”
That quote from the outgoing chairman of the Joint Chiefs of Staff to this year’s graduating class at West Point prefaced a massive new study of veteran and civilian public opinion in the post-Sept. 11 era released Wednesday by the Pew Research Center. The quote also popped up repeatedly during a discussion Pew hosted Wednesday to publicize the results.
The center surveyed nearly 2,000 veterans and as many civilians about their expectations of each other, their views on who should (and has been) carrying the burden during wartime and whether these wars have been “worth it.”
One of the central findings was that post-Sept. 11 veterans appear to agree with Mullen. And — more surprisingly — the public does, too.
Eighty-four percent of modern-era veterans said the general public has little or no understanding of the problems they face. Among the public, 71 percent agreed. This raises the awkward question of whether it’s possible to “appreciate” service if you don’t really understand it.
Several factors likely are at play. In the all-volunteer professional military, soldiers are serving longer tours and aren’t as quick to return to the communities from which they came. In this sense, they become less visible to civilians. The conflicts in Iraq and Afghanistan are also distinctly unconventional, making it harder for civilians to understand not just what deployed soldiers are going through, but also literally what they’re doing. Are they manning Humvees or programming drones or supervising school construction in a combat zone?
What civilians say they do know is that soldiers and their families have had to make a lot of sacrifices since 9/11 (83 percent feel this way, compared to 43 percent when the same question is asked about the American people). Among people who feel the military has made more sacrifices than the public, 26 percent described this as “unfair.” But 70 percent felt the added burden was “just part of being in the military.”
This attitude contrasts sharply with past moments in American history. In 1930, Congress created a War Policies Commission that was charged, among other things, with considering ways to equalize the burdens of war across the population. The commission even weighed amendments to the U.S. Constitution to make this happen. Nothing ever came of the idea…
Is evil over? Has science finally driven a stake through its dark heart? Or at least emptied the word of useful meaning, reduced the notion of a numinous nonmaterial malevolent force to a glitch in a tangled cluster of neurons, the brain?
Yes, according to many neuroscientists, who are emerging as the new high priests of the secrets of the psyche, explainers of human behavior in general. A phenomenon attested to by a recent torrent of pop-sci brain books with titles like Incognito: The Secret Lives of the Brain. Not secret in most of these works is the disdain for metaphysical evil, which is regarded as an antiquated concept that’s done more harm than good. They argue that the time has come to replace such metaphysical terms with physical explanations—malfunctions or malformations in the brain
Of course, people still commit innumerablebad actions, but the idea that people make conscious decisions to hurt or harm is no longer sustainable, say the new brain scientists. For one thing, there is no such thing as “free will” with which to decide to commit evil. (Like evil, free will is an antiquated concept for most.) Autonomous, conscious decision-making itself may well be an illusion. And thus intentional evil is impossible.
Have the new neuroscientists brandishing their fMRIs, the ghostly illuminated etchings of the interior structures of the skull, succeeded where their forebears from disciplines ranging from phrenology to psychoanalysis have failed? Have they pinpointed the hidden anomalies in the amygdala, the dysfunctions in the prefrontal lobes, the electrochemical source of impulses that lead a Jared Loughner, or an Anders Breivik, to commit their murderous acts?
And in reducing evil to a purely neurological glitch or malformation in the wiring of the physical brain, in eliminating the element of freely willed conscious choice, have neuroscientists eliminated as well “moral agency,” personal responsibility? Does this “neuromitigation” excuse—”my brain made me do it,” as critics of the tendency have called it—mean that no human being really wantsto do ill to another? That we are all innocent, Rousseauian beings, some afflicted with defects—”brain bugs” as one new pop-neuroscience book calls them—that cause the behavior formerly known as evil?
Are those who commit acts of cruelty, murder, and torture just victims themselves—of a faulty part in the head that might fall under factory warranty if the brain were a car?
The new neuroscience represents the latest chapter in a millennia-old and still divisive cultural conflict over the problem of evil, the latest chapter in the attempt by science to reduce evil to malfunction or dysfunction rather than malevolence. It’s a quest I examined in Explaining Hitler: the way the varieties of 20th-century psychological “science” sought to find some physiological, developmental, sexual, or psychoanalytic cause for Hitler’s crimes. (One peer-reviewed paper sought to trace Hitler’s evil to a mosquito bite—to the secondary sequelae of mosquito-borne encephalitis which were known to cause profound personality changes as long as a decade after being contracted in the trenches of World War I.)
It would be consolatory if not comforting if we could prove that what made Hitler Hitler was amalfunction in human nature, a glitch in the circuitry, because it would allow us to exempt “normal” human nature (ours for instance) from having Hitler potential. This somewhat Pollyannaish quest to explain the man’s crimes remains counterintuitive to many. I recall the late British historian and biographer of Hitler Alan Bullock reacting to the claims of scientism by exclaiming to me vociferously: “If he isn’t evil, then who is? … If he isn’t evil the word has no meaning.”
Indeed recent developments demonstrate that evil remains a stubborn concept in our culture, resistant to attempts to reduce it to pure “physicalism.” To read the mainstream media commentary on the Breivik case, for instance, is to come upon, time after time, the word “evil.” Not just that the acts were evil, but that he, Breivik was, as a Wall Street Journal columnist put it, “evil incarnate.”
But what exactly does that mean? The incarnation of what? Satan? The word “incarnation,” even without explicit religious context, implies, metaphorically at least, the embedding of a metaphysical force in a physical body. One can understand the scientific aversion to this as a description of reality. But evil as a numinous force abides. It is not surprising that Pope Benedict issued a statement following the attacks in Norway calling on everyone to “escape from the logic of evil.” (Although what exactly is that “logic”?)
Even if it was not surprising for the Pope to invoke evil thus, it was surprising to see a devout atheist such as my colleague Christopher Hitchens invoke “evil” in his “obituary” for Osama bin Laden. Hitchens admits wishing he could avoid using “that simplistic (but somehow indispensable) word.” But he feels compelled to call whatever motivated bin Laden a “force” that “absolutely deserves to be called evil.”
But what is this “force,” which sounds suspiciously supernatural for an atheist to believe in? Some kind of Luciferian Kryptonite? Where is it located: in the material or nonmaterial world?
…One person whose work on these matters has received considerable attention lately is the British Professor of Psychopathology, Simon Baron-Cohen. (Yes, cousin of Sacha Baron-Cohen aka Borat, but highly regarded as a serious scientist.) He’s the author of The Science of Evil, which seeks to dispose of the problem of evil in part at least by changing its name.
“My main goal,” says Baron-Cohen, “is replacing the unscientific term ‘evil’ with the scientific term ‘empathy.’ ” What he means is that instead of calling someone evil we should say they have no empathy.
Baron-Cohen goes to great lengths to posit an “empathy circuit” in the brain whose varying “degrees” of strength constitute a spectrum, ranging from total, 100 percent empathy to “zero degrees of empathy.”
This empathy circuit, he tells us, consists of 13 specific regions of the brain involved in the generation of nonevil choices, among them “the medial prefrontal cortex,” “the inferior frontal gyrus,” and “the posterior superior temporal sulcus.”
Ideally all of these act together empathetically to defeat “single minded focus,” which appears to be Baron-Cohen’s explanation for what was previously called evil. Single-mindedness is the inability to “recognize and respond” to the feelings of others. A healthy empathy circuit allows us to feel others’ pain and transcend single-minded focus on our own. This theory does, however, seem to carry a presumption that when one “recognizes and responds,” one will do so in warm and fuzzy ways. But what about those who “recognize and respond” to others’ feelings with great discernment—and then torture them? It happens.
One troubling aspect of Baron-Cohen’s grand substitution of a lack of empathy for evil is the mechanistic way he describes it.
He characterizes those who lack empathy as having “a chip in their neural computer missing.” He tells us “empathy is more like a dimmer switch than an all-or-none switch.” The big problem here is that by reducing evil to a mechanical malfunction in the empathy circuit, Baron-Cohen also reduces, or even abolishes, good. No one in this deterministic conceptual system chooses to be good, courageous, or heroic. They just have a well-developed empathy circuit that compelsthem act empathetically—there’s no choice or honor in the matter.
And so evil for Baron-Cohen is just “zero degrees of empathy.” And I’m left with the nonempathetic feeling that his boast that he is “replacing” evil with nonempathy is more a semantic trick than a scientific discovery. It’s another instance of what one of the authors in an important collection of academic papers from MIT Press called Neuroethics, calls “Brain Overclaim Syndrome.”
A number of papers in Neuroethics pour cold water on the triumphalism of the giddy new pop-sci brain books. It makes clear there is a debate within the neuroscience profession about what exactly all those impressive-looking fMRI images tell us. And these “neurocritics” or “neuroskeptics” warn about the consequences for acting too quickly on these claims. (There is a valuable British website called Neuroskeptic that offers the general reading public these critiques and correctives from the point of view of someone within the profession. People need to know!)
The “Brain Overclaim” paper by Stephen Morse of the University of Pennsylvania’s Center for Neuroscience and Society is a tongue-in-cheek “diagnostic note” on the grandiosity of the assumptions of the brain-book fad, mainly concerned about the way they have been creeping into jurisprudence. fMRIs have made their way into a Supreme Court opinion this year, for instance; Justice Stephen Breyer cited “cutting edge neuroscience” in his dissent to a ruling denying the right of California to ban violent video games, because the otherwise-pro-free-speech justice was alarmed at neuroscientific studies that claim such games could create mental pathways for actual violence.
But Morse’s critique extends beyond the jurisprudential and goes to the heart of the failure of current neuroscience to explain or “replace” evil. Popular neuroscience has claimed to find the neural locus of love and God and evil, but Morse points out a fundamental flaw in their logic:
Despite all the astonishing advances in neuroscience, however, we still know woefully little about how the brain enables the mind and especially about how consciousnesss and intentionality can arise from the complicted hunk of matter that is the brain. … Discovering the neural correlates of mental phenomena does not tell us how these phenomena are possible.
In other words, correlation doesn’t always equal causation: We may know the 13 regions that light up on an fMRI when we feel “empathy” (or fail to light up when we choose evil) but that doesn’t explain whether this lit-up state indicates they are causing empathy or just reflecting it.
The problem of evil—and moral responsibility—is thus inseparable from what is known in the philosophical trade as “the hard problem of consciousness.” How does the brain, that electrified piece of meat, create the mind and the music of Mozart, the prose of Nabokov? Where is consciousness, anyway?…
Optimism is plummeting among working-class whites, but it is holding steady for minorities. What does this great divergence in hopefulness mean for the 2012 presidential election?
They dream in water, cotton, and brick. One of them is losing hope.
Not Tierra, who is black, and whose nursing ambitions could be delayed by another brutal electric bill. Not Ambar, a Latina and an aspiring lawyer who just lost the only home she ever knew.
Dave. Who is white, and who thought, finally, he’d made it. Who broke his back for a dream–a pension, a getaway cottage, security–that seems to be wavering in the Lake Erie haze.
He grew up in Detroit, where the upward mobility of the American middle class could be seen every Friday afternoon. Factory workers, driving cars they’d built, crowded I-75, heading north to their cottages. That was the deal that Dave Miller signed up for when he dropped out of Wayne State University and followed his dad into the firefighting ranks. The deal was supposed to include decent wages, health insurance, tuition, retirement, mortgages, and maybe, with overtime pay, a boat and a house on the lake–a physical reminder that hard work still pays like it always did.
“Here’s the ticket! Twenty-five years, a pension, health care, and nine working days a month–that’s how they sold it,” Dave says. Nobody mentioned going 10 years without a raise; or starting a construction company on the side to make ends meet; or wondering if he shouldn’t just sell the little lake cottage that his hard work bought, because he struggles just to make it up there.
Nobody said that one day Dave, 41, would sit around a table with five other white firefighters and admit, to nods of approval, that his hope for his kids’ future “takes a hit when shit goes sour.” It is 5 p.m. on a Thursday. He was supposed to escape to the lake 36 hours ago. He feels like he is running out of time.
It is early in the evening on a Monday in South Carolina when Tierra Stewart, 22, leaves the maroon tents where her extended family has passed the day barbecuing and catching up. She loads her 3-year-old boy into her aunt’s Jeep, damp blades of grass clinging to his white high-tops. Her cousin points the car down the highway away from their small hometown and back to the little house that Tierra shares with her son, Quay.
There is a $300 electric bill waiting for her there, the second one in a row. Both have been surprises, far more than she expected when she moved into the house this summer. To pay one electric bill–just that bill–Tierra will work eight straight days in two jobs, caring for senior citizens and disabled people. She knows that one more unexpectedly large bill, or one more problem with the car that has already broken down once this year, would devastate her fragile finances. But still, she dreams.
Tierra dreams of becoming a nurse; of wearing cotton hospital scrubs; of traveling outside of South Carolina. She wants to own a house and a car. She wants to earn at least $15 an hour–an annual salary of about $31,000. “I’ve never made more than $13 an hour,” she says. “Thirty thousand would be good for me.” She has grand plans for the sleepy boy in the backseat. “He will go off to college,” she says. “I don’t want him to be here pining on any women, his mom neither. I want him to be somebody. I want him to be successful.”
It is mid-morning on a Thursday on the lower west side of Chicago, and the late summer air blows in a chill. Ambar Gonzalez pulls a gray hooded sweatshirt over her sweater. She navigates her neighborhood from the passenger side of a rental car. She is 25, and she has never lived anywhere but this neighborhood. There is my grammar school, she says. There is the school where my friends went. There is the coal plant; I think it gave me my asthma. There is my church. They just washed the bricks. It’s even more gorgeous inside. Turn here.
Suddenly the voice that crackled with possibilities during breakfast–a job downtown! law school in Washington, D.C.!–deadens. She is back on a block that she crosses only if she’s riding with friends and they forget where they are. There it is on the corner: weather-beaten yellow brick, with red-and-white awnings. It’s the home she grew up in, the one that the economy made it impossible for her family to keep. “Turn,” she says.
The house appears again, this time on the left. Freckles tighten around her light brown eyes. Do you ever think about buying it back? “Yes,” she says, instantly. “Even if I don’t live there, I want it to be my house.” It gives her direction: She will leave, she will learn, she will win the job that will bring her back to claim the cradle of her middle-class dreams.
THE GREAT DIVERGENCE
Like Dave, Tierra, and Ambar, others in America’s working class still dream of a better life and the totems–the cottage, the uniform, the house–that represent it, even in the grips of an economy that has snuffed so many hopes. The dreams vary with the color of the dreamers’ skin, though not in the way you’d expect.
The Great Recession and the weak recovery have soaked working-class Americans across racial and generational lines. But the groups who suffered most, amazingly, are the ones who remain most hopeful that life will improve for them and for their children. Optimism is plummeting among working-class whites but holding steady for minorities, a divergence that risks inflaming racial tensions. It could also sway the 2012 presidential election.
Census figures show that all American workers slid backward in the past few years. Median income fell by 6.4 percent between 2007 and 2010, to the lowest level in 13 years (adjusted for inflation). African-American incomes fell by more than 10 percent, and Latino incomes fell 7.2 percent. White incomes dropped less, by 5.4 percent. In 2010, income for blacks and Latinos remained 40 and 30 percent lower, respectively, than the median income for whites. Hispanics lost two-thirds of their household wealth between 2005 and 2009, and blacks lost more than half, according to the Pew Research Center’s Social & Demographic Trends project. Whites lost less than a fifth of theirs.
Yet polls suggest that far more minorities than whites believe they are still advancing toward their economic dreams. Latinos and blacks remain more than twice as likely to say that today’s children will have more opportunity than they did, according to an Allstate/National Journal Heartland Monitor poll conducted this summer. Minorities are also far more likely than whites to say that their own economic opportunity exceeds their parents’…
October 8, 2011
This image has been posted with express written permission. This cartoon was originally published at Town Hall.