October 25, 2011
Psychologists tie the reluctance to protest Wall Street bailouts to a deep-seated need to justify the status quo.
Among the many issues raised by the Occupy Wall Street movement, perhaps the most basic is: What took so long? Why did three years elapse between the time reckless financial traders nearly brought down the global economy and large numbers of people began collectively expressing outrage?
A new psychological study provides at least a partial answer. It finds people are strongly motivated to perceive the socioeconomic system they live under as fair and just, and links this pro-status-quo impulse with a reluctance to protest against the Wall Street bailout.
“It is extremely difficult for most of us to believe that our political or economic system is inherently corrupt,” said New York University psychologist John Jost, “and it is a belief that we are tempted to resist, even when there is evidence suggesting deep and fundamental problems.
“Because of our immense psychological capacity to justify and rationalize the status quo, human societies are very slow to fix system-level injustices and to enact substantive changes.”
In a paper titled “Why Men (and Women) Do and Don’t Rebel,” recently published in the journal Personality and Social Psychology Bulletin, a research team led by Jost examines the proclivity to protest through the lens of system justification theory. According to this school of thought, which Jost helped formulate in the 1990s, a fundamental need for certainty and security dampens our desire to rock the boat.
Jost’s evidence suggests this stay-the-course impulse – which played a measurable role in recent debates over climate change and health-care reform — is inherently stronger in certain people than others. But it also can be manipulated. He and his colleagues demonstrated its dual nature in an experiment featuring 108 NYU students, who were asked about their willingness to protest Wall Street bailouts.
The participants first responded to a series of statements, which were designed to reveal the degree to which they reflexively justify our economic system. They expressed their level of agreement or disagreement with such thoughts as “Laws of nature are responsible for differences in wealth in society,” and “There are no inherent differences between rich and poor; it is purely a matter of the circumstances into which you were born.”
Half of the students then wrote a short essay about “the experience of being uncertain,” while the other half wrote about watching television. Afterward, all read excerpts from a New York Times article about an Obama administration plan to “to liberate the nation’s banks from a toxic stew of bad home loans and mortgage-related securities.” The story described this proposal as “more generous to private investors than expected, but it also puts the taxpayers at greater risk.”
Finally, the students were asked to rate (on a scale of one to seven) how willing they were to engage in both disruptive and non-disruptive protests against this proposed plan. Disruptive actions included “occupying an NYU building as a sign of protest;” non-disruptive ones included writing an angry letter or email to government officials.
Not surprisingly, the researchers found participants who support the economic system were unwilling to protest, whether or not the concept of uncertainty had been implanted in their minds. But for the others, remembering a time they felt uncertain “significantly reduced the motivation to engage in non-disruptive protest.”
In other words, for people who are disposed to challenge the system, uncertainty dampens the urge to demand change. Since those inclined to support the status quo aren’t going to take to the streets, this drastically reduces the pool of potential protestors.
“If we can extrapolate from our experimental research (and I think that we can, at least tentatively), the economic uncertainty that most Americans have felt for the past few years — as we have waited to see whether various stimulus measures are going to work —probably undermined (rather than hastened) the motivation to protest against the Wall Street bailout,” Jost said in an interview.
So what shifted in recent weeks? Jost points to a couple of possible factors…
October 25, 2011
Since the LAPD’s cold case unit began 10 years ago, detectives have used science to arrest serial killers and dozens of others who thought they had gotten away with murder.
His list of victims could read like a yearbook: Debra Jackson, 1985; Henrietta Wright, 1986; Barbara Ware, Bernita Sparks, and Mary Lowe, 1987; Alicia Alexander and Lachrica Jefferson, 1988. Then, after a break of more than a dozen years — the “sleeper” period that inspired his nickname — Valerie McCorvey, 2003. Four years after that — Jenica Peters, 2007.
All of the victims were black women. They were as young as 18 and as old as 36 when he ended their lives. Most were sexually assaulted and then shot, their bodies left in alleys or trash bins along a stretch of Western Avenue in South Los Angeles. It is a poor, predominately black neighborhood hemmed in on all sides by freeways. Prostitution and drug activity are common. Murders committed there typically receive little media attention. The South L.A. cases did not become a priority for the LAPD until 2007, when DNA analysis revealed the women had a common killer — one who had gone undetected for at least 22 years and was presumably still active. A task force of six detectives was assembled to hunt for the suspect, who the media dubbed “The Grim Sleeper.” Countless leads were pursued but, frustratingly, none panned out. Detectives hoped his DNA profile might already be on record for some other offense, but it wasn’t. The investigation seemed to have reached a dead end — save for a new and controversial data-mining technique called “familial searching.”
In violent cases when conventional DNA searches fail to produce a match, and all other investigative leads have proved fruitless, there is a last-ditch option: searching the database for near-matches who are likely to be close relatives of the suspect. In the Grim Sleeper case, the familial search turned up the DNA profile of a felon who shared multiple genetic markers. The man was too young to have committed the earliest murders, but detectives quickly honed in on his father, a resident of South Los Angeles named Lonnie Franklin Jr. A surreptitious DNA sample was collected — from a discarded piece of pizza — and Franklin’s genetic profile was compared to the Grim Sleeper’s: they matched. In arresting Franklin, the LAPD wrote a new page in the history of DNA forensics: never before in American history had an active familial search been used to solve a homicide. California is one of only four states where familial searching is legal, but the LAPD’s success in the Grim Sleeper investigation has become a prime argument for expanding its use.
Today, a quarter-century after DNA analysis was used in a murder investigation for the first time, the LAPD has become renowned worldwide for its skill in using DNA to solve homicides. It wasn’t always this way.
Ten years ago this fall, the LAPD’s Cold Case Homicide Unit was born. When it opened its doors, the brand-new unit had seven detectives, and a staggering caseload: more than 9,000 unsolved murders committed in Los Angeles since 1960. The officer in charge of the new unit was a veteran LAPD homicide detective named David Lambkin. Lambkin retired in 2007. He and his wife, Jane, a former LAPD civilian employee, live in a small town on the Olympic Peninsula, northwest of Seattle. Violent crime of the type Lambkin routinely handled as a detective is nearly nonexistent there. The living room of the Lambkins’ modern home looks out on a slate-colored bay. In concert with an overcast sky, the view from the picture windows appears a study in grays. Los Angeles feels very far away.
Asked to recount the first days of the cold case unit, Lambkin speaks with the frankness of someone who’s proud of his association with the LAPD, an institution he served for nearly three decades but no longer feels beholden to — if he ever did. He does not whitewash the monumental task faced by the new unit when it went operational.
Not only was each team of detectives responsible for more than 3,000 cold cases, but nothing had been allocated in the LAPD’s budget for basic investigative necessities — like cars and computers that actually worked. “We weren’t given any fleet cars, only old cars that had been taken out of service because they were deemed not usable anymore,” Lambkin recalls.
Unlike fresh homicides, where the majority of suspects and witnesses are typically still in the city, with cold cases it is common for people to have moved far from Los Angeles, and for detectives to have to go greater distances to interview them. And Lambkin had to call in favors to be able to scrounge up one computer for every two people. The new unit was crammed into a 250-square-foot former utility closet with no windows and a broken ventilation system. All the telephone jacks were along one wall, so phone handsets were constantly being passed across the room. A Los Angeles Times reporter who came in to write a feature on the new unit told Lambkin the office was the worst he’d ever seen.
But Lambkin felt there was no more dignified work than trying to solve murders that society seemed to have forgotten. For victims’ families, he says, “this stuff never goes away. After awhile, they get tired of dealing with the department, and they quit calling. So there’s a huge moral reason to be looking back at them, now that we have these new tools.” The new tools were the revolutionary DNA, ballistics, and fingerprint databases that had come online in the 1990s. Lambkin had avidly followed their evolution. He knew these databases were rapidly improving detectives’ ability to identify people who very likely believed that they’d gotten away with murder…
October 25, 2011
My first edict as global overlord would be to impose the following rule on pundits: No one may bemoan a decay, decline, or degeneration without providing (1) a measure of the way the world is today; (2) a measure of the way the world was at some point in the past; (3) a demonstration that (1) is worse than (2).
This decree would, first of all, eliminate tedious jeremiads about the decline of the language. The genre has been around for centuries, and if the doomsayers were correct we would now be grunting like Tarzan. But not only do we see vast amounts of clear and competent prose in everyday outlets like Wikipedia and Amazon reviews, but a gusher of superb writing appearing daily, as anyone who has lost a morning to sites like The Browser and Arts and Letters Daily can attest.
Language mavens commonly confuse their own peeves with a worsening of the language. A century ago editors issued fatwas against barbarous innovations such as “standpoint,” “bogus,” “to run a business,” and “to quit smoking.” Decades ago they fulminated against “six people” (as opposed to persons), “fix” (for repair), and the verbs “to contact” and “to finalise.” Today this linguistic contraband is unexceptionable, if not indispensable. Also vilified is the seepage of new technological jargon into the language (leverage, incentivise, synergy). Yet old technological jargon (proportional, placebo, false positive, trade-off) has made it easier for everyone to think about abstract concepts, and may even have contributed to the Flynn effect, the relentless increase in IQ scores during the 20th century.
And speaking of technology, today’s Luddites have a short memory. Parents who lament the iPods and mobile phones soldered onto the ears of teenagers forget that their own parents made the same complaint about them and their bedroom telephones and transistor radios. The abbreviated prose in tweets and instant messages is no more likely to corrupt the language or shorten attention spans than the telegrams, radio ads, and advertising catchphrases of yesteryear. Email can seem like a curse, but who would go back to stamps, phone booths, carbon paper, and piles of phone messages? And now that dinner companions can fact-check any assertion on an iPhone, we are coming to realise how many of our everyday beliefs are false—a valuable lesson in the fallibility of memory.
But nowhere is the confusion of a data point with a trend more pernicious than in our understanding of violence. A terrorist bomb explodes, a sniper runs amok, an errant drone kills an innocent, and commentators ask “What is the world coming to?” Yet they seldom ask, “How bad was the world in the past?”
By just about any quantitative standard, the world of the past was much worse. The medieval rate of homicide was 35 times the rate of today, and the rate of death in tribal warfare 15 times higher than that. Collapsed empires, horse-tribe invasions, the Crusades, the slave trade, the wars of religion, and the colonisation of the Americas had death tolls which, adjusted for population, rival or exceed those of the world wars. In earlier centuries the wife of an adulterer could have her nose cut off, a seven-year-old could be hanged for stealing a petticoat, a witch could be sawn in half, and a sailor could be flogged to a pulp. Deadly riots were common enough in 18th-century England to have given us the expression “to read the riot act,” and in 19th-century Russia to have given us the word pogrom. Deaths in warfare have come lurchingly but dramatically downward since their postwar peak in 1950. Deaths from terrorism are less common in today’s “age of terror” than they were in the 1960s and 1970s, with their regular bombings, hijackings, and shootings by various armies, leagues, coalitions, brigades, factions and fronts…