A decade after 9/11, the mystery is not why so many Muslims turn to terror — but why so few have joined al Qaeda’s jihad
August 28, 2011
The rental car turned onto the sidewalk behind the registrar’s office and rolled slowly down the brick path between a dining hall and the English department, a few steps from my office. “Beyond Time,” an upbeat German dance song, played on the car’s stereo. The driver, Mohammed Taheri-Azar, had just graduated from the University of North Carolina three months earlier, so he knew the campus well. Beyond the dining hall was a plaza known as the Pit, where students were hanging out at lunchtime on a warm winter day in early 2006. Taheri-Azar planned to kill as many of them as possible.
He brought no weapons except a knife, some pepper spray, and the four-wheel-drive sport utility vehicle he had rented in order to run people over without getting stuck on their bodies. When he reached the Pit, Taheri-Azar accelerated and swerved to hit people as they scattered out of his way. His fender clipped several students, and several more rolled over his hood and off the windshield. Taheri-Azar turned left at the end of the plaza, hit another couple of students in front of the library, and then sped off campus just beneath my office window.
Taheri-Azar drove down the hill that gave Chapel Hill its name, pulled over in a calm residential neighborhood, parked, and called 911 on his cell phone. “Sir, I just hit several people with a vehicle,” he told the operator. “I don’t have any weapons or anything on me; you can come arrest me now.”
Why did you do this? the operator asked. “Really, it’s to punish the government of the United States for their actions around the world.” So you did this to punish the government? “Yes, sir.” Following the operator’s instructions, he placed his phone on the hood of the car and put his hands on his head as police officers arrived.
Before leaving his apartment that morning, Taheri-Azar left a letter on his bed explaining his actions more fully:
Due to the killing of believing men and women under the direction of the United States government, I have decided to take advantage of my presence on United States soil on Friday, March 3, 2006 to take the lives of as many Americans and American sympathizers as I can in order to punish the United States for their immoral actions around the world.
In the Qur’an, Allah states that the believing men and women have permission to murder anyone responsible for the killing of other believing men and women. I know that the Qur’an is a legitimate and authoritative holy scripture since it is completely validated by modern science and also mathematically encoded with the number 19 beyond human ability. After extensive contemplation and reflection, I have made the decision to exercise the right of violent retaliation that Allah has given me to the fullest extent to which I am capable at present.
I have chosen the particular location on the University campus as my target since I know there is a high likelihood that I will kill several people before being killed myself or jailed and sent to prison if Allah wills. Allah’s commandments are never to be questioned and all of Allah’s commandments must be obeyed.
Nine people suffered broken bones and other injuries that day. Fortunately, Taheri-Azar didn’t kill anybody, though the toll could have been higher. Initially, Taheri-Azar had planned to join insurgents in Afghanistan or Iraq, but he was discouraged by visa restrictions on travel to those countries. Then he looked into joining the military and dropping a nuclear bomb on Washington, D.C., but he realized that his eyesight was too poor to qualify to be a military pilot. Turning closer to home, Taheri-Azar considered shooting people randomly at the university. His letters from prison indicate that he thought about targeting the dining hall where I often eat lunch.
In the weeks before his attack, Taheri-Azar test-fired a laser-sighted handgun at a nearby shooting range but was told that he couldn’t buy it without a permit. Taheri-Azar could have purchased a rifle on the spot if he had completed some federal paperwork, but he had his heart set on a Glock pistol. Later, at his apartment, he started to fill out the permit application, but gave up when he found that he would need three friends to attest to his good moral character. “[T]he process of receiving a permit for a handgun in this city is highly restricted and out of my reach at the present,” Taheri-Azar complained in the letter he left on his bed for the police. Months later, in prison, he rationalized his decision: “The gun may have malfunctioned and acquiring one would have attracted attention to me from the FBI in all likelihood, which could have foiled any attack plans.” Taheri-Azar may be the only terrorist in the world ever deterred by gun-control laws.
TAHERI-AZAR’S INCOMPETENCE as a terrorist is bewildering. Surely someone who was willing to kill and die for his cause, spending months contemplating an attack, could have found a more effective way to kill people. Why wasn’t he able to obtain a firearm or improvise an explosive device or try any of the hundreds of murderous schemes that we all know from movies, television shows, and the Internet, not to mention the news? And once Taheri-Azar decided to run people over with a car, why did he pick a site with so little room to accelerate?
Even more bewildering is that we don’t see more terrorism of this sort, a decade into the “global war on terror” launched by the United States in response to the attacks of Sept. 11, 2001. If every car is a potential weapon, then why aren’t there more automotive attacks? Car bombs have been around since the 1920s, when the first one was detonated on Wall Street in New York City, but they require a fair bit of skill. Drive-through murder, on the other hand, takes very little skill at all. People have been killing people with cars ever since the automobile was invented, and the political use of automotive assault was immortalized in a famous 1966 film, The Battle of Algiers, in which two Algerian revolutionaries drive into a bus stand full of French settlers. Yet very few people resort to this accessible form of terrorism. Out of several million Muslims in the United States, it appears that Taheri-Azar was the first to attempt this sort of attack; so far he has been followed by two possible copycats, leading to one fatality. (The trial of Omeed Popal, who killed a pedestrian, has been delayed for several years while the court tries to determine whether he is mentally fit to stand trial.) In addition to cars, plenty of other terrorist weapons are readily accessible. One manual for Islamist terrorists, published online in 2006, listed 14 “simple tools” that “are easy to use and available for anyone who wants to fight the occupying enemy,” including “running over someone with a car” (No. 14) and “setting fire to homes or rooms at sleep time” (No. 10).
If terrorist methods are as widely available as automobiles, why are there so few Islamist terrorists? In light of the death and devastation that terrorists have wrought, the question may seem absurd. But if there are more than a billion Muslims in the world, many of whom supposedly hate the West and desire martyrdom, why don’t we see terrorist attacks everywhere, every day?
Islamist terrorists ask these questions, too. In their view, the West is engaged in a massive assault on Muslim societies and has been for generations, long preceding 9/11. This assault involves military invasions, political domination, economic dependence, and cultural decadence — and, they believe, it is reaching new heights of aggression each year. Islamists offer a solution: the establishment of Islamic government. Revolutionary Islamists offer a strategy to achieve Islamic government: armed insurrection. Terrorist revolutionaries offer a tactic to trigger insurrection: attacks on civilians. These attacks are intended to demoralize the enemy, build Muslims’ self-confidence, and escalate conflict, leading Muslims to realize that armed insurrection is the sole path to defend Islam…
August 28, 2011
You can drive almost anywhere in the state of Michigan — pick a point at random and start moving — and you will soon come upon the wreckage of American industry. If you happen to be driving on the outer edge of Midland, you’ll also come upon a cavern of steel beams and ductwork, 400,000 square feet in all. When this plant, which is being constructed by Dow Kokam, a new venture partly owned by Dow Chemical, is up and running early next year, it will produce hundreds of thousands of advanced lithium-ion battery cells for hybrid and electric cars. Just as important, it will provide about 350 jobs in a state with one of the nation’s highest unemployment rates.
Over the last two years, the federal government has doled out nearly $2.5 billion in stimulus dollars to roughly 30 companies involved in advanced battery technology. Many of these might seem less like viable businesses than scenery for political photo ops — places President Obama can repeatedly visit (as he did early this month) to demonstrate his efforts at job creation. But in fact, the battery start-ups are more legitimate, and also more controversial, than that. They represent “the far edge,” as one White House official put it, of where the president or Congress might go to create jobs.
For decades, the federal government has generally resisted throwing its weight —and its money — behind particular industries. If the market was killing manufacturing jobs, it was pointless to fight it. The government wasn’t in the business of picking winners. Many economic theorists have long held that countries inevitably pursue their natural or unique advantages. Some advantages might arise from fertile farmland or gifts of vast mineral resources; others might be rooted in the high education rates of their citizenry. As the former White House economic adviser Lawrence Summers put it, America’s role is to feed a global economy that’s increasingly based on knowledge and services rather than on making stuff. So even as governments in China and Japan offered aid to industries they deemed important, factories in the United States closed or moved abroad. The conviction in Washington was that manufacturing deserved no special dispensation. Even now, as unemployment ravages the country, so-called industrial policy remains politically toxic. Legislators will not debate it; most will not even speak its name.
By almost any account, the White House has fallen woefully short on job creation during the past two and a half years. But galvanized by the potential double payoff of skilled, blue-collar jobs and a dynamic clean-energy industry — the administration has tried to buck the tide with lithium-ion batteries. It had to start almost from scratch. In 2009, the U.S. made less than 2 percent of the world’s lithium-ion batteries. By 2015, the Department of Energy projects that, thanks mostly to the government’s recent largess, the United States will have the capacity to produce 40 percent of them. Whichever country figures out how to lead in the production of lithium-ion batteries will be well positioned to capture “a large piece of the world’s future economic prosperity,” says Arun Majumdar, the head of the Department of Energy’s Advanced Research Projects Agency-Energy (ARPA-E). The batteries, he stressed, are essential to the future of the global-transportation business and to a variety of clean-energy industries.
We may marvel at the hardware and software of mobile phones and laptops, but batteries don’t get the credit they deserve. Without a lithium-ion battery, your iPad would be a kludge. The new Chevrolet Volt and Nissan Leaf rely on big racks of lithium-ion battery cells to hold their electric charges, and a number of new models — including those from Ford and Toyota, which use similar battery technology — are on their way to showrooms within the next 18 months.
This flurry of activity comes against a dismal backdrop. In the last decade, the United States lost some five million manufacturing jobs, a contraction of about one-third. Added to the equally brutal decades that preceded it, this decline left large swaths of the country, the Great Lakes region in particular, without a clear economic future. As I drove through the hollowed-out cities and towns of Michigan earlier this year, it was hard to tell how some of these places could survive. Inside the handful of battery companies that I visited, though, the mood was starkly different. Many companies are working on battery-pack designs for dozens of car models. At the Johnson Controls factory in Holland, Mich., Ray Shemanski, who is in charge of the company’s lithium-ion operation, said, “We have orders that would fill this plant right now.” Every company I visited not only had plans to get their primary factories running full speed by 2012 or 2013 but also to build or expand others. Jennifer Granholm, Michigan’s former governor, has predicted that advanced batteries will create 62,000 jobs over the next decade…
August 28, 2011
In the 1980s, Silicon Valley was populated by lefties and hippies who dreamed of a computer revolution. One of the pioneers recalls how the internet was born.
In Sofia Coppola’s 2006 film of the life of Marie Antoinette, there is a scene where an entourage of palace jeunes filles sweeps through a ball at which the set and costumes are period, but the music and manners are straight out of a modern dance club. The proposition seems to be that an elite few were able to put a toe into the future to experience what is ordinary today.
Something like that went on in the Silicon Valley I knew in the 1980s. The debates and dilemmas that occupy a generation today appeared in miniature before there was an internet. We took our anticipation of the internet deadly seriously, to the point where it seemed already real. Thus I have experienced the internet age twice.
Experiencing the internet in reality is different – and even bizarre, because although it seemed reasonable to expect the thing to come about, it is still uncanny that the reasoning was right. It feels as though we got away with something we shouldn’t have done.
The internet arrived from two directions, one top-down and the other bottom-up. Initially computers and computer networking were both developed in military and government labs. The way you experienced computation from the 1960s often reflected this point of origin, with early computer companies such as IBM exuding a grey, regimented stoniness in order to appear seductive to their patrons.
In the 1970s, a small market emerged for hobbyist computers. You could build your own little box with blinking lights that you could program by flipping lines of switches on the front panel. That’s all you could do at first, but oh, the ecstasy to be able to touch your own computer, if you had an inkling of where it all could lead.
A culture grew up around these hobbyist machines centred in Silicon Valley, and spawned the personal computer market – with Microsoft launching in 1975 and Apple in 1976. The centre of gravity split: the stony grey opposite delirious hippies and faux revolutionaries.
The turbulent confluence between top-down and bottom-up continues to this day. Internet start-ups sprout like garage bands. Most die, but a few explode into national-scale empires, as in the case of Facebook. Dreary top-down institutions such as wireless carriers maintain their lofty entitlements, though occasionally they drain away, like the old music business. I used to be partisan, favouring the bottom-up approach, but now I appreciate the balance of tides, because all kinds of power should be checked.
My first encounter with Silicon Valley was at the end of my teens, which was also the end of the 1970s. The world seemed carved into zones according to the degree of magic available. The highest magic was found in nexuses of hippie exuberance such as the beach town of Santa Cruz, California, where pearlescent rainbows covered everything and even the most mediocre musicians could effortlessly invent melodies superior to almost anything heard since. Young, creative people with any sense of ambition tended to be drawn to these places like weight to gravity, but by the time I arrived the magic was receding.
The overwhelming explanation we held of our time and place was that we had been born too late to experience the one true orgasm of meaning, the 1960s. Young people who felt jilted by life because of a slight error in timing found solace in a twisted calculus of punk humour. An alternative to the Santa Cruz-type El Dorados of bohemia were the zones of brazen, barren reality: remote and violent desert towns, impoverished villages in Mexico, or tenements in New York City.
The most deficient places – condemned by hippies and punks alike – were the suburbs, the places of the conventional parent: an artificial world ruled by Disney and McDonald’s.
I did not arrive at this suspect ontology naturally, having grown up in a way that was both gritty and bohemian. My father and I couldn’t afford a home at one point, when I was 11, so we lived in tents on cheap land while building a crazed, geometric, spaceship-like house in a rough corner of southern New Mexico. I adapted to the flight from the suburbs because this seemed the ticket into the social world of my peers in that era. I well remember how my heart sank when I later realised that economic circumstances left me no choice but to force my old jalopy over the mountain pass that insulated dewy, arousing Santa Cruz from soul-killing, blandifying Silicon Valley, which was situated in, of all places, a suburb.
The mountain ridge that separates Silicon Valley and the town of Palo Alto from the ocean keeps out the famed fog of northern California in the summer. This has always made it an elite getaway from San Francisco, but to me Silicon Valley’s light looked incomplete and made me feel remote and depressed – so close to the ocean, but without its full light.
I despaired at the time that I had failed to earn enough to be able to remain at the fulcrum of hippie truth, but I was to learn, slowly, that I was moving from one narcissistic category war to another. Instead of hippies v suburbs, I enlisted in the turf war between nerds and – well, the opposite doesn’t have a name. A sort of muggle: the fool who doesn’t realise that he lives in a cocoon and serves only as a battery to power the action; a person who fails to understand that the world is an information system, and that life is programming.
Having moved from one kind of nonsense to another eventually helped me learn to be sceptical of both.
Palo Alto was nicknamed “Shallow Alto” by the hippie hackers, who felt that living there was a sell-out, a sign of failure. And yet, one by one, we gave in and entered an alternate, infinitely better-funded elite club. The place was much more than a suburb, naturally. A little more than a century earlier, there had been a Native American culture there, but it was murdered and erased, so little more can be said. Layers of mutually indifferent histories were then overlaid on to this, awaiting the final washout by Silicon Valley culture…
August 28, 2011
This image has been posted with express written permission. This cartoon was originally published at Town Hall.