December 1, 2011
December 1, 2011
According to Immanuel Kant, the urge to philosophize is universal: “In all men, as soon as their reason has become ripe for speculation, there has always existed and will always continue to exist some kind of metaphysics.” The truth of this is apparent in children at any early age, whose questions exhaust even the most profound and patient of parents. But it does not follow that there must inevitably be a place for philosophy in our educational systems. It is rare in the United States, for instance, to encounter philosophy before college, and rare outside Catholic universities for philosophy to be required in college. (It was a pleasant feature of a recent year spent living in Morocco to find that almost everyone there, from pharmacists to cab drivers, had a basic grasp of what philosophy is, acquired from their high school days. In this country, in contrast, even well-educated people often have little idea of what philosophy actually consists.) At the university, we think of philosophy as an essential offering in the humanities. But there is nothing inevitable even about this, as reflection on the history of the subject reveals.
Philosophy, as it is generally studied in the modern university, springs from ancient Greece and the writings of Plato and Aristotle. The various famous ancient schools long thrived during the Hellenic and Roman eras, but then slowly faded away during the sixth century CE. There followed several centuries of darkness—a true Dark Ages, as much as medievalists dislike the phrase—until philosophical forms of thought began to reemerge in the ninth century. Around the same time, one finds distinct and quite independent philosophical movements afoot in Byzantium, in Latin Western Europe, and in the Islamic world. In time, the Latin tradition would become ascendant, as fostered within the European university and eventually reinvigorated by the Enlightenment and the rise of modern science. These developments, however, were still centuries away. In the year 900, by far the most robust and impressive philosophical tradition was found not in Europe, but in the Middle East. Islamic scholars there had embarked on a wholesale program to recover the traditions of Greek philosophy (particularly the works of Aristotle), translate them into Arabic, and rethink their message in light of the newly revealed teachings of the Qur’an. Anyone able to observe from on high these distinct intellectual traditions at the end of the first millennium would surely have put their money on the Muslims as the group most likely to inherit the Greek philosophical legacy, and so it was for several centuries, as a series of brilliant philosophers and scientists made Baghdad the intellectual center of the early medieval world.
Eventually, however, the center shifted—first to the western part of the Islamic world in northern Africa and southern Spain, and then north to Christian Europe. What we call the Middle Ages was, in Islam, the great classical era of philosophy and science. After several centuries of flourishing, however, the study of philosophy and science faded in Muslim countries, even while it was being pursued with increasing vigor in the Latin West.
What happened? How did Western Europe, by the late Middle Ages, become the prime locus for philosophical and scientific research? These are, of course, complex matters. But to see something of the factors at play, we might consider the life and work of Averroës, one of the last great Islamic philosophers, and the one who made the strongest argument on behalf of philosophy. Those arguments would eventually take root, but not where he expected them to.
A Controversial Life
Abū al-Walīd Muhammad ibn Ahmad ibn Muhammad ibn Rushd—or Averroës, as he was known to Latin readers—was born in 1126 at the far western edge of the Islamic world, in Córdoba, Spain. His father and grandfather were prominent scholars and religious figures, and he, in turn, developed close ties with the Almohad caliphs who reigned over southern Spain and northwestern Africa during the twelfth century. These connections allowed him to serve as an influential religious judge in Seville and Córdoba and, later, as court physician in Marrakesh. Supposedly in response to the caliph’s complaint about the obscurity of Aristotle’s writings, Averroës devoted much of his scholarly efforts to a series of commentaries on Aristotle, producing both brief epitomes and exhaustive, line-by-line studies. These commentaries would eventually take on a life of their own, but the most striking feature of Averroës’s career is how little influence he had on the Islamic world of his time, despite his obvious brilliance. Many of his works no longer survive in Arabic at all, but only in Latin or Hebrew translation. Indeed, even during his life, Averroës became a controversial figure. For in 1195, when the then-reigning caliph felt the need to make concessions to conservative religious figures, he banished Averroës to the small Spanish town of Lucena, and ordered that his philosophical works be burned. Not long after, the caliph moved to Marrakesh, a position from which he evidently was able to restore Averroës to favor. The philosopher rejoined the caliph’s court, where he died in 1198.
What made Averroës so controversial, and what does this show us about the way in which philosophy has and has not persisted over the centuries? One can see something of the attitude among Muslim conservatives of that time from a popular Andalusian insult that has survived: “This fate has struck all the falsifiers who mix philosophy with religion and promote heresies. They have studied logic (mantiq), but it is said with reason that misfortune is passed through speech (mantiq).” Here an Arab proverb is invoked to play upon the twin meanings of mantiq as logic and speech. The jibe is fair enough, in a sense—Averroës did in his own way want to mix philosophy with religion, and, in particular, he promoted logic as the key to a true understanding of religion. As for whether the results were heretical, that, of course, is a matter of dispute; like all the great philosophers, Averroës arrived at his share of heterodox views…
December 1, 2011
Despite the boom of recent years, Indonesia is still the sort of place from which young people seek to escape. Nearly all Indonesians who can afford it send their children to study in universities abroad, and my parents were no different. But where many of my former classmates have since become Canadians and Australians, I am again in Jakarta. Like most of the Indonesians I know who studied in the United States, I had trouble staying there after graduating. Many of these would-be Americans are now doing exceptional things elsewhere. One person who was denied a visa to the United States is now back in Indonesia, founding his own tech company. Another went to the Netherlands and is working on a software platform to sell to Indonesian companies. They would have preferred to be making contributions to the United States, but the American immigration system wouldn’t permit it.
No one can deny that immigration is a major political issue in the U.S.—it has, in some respects, dominated this year’s Republican presidential primary—but it’s lamentable how narrowly the issue is usually defined. Much of the American public is obsessed with the specter of poor, illegal immigrants, but there is little attention paid to the byzantine system for skilled migrants. There is a bipartisan skilled immigration plan sponsored by Charles Schumer and Mike Lee currently meandering through the Senate, but its prospects, like all recent immigration reform efforts, are dim.
In the absence of reform, the American immigration system will remain what it is—the product of a series of accidents and miscalculations by policymakers. From the preferences for family unification that have come to dominate America’s immigration system to the absurd lotteries that determine green cards, the overall picture is unflattering: of a country on autopilot, with a civil service incapable of formulating a long-term strategic plan in the national interest, and with rabid interest groups jockeying for narrow victories. There’s no perfect way to determine how many family members, skilled migrants, refugees, or other groups America should admit—broader questions of economic vitality, social justice, belonging, and obligation all need to play a part—but as it stands, there’s simply no coherence at all.
THE UNITED STATES’ modern immigration system was born in the postwar era with the abolition of the ethnic quota system that favored Europeans. Conservatives and patriotic groups were initially opposed to the 1965 reform to abolish quotas, but they eventually relented, in part because they imagined the concept of family reunification would help to freeze the ethnic landscape in the 1960s. “Do you not agree with me that it would be vastly easier for us to assimilate into American life an Englishman than it would be to assimilate a person from Indonesia?” asked Senator Sam Ervin, a notable conservative in the hearings, about people like myself. “I think immigration should be restricted to those who have relatives already in this country. I think that we should have our immigration drawn in such a way as to reunite families.” There was a general understanding that family reunification would prevent an excess of many non-white migrants entering the country. As Representative Emmanuel Celler, author of the 1965 Hart-Celler immigration act said, “since the people of Africa and Asia have very few relatives here, comparatively few could immigrate from those countries because they had no family ties to the United States.”
The congressmen’s assumptions turned out to be completely, even spectacularly wrong. Immigration from Asia totaled about 15,000 a year in the 1950s but surged to 43,000 a year in the ’60s, then to six times that amount by the ’80s. Policymakers had no understanding of the concept of “chain migration,” which renders the composition of the native cohort not nearly as meaningful as the future supply of the incoming cohort. To put it simply, while there were many Americans of French descent already in the United States, they had comparatively few relatives available and willing to migrate, whereas the small number of Thai in America were linked to millions of other Thai seeking to emigrate. The reason this miscalculation was important is not because we ought to prefer European immigrants to Asian ones. Rather, if policymakers had understood the concept of “chain migration,” they might have ditched the family reunification system entirely in favor of a more meritocratic or humanitarian one. Instead, the country was saddled with an immigration policy that neither sought out the most skilled applicants nor recognized its historic obligation to the needy of the world (“give us your tired, your poor”), opting instead to cater to the special interests of some of its citizens.
The second accident that defines our current dysfunctional system comes on the heels of the first: By the 1980s, congressional panels began to think up ways to redress what they saw as the immigration system’s bias against Europeans. It had become clear that family reunification hurt would-be European migrants because they had no immediate relatives in America—most of their ancestors had migrated decades ago, whereas Asians and Latin Americans had “fresher” ties to the United States and could keep coming. At the same time, the number of illegal Irish and Italian immigrants in the country had grown, and powerful lobbies began to call for a solution.
The NP-5 visa lottery was subsequently created to aid nationalities “adversely affected” by the 1965 law. Because Irish groups had strong political ties to the immigration subcommittees, which among others, included Senator Ted Kennedy, initially 40 percent of the lottery visas were allotted only for Irish nationals, an extraordinary handout. Later, the program was expanded into what is now known as the green card lottery. Today’s lottery is not particularly discriminating—applicants need a high school diploma and two years of work experience—and in 2011, 12.1 million people applied for 50,000 visas. The national quotas have since been randomized, deprioritizing the Irish, but it now contains a more basic, troubling feature: America is the only country to use lotteries to determine questions of citizenship, which, incidentally, is something it appears to have made a habit out of—lotteries also decide the recipients of the H-1B skilled worker visa when the 65,000 person cap is reached. Both lotteries are partly to correct mistakes made in 1965. A non-lottery H1-B system is not possible until Congress votes to increase the number of skilled workers allowed to migrate or narrow the definition of skill to reduce applications. The political paralysis of the past decade, however, has left even a basic reforms like this impossible.
The third accident also involves the 1965 law and demonstrates the power of special interest groups in crafting our current system. As it turns out the White House never intended for the family reunification category to represent the largest category of visas in the first place. The initial plan President Kennedy and his advisor Abba Schwartz sent to Congress in July 1963 envisioned a 50 percent quota for skilled migration, with the remaining places for close relatives and refugees—similar to Canada today. But the House committee chairman deliberating the bill, Michael Feighan, was successfully lobbied by ethnic and labor groups in his district to bring their own family members to America. Unions don’t tend to favor a large influx of immigration, but the AFL-CIO and other groups in the 1950s and 1960s were comprised mostly of recent migrants from Southern and Eastern Europe, which were particularly active in Feighan’s district. The congressman, who nearly lost his seat in 1964 and was fighting a tough reelection battle in 1966, caved easily to the pressure. As Feighan told a congressional reporter on September 30, a few days before the Hart-Cellar bill that revamped American immigration was passed, “1,000 families in my district would benefit from the family reunification provisions of the final bill.”
As a result, the preferences that were unveiled in 1965 were diametrically opposed to what the Kennedy administration had imagined, with family reunification at 74 percent, professionals and skilled workers at 20 percent, and refugees at 6 percent. Today, these basic proportions hold, with two-thirds of immigrants arriving because they are related to American citizens, 13 percent from employment (of which a maximum of 10 percent are skilled), 5 percent via the Green Card lottery, and the remainder as refugees…
December 1, 2011
Steven Pinker was a 15-year-old anarchist. He didn’t think people needed a police force to keep the peace. Governments caused the very problems they were supposed to solve.
Besides, it was 1969, said Dr. Pinker, who is now a 57-year-old psychologist at Harvard. “If you weren’t an anarchist,” he said, “you couldn’t get a date.”
At the dinner table, he argued with his parents about human nature. “They said, ‘What would happen if there were no police?’ ” he recalled. “I said: ‘What would we do? Would we rob banks? Of course not. Police make no difference.’ ”
This was in Montreal, “a city that prided itself on civility and low rates of crime,” he said. Then, on Oct. 17, 1969, police officers and firefighters went on strike, and he had a chance to test his first hypothesis about human nature.
“All hell broke loose,” Dr. Pinker recalled. “Within a few hours there was looting. There were riots. There was arson. There were two murders. And this was in the morning that they called the strike.”
The ’60s changed the lives of many people and, in Dr. Pinker’s case, left him deeply curious about how humans work. That curiosity turned into a career as a leading expert on language, and then as a leading advocate of evolutionary psychology. In a series of best-selling books, he has argued that our mental faculties — from emotions to decision-making to visual cognition — were forged by natural selection.
He has also become a withering critic of those who would deny the deep marks of evolution on our minds — social engineers who believe they can remake children as they wish, modernist architects who believe they can rebuild cities as utopias. Even in the 21st century, Dr. Pinker argues, we ignore our evolved brains at our own peril.
Given this track record, Dr. Pinker’s newest book, published in October, struck some critics as a jackknife turn. In “The Better Angels of Our Nature” (Viking), he investigates one of the most primal aspects of life: violence.
Over the course of 802 pages, he argues that violence has fallen drastically over thousands of years — whether one considers homicide rates, war casualties as a percentage of national populations, or other measures.
This may seem at odds with evolutionary psychology, which is often seen as an argument for hard-wired Stone Age behavior, but Dr. Pinker sees that view as a misunderstanding of the science. Our evolved brains, he argues, are capable of a wide range of responses to their environment. Under the right conditions, they can allow us to live in greater and greater peace.
“The Better Angels of Our Nature” is full of the flourishes that Dr. Pinker’s readers have come to expect. He offers gruesomely delightful details about cutting off noses and torturing heretics. Like his other popular books, starting with “The Language Instinct” (1994), it is a far cry from his first published works in the late 1970s — esoteric reports from his graduate work at Harvard, with titles like “The Representation and Manipulation of Three-Dimensional Space in Mental Images.”
From Irregular Verbs, a Career
He came to Harvard after graduating from McGill University in 1976. At the time, he was convinced that a life in psychology would allow him to ask the big questions about the mind and answer them with scientific rigor. “It was the sweet spot for me in trying to understand human nature,” he said.
But he quickly realized that such explorations would have to wait. “You can’t do a Ph.D. thesis on human nature,” he said. “So I studied much smaller problems — academic bread-and-butter problems.”
He began by studying how we picture things in our heads, looking for the strategies people use to make sense of the visual information continually flooding the brain. As he worked on his dissertation, however, he recognized that many other scientists were also tackling the same problems of visual cognition.
“There were a lot of people studying them who were doing a better job than I could,” he said. So he looked for another problem.
The field he settled on was language, and it proved to be consuming. For Dr. Pinker, it was “a window into human nature.” Linguists have long debated whether language is a skill we develop with all-purpose minds, or whether we have innate systems dedicated to it.
Dr. Pinker has focused much of his research on language on a seemingly innocuous fluke: irregular verbs. While we can generate most verb tenses according to a few rules, we also hold onto a few arbitrary ones. Instead of simply turning “speak” into “speaked,” for example, we say “spoke.”
As a young professor at the Massachusetts Institute of Technology, he pored over transcripts of children’s speech, looking for telling patterns in the mistakes they made as they mastered verbs. Out of this research, he proposed that our brains contain two separate systems that contribute to language. One combines elements of language to build up meaning; the other is like a mental dictionary we keep in our memory.
This research helped to convince Dr. Pinker that language has deep biological roots. Some linguists argued that language simply emerged as a byproduct of an increasingly sophisticated brain, but he rejected that idea. “Language is so woven into what makes humans human,” he said, “that it struck me as inconceivable that it was just an accident.”…
December 1, 2011
This image has been posted with express written permission. This cartoon was originally published at Town Hall.