July 31, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.
The Diplomat of Shoah: History Does Yale historian Timothy Snyder absolve Eastern Europe of special complicity in the Holocaust?
July 31, 2012
The dispute between Poles and Jews about the Nazi period can move in unsettling directions, ones that make an unhealed wound hurt even worse. Perceived insults, like President Barack Obama’s recent reference to “Polish concentration camps,” are seen by right-wing Poles as part of a plot to blacken their country’s name in the West. Some on the Polish right are also quick to argue that Poles who assisted the Nazis in anti-Jewish actions, or who slaughtered Jews on their own initiative (such pogroms occurred both during and just after the war), acted from understandable motives: After all, Jewish “treachery” had handed their country to the Bolsheviks. But the treachery is a fiction. Polish Jews were overwhelmingly anti-Communist, and the Soviets deported many of them.
The Polish role in the Holocaust had other roots, darker ones: traditional anti-Semitism and the greedy desire for Jewish property. When the historian Jan Gross in his books Neighbors and Fear (and, most recently, Golden Harvest, written with Irena Grudzinska Gross) charged his fellow Poles with aiding the Nazi genocide and profiting from the death of the Jews in their midst, he wanted them to mourn the vanished Jewish lives they had known so well, to come to terms with their guilt, since many of them had been indifferent or complicit or satisfied in the face of the Shoah. Instead, Lech Walesa, the hero of Solidarity and former president of Poland, called Gross “a mediocre writer … a Jew who tries to make money.” (Gross’ father was Jewish.) When Gross, who teaches at Princeton, returns to his native Poland, he has to contend with public prosecutors who, a few years ago, threatened to take him to court for “slandering the Polish nation.” His fellow historian Jan Grabowski says that Gross demolished the myth of Polish innocence by focusing on the reaction of Poles to the murder of 3 million of their fellow citizens, a reaction that was often craven, money-hungry, and cruel. “He was the one who brought this stinking mess into the open, single-handedly,” Grabowski remarks.
Enter Timothy Snyder.
The Yale historian’s Bloodlands: Europe Between Hitler and Stalin—hailed by Antony Beevor when it appeared in 2010 as “the most important work of history for years”—is grim and magisterial; it puts together the tragedy of the Holocaust with earlier mass murders in the regions that Snyder christens the “bloodlands” (Lithuania, Latvia, Byelorussia, Poland, and Ukraine). Snyder begins with the terrible famine that Stalin inflicted on Ukraine (more than 3 million dead); he goes on to the Great Terror, in which 700,000 died, including many Poles; and he writes movingly of the 3 million Soviet prisoners of war whom the Nazis starved to death, many of them in Byelorussian camps that were little more than barbed wire strung around masses of helpless, doomed POWs.
Like Gross, Snyder seeks to explain the actions of the non-Jews of Eastern Europe, the nearest bystanders to the Holocaust. But unlike Gross, he demands no conscience-searching from Eastern Europeans. Snyder points out that the Soviets and the Germans had ravaged the countries of the bloodlands, whose loss of sovereignty led to social chaos, hunger, threats of death, and deportation. Suddenly, Poles, Ukrainians, and others realized there was a starkly unavoidable presence in their midst, the German desire to kill Jews. It should not be a surprise, Snyder argues, that, by and large, they had little empathy for the Jews. Neither did we Americans, and we were thousands of miles away from Hitler and Stalin. The great debate between Snyder and Gross is a key juncture in the politics of memory in Eastern Europe and a test case for our efforts to understand what the Nazi extermination of the Jews meant to the part of the world where it happened.
I recently met Snyder for coffee in New Haven’s Blue State Café. Excited and nervous, he was anticipating the birth of his second child, due within days of our meeting. When he saw me he quickly folded his newspaper, and we launched, without throat-clearing, into our inescapable theme: mass murder. Snyder has the look of a hard-worked scholar on the brink of middle age—not unfriendly, but with a certain wariness about being misread; he seemed tired but in conversation was alert and careful. This fall, he said, he is preparing to teach a course solely about the destruction of the Jews and is writing a book on the causes of the Holocaust.
Although Bloodlands describes an array of Nazi and Soviet mass murders, its secret, as every reader discovers, is that it turns out to be a book about the Holocaust. Why the Shoah is the inevitable end point of the story that Bloodlands tells is a question that Snyder elicits without fully answering: The Holocaust stands out because it is the most developed instance of genocide. Every single Jew was marked down for murder, with the goal of making the Jewish nation vanish forever from the earth, and the German state devoted its best resources to this end. The disappearance of the Jews became an absolute priority; this was not true of the Roma and Sinti, or the Soviet POWs, or the Ukrainians under Stalin, who suffered just as the Jews did, but whose fate did not carry the same symbolic weight.
The utopian, absurd idea that getting rid of Jews means liberating non-Jewish humanity points to the central, though hidden, role that Jews played in the Nazi imagination. Jews, the people of the Ten Commandments, were the incarnations of conscience; their presence on the earth reminded humanity of the difference between good and evil, right and wrong. No other genocide took on such a task: the redemption of the world from the disease of conscience. The victims of Stalin and Mao died just like Hitler’s, but their deaths weren’t intended to have the world-altering significance that the annihilation of the Jews had for the Nazis.
Unusually for a historian in his field, Snyder—who is from small-town southwestern Ohio, where his family has lived for two centuries—has no Jewish and no Eastern-European ancestry. “I grew up as an American kid with no connection to any of these places,” he told me. In college in the late 1980s, he said, “I thought I was going to grow up and become a diplomat and negotiate nuclear arms,” but with the fall of the Soviet Union, he veered toward Eastern European studies, where he discovered high-voltage connections between intellectual life, politics, and national identity and learned to speak Polish and Ukrainian.
While Snyder never planned to become a Holocaust historian, it appears that he may now be turning into one. In 2008, he wrote a masterful essay on the Shoah in Volhynia, integrating survivor testimony with a measured account of the roles that Germans and Ukrainians played in the killing of Jews. In Volhynia, Snyder wrote, Jews were in greater danger from Ukrainian nationalists than they were from Germans. “Many gentiles came to see the murder of Jews as corresponding to their personal economic interests,” he explained. He ended his essay with a haunting passage that he later incorporated into Bloodlands, in which he recounted the inscriptions scrawled on the walls of the synagogue in Kovel. Here, where 12,000 Jews awaited certain death, they wrote their parting messages, nearly unbearable for the reader (“My beloved mama! There was no escape. They brought us here from outside the ghetto, and now we must die a terrible death. … We kiss you over and over.”)…
July 31, 2012
There remains a widely perceived notion — still commonly held within intellectual, academic, and policy circles in the West and elsewhere — that “Muslim” societies are especially resistant to embarking upon the path of demographic and familial change that has transformed population profiles in Europe, North America, and other “more developed” areas (un terminology). But such notions speak to a bygone era; they are utterly uninformed by the important new demographic realities that reflect today’s life patterns within the Arab world, and the greater Islamic world as well.
Throughout the Ummah, or worldwide Muslim community, fertility levels are falling dramatically for countries and subnational populations — and traditional marriage patterns and living arrangements are undergoing tremendous change. While these trends have not gone entirely unnoticed, no more than a handful of pioneering scholars and observers have as yet drawn attention to them and their potential significance.1 In this essay we will detail the dimensions of these changes in fertility patterns within the Muslim world, examine some of their correlates and possible determinants, and speculate about some of their implications.
THE GLOBAL MUSLIM POPULATION
There is some inescapable imprecision to any estimates of the size and distribution of the Ummah — an uncertainty that turns in part on questions about the current size of some Muslim majority areas (e.g., Afghanistan, where as one U.S. official country study puts it, “no comprehensive census based upon systematically sound methods has ever been taken”), and in part on the intrinsic difficulty of determining the depth of a nominal believer’s religious faith, but more centrally on the crucial fact that many government statistical authorities do not collect information on the religious profession of their national populations. For example: While the United States maintains one of the world’s most extensive and developed national statistical systems, the American government expressly forbids the U.S. Census Bureau from surveying the American public about religious affiliation; the same is true in much of the eu, in the Russian Federation, and in other parts of the “more developed regions” with otherwise advanced data-gathering capabilities.
Nevertheless, on the basis of local population census returns that do cover religion, demographic and health survey (dhs) reports where religious preference is included, and other allied data-sources, it is possible to piece together a reasonably accurate impression of the current size and distribution of the world’s Muslim population.
Two separate efforts to estimate the size and spread of the Ummah result in reasonably consistent pictures of the current worldwide Muslim demography profile. The first, prepared by Todd M. Johnson of Gordon-Conwell Theological Seminary under the aegis of the World Christian Database, comes up with an estimate of 1.42 billion Muslims worldwide for the year 2005; by that reckoning, Muslims would account for about 22 percent of total world population. The second, prepared by a team of researchers for the Pew Forum on Religion and Public Life, placed the total global Muslim population circa 2009, a few years later, at roughly 1.57 billion, which would have been approximately 23 percent of the estimated population at the time.
Although upwards of one fifth of the world’s population today is thereby estimated to be Muslim, a much smaller share of the population of the “more developed regions” adheres to Islam: perhaps just over three percent of that grouping (that is to say, around 40 million out of its total of 1.2 billion people). Thus the proportion of the world’s Muslims living in the less developed regions is not only overwhelming, but disproportionate: Well over a fourth of the population of the less developed regions — something close to 26 or 27 percent — would be Muslim, to go by these numbers.
Most of the world’s Muslim population inhabits a tropical and semitropical expanse that stretches across Africa and Asia from the Atlantic shores of Mauritania and Morocco to the Pacific archipelagos of Indonesia and the Philippines. The great preponderance of the world’s Muslims live in Muslim-majority countries — 73 percent according to the World Christian Database, nearly 80 percent according to the Pew Forum study (which lists 49 countries and territories in Asia, Africa, and Europe that it identifies as Muslim-majority). Another tenth of the Ummah (roughly 160million people as of 2009) lives within India, where Muslims are a religious minority. In all, eight countries today account for over 60 percent of the world’s Muslim population: Indonesia, Pakistan, India, Bangladesh, Egypt, Nigeria, Iran, and Turkey. Note that only one of these eight is an Arab society in the Middle East.
FERTILITY DECLINE IN MUSLIM-MAJORITY COUNTRIES
Since the overwhelming majority of today’s Muslims live in Muslim-majority countries, and since those same countries are typically overwhelmingly Muslim (by the Pew study’s estimate, 43 of those 49 countries and places are over two-thirds Muslim, 40 of them over 90 percent Muslim), we can use national-level data on fertility for Muslim-majority countries as a fairly serviceable proxy for examining changes in fertility patterns for the Muslim world community. For our purposes, the advantage here is that a number of authoritative institutions — most importantly, the United Nations Population Division (unpd) and the United States Census Bureau (uscb) — regularly estimate and project population trends for all the countries in the world.
The unpd provides estimates and projections for period “total fertility rates” (births per woman per lifetime) for over190 countries and territories across the planet for both the late 1970s and the 2005 to 2010 period. Using these data, we can appraise the magnitude of fertility declines in 48 of the world’s 49 identified Muslim-majority countries and territories.2
One way of considering the changes in fertility in these countries is to plot a 45-degree line across a chart and to compare fertility levels from three decades ago on one axis against recent fertility levels on the other axis. A country whose fertility level remains unchanged over time will remain exactly on this plotted line. If the fertility levels of the earlier time are plotted on the x-axis and the more current fertility levels on the y-axis, any country whose fertility level rises over time will be above the plotted line, whereas a country experiencing fertility decline will be located below the plotted line; the distance of these data points from the plotted line indicates the magnitude of a country’s absolute drop in fertility over these decades.
The results from this exposition of data are displayed in Figure 1. As may be seen, according to unpd estimates and projections, all 48 Muslim-majority countries and territories witnessed fertility decline over the three decades under consideration. To be sure: For some high-fertility or extremely-high-fertility venues in sub-Saharan Africa, wheretfrs (total fertility rates) in the six to eight range prevailed in the late 1970s, declines are believed to have been marginal (think of Sierra Leone, Mali, Somalia, and Niger). In other places where a fertility transition had already brought tfrs down around three by the late 1970s, subsequent absolute declines also appear to have been somewhat limited (think of Kazakhstan). In most of the rest of the Muslim-majority countries and territories, however, significant or dramatic reductions in fertility have been registered — and in many of these places, the drops in question have been truly extraordinary…
July 31, 2012
In September 1863, a local paper in Somerset, England, ran an article about a man and a woman from Taunton whose child had been stricken with scarlet fever. Depressingly common, a child suffering from the illness itself was not noteworthy—what made the news were the remedies proposed. Distraught, the parents had turned to a group of women for advice, and this “jury of matrons,” in the paper’s words, all agreed that there was no hope of survival. Instead, they suggested ways to prevent the child from “dying hard”: open all the doors, drawers, cupboards, and boxes in the house, untie any knots—perhaps in a shoelace, a curtain pull, or an apron sash—and remove all keys from their locks.
In 1707, Taunton had been the site of one of the last witch trials in England, and while the paper didn’t call these matrons witches outright, reactions among any urban readers who would have come across the story would likely have ranged from bemusement to dismay that ancient superstitions still persisted in England’s smaller towns. The household rituals suggested by these women contained a belief that stretches back thousands of years, a belief in “sympathetic magic,” a phrase coined in 1890 by anthropologist James G. Frazer in The Golden Bough—a mammoth study of magic, science, and religion. The sympathies involved were commonplace: everyday objects could affect human behavior and physical actions. By throwing open the doors and untying the knots, the Somerset jury of matrons were offering their best advice so that a “sure, certain, and easy passage into eternity could be secured.”
Miraculously, the child did not die. A few years later, a surgeon familiar with the case suggested that the women’s advice had inadvertently ventilated the home, and he celebrated this bit of “magical” intervention. (“Oh, that there were in scarlet-fever cases a good many such old women’s—such a ‘jury of matrons’—remedies!”) In his book, Frazer had a good deal more scorn for the matrons and their advice: “Strange to say, the child declined to avail itself of the facilities for dying so obligingly placed at its disposal by the sagacity and experience of the British matrons of Taunton; it preferred to live rather than give up the ghost.”
It’s a shame that we owe so much of our understanding of sympathetic magic to someone whose attitude toward magic was so, in a word, unsympathetic. In the time since its publication, The Golden Bough has influenced and inspired everyone from T. S. Eliot to H. P. Lovecraft, W. B. Yeats to Joseph Campbell. No one before Frazer had so exhaustively documented the wide variety of shamanistic, magical, and religious practices throughout the world, nor had anyone sought to so thoroughly synthesize them into a work regarding the basic structures of human belief. Yet for all its rigor, Frazer repeated a vicious refrain throughout: magic is a “spurious system of natural law as well as a fallacious guide of conduct,” “a false science,” as well as an “abortive” and “bastard” art.
Drawing from questionnaires sent to anthropologists, field workers, missionaries, and colonial administrators, Frazer analyzed a broad spectrum of magical rites and rituals— the preservation of hair and nail clippings, the destruction of images and effigies, the healing power of color—ultimately grouping them into two broad categories: the Law of Similarity, whereby “like produces like,” (a mutilated wax figure, for example, standing in for a hated person) and the Law of Contagion, in which “things which have once been in contact with each other continue to act on each other at a distance after the physical contact has been severed” (in which hair, nail clippings, or clothing once belonging to that person just might do the trick).
Sympathetic magic taps into a symbolic ordering of the world, where disparate objects and ideas can have unexpected correspondences and new potentials. This kind of magic reflects the order of our lives even as it seeks to gain mastery over that order. Its genius is its simplicity, which articulates a basic but all-encompassing explanation for human and natural events. With the proper gesture or carved totem, all of heaven and earth is in the hands of the magician. Sympathetic magic is easy to understand, requires no real training, and is catholic in its application; anyone who can tie knots or who can fashion a wax doll can employ it, from kings to their illiterate peasant subjects.
Perhaps this is why Frazer disdained sympathetic magic: by its very simplicity, scholars saw in it something juvenile and unsophisticated. Frazer describes the rainmaking ceremonies of an unspecified tribe of American Indians as involving a “sort of childish make-believe,” whereby one seeks to establish a sympathy with rain by making oneself wet, and fertility rites are “relics of an age of childish ignorance,” practiced by those who don’t understand modern obstetrics.
It’s no accident that the study of magic and ritual flourished during the wane of the British empire; the beliefs of native cultures were collected and studied so they could be corrected. The study of magic was important, the ethnologist E. E. Evans-Pritchard would later explain, “not only for the anthropologist but also for the colonial administrator and missionary, if they wish to show to the peoples whom they govern and teach that they understand their notions about right and wrong.” For some, the main reason for studying sympathetic magic was to eradicate it.
For all its erudition and analysis, The Golden Bough has for more than a century helped cement the idea that magic is inappropriate, wrongheaded thought. Yet what separates magic from religion or science is not its methodology—Frazer himself notes that it “is therefore a truism, almost a tautology, to say that all magic is necessarily false and barren; for were it ever to become true and fruitful, it would no longer be magic but science”—it’s that ordinary people can do it, transforming their lives with the ambitious power of everyday thought…
July 30, 2012
July 30, 2012
Buying Policy on Israel: American billionaire Sheldon Adelson is doing no favor to Israel by promoting policies that condemn it to perpetual conflict and isolation.
July 30, 2012
In this first presidential election since theCitizens United decision of the Supreme Court took away Congress’s legislative ability to reduce the corrupting influence of big money on the U.S. electoral process, there are worrisome manifestations of that influence every week. For example, Mitt Romney right now is doing some fund-raising in Britain among banking nabobs on the heels of the Libor-fixing scandal. A cochair of an event that is charging $25,000 to $75,000 a head to schmooze with the presumptive GOP nominee is the chief lobbyist of Barclays. He replaced in that role former Barclays chief executive Bob Diamond, who resigned (from his bank job and from his role in the Romney fund-raiser) because of his bank’s central role in the scandal.
But if I had to identify one source of big money whose influence is most worrisome on issues I happen to think about a lot, it would be someone who will meet Romney at a later stop on his current overseas trip, in Israel. That source is casino magnate Sheldon Adelson. Two things about Adelson’s role in this post-Citizens United world stand out. One is the sheer magnitude of the money involved. Adelson appears to be on track to be the single biggest individual donor in this U.S. election year—although we may never know that for sure, given the way the bundling of political money works and the refusal of the Romney campaign to identify the sources of its bundled money. Adelson’s fortune is currently estimated at about $24 billion. He has taken in stride the fluctuation of his wealth by many billions as shares of the Las Vegas Sands Corporation tanked during the recession before recovering, and he has repeatedly commented about how wide he intends to open his wallet to the candidate of his choice. During the primary season, that candidate was Newt Gingrich. Adelson said he would have been willing to give as much as $100 million to Gingrich’s campaign, before that campaign ended and Adelson turned his support to Romney.
The other distinguishing characteristic of Adelson is the strength of his affinity to a foreign government—not just to a foreign country but to the policies of the current government of that country. It is appropriate that Adelson will be one of the greeters when Romney arrives in Israel because, although Adelson is a U.S. citizen, his declared primary allegiance is to Israel. Adelson once commented that when he did military service as a young man it “unfortunately” was in a U.S. uniform rather than an Israeli one and that all he and his Israeli wife “care about is being good Zionists, being good citizens of Israel, because even though I am not Israeli born, Israel is in my heart.”
Adelson is using his fortune to push a political agenda in Israel as well as in the United States. One way he has done that is by establishing five years ago a free-distribution newspaper, Israel Hayom, which has become the highest-circulation daily in Israel. The paper follows a firmly rightist, pro-Netanyahu line. As a business the newspaper is a money loser, but Adelson cheerfully has indicated his willingness to continue losing money on the paper (not a significant loss, in comparison with his fortune) to get its message across.
Israel already has a government to Adelson’s liking, and he is using his money to sustain public support for it. In the United States, it is a matter of still trying to buy a government to his liking. His current hoped-for vehicle for doing that—Mitt Romney—has to date left his foreign policy largely a blank beyond slogans and the most general of themes. This was fully in evidence in his pre-trip VFW speech, in which the paucity of specific alternatives to the Obama administration’s policies was as evident as the rhetorical vehemence with which the Obama foreign policy in general was denounced. (Jacob Heilbrunn has furnished a good guide on how to interpret that speech.) It is possible, of course, that very specific foreign-policy ideas are firmly embedded in the candidate’s head, being kept in occultation there until he is elected. It is at least as plausible that there is much opportunity for those who would enjoy influence with a President Romney, including those most helpful in electing him, would have considerable opportunity to influence the policies that eventually emerge. In Adelson’s case, so much money is involved that it is hard to believe that money would not buy something on matters he feels most strongly about. When Gingrich was his man, it bought a candidate who dismissed the Palestinians as an “invented” people…
Healing Spirits: Why Medicine Has For Thousands Of Years Tried To Shed The Magical And Banish The Miraculous
July 30, 2012
Earlier this year at the hospital, where I am in training as a psychiatry resident, some colleagues were discussing the case of a young woman who had presented with worsening headaches. On CT scan she’d been found to have a meningioma, an often curable tumor of the membranes covering the brain. The prognosis was excellent, and she was offered surgery. She refused, instead flying to the Philippines, where a folk healer—a “psychic surgeon,” to use the term of trade—pressed on her forehead, extracted a dripping gobbet of flesh, and pronounced her cured.
The discussion was charged, running from the failures of modern medicine to ethical quandaries of how to counsel a patient seeking alternative therapies, to the power of the placebo (the tumor hadn’t vanished on followup CT, but she still claimed to feel relief). At the time, however, I was mostly unsettled by a memory. Ten years earlier, traveling in Brazil, I had heard again and again the story of another miracle surgeon. I remembered only the most extraordinary details of the case: the healer, a peasant with no medical training, was said to enter trances in which a spirit guided him as he operated on thousands of patients, from the destitute and hopeless of the nearby villages, to patients as prominent as the Brazilian president’s daughter.
Although I remembered little else, his fame was such that even in 2012, some forty years after his death, it wasn’t difficult to flesh out more details of his story. Jose Pedro de Freitas, known by his nickname Zé Arigó was born in 1921 or 1922 at a farm site six kilometers outside the town of Congonhas do Campo, in the mountainous state of Minas Gerais. As a young man, he was different, tormented by headaches and a strange white light, and then, as he grew older, dreams. In these, he found himself in an unfamiliar chamber, watching gowned and aproned figures speaking a foreign tongue. One night, a severe, stout, bald man, frock buttoned to his chin—a monster, Arigó would later say separated himself from the others, identified himself as Dr. Adolph Fritz, a German killed in World War I, and announced that he’d selected Arigó to carry out his earthly work. That night of revelation, Arigó awoke screaming. He sought help from doctors and the local priest—to no avail. It was only when he obeyed the surgeon that the nightmares ceased. Speaking German (a language he had never learned) and operating without anesthesia or antisepsis, Arigó used any tool at hand—butcher knives, scissors, rusty garden shears. He removed tumors and kidney stones, scarcely shedding blood. He cured blindness by sliding a blade high behind the orbit. Other times, like Jesus and the paralytic at Capernaum, Arigó simply commanded an illness to desist. As word of his successes spread, visitors began to arrive, first the ill and then others: incredulous doctors from São Paulo, foreign parapsychologists, journalists. And then, in 1956, carrying a charge of charlatanism, the police.
For the next fifteen years, Arigó would be alternatively arrested, tried, imprisoned, then rehabilitated to massive, adoring crowds. He would submit to videotaping to try to expose some sleight of hand and to electroencephalograms to look for seizures. He would be attacked by the Brazilian Medical Association as a fraud and ridiculed as a schizophrenic. Nor did the “Arigó wars” end with his death; only months after Arigó’s Chevy Opala collided with a truck on a rainy mountain road, Dr. Fritz returned to guide other healers, as he continues to do today.
While Arigó’s story remains dominated by polemics, the question of how best to care for the young woman with the meningioma seemed to require an exploration of something other than whether the most extraordinary claims of healing were simply true or false. After all, medicine and miracle have always been entwined. It was from medicine that magic was born, wrote Pliny the Elder, although one could make the reverse case as well: the very symbol of modern medicine—Asclepius’ staff—is named for a Greek divinity whose temples were sites of divine cures. And yet from its earliest years, medicine has tried to sever its ties, to shed the magical and to banish the miraculous. The more I read about Arigó, the more his story spoke to this uneasy history; from the nature of magical healers to the lasting impact they have had on the medical orthodoxy that has tried to stamp them out.
In his 1943 study of the psychology of medicine men, historian Erwin Ackerknecht surveyed a vast anthropological literature and distinguished between two patterns of initiation into the healing arts. For one group, medical knowledge was obtained through carefully practiced ritual, induced by fasting, drugs, or ceremonies invoking spirits who could lead the healer to a cure. For the second, Ackerknecht cited Russian travelers to Siberia, who had reported rituals among the Yakuts that were anything but methodical:
He who is to become a shaman begins to rage like a raving madman. He suddenly utters incoherent words, falls unconscious, runs through the forests, lives on the bark of trees, throws himself into fire and water, lays hold on weapons and wounds himself, in such ways that his family is obliged to keep watch on him.Despite the ubiquity of the word shaman today, its diffusion is recent. It comes from saman, from the Tungus—known today as the Evenk—people of Siberia, and the first outsiders to take detailed note were exiled Russian intellectuals. After a trickle of reports in the late nineteenth century, shamanism arrived in the West in two principal waves: during the Russo-American cooperation in the 1897-1902 Jesup North Pacific Expedition, and later, in more popular works, describing convulsive “pre-shamanic psychosis” as a disease unique to the North Asian steppes. From then, the word proved infectious, acquiring the hazy meaning of any healer who works by mysterious means. Seeking a definition for his monumental survey, Shamanism, the historian of religion Mircea Eliade worried over these loose boundaries between medicine man, sorcerer, medium, physician. At the same time, he felt there was an essence—an archaic “technique of ecstasy”—that could be found in a spectrum of practices from around the world…
July 30, 2012
“My child could have done that!” Wrong – neuroaesthetics is starting to show us why abstract art can be so beguiling
STANDING in front of Jackson Pollock’s Summertime: Number 9A one day, I was struck by an unfamiliar feeling. What I once considered an ugly collection of random paint splatters now spoke to me as a joyous celebration of movement and energy, the bright yellow and blue bringing to mind a carefree laugh.
It was my road-to-Damascus moment – the first time a piece of abstract art had stirred my emotions. Like many people, I used to dismiss these works as a waste of time and energy. How could anyone find meaning in what looked like a collection of colourful splodges thrown haphazardly on a 5.5-metre-wide canvas? Yet here I was, in London’s Tate Modern gallery, moved by a Pollock.
Since then, I have come to appreciate the work of many more modern artists, who express varying levels of abstraction in their work, in particular the great Piet Mondrian, Paul Klee, and contemporary artist Hiroshi Sugimoto. Even so, when I tried to explain my taste, I found myself lost for words. Why are we attracted to paintings and sculptures that seem to bear no relation to the physical world?
Little did I know that researchers have already started to address this question. By studying the brain’s responses to different paintings, they have been examining the way the mind perceives art. Although their work cannot yet explain the nuances of our tastes, it has highlighted some of the unique ways in which these masterpieces hijack the brain’s visual system.
The studies are part of an emerging discipline called neuroaesthetics, founded just over 10 years ago by Semir Zeki of University College London. The idea was to bring scientific objectivity to the study of art, in an attempt to find neurological bases for the techniques that artists have perfected over the years. It has already offered insights into many masterpieces. The blurred imagery of Impressionist paintings seems to tickle the brain’s amygdala, for instance, which is geared towards detecting threats in the fuzzy rings of our peripheral vision. Since the amygdala plays a crucial role in our feelings and emotions, that finding might explain why many people find these pieces so moving.
Could the same approach tell us anything about the controversial pieces that began to emerge from the tail end of Impressionism more than 100 years ago? Whether it is Mondrian’s rigorously geometrical, primary-coloured compositions, or Pollock’s controversial technique of dripping paint onto the canvas in seemingly haphazard patterns, the defining characteristic of modern art has been to remove almost everything that could be literally interpreted.
Although these works often sell for whopping sums of money – Pollock’s No. 5fetched $140 million in 2006 – they have attracted many sceptics, who claim that modern artists lack the skill or competence of the masters before them. Instead, they see the newer works as a serious case of the emperor’s new clothes, believing that people might claim to like them simply because they are in fashion. In the scathing words of the American satirist Al Capp, they are the “product of the untalented, sold by the unprincipled to the utterly bewildered”.
Chimp or Rothko?
We certainly do have a strong tendency to follow the crowd. When asked to make simple perceptual decisions such as matching up a shape with its rotated image, for instance, people will often choose a definitively wrong answer if they see others doing the same. It is easy to imagine that the herd mentality would have an even greater impact on a fuzzy concept like art appreciation, where there is no right or wrong answer.
Angelina Hawley-Dolan of Boston College, Massachusetts, responded to this debate by designing a fun experiment that played with her volunteers’ expectations of the pieces they were seeing. Their task was simple. The volunteers viewed pairs of paintings – either the creations of famous abstract artists or the doodles of amateurs, infants, chimps and elephants. Then they had to judge which they preferred. A third of the paintings were given no captions, while the rest were labelled. The twist was that sometimes the labels were mixed up, so that the volunteers might think they were viewing a chimp’s messy brushstrokes when they were actually seeing an expressionist piece by Mark Rothko. Some sceptics might argue that it is impossible to tell the difference – but in each set of trials, the volunteers generally preferred the work of the well-accepted human artists, even when they believed it was by an animal or a child (Psychological Science, vol 22, p 435). Somehow, it seems that the viewer can sense the artist’s vision in these paintings, even if they can’t explain why…
Read it all.
July 30, 2012
July 30, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.
July 29, 2012
The Wisdom of George Washington: Have we deviated from the sound principles of his Farewell Address?
July 29, 2012
The world’s most generous prize money is attached not to the Nobel Prize but to the Mo Ibrahim Prize, awarded for good governance in Africa, as determined by a very simple test: a democratically elected leader who actually leaves office at the end of his term. The winner receives five million dollars plus two hundred thousand dollars a year for life. The 53 African nations yielded one claimant in 2011, but none for the two years previous. The precedent set by George Washington has not been easy to establish elsewhere, prize money or not.
George Washington is justly famous for his retirements: his republican refusal of perpetual power on two all-important occasions, first when he resigned supreme military authority in 1783 and then again when he relinquished presidential authority in 1796. Although he went willingly, it can’t be said that he went quietly. Not, of course, that he made any sort of fuss and bother—that was not his style—but he did on both occasions take the opportunity to speak to his fellow citizens about the perils ahead. This impulse to extend his guiding presence over the generations indicates, I think, how difficult it actually was for the most competent man on the stage to exit of his own accord and turn the nation’s performance over to an ensemble cast.
“Silence in Me Would Be a Crime”
In Washington’s first valedictory, the “Circular to the States,” the General had noted that there were some who might object to his even offering political counsel for the future, viewing it as an act of arrogant presumption, “stepping out of the proper line of . . . duty.” Washington responded by saying, “silence in me would be a crime.” Why a crime? —because although the war had been won, it was yet to be determined, according to Washington, “whether the Revolution must ultimately be considered as a blessing or a curse.” (We speak today of the Arab “Spring”—a hopeful metaphor, but also misleading since political life is not regular like the seasons. Washington was aware that what follows the revolution counts most and there are never any guarantees of what that might be.)
In view of what he called “the present Crisis,” Washington was convinced it was not only permissible but also incumbent on him to set forth his thoughts on government, which he proceeded to do by describing four “Pillars” that were needed to support “the glorious Fabrick of our Independency and National Character.”
Three years later, Washington was distressed both by the state of the Union and by his countrymen’s disregard of his parting words. In a letter to John Jay, Washington lamented that “my sentiments and opinions . . . have been neglected, tho’ given as a last legacy in the most solemn manner.” Of course, as it turned out, the Circular of 1783 was not his last legacy. After serving two terms as president under the new Constitution, Washington had a chance to compose another Farewell Address, now with higher expectations of finding a receptive and lasting audience.
Like the Circular, the Farewell Address was never delivered as a speech; it was, from the first, a written document, intended to be read not heard, pondered not applauded. Its audience and mode of distribution, however, were strikingly different from the Circular’s. The Circular was directed to the respective governors of the states. It bore the salutation “Sir” and called upon “your Excellency” to communicate the contents to “your Legislature.” The “Citizens of America” were mentioned, but always in the third person as “they.”
By contrast, the Farewell was published via the popular medium of the newspapers. It was an open letter, bearing the salutation “Friends, and Fellow-Citizens.” The second-person “you” now embraced the people, rather than the functionaries of the states. By the close of the Farewell, even the gap between “I,” George Washington, and “you,” my fellow citizens, was de-emphasized, as Washington shifted increasingly to the first-person plural possessive: as in “our country,” “our interest,” “our foreign relations,” “our destiny,” culminating in the evocative closing reference to “our mutual cares, labours, and dangers.”
By the way, this salutation, “Friends, and Fellow-Citizens,” is unique in Washington’s writings. Throughout his presidency, each Annual Message had been addressed to “Fellow-citizens of the Senate and of the House of Representatives.” So, too, the First Inaugural, although the Second Inaugural was addressed more broadly to “Fellow-citizens.” The Farewell takes this a step further. The addition of the word “Friends” sounds a new, more intimate note—a note that develops into one of the speech’s recurring motifs. Whereas the formal voice of the Circular had been actuated by duty—remember, “silence would be a crime”—the warmer voice of the Farewell is prompted by love. As Washington himself puts it, his counsels are those of “an old and affectionate friend.”
The Counsel of an Affectionate and Parting Friend
So what did the nation’s “parting friend” offer as his last legacy for our “solemn contemplation” and “frequent review”? The 50 paragraphs of the address are carefully structured. The primary divisions are an opening section of 6 paragraphs which constitutes the resignation proper, a central section of 36 paragraphs which delineates Washington’s maxims and warnings, and a concluding section of 8 paragraphs which measures Washington’s own administration against his expressed principles and solicits pardon for any shortcomings.
The language of the opening section, with its ostentatious modesty, is now alien to us. Our self-trumpeting politicians would never dream of drawing attention, as Washington does, to his “very fallible judgment” or “the inferiority of my qualifications.” For himself, Washington claims only “good intentions.” Of course, maybe it’s easier to appear humble when one’s actions have spoken as irrefutably as Washington’s have. The great man in the infant republic effaces himself, and deflects the credit onto his fellow citizens. “If benefits have resulted to our country from these services,” Washington insists, “let it always be remembered to your praise,” since “the constancy of your support was the essential prop of the efforts.”
The converse of Washington’s humility is his gratitude; he leaves office deeply indebted to “my beloved country.” He closes the opening section with a prayer—a carefully itemized prayer—hoping that the nation will be blessed with the favor of Heaven, perpetual Union, fidelity to the Constitution, the wise Administration of government, and a completion of national Happiness that will inspire the worldwide spread of liberty…