May 31, 2012
May 31, 2012
Mafia States: It has become difficult to distinguish the geopolitical calculations of some states from the profit motives of criminal organizations
May 31, 2012
The global economic crisis has been a boon for transnational criminals. Thanks to the weak economy, cash-rich criminal organizations can acquire financially distressed but potentially valuable companies at bargain prices. Fiscal austerity is forcing governments everywhere to cut the budgets of law enforcement agencies and court systems. Millions of people have been laid off and are thus more easily tempted to break the law. Large numbers of unemployed experts in finance, accounting, information technology, law, and logistics have boosted the supply of world-class talent available to criminal cartels. Meanwhile, philanthropists all over the world have curtailed their giving, creating funding shortfalls in the arts, education, health care, and other areas, which criminals are all too happy to fill in exchange for political access, social legitimacy, and popular support. International criminals could hardly ask for a more favorable business environment. Their activities are typically high margin and cash-based, which means they often enjoy a high degree of liquidity — not a bad position to be in during a global credit crunch.
But emboldened adversaries and dwindling resources are not the only problems confronting police departments, prosecutors, and judges. In recent years, a new threat has emerged: the mafia state. Across the globe, criminals have penetrated governments to an unprecedented degree. The reverse has also happened: rather than stamping out powerful gangs, some governments have instead taken over their illegal operations. In mafia states, government officials enrich themselves and their families and friends while exploiting the money, muscle, political influence, and global connections of criminal syndicates to cement and expand their own power. Indeed, top positions in some of the world’s most profitable illicit enterprises are no longer filled only by professional criminals; they now include senior government officials, legislators, spy chiefs, heads of police departments, military officers, and, in some extreme cases, even heads of state or their family members.
This fusing of governments and criminal groups is distinct from the more limited ways in which the two have collaborated in the past. Governments and spy agencies, including those of democratic countries, have often enlisted criminals to smuggle weapons to allied insurgents in other countries or even to assassinate enemies abroad. (The CIA’s harebrained attempt to enlist American mafia figures to assassinate Fidel Castro in 1960 is perhaps the best-known example.) But unlike normal states, mafia states do not just occasionally rely on criminal groups to advance particular foreign policy goals. In a mafia state, high government officials actually become integral players in, if not the leaders of, criminal enterprises, and the defense and promotion of those enterprises’ businesses become official priorities. In mafia states such as Bulgaria, Guinea-Bissau, Montenegro, Myanmar (also called Burma), Ukraine, and Venezuela, the national interest and the interests of organized crime are now inextricably intertwined
Because the policies and resource allocations of mafia states are determined as much by the influence of criminals as by the forces that typically shape state behavior, these states pose a serious challenge to policymakers and analysts of international politics. Mafia states defy easy categorization, blurring the conceptual line between states and nonstate actors. As a result, their behavior is difficult to predict, making them particularly dangerous actors in the international environment.
A REVOLUTION IN CRIME
Conventional wisdom about international criminal networks rests on three faulty assumptions. First, many people believe that when it comes to illicit activities, everything has been done before. It is true that criminals, smugglers, and black markets have always existed. But the nature of international crime has changed a great deal in the past two decades, as criminal networks have expanded beyond their traditional markets and started taking advantage of political and economic transformations and exploiting new technologies. In the early 1990s, for example, criminal groups became early adopters of innovations in communications, such as advanced electronic encryption. Criminal syndicates also pioneered new means of drug transportation, such as “narco-submarines”: semi-submersible vessels able to evade radar, sonar, and infrared systems. (Drug cartels in Colombia eventually graduated to fully submersible submarines.) In more recent years, criminal organizations have also taken advantage of the Internet, leading to a dizzying growth in cybercrime, which cost the global economy some $114 billion in 2011, according to the Internet security firm Symantec.
A second common misperception is that international crime is an underground phenomenon that involves only a small community of deviants operating at the margins of societies. The truth is that in many countries, criminals today do not bother staying underground at all, nor are they remotely marginal. In fact, the suspected leaders of many major criminal groups have become celebrities of a sort. Wealthy individuals with suspicious business backgrounds are sought-after philanthropists and have come to control radio and television stations and own influential newspapers. Moreover, criminals’ accumulation of wealth and power depends not only on their own illicit activities but also on the actions of average members of society: for example, the millions of citizens involved in China’s counterfeit consumer-goods industry and in Afghanistan’s drug trade, the millions of Westerners who smoke marijuana regularly, the hundreds of thousands of migrants who every year hire criminals to smuggle them to Europe, and the well-to-do professionals in Manhattan and Milan who employ illegal immigrants as nannies and housekeepers. Ordinary people such as these are an integral part of the criminal ecosystem.
A third mistaken assumption is that international crime is strictly a matter of law enforcement, best managed by police departments, prosecutors, and judges. In reality, international crime is better understood as a political problem with national security implications. The scale and scope of the most powerful criminal organizations now easily match those of the world’s largest multinational corporations. And just as legitimate organizations seek political influence, so, too, do criminal ones. Of course, criminals have always sought to corrupt political systems to their own advantage. But illicit groups have never before managed to acquire the degree of political influence now enjoyed by criminals in a wide range of African, eastern European, and Latin American countries, not to mention China and Russia…
B. F. Skinner’s notorious theory of behavior modification was denounced by critics 50 years ago as a fascist, manipulative vehicle for government control. But Skinner’s ideas are making an unlikely comeback today, powered by smartphone apps that are transforming us into thinner, richer, all-around-better versions of ourselves. The only thing we have to give up? Free will.
MY YOUNGER BROTHER DAN gradually put on weight over a decade, reaching 230 pounds two years ago, at the age of 50. Given his 5-foot-6 frame, that put him 45 pounds above the U.S. National Institutes of Health’s threshold of obesity. Accompanying this dubious milestone were a diagnosis of type 2 diabetes and multiple indicators of creeping heart disease, all of which left him on a regimen of drugs aimed at lowering his newly significant risks of becoming seriously ill and of dying at an unnecessarily early age.
He’d be in good company: a 2007 study by TheJournal of the American Medical Association found that each year, 160,000 Americans die early for reasons related to obesity, accounting for more than one in 20 deaths. The costs are not just bodily. Other studies have found that a person 70 or more pounds overweight racks up extra lifetime medical costs of as much as $30,000, a figure that varies with race and gender. And we seem to be just warming up: cardiologists who have looked at current childhood obesity rates and other health indicators predict a steep rise in heart disease over the next few decades, while a report from the Organization for Economic Cooperation and Development projected that two-thirds of the populations of some industrialized nations will be obese within 10 years.
Dan had always been a gregarious, confident, life-of-the-party sort of guy, but as his weight went up, he seemed to be winding down. Then, on a family visit to Washington, D.C., early last year, he and I dropped in on the National Gallery of Art, where 10 minutes of walking left him so sore in one leg that I had to find him a wheelchair. That evening, I decided to say the obvious: He was fast heading to incapacity and an early grave. He had a family to think of. He needed to get into some sort of weight-loss program. “Got any suggestions?” he retorted. As it happened, I did.
Today, my brother weighs 165 pounds—what he weighed at age 23—and his doctor has taken him off all his medications. He has his vigor back, and a brisk three-mile walk is a breeze for him.
Sorry if this sounds like a commercial for a miracle weight-loss program. But in fact my brother did it with plain old diet and exercise, by counting calories and walking. He had no surgery, took no supplements or pills, ate no unusual foods, had no dietary restrictions, embarked on no extreme exercise regimen. He will need to work his whole life to keep the weight off, but he shows every sign of being on the right track. He has changed his eating and exercise habits, and insists he enjoys the new ones more than the old.
In short, Dan seems a lot like many of the people in the National Weight-Control Registry, the research database of those who, despite the popular wisdom that avoiding weight regain is a Herculean task, have kept off a minimum of 30 pounds for at least a year. Most of us know someone who lost weight years ago and has kept it off, and we all see celebrities who claim to have slimmed down for good using plain old diet and exercise, from Bill Clinton to Drew Carey to Jennifer Hudson. But we keep hearing that the vast majority of us—98 percent is a figure that gets thrown about—can’t expect to do the same.
Alcoholics don’t seem to face such dismal prospects, thanks to Alcoholics Anonymous and similar multistep programs, which are widely regarded as effective treatments. With obesity, we’re apparently at a loss for a clear answer. Fads like the Atkins diet slowly fade in popularity after dieters watch the weight return. We’re left with the impression that the techniques needed to permanently lose weight don’t exist, or apply to only a tiny percentage of the population, who must be freaks of willpower or the beneficiaries of exotic genes. Scientists and journalists have lined up in recent years to pronounce the diet-and-exercise regimen a nearly lost cause—a view argued in no fewer than three cover stories and another major article in The New York Times Magazine over the past 10 years, and in a cover story in this magazine two years ago.
All of which is odd, because weight-loss experts have been in fairly strong agreement for some time that a particular type of diet-and-exercise program can produce modest, long-term weight loss for most people. But this program tends to be based in clinics operated by relatively high-priced professionals, and requires a significant time commitment from participants—it would be as if the only way to get treated for alcoholism were to check into the Betty Ford Center. The problem is not that we don’t know of a weight-control approach that works; it’s that what works has historically been expensive and inconvenient.
But now that’s changing. Consider my brother, who has never been to a weight-loss clinic. His program has taken place entirely in his home, at his office, and when he’s out at restaurants or visiting friends and family—and it happens at his convenience, or even automatically, literally without his doing more than lifting a finger.
Early studies of a fast-expanding pool of electronic weight-loss aids suggest that, by allowing people like Dan to construct their own regimen on their phone and computer, these tools could be a key to reversing the obesity epidemic. Applied across the health-care spectrum—to improve senior care, fix sleep problems, and cure addiction, for example—these affordable, accessible tools could radically change the way we conceive of and administer health care, potentially saving the system billions of dollars in the process.
And the basic formula underlying Dan’s weight loss reaches well beyond health. Behavioral technology allows users to gradually and permanently alter all kinds of behavior, from reducing their energy use to controlling their spending. Now, with the help of our iPhones and a few Facebook friends, we can train ourselves to lead healthier, safer, eco-friendlier, more financially secure, and more productive lives.
Ironically, this high-tech behavioral revolution is rooted in the work of a mid-century psychologist once maligned as morally bankrupt, even fascist. But the rise of social media has reoriented our societal paranoias, and more and more people are incorporating his theories into their daily lives. As a result, psychology’s most misunderstood visionary may finally get his due…
On a cold afternoon this winter I sat before a glass wall at the Museum of Jewish Heritage in New York City, fielding questions about Jewish dystopian literature. Outside was New York Harbour and the audience seemed distracted by the passing boats. My fellow panellist was Joshua Cohen, author of Witz, a novel about the last Jew on earth. My novel The Flame Alphabet concerns a poisonous language spoken by children and is set in a world of failed science where Jewish mysticism might offer the only clue to the language toxicity. Cohen and I were asked, with some impatience, why the future in our novels was so dour. Why write about the future at all when the present was, you know, so interesting? Doesn’t the real trump the unreal? And maybe most importantly: what was this attraction to dark visions of the last days, a burgeoning literary genre that might as well be called “end times porn”?
For years I’ve been asked to justify mercilessly sad endings, stories lacking in redemption and narrative visions that strip characters of their humanity through gruelling moral tests. Finally it is difficult to argue – no matter how true it feels – that pain and sorrow, in the literary sense, equal pleasure. Sometimes rhapsodic pleasure. But you know what they say about one man’s pleasure: you start to feel like a fetishist luring customers on the street to come inside and sample the delights of your whip.
The audience that day wasn’t satisfied by the observation that the phrase “happy novel” might be an oxymoron. What a lonely, underpopulated bookshelf would result that could house only happy novels. Neither did it help to intone that “the positive has already been given”. When you quote Kafka, even to a middle-aged Jewish crowd, people cry foul. Kafka was welcome to excuse his own bleakness, but we were not permitted to borrow his alibi.
Yet, in American fiction at least, the end times has graduated into de rigueur subject matter. Increasingly novelists cut their teeth on it and it’s starting to look like a rite of passage. Long a preoccupation of science fiction and horror writing, the apocalypse, as it looms closer, has become more intriguing to writers of literary fiction, more necessary to address. The last days no longer seem like a harmless fantasy. If this is a new development, it is worth considering why the end of the world is poised to join the suburbs and bad marriages as a distinctly American literary fascination.
After 11 September 2001, American novelists, asleep in their lairs, took heat for not responding soon enough to the attacks. In this case, “soon enough” meant within 24 hours. The question was pressing. Why must novelists take so long to process history? Or, worse, why must the novelist neglect history altogether, obsessed instead with the politics and sorrows within homes and suburban neighbourhoods that, measured against the rest of the world, are among the safest?
While other professional communicators had adapted to the faster times, the technological immaturity of the novelist was glaring, to some. What on earth was a novelist for, if not to rush to the microphone when his or her poor nation was battered and confused, to explain to everyone what the whole thing meant? Shouldn’t all of that training in graduate creative writing programmes have prepared these people for something like this?
Novelists are comfortable with the idea that their essential purpose should never be questioned but almost no one else is. In this case, concerns about the sluggish novelist’s absence from the public discourse betrayed a woeful, if predictable, failure of imagination. For a little while, the conspicuous silence of the novelist was in the news and eulogies were sung, again, for an art form that, over the past decades, had been left for dead several times over. But this was not the time for novelists to be defending themselves. As the 9/11 commentary and op-eds piled up, it became clear that many contemporary American novelists, in their work, maintain an asexual relationship to current affairs, even with game-changing events. Gone are the days of Mailer and Vidal pirouetting on television.
Among writers, many of whom were halfway out to sea on projects for novels that suddenly looked fit for the fish tank, a unique kind of despair set in. It wasn’t just that novelists supposedly had nothing to say about the events of 9/11. This didn’t, in the end, seem to trouble the writers I heard from. They weren’t political analysts and, in any case, opinion wasn’t the preferred rhetorical mode for writers besotted by narrative. The public field was already crowded with experts and it wasn’t hard to find articulate arguments from every perspective. Of far more concern to writers, if not shared very publicly, was that their narrative spectacles – when measured in terms of drama – could not possibly compete with the events of that day (never mind the atrocities that have occurred on other soil). Did all fiction suddenly amount to a kind of escapism, cowardly flight from what mattered most?
Many book projects were abandoned, and not just out of grief for the toll wreaked that day. In literary fiction, there was a new kid on the block and he was called “bad shit happening in your own backyard”. If this kind of thing could happen and wasn’t just fodder for science-fiction fantasies, by rights it must be a candidate for literary realism, which meant that the genre had some stomach-stretching to do. American realism would need to grow beyond the suburban bedroom, absorbing what, the previous day, had been pure fantasy. If truth used to be merely stranger than fiction, now it had also gorged itself on part of what made fiction special: its use of nightmarish material to haunt us. Reality seen supersize was more disturbing and commanded the nation’s attention in ways that made the imagination, and its artificial fruits, seem newly frivolous.
What is odd about this literary despair – the sense that one’s work is boring compared to a generation-defining surprise attack on the homeland – is that American novelists had not been especially celebrated for their ability to imagine plausible calamity . . . or, really, any kind of calamity (witness the serial indifference in critical circles to science-fiction writers, whose doomsday scenarios, worked out sometimes with fanatical logic, could populate a large encyclopaedia).
May 31, 2012
May 31, 2012
May 30, 2012
French Philosopher Alain Badiou On The Real Expression Of Love: “If you limit yourself to sexual pleasure it’s narcissistic”
May 30, 2012
Love, says France’s greatest living philosopher, “is not a contract between two narcissists. It’s more than that. It’s a construction that compels the participants to go beyond narcissism. In order that love lasts one has to reinvent oneself.”
Alain Badiou, venerable Maoist, 75-year-old soixante-huitard, vituperative excoriator of Sarkozy and Hollande and such a controversial figure in France that when he was profiled in Marianne magazine they used the headline “Badiou: is the star of philosophy a bastard?”, smiles at me sweetly across the living room of his Paris flat. “Everybody says love is about finding the person who is right for me and then everything will be fine. But it’s not like that. It involves work. An old man tells you this!”
In his new book, Badiou writes about his love life. “I have only once in my life given up on a love. It was my first love, and then gradually I became so aware this step had been a mistake I tried to recover that initial love, late, very late – the death of the loved one was approaching – but with a unique intensity and feeling of necessity.” That abandonment and attempt at recovery marked all the philosopher’s subsequent love affairs. “There have been dramas and heart-wrenching and doubts, but I have never again abandoned a love. And I feel really assured by the fact that the women I have loved I have loved for always.”
But isn’t such laborious commitment a pointless fuss in this age of ready pleasures and easily disposable lovers? “No! I insist on this – that solving the existential problems of love is life’s great joy,” he says and then looks across the coffee table at his translator, Isabelle Vodoz, with a big, half-ironic grin. “There is a kind of serenity in love which is almost a paradise,” he adds, popping a biscuit in his mouth and giggling. She giggles, too. “I am not only his translator,” she tells me later. Below this sixth-floor apartment, an RER train screeches along the rails out of Denfert-Rochereau station.
I think about the distinction Badiou describes in In Praise of Love. “While desire focuses on the other, always in a somewhat fetishist[ic] manner, on particular objects, like breasts, buttocks and cock,” writes Badiou, “love focuses on the very being of the other, on the other as it has erupted, fully armed with its being, into my life that is consequently disrupted and re-fashioned.”
In other words love is, in many respects, the opposite of sex. Love, for Badiou, is what follows a deranging chance eruption in one’s life. He puts it philosophically: “The absolute contingency of the encounter takes on the appearance of destiny. The declaration of love marks the transition from chance to destiny and that’s why it is so perilous and so burdened with a kind of horrifying stage fright.” Love’s work consists in conquering that fright. Badiou cites Mallarmé, who saw poetry as “chance defeated word by word”. A loving relationship is similar. “In love, fidelity signifies this extended victory: the randomness of an encounter defeated day after day through the invention of what will endure,” writes Badiou.
But this encomium to creative fidelity surely shows Badiou to be a man out of his time. “In Paris now half of couples don’t stay together more than five years,” he says. “I think it’s sad because I don’t think many of these people know the joy of love. They know sexual pleasure – but we all know what Lacan said about sexual pleasure.”
Indeed. Jacques Lacan argued that sexual relationships don’t exist. (Badiou will shortly publish a book of conversations between Lacan and his biographer, Elisabeth Roudinesco.) What is real is narcissistic, Lacan suggested, what binds imaginary. “To an extent, I agree with him. If you limit yourself to sexual pleasure it’s narcissistic. You don’t connect with the other, you take what pleasure you want from them.”
But wasn’t the rampant hedonism unleashed during Paris’s May 1968 événements, in which Badiou participated, all about libidinal liberation from social constraint? How can he, of all people, hymn bourgeois notions such as commitment and conjugal felicity? “Well, I absolutely agree that sex needs to be freed from morality. I’m not going to speak against the freedom to experiment sexually like some old arse” – “un vieux connard” – “but when you liberate sexuality, you don’t solve the problems of love. That’s why I propose a new philosophy of love, wherein you can’t avoid problems or working to solve them.”
But, he argues, avoiding love’s problems is just what we do in our risk-averse, commitment-phobic society. Badiou was struck by publicity slogans for French online dating site Méetic such as “Get perfect love without suffering” or “Be in love without falling in love”. “For me these posters destroy the poetry of existence. They try to suppress the adventure of love. Their idea is you calculate who has the same tastes, the same fantasies, the same holidays, wants the same number of children. Méetic try to go back to organised marriages – not by parents but by the lovers themselves.” Aren’t they meeting a demand? “Sure. Everybody wants a contract that guarantees them against risk. Love isn’t like that. You can’t buy a lover. Sex, yes, but not a lover.”
For Badiou, love is becoming a consumer product like everything else. The French anti-globalisation campaigner José Bové once wrote a book entitled Le Monde n’est pas une Marchandise (The World Isn’t a Commodity). Badiou’s book is, in a sense, its sequel and could have been entitled L’Amour n’est pas une Marchandise non plus (Love Isn’t a Commodity Either).
Surely that makes him an old romantic? “I think that romanticism is a reaction against classicism. Romanticism exalted love against classical arranged marriages – hence l’amour fou, antisocial love. In that sense I’m neither romantic nor classic. My approach is that love is both an encounter and a construction. You have to resolve the problems in love – live together or not, to have a child or not, what one does in the evening.”…
I began to think of suicide at sixteen. An anxious and driven child, I entered in my mid-teens a clinical depression that would last for 40 years. I participated in psychotropic drug therapy for almost 30 of those, and now, owing in part, but only in part, to the drug Cymbalta, I have respite from the grievous suffering that is mental illness.
As a health policy scholar, I understand the machinations of the pharmaceutical industry. My students learn about “me-too” drugs, which barely improve on existing medications, and about “pay-for-delay,” whereby pharmaceutical companies cut deals with manufacturers of generic drugs to keep less expensive products off the market. I study policymakers’ widespread use of effectiveness research and their belief that effectiveness will contain costs while improving quality. I appreciate that randomized controlled trials are the gold standard for determining what works. Specifically, I know that antidepressant medication is vigorously promoted, that the diagnostic criteria for depression are muddled and limited, and that recent research attributes medicated patients’ positive outcomes to the placebo effect. In my own research and advocacy work, I take a political, rather than a medical, approach to recovery from mental illness.
Cymbalta in particular epitomizes pharmaceutical imperialism. Approved by the FDA in August 2004 for the treatment of major depressive disorder, it has since gotten the go-ahead for treating generalized anxiety disorder, fibromyalgia, and chronic musculoskeletal pain, including osteoarthritis and lower back pain. It remains under patent to Eli Lilly.
I would not have been surprised if Cymbalta had not worked for me or had not bested the myriad drugs and drug combinations that came before. My path through clinical depression is strewn with discarded remedies. “Who are these people?” I wondered about patients who were said to achieve happiness with the first pill and therefore to violate societal notions of identity and independence. I was just trying to get out of bed, and although my first antidepressant, at age 26, had a strong positive result, it also had incommodious side effects, and relief was tentative and partial. Decades of new and evolving treatment regimens followed. I have been treated with every class of antidepressant medication, often in combination with other psychotropic drugs. Some drugs worked better than others, some did not work at all, and some had unendurable side effects. But Cymbalta did not disappoint, and now I have become a teller of two tales, one about health policy, the other about health.
Like many depressed people, I resisted the idea of psychotropic medication. I was deeply hurt when my psychotherapist suggested I see a psychiatrist about antidepressant drugs. How could she think I was that crazy or that weak? But she said she was concerned for my survival, and I eventually did as she asked. I became an outpatient at a venerable psychiatric hospital, where I found a kind stranger who knew my deepest secrets and wanted to end my suffering. He wrote a prescription, and thus began my 30-year trek.
Depression is sometimes confused with sadness. Many depressed people are very sad, as I was, but the essence of my depression was feeling dead among the living. Everything was just so hard. William Styron describes depression as “a storm of murk.” Andrew Solomon’s atlas of depression is titled Noonday Demon. I too found depression to be fierce, wrapping me in a heavy woolen blanket and mocking my attempts to cast it off. The self-loathing was palpable; it felt like I was chewing glass. I sensed that other people were seeing things I did not, and apparently they were, because when I began my first course of antidepressants, it was as if someone had turned on the lights. It did not make me happy or even content. The world simply looked different—brighter, deeper—and I was a part of it. I saw something other than the impassable flatness and enervating dullness, and I was amazed.
My progress came at a cost. In the late 1970s, before Prozac, antidepressant medication was seldom spoken of. The people I told about my treatment echoed my first reaction and sang throaty choruses of why-don’t-you-just-cheer-up and won’t-this-make-you-a-drug-addict. I was also drowsy after I ate, my mouth was always dry, and when a second medication was added, I began to lose control of my limbs and fall down. I insisted to my psychiatrist that it was the second drug that was causing me to fall. A champion of that one, he instructed me to discontinue the first. I responded in the way only privileged patients can: I went around him, using personal connections to wrest an informal second opinion from a resident in the lab run by my psychiatrist’s mentor. My doctor was convinced, and a little embarrassed, and we both learned something about therapeutic alliances.
In another city, another psychiatrist took up my case. We tweaked antidepressants and mood stabilizers, reintroducing the drug that had made me fall, this time in tiny doses. I fell less often, but ultimately gave it up, and when I left this psychiatrist to move again, we acknowledged that I was, in the words of mental health expert Gerald Klerman, better but not well…
Moral Principle vs. Military Necessity: The first code of conduct during warfare reflects ambiguities we struggle with to this day
May 30, 2012
During the hot and desperate summer of 1862, a senior American commander found himself consumed with the question of insurgents. Major General Henry Halleck had become general-in-chief of the Union armies in July of that year, and he soon discovered that the army had no laws or regulations to govern its contacts with the bands of irregular Southern forces in the field. A lawyer by training, Halleck found the absence of guidance maddening. Union troops were encountering an array of rebel forces, some uniformed, some not. “The rebel authorities claim the right to send men, in the garb of peaceful citizens, to waylay and attack our troops, to burn bridges and houses and to destroy property and persons within our lines,” Halleck vented in a letter sent on August 6.
Halleck’s correspondent was eager to help. Francis Lieber (1798–1872) was then a professor of history at Columbia College. A Prussian immigrant, he was a military veteran who had recently devoted himself to studying the conduct of war. What’s more, he was a passionate supporter of the Union cause and was keenly ambitious to influence national policy. Less than a year after that first exchange, a short paper Lieber wrote for the general on how international law regards insurgents and guerrillas had blossomed into America’s first code regulating the conduct of its army in warfare.
“Lieber’s Code,” as it soon became known, was widely disseminated, and it deeply influenced the later Hague and Geneva conventions. It is no exaggeration to say that this émigré professor with longstanding connections to the Southern aristocracy made one of the most substantial contributions to the modern law of war. Lieber was acutely aware of the novelty of his project. “It is an honor of the United States that they have attempted, first of all nations, to settle and publish such a code,” he wrote to Halleck.
The code achieved its stature with remarkable speed. Lieber completed the text in March 1863, and it was cursorily reviewed by a panel of generals and quickly approved by President Lincoln. Dispatched to military commanders in May 1863 as General Orders No. 100, it circulated through the army ranks and within a few years had been lauded by a United States Supreme Court Justice as an authoritative expression of the law of war.
But the deeper one delves into the details of this seemingly inspiring tale, the muddier it becomes. Lieber’s life and thought embodied some of the most serious contradictions in the struggle to humanize warfare. Those contradictions became painful as the Civil War grew more intense, and whether the gifted scholar restrained the conduct of the fighting in any way is uncertain at best. He certainly did not resolve the tensions he confronted; 150 years after his death, his adopted country is still struggling to reconcile the competing demands of security and humanity, principle and pragmatism.
Francis Lieber may have been assigned a lawyer’s task, but he still wrote like the professor he was for most of his adult life. At South Carolina College, where he first taught, and at Columbia, Lieber’s lectures were famously dense essays that he read verbatim to his students, who were then asked to regurgitate this received wisdom in writing. Frank Freidel recounts in his superb 1947 biography of Lieber that the professor’s heavy-handed pedagogy often wore on his students and enervated his colleagues, one of whom described his teaching as “singularly ill-suited to the needs of undergraduates.”
That style infuses his pamphlet on guerrilla warfare and the code itself, which was, as one scholar wrote, less a code than a “persuasively written essay on the ethics of conducting war.” The two documents included lengthy asides on recent European military campaigns, lofty thoughts on the progress of civilization, and several obsequious references to General Halleck’s own writings on international law.
At the heart of Lieber’s view of how war should be fought was the distinction between combatants and civilians and the conviction that civilian life and property should be spared whenever possible. “The principle has been more and more acknowledged,” he wrote, “that the unarmed citizen is to be spared in person, property, and honor as much as the exigencies of war will admit.” Commanders had an obligation to give warning whenever feasible before bombarding a location where civilians were likely to be. Libraries, hospitals, and art collections were to be spared. Cruelty and revenge had no place in Lieber’s concept of war, and he insisted that soldiers pay heed to the effects their actions would have after the guns fell silent. “Military necessity,” he insisted, “does not include any act of hostility which makes the return to peace unnecessarily difficult.”
Lieber consistently opposed the abuse of prisoners, and he quickly dispensed with the notion that captured Southern soldiers should be treated as criminals, traitors, or bandits. Instead, they were to be housed humanely and fed “plain and wholesome food.” Torture and public humiliation were forbidden, and chivalry was very much alive: To reward exemplary bravery and honor, captors could even return sidearms to enemy officers.
Few of Lieber’s insights were new. Classic “just war” doctrine, developed by Christian theologians including St. Augustine and Thomas Aquinas, stressed the importance of proportionality in conflicts and the need to avoid action that would make a return to peace impossible. The doctrine of immunity for noncombatants had roots in the Middle Ages’ chivalric codes. In the 18th and early 19th centuries, scholars and philosophers including Emmerich de Vattel, Jean Jacques Rousseau, and Immanuel Kant plowed the field as well, anticipating many of Lieber’s provisions on prisoners and civilians. Rousseau insisted that wars take place between states, not between peoples, and that “private persons are only enemies accidentally.” Vattel contended that foreign civilians are, in theory, enemies, but he leavened that harsh conclusion with an injunction that they should not be harmed if they pose no danger.
Lieber wasn’t alone in realizing the value of a code to regularize the behavior of combatants. In the wake of the bloody Crimean War in the mid-1850s, a movement grew in Europe to address war’s savagery. In 1856, Jean Henri Dunant began organizing what would become the International Committee of the Red Cross, the guardian of the law of war to this day. Lieber’s contribution lay in summarizing and synthesizing existing works, leavening them with examples from modern practice, and placing them in the form of a succinct code of conduct directed toward military commanders…
May 30, 2012
May 30, 2012
May 29, 2012
The current Supreme Court term has been dominated by the Constitutional challenge to the Affordable Care Act, the health-care legislation better known as Obamacare. But the Court has recently heard another case, this one concerning the controversial Arizona immigration law passed in 2010. Though five other states have passed similar laws, Arizona’s is the toughest one to date that attempts to get control of illegal immigration and its social and economic costs. The problems surrounding illegal immigration that this bill attempts to solve involve not just practical policies, but the very meaning of American identity and history.
For Americans, these issues have particular resonance; as we continually hear, we are a “nation of immigrants.” Many see the laws targeting immigrants as a repudiation of this heritage, an ethnocentric or even racist attempt to impose and monitor an exclusive notion of American identity and culture. Additionally, opponents claim that these laws invite the police to practice discriminatory “racial profiling,” creating the possibility that legal immigrants and U.S. citizens are unjustly detained and questioned.
As President Obama said in April 2010, laws like Arizona’s “threaten to undermine basic notions of fairness that we cherish as Americans.” The greater significance of this case, however, is the way it touches on deeply held and frequently conflicting beliefs about the role of immigration in American history and national identity. These beliefs have generated two popular metaphors: the melting pot and the salad bowl.
The melting pot metaphor arose in the eighteenth century, sometimes appearing as the “smelting pot” or “crucible,” and it described the fusion of various religious sects, nationalities, and ethnic groups into one distinct people: Ex pluribus unum. In 1782, French immigrant J. Hector St. John de Crevecoeur wrote that in America, “individuals of all nations are melted into a new race of men, whose labors and posterity will one day cause great changes in the world.”
A century later, Ralph Waldo Emerson used the “melting pot” image to describe “the fusing process” that “transforms the English, the German, the Irish emigrant into an American . . . The individuality of the immigrant, almost even his traits of race and religion, fuse down in the democratic alembic like chips of brass thrown into the melting pot.” The phrase gained wider currency in 1908, during the great wave of Slavic, Jewish, and Italian immigration, when Israel Zangwill’s play The Melting Pot was produced. In it, a character enthuses, “America is God’s Crucible, the great Melting-Pot where all the races of Europe are melting and re-forming!”
The idea of the melting pot, then, communicated the historically exceptional notion of American identity as one formed not by the accidents of blood, sect, or race, but by the unifying beliefs and political ideals enshrined in the Declaration of Independence and the Constitution: the notion of individual, inalienable human rights that transcend group identity. Of course, this ideal was violated in American history over the centuries by racism, ethnocentrism, xenophobia, and other ignorant prejudices. But over time, changes in law and social mores have taken place, making the United States today the most inclusive and tolerant nation in the world, the destination of choice for those millions desiring more freedom and opportunity.
In the melting pot metaphor, inalienable human rights transcend group identity.
Of course, this process of assimilation also entailed costs and painful sacrifices. Having voted with his feet for the superiority of America, the immigrant was required to become American, to learn the language, history, political principles, and civic customs that identified an American as American. This demand was necessarily in conflict with the immigrants’ old culture and its values, and, at times, it led to the painful loss of the old ways and customs. But how immigrants negotiated the conflicts and trade-offs between their new and old identities was up to them, and they were free in civil society to celebrate and retain those cultures through fraternal organizations, ethnic festivals, language schools, and religious guilds.
Still, they had to make their first loyalty to America and its ideals. If some custom, value, or belief of the old country conflicted with those core American values, then that old way had to be modified or discarded if the immigrant wanted to participate fully in American social, economic, and political life. The immigrant was the one who had to adjust; no one expected the majority culture to modify its values to accommodate the immigrant. After all, there were too many immigrants to do this without fragmenting American culture. No matter the costs, assimilation was the only way to forge an unum from so many pluribus.
Starting in the Sixties, however, another vision of American pluralism arose, captured in the metaphor of the salad bowl. Rather than assimilating, now different ethnic groups would coexist in their separate identities like the ingredients in a salad, bound together only by the “dressing” of law and the market. This view expresses the ideology of multiculturalism, which goes far beyond the demand that ethnic differences be acknowledged rather than disparaged.
Long before multiculturalism ever existed, Americans wrestled with the conflicts and clashes immigrants experienced in their lives. A book from the Forties on “intercultural education” announced its intent “to help our schools to deal constructively with the problem of intercultural and interracial tensions among our people” and to alleviate “the hurtful discrimination against some of the minority groups which compose our people.” One recommendation was to create school curricula that would “help build respect for groups not otherwise sufficiently esteemed.” Modern multiculturalism takes that idea but goes much farther by endorsing a species of identity politics predicated on victimization…
May 29, 2012
In an accelerated culture, 15 years is a long time. And last spring, when a stiff, cream-colored envelope arrived in the mail to announce preparations for my 10th college reunion, I realized that it had been nearly that long since my experience with antidepressants began.
When the envelope came, I was at work on a book about my generation’s relationship to psychiatric drugs. The book opened with a memory from the fall of 1997, when I was a dumped, homesick, anxious, and tearful freshman. I sought guidance in my school’s health and counseling center, where I was quickly treated to a remedy that seemed exotic—a diagnosis of depression and a prescription for a pill known as an SSRI, or selective serotonin reuptake inhibitor. Over the following months, I realized with a mounting sense of shock how many of my classmates were using medication, too.
For those of us who were teenagers in the 1990s, this feeling of surprise was fundamental to our experience of psychiatric drugs. In our midteen years, antidepressants and medication for attention-deficit hyperactivity disorder hadn’t been everywhere, and then suddenly they were. We attended college during the first report of a psychopharmaceutical explosion.
But people born in the late 80s and early 90s were raised in a very different world. They never knew a time before Prozac, can scarcely remember when advertisements for prescription medication didn’t peer out from bus shelters or blare from TV. Prompted by the arrival of my reunion invitation, I began to wonder whether psychiatric medication meant something different to this new generation of students than it had to mine.
My interest was piqued by two sensationalistic but widely reported stories a few years ago. The first was of a precipitous deterioration in college students’ mental health. One survey of incoming college freshmen found that the self-reported mental well-being of this group had fallen to its lowest level since the survey began 25 years earlier. Another major survey announced that 30 percent of college students had felt “so depressed that it was difficult to function” at some point in the preceding year. College mental-health staffs across the country reported facing an unprecedented volume of requests for service and a nearly ceaseless stream of psychiatric emergencies.
The second story was of a stark rise in the amount of academic stress faced by college students. Reports noted that undergraduate admissions have become more selective in the past decade. Today’s students apply to more schools, endure more rejection, and live their precollege lives keenly attuned to the need to compete. Deans had noticed a more serious bent among college students lately, describing a group that was apt to approach college as though it were a professional job, rather than a time for exploration. One college president lamented that the “moments of woolgathering, dreaming, improvisation” that were integral to a liberal-arts education a generation ago had become a hard sell for today’s crop of highly driven students. Sometimes the stories about stress on campus implied that this new breed of students were the type of kids—from affluent, self-aware, achievement-oriented families—who had been raised to view antidepressants and ADHD medications as a means of keeping up.
Were these stories true, I wondered? What role did medication play on campus now, and what did students’ attitudes toward it augur for the future? With those questions in mind, I decided to return to a college whose size and orientation reminded me a little bit of my own, to look for the change that 15 years had brought.
Madrianne Wong was one of the directors of a campus group at Swarthmore College that offers free peer counseling to students. On a gray day in March, I sat down to talk with her in the college library. In the winter of 2011, Wong, then a senior, and her fellow director, Jessica Schleider, then a junior, published an article in Swarthmore’s online newspaper that described mental-health issues as a large and growing problem on the campus. The authors held stress and academic pressure partly responsible. But they also blamed a pervasive ethic of self-presentation which demands that students appear not to have any problems at all. In their article, Wong and Schleider called it a “culture of silence.”
“Being at Swarthmore,” Wong told me later, “there’s just this expectation of mental strength and resilience. If you’re here, you must perform. Otherwise, there’s this running joke about who the admissions mistake is.” It’s an expectation that makes students loath to admit to any vulnerabilities, insecurities, or bona fide mental problems, even with close friends.
Wong and Schleider weren’t the first to point out a campus taboo against seeming anything short of perfect. They borrowed the phrase “culture of silence” from a 2010 article by a Yale senior named Julia Lurie, who described her college as a place in which emotional problems were both ubiquitous and unmentionable. She wrote of working hard to make herself resemble the Yale ideal, someone academically top-notch but also popular, socially engaged, worldly, ambitious, involved in unique extracurriculars—and most important of all, appearing to fill these roles without effort. Outwardly, she had succeeded. But how surprised her classmates would be, she wrote, if they could see her private self, the girl who “takes her Zoloft and a sleeping pill” each night, then “writhes in hot, silent tears, white-knuckled, feeling like she could scream.”…
A Tropical Brew: Deep in the Brazilian rainforest there is a town built around a church where worshippers drink hallucinogenic tea
May 29, 2012
Rio Branco is the capital of Acre, Brazil’s most westerly state and its most Wild West one too. A congressman was jailed there in the late 1990s, accused of slicing off an enemy’s arms and legs with a chainsaw. I visited the city shortly after. I had recently arrived in Brazil, a freelance writer from the other side of the globe, and when an acquaintance invited me to a church service at which psychoactive drugs would be consumed, I jumped at the chance.
Ayahuasca—or Daime as it is known locally—is a muddy-looking concoction made from boiling the Banisteriopsis caapi vine and the Psychotria viridis leaf. Across the Amazon, indigenous people drink it as a part of their rituals. In Brazil a century ago, however, the hallucinogen led to the birth of a new Christian movement, the religion known as Santo Daime.
Daime services require worshippers to take the sacrament. At the church entrance I was served a cup of the brew. I swigged it down straight away, grimacing at its rank bitterness. After 20 minutes, feeling that it wasn’t working, I drank another cup.
Seconds later, I was overwhelmed with tiredness. My eyes shut and a sea of swirling, luminescent colours filled my head. I collapsed in the fetal position by the exit, cuddling a stool like a pillow. Urged by my acquaintance to return into the church—since outside the Devil’s spirits would get me and inside Jesus would protect me—I started to panic. Before taking the drug I had considered my Jewishness as irrelevant. Worrying what Jesus might do sent me into a total freak-out.
So I stayed outside, reasoning that it was better the Devil I knew. I felt sick and vomited. My jaw started moving uncontrollably. I tried to focus on a woman, since I remembered that sexual impulses can lessen the effect. Unable to summon desire I asked myself why—and it occurred to me that I could not remember if I was straight or gay. The more I thought about myself the less I could be sure of. Was I a man or a woman? British or Brazilian? Was I thinking in Portuguese or English? I did not know.
I frantically pieced myself together, and an hour or so later found a taxi and returned to my hotel room. I switched on all the lights and sat up for another hour until I felt the urge to pee and, on relieving myself, the trip ended as abruptly as it began.
The experience did not feel at all spiritual. It was the most tormented five hours of my life. When I returned to Rio de Janeiro I was left with a respect for ayahuasca and a faint embarrassment for not having deduced a priori that it is inadvisable for Jews to take hallucinogenic drugs in bizarre jungle churches.
Several years later, I tried again.
It takes five hours to fly from Rio de Janeiro to Rio Branco. My destination is an isolated Santo Daime community founded 20 years ago by the priest Padrinho Sebastião, on the banks of the Purus, one of the Amazon’s grand southern tributaries. He named his New Jerusalem Céu do Mapiá (Heaven of Mapiá) and it now has a population of almost a thousand people.
To get there you must pass through Boca do Acre, a 200km taxi ride from Rio Branco along a precarious stretch of reddish, muddy track. On arrival I feel the intense, suffocating heat of the urban Amazon. The brick homes and asphalt streets have turned Boca do Acre into an open stove. The town may have had a purpose during the rubber era, but now it seems only a reminder of the futility of colonising the rainforest. There are no trees (nor, I suspect, jobs) and the only birds you see are vultures. Even the river looks sullen here: a menacing grey-green, mottled with floating logs and dirty with the town’s waste.
The road ends at Boca do Acre. We leave for Mapiá the next morning in a motorised dinghy loaded with eight people and heavy with provisions. There is no shade and the sun scalds my skin. After the Purus’s first meander we are in the wilderness. The river is a corridor about 200 metres wide. The foliage at its margins is a monotonous equatorial green. Only the huge sky changes, a lucid blue turning dirty grey as the sun makes way for a heavy downpour. After an hour and a half we slow down and turn up the Mapiá. It is as if we have come off the motorway and entered a rather precarious B-road. At times the creek is only a few metres wide. The rainforest is so close it assaults our senses. Everything now is a shade of brown—the muddy colour of the water and the trunks, roots and branches that force us to duck as they brush against us. We hear the screech of monkeys; swallows and kingfishers swoop by. The proximity to nature is exhilarating—and exasperating. At one point a tree has fallen across the river and we have to step off the boat and haul it over. During the operation an ant crawls in my ear, and as it fidgets and bites its way down my hearing tube it feels like someone is twisting a nail through my skull. A fellow passenger grabs my head and pours water into my ear hole. It is with some delight that, after a long six hours, our boat finally put-puts into port.
You only need ask the time to realise that Céu do Mapiá is a special place, the smallest community in the world with its own time zone, half an hour in front of Boca do Acre and half an hour behind Pauini, the next town down the Purus. Such whimsy is just the start. As the vegetation disperses, we pass under a high wooden bridge. Then some neatly painted houses come into view. The homes have gardens and I catch sight of undulating grassland between them. I am deeper in the jungle than I have ever been yet Céu do Mapiá has the feel of a European country village.
I jump off the dinghy. There is a smell of cut grass. Since I find myself by the main square I take a stroll. I smile at what I see: A flagpole! A line of grocery stores! A public phone! Quite apart from the peculiarity of seeing such examples of “civilisation” in such a remote outpost there are more basic reasons why the place seems out of character with its location. The defining characteristic of the rural Amazon is extreme poverty—and there is none. The paralysing heat has gone. The climate is deliciously fresh.
The wailing sound of girls singing drifts from many homes. A few people cross my path. I notice they all have a certain zombie stare: eyes wide open, deep set, with an unfocused gaze…