Via Newsday

Romney Gets…Pranked

May 15, 2012

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

The New Republic:

There are two cathedrals in Coventry. The newer one, consecrated on May 25, 1962, stands beside the remains of the older one, which dates from the fourteenth century, a ruin testifying to the bombardment of the Blitz. Three years before the consecration, in one of the earliest ventures in the twinning of towns, Coventry had paired itself with Dresden. That gesture of reconciliation was recapitulated in 1962, when Benjamin Britten’s War Requiem received its first performance at the ceremony. The three soloists were an English tenor (Peter Pears), a German baritone (Dietrich Fischer-Dieskau), and a Russian soprano (Galina Vishnevskaya).

Since the 1960s, historians have worked—and debated—to bring into focus the events of the night of February 13, 1945, in which an Allied bombing attack devastated the strategically irrelevant city of Dresden. An increased understanding of the decisions that led to the fire-bombing, and of the composition of the Dresden population that suffered the consequences, have altered subsequent judgments about the conduct of war. The critical light of history has been reflected in the contributions of novelists and critics, and of theorists of human rights. Social and political changes, in other words, followed the results of humanistic inquiry, and were intertwined with the reconciliatory efforts of the citizens of Coventry and Dresden. Even music and poetry played roles in this process: what history has taught us is reinforced by the lines from Wilfred Owen that Britten chose as the epigraph for his score—“My subject is war, and the pity of war. The poetry is in the pity. All a poet can do today is warn.”It is so easy to underrate the impact of the humanities and of the arts. Too many people, some of whom should know better, do it all the time. But understanding why the natural sciences are regarded as the gold standard for human knowledge is not hard. When molecular biologists are able to insert fragments of DNA into bacteria and turn the organisms into factories for churning out medically valuable substances, and when fundamental physics can predict the results of experiments with a precision comparable to measuring the distance across North America to within the thickness of a human hair, their achievements compel respect, and even awe. To derive one’s notion of human knowledge from the most striking accomplishments of the natural sciences easily generates a conviction that other forms of inquiry simply do not measure up. Their accomplishments can come to seem inferior, even worthless, at least until the day when these domains are absorbed within the scope of “real science.”

The conflict between the Naturwissenschaften and the Geisteswissenschaften goes back at least two centuries, and became intensified as ambitious, sometimes impatient researchers proposed to introduce natural scientific concepts and methods into the study of human psychology and human social behavior. Their efforts, and the attitudes of unconcealed disdain that often inspired them, prompted a reaction, from Vico to Dilthey and into our own time: the insistence that some questions are beyond the scope of natural scientific inquiry, too large, too complex, too imprecise, and too important to be addressed by blundering over-simplifications. From the nineteenth-century ventures in mechanistic psychology to contemporary attempts to introduce evolutionary concepts into the social sciences, “scientism” has been criticized for its “mutilation” (Verstümmelung, in Dilthey’s memorable term) of the phenomena to be explained.

The problem with scientism—which is of course not the same thing as science—is owed to a number of sources, and they deserve critical scrutiny. The enthusiasm for natural scientific imperialism rests on five observations. First, there is the sense that the humanities and social sciences are doomed to deliver a seemingly directionless sequence of theories and explanations, with no promise of additive progress. Second, there is the contrasting record of extraordinary success in some areas of natural science. Third, there is the explicit articulation of technique and method in the natural sciences, which fosters the conviction that natural scientists are able to acquire and combine evidence in particularly rigorous ways. Fourth, there is the perception that humanists and social scientists are only able to reason cogently when they confine themselves to conclusions of limited generality: insofar as they aim at significant—general—conclusions, their methods and their evidence are unrigorous. Finally, there is the commonplace perception that the humanities and social sciences have been dominated, for long periods of their histories, by spectacularly false theories, grand doctrines that enjoy enormous popularity until fashion changes, as their glaring shortcomings are disclosed.

These familiar observations have the unfortunate effect of transforming differences of degree into differences of kind, as enthusiasts for the alleged superiority of natural science readily succumb to stereotypes and over-generalizations, without regard for more subtle explanations. Let us consider the five foundations of this mistake in order.

THE MOST OBVIOUS EXPLANATION for the difficulties of the Geisteswissenschaften,the humanities and the study of history and society, is that they deal with highly complex systems. Concrete results are often achieved in particular instances: historians and anthropologists are able to be precise and accurate by sacrificing generality, by clear-headedly disavowing the attempt to provide any grand overarching theory. No large vision of history emerges from our clearer understanding of the bombing of Dresden, but the details are no less powerful and significant. In this respect, moreover, matters are no different in the natural sciences. As we shall see, science often forgoes generality to achieve a precise and accurate answer to an important question.

In English we speak about science in the singular, but both French and German wisely retain the plural. The enterprises that we lump together are remarkably various in their methods, and also in the extent of their successes. The achievements of molecular engineering or of measurements derived from quantum theory do not hold across all of biology, or chemistry, or even physics. Geophysicists struggle to arrive at precise predictions of the risks of earthquakes in particular localities and regions. The difficulties of intervention and prediction are even more vivid in the case of contemporary climate science: although it should be uncontroversial that the Earth’s mean temperature is increasing, and that the warming trend is caused by human activities, and that a lower bound for the rise in temperature by 2200 (even if immediate action is taken) is two degrees Celsius, and that the frequency of extreme weather events will continue to rise, climatology can still issue no accurate predictions about the full range of effects on the various regions of the world. Numerous factors influence the interaction of the modifications of climate with patterns of wind and weather, and this complicates enormously the prediction of which regions will suffer drought, which agricultural sites will be disrupted, what new patterns of disease transmission will emerge, and a lot of other potential consequences about which we might want advance knowledge. (The most successful sciences are those lucky enough to study systems that are relatively simple and orderly. James Clerk Maxwell rightly commented that Galileo would not have redirected the physics of motion if he had begun with turbulence rather than with free fall in a vacuum.)

The emphasis on generality inspires scientific imperialism, conjuring a vision of a completely unified future science, encapsulated in a “theory of everything.” Organisms are aggregates of cells, cells are dynamic molecular systems, the molecules are composed of atoms, which in their turn decompose into fermions and bosons (or maybe into quarks or even strings). From these facts it is tempting to infer that all phenomena—including human actions and interaction—can “in principle” be understood ultimately in the language of physics, although for the moment we might settle for biology or neuroscience. This is a great temptation. We should resist it. Even if a process is constituted by the movements of a large number of constituent parts, this does not mean that it can be adequately explained by tracing those motions.

A tale from the history of human biology brings out the point. John Arbuthnot, an eighteenth-century British physician, noted a fact that greatly surprised him. Studying the registry of births in London between 1629 and 1710, he found that all of the years he reviewed showed a preponderance of male births: in his terms, each year was a “male year.” If you were a mad devotee of mechanistic analysis, you might think of explaining this—“in principle”—by tracing the motions of individual cells, first sperm and eggs, then parts of growing embryos, and showing how the maleness of each year was produced. But there is a better explanation, one that shows the record to be no accident. Evolutionary theory predicts that for many, but not all, species, the equilibrium sex-ratio will be 1:1 at sexual maturity. If it deviates, natural selection will favor the underrepresented sex: if boys are less common, invest in sons and you are likely to have more grandchildren. This means that if one sex is more likely to die before reaching reproductive age, more of that sex will have to be produced to start with. Since human males are the weaker sex—that is, they are more likely to die between birth and puberty—reproduction is biased in their favor…

Read it all.

The National Interest:

NOWHERE IN the world have the latest shocks to the Old Order been more powerful than in the Middle East and North Africa, where massive civic turmoil has swept away long-entrenched leaders in Egypt, Tunisia and Yemen, toppled a despot in Libya and now challenges the status quo in Syria. Over the past sixty years, the only other development of comparable game-changing magnitude was the 1989 fall of the Berlin Wall and the subsequent collapse of the Soviet Union.

It isn’t clear where the region is headed, but it is clear that its Old Order is dying. That order emerged after World War II, when the Middle East’s colonial powers and their proxies were upended by ambitious new leaders stirred by the force and promise of Arab nationalism. Over time, though, their idealism gave way to corruption and dictatorial repression, and much of the region slipped into economic stagnation, unemployment, social frustration and seething anger.

For decades, that status quo held, largely through the iron-fisted resolve of a succession of state leaders throughout the region who monopolized their nations’ politics and suppressed dissent with brutal efficiency. During the long winter of U.S.-Soviet confrontation, some of them also positioned themselves domestically by playing the Cold War superpowers against each other.

The United States was only too happy to play the game, even accepting and supporting authoritarian regimes to ensure free-flowing oil, a Soviet Union held at bay and the suppression of radical Islamist forces viewed as a potential threat to regional stability. Although successive U.S. presidents spoke in lofty terms about the need for democratic change in the region, they opted for the short-term stability that such pro-American dictators provided. And they helped keep the strongmen in power with generous amounts of aid and weaponry.

Perhaps the beginning of the end of the Old Order can be traced to the 1990 invasion of the little oil sheikdom of Kuwait by Iraq’s Saddam Hussein—emboldened, some experts believe, by diplomatic mixed signals from the United States. It wasn’t surprising that President George H. W. Bush ultimately sent an expeditionary force to expel Saddam from the conquered land, but one residue of that brief war was an increased American military presence in the region, particularly in Saudi Arabia, home to some of Islam’s most hallowed religious shrines. That proved incendiary to many anti-Western Islamists, notably Osama bin Laden and his al-Qaeda terrorist forces. One result was the September 11, 2001, attacks against Americans on U.S. soil.

The ensuing decade of U.S. military action in Iraq and Afghanistan produced ripples of anti-American resentment in the region. The fall of Saddam Hussein after the 2003 U.S. invasion represented a historic development that many Iraqis never thought possible. But the subsequent occupation of Iraq and increasing American military involvement in Afghanistan deeply tarnished the U.S. image among Muslims, contributing to a growing passion for change in the region. Inevitably, that change entailed a wave of Islamist civic expression that had been suppressed for decades in places such as Egypt and Tunisia.

Meanwhile, the invasion of Iraq also served to upend the old balance of power in the region. It destroyed the longtime Iraqi counterweight to an ambitious Iran. This fostered a significant shift of power to the Islamic Republic and an expansion of its influence in Syria, Lebanon, the Gaza Strip and the newly Iran-friendly Iraqi government in Baghdad. Iran’s apparent desire to obtain a nuclear-weapons capacity—or at least to establish an option for doing so—tatters the status quo further. With Israel threatening to attack Iran’s nuclear facilities, many see prospects for a regional conflagration, and few doubt that such a conflict inevitably would draw the United States into its fourth Middle Eastern war since 1991.

In short, recent developments have left the Middle East forever changed, and yet there is no reason to believe the region has emerged from the state of flux that it entered last year, with the street demonstrations in Tunisia and Egypt. More change is coming, and while nobody can predict precisely what form it will take, there is little doubt about its significance. Given the region’s oil reserves and strategic importance, change there inevitably will affect the rest of the world, certainly at the gas pump and possibly on a much larger scale if another war erupts.

A CAREFUL look at the history of the Middle East reveals that, although the West has sought to hold sway over the region since the fall of the Ottoman Empire following World War I, its hold on that part of the world has been tenuous at best. This has been particularly true since the beginning of the current era of Middle Eastern history, marked by the overthrow of colonial powers and their puppets.

Like so many major developments in the region, this one began in Egypt. On July 22, 1952, Colonel Gamal Abdel Nasser led a coup that toppled the British-installed King Farouk, a corrupt Western lackey. The son of a postal clerk, Nasser was a man of powerful emotions. He had grown up with feelings of shame at the presence of foreign overlords in his country, and he sought to fashion a wave of Arab nationalism that would buoy him up as a regional exemplar and leader. By 1956, he had consolidated power in Egypt and emerged as the strongest Arab ruler of his day—“the embodied symbol and acknowledged leader of the new surge of Arab nationalism,” as the influential columnist Joseph Alsop wrote at the time…

Read it all.

Washington Monthly:

With financial ties to nearly two dozen drug and biotech companies, Dr. Charles B. Nemeroff may hold some sort of record among academic clinicians for the most conflicts of interest. A psychiatrist, a prominent researcher, and chairman of the department of psychiatry and behavioral science at Emory University in Atlanta, Nemeroff receives funding for his academic research from Eli Lilly, AstraZeneca, Pfizer, Wyeth-Ayerst–indeed from virtually every pharmaceutical house that manufactures a drug to treat mental illness. He also serves as a consultant to drug and biotech companies, owns their stocks, and is a member of several speakers’ bureaus, delivering talks–for a fee–to other physicians on behalf of the companies’ products.

But it was just three of Nemeroff’s many financial entanglements that caught the eye of Dr. Bernard J. Carroll last spring while reading a paper by the Emory doctor in the prominent scientific journal, Nature Neuroscience. In that article, Nemeroff and a co-author reviewed roughly two dozen experimental treatments for psychiatric disorders, opining that some of the new treatments were disappointing, while others showed great promise in relieving symptoms. What struck Carroll, a psychiatrist in Carmel, Calif., was that three of the experimental treatments praised in the article were ones that Nemeroff stood to profit from–including a transdermal patch for the drug lithium, for which Nemeroff holds the patent.

Carroll and a colleague, Dr. Robert T. Rubin, wrote to the editor of Nature Neuroscience, which is just one of a family of journals owned by the British firm, Nature Publishing Group, pointing out the journal’s failure to disclose Nemeroff’s interests in the products he praised. They asked the editor to publish their letter, so that readers could decide for themselves whether or not the author’s financial relationships might have tainted his opinion. After waiting five months for their letter to appear, the doctors went to The New York Times with their story–a move that sparked a furor in academic circles, and offered the public yet another glimpse into conflict of interest, one of the most contentious and bitter debates in medicine.

In his defense, Nemeroff told the Times he would have been happy to list his (many) relationships with private industry–if only the journal had asked. “If there is a fault here,” he said, “it is with the journal’s policy,” which did not require authors of review articles to disclose their conflicts of interest.

And that is pretty much where the debate over conflict of interest in medical journals stands: Should research scientists who have financial stakes in the products they are writing about be forced to disclose those ties? To which the average person might reasonably respond, of course they should. But the more pertinent question is why scientists with financial stakes in the outcome of scientific studies are allowed anywhere near those studies, much less reviewing them in elite journals.

The answer to that question is at once both predictable and shocking: For the past two decades, medical research has been quietly corrupted by cash from private industry. Most doctors and academic researchers aren’t corrupt in the sense of intending to defraud the public or harm patients, but rather, more insidiously, guilty of allowing the pharmaceutical and biotech industries to manipulate medical science through financial relationships, in effect tainting the system that is supposed to further the understanding of disease and protect patients from ineffective or dangerous drugs. More than 60 percent of clinical studies–those involving human subjects–are now funded not by the federal government, but by the pharmaceutical and biotech industries. That means that the studies published in scientific journals like Nature and The New England Journal of Medicine–those critical reference points for thousands of clinicians deciding what drugs to prescribe patients, as well as for individuals trying to educate themselves about conditions and science reporters from the popular media who will publicize the findings–are increasingly likely to be designed, controlled, and sometimes even ghost-written by marketing departments, rather than academic scientists. Companies routinely delay or prevent the publication of data that show their drugs are ineffective. The majority of studies that found such popular antidepressants as Prozac and Zoloft to be no better than placebos, for instance, never saw print in medical journals, a fact that is coming to light only now that the Food and Drug Administration has launched a reexamination of those drugs.

Today, private industry has unprecedented leverage to dictate what doctors and patients know–and don’t know–about the $160 billion worth of pharmaceuticals Americans consume each year. This is an unsettling charge that many (if not a majority) of doctors and academic researchers don’t want to acknowledge. Once grasped, however, the full scope and consequences of medical conflict of interest beget grave doubts about the veracity of wide swaths of medical science. As Dr. Drummond Rennie, deputy editor ofThe Journal of the American Medical Association (JAMA), puts it, “This is all about bypassing science. Medicine is becoming a sort of Cloud Cuckoo Land, where doctors don’t know what papers they can trust in the journals, and the public doesn’t know what to believe.”…

Read it all.

JP Morgan Mess

May 15, 2012

Via Newsday

Ms Van Pelt Knows

May 15, 2012

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

Follow

Get every new post delivered to your Inbox.

Join 83 other followers