GOP Culture Wars

February 20, 2012

Via About

The Nation:

A right thumb, a finger, a tooth. These were the contents of a reliquary acquired several years ago by a collector at an auction in Florence. Little did he know that for centuries the remains had been objects of profane devotion. Last seen in 1905, they had been sliced from the corpse of Galileo, along with another finger and a vertebra, during his highly publicized reburial in the Basilica of Santa Croce in 1737 almost 100 years after his death, and preserved in a slender case fashioned of glass and wood and crowned with a carved bust of the scientist. The reliquary’s new owner consulted Galileo experts about his find, and after the authenticity of its contents had been verified he donated it to the Museo Galileo, which is tucked behind the Uffizi in a quiet piazza overlooking the River Arno. (A dentist asked by the museum to examine the tooth concluded that Galileo suffered from gastric acid reflux and ground his teeth in his sleep.) The rediscovered reliquary is displayed adjacent to a smaller one containing Galileo’s other finger, a prized museum possession since 1927. Nearby are several artifacts of Galileo’s scientific genius: a telescope presented to the Medici and the broken objective lens of the original device with which Galileo sighted Jupiter’s four satellites in 1610.

Galileo was not the first scientist whose corpse was as revered as his corpus. That honor belongs to René Descartes, who was reburied numerous times after his death in 1650, initially to secure the return of his body to French soil and subsequently to install him in the pantheon of French genius. Yet Galileo’s remains in Florence have an added meaning. In 1633 the scientist was tried for heresy, having been accused of violating a 1616 papal decree condemning as contrary to Scripture the idea of a heliocentric universe, first described by Copernicus in On the Revolutions of the Heavenly Spheres (1543). The Florentines who snatched a few of Galileo’s bones in 1737 sought to canonize the scientist as a counter-saint, even as the Roman Catholic Church, with a century of hindsight, relented on its decision to deny Galileo a public burial and monument worthy of his fame when he died. Times were changing, but not rapidly enough for Galileo’s most ardent disciples. Their veneration of a few body parts privately commemorated his martyrdom for the cause of science. The church’s interment of his other remains in a sepulcher adjacent to Michelangelo’s in Santa Croce designated him a heroic embodiment of Tuscan genius and creativity.

Understanding Galileo has been the task of historians ever since he became a mythical figure. His youngest disciple, Vincenzo Viviani, spent more than half a century trying to get his biography right, never quite managing to meet his own impossibly high expectations of how to write about a great scientist. Bertolt Brecht was so mesmerized by the particulars of Galileo’s life that he wrote three versions of it for the stage, the first while living in Nazi Germany, the second while in postwar America after Hiroshima and Nagasaki, and the third during his voluntary exile from McCarthy’s America in communist East Berlin. Brecht’s Galileo was simultaneously the victim of a tragedy perpetuated by his society, and the tragedy himself. As Brecht witnessed the evolving role of the scientist in the mid-twentieth century, he began to see similarities between Galileo and J. Robert Oppenheimer, who paid a high price for attempting to work on the Manhattan Project while resisting its core values. In this respect, it may be fairly said that Brecht got Galileo right more than any modern historian in recognizing that he belonged to the ages, and that our perspective on him would be ever changing.

The publication of two recent biographies of Galileo, by John Heilbron and David Wootton, coincided with the 400th anniversary of the publication of Starry Messenger (1610), the treatise in which Galileo reported the astronomical observations he had made with the instrument not yet called the telescope. Heilbron, a distinguished historian of physics and mathematics, has spent many years studying the relations between science and religion, including how the Roman Catholic Church stimulated and materially supported a research program of Catholic astronomy. Wootton has previously written on the history of atheism and unbelief, and about Galileo’s controversial Venetian friend Paolo Sarpi—a theologian and tireless critic of the papacy. In Venice there is a statue of Fra Paolo in Campo Santa Fosca commemorating his survival of a botched assassination attempt in October 1607. The cutthroats were sheltered and paid by Rome, yet Sarpi continued to defend freedom of thought and belief, both in conversation and in print, and to discuss science with Galileo. In Heilbron’s account, Galileo is a versatile connoisseur and critic; in Wootton’s, he is all but a modern scientist without faith.

Before the appearance of Starry Messenger, Galileo was known as a poorly dressed, occasionally sarcastic and mechanically adroit college dropout who kept a mistress and had sired three illegitimate children. He admired and imitated the prose of Dante, Machiavelli and Ariosto; he enjoyed reading poetry and liked to draw and tell a good joke. He learned a fair bit of music from his father, Vincenzo Galilei, a Medici court musician, yet rebelled against his father’s desire that he become a physician. He frequently quarreled with his mother, Giulia Ammannati, who seems to have thought that a session with the Florentine Inquisitors might curb her son’s insolence. Wootton makes these fraught familial relations the basis of his depiction of Galileo as a proud, stubborn and sensitive man, a portrait reminiscent of Arthur Koestler’s 1959 account of Galileo as an anti-hero. But Wootton overreaches when he makes some imaginative and not particularly well-substantiated hypotheses about a third (illegitimate) daughter and a late-blooming love affair.

In his formative years Galileo cultivated a highly fertile geometric imagination that would nourish his study of mathematics and physics, and especially mechanics. His invention, when he was in his 20s, of a lightweight hydrostatic balance earned him the admiration of senior mathematicians in Italy. Heilbron lovingly explores Galileo’s resourcefulness by explaining, recalculating and diagramming all his most important insights into the nature of things. This work has been done piecemeal by other historians of science, but it is Heilbron’s accomplishment to have created a complete, accessible yet technical synthesis of Galileo’s findings. Such reading is not for the mathematically faint of heart, but it is essential for understanding Galileo’s science. By integrating this material into a sharp-witted and ironic narrative of Galileo as a man of culture and learning, Heilbron portrays Galileo as a child of the Renaissance, a man who saw the lunar mountains not only through the lens of his telescope and by the point of his compass but also in the context of Ariosto’s fantastic descriptions of them inOrlando Furioso…

Read it all.

Miller-McCune:

By age 10, most people are exposed to enough radiation to be at risk, but the science is so complicated that exposure could even have benefits.

Meet Reference Man, a kind of hypothetical Ken Doll: a 20-something white male, fit and hearty, out in the park doing a hundred one-armed pushups every morning at 5:30. He’s the guy most radiation exposure standards are designed to protect. But as a stand-in, he’s passé.

Reference Man was born when most of the evidence about the health effects of radiation came from high-dose exposures such as nuclear bombs. But the landscape has changed. Exposure now comes from low and often chronic levels of radiation such as medical technologies, which are the fastest-growing source of radiation exposure. Emerging science is eroding central assumptions about radiobiology. Effects at repeated low doses are different and subtler than those from episodic high doses. And mysterious, intertwining, and sometimes contradictory phenomena hint at both serious health risks and surprising benefits: cells communicate extensively about exposures, taking radiation’s influence far beyond the genome; cancer may not be the only harmful consequence; low-level exposure may enable organisms to build up a tolerance that would protect them from high doses; and healthy cells can give radiation-damaged cells the equivalent of a death sentence to stop the threat of disease.

These enigmas, and the fact that individual responses to radiation exposure vary widely, mean Reference Man can’t represent two-thirds of the population: the very young, the very old, the overweight, the immune-compromised — not to mention Reference Woman. Exposure limits based on Reference Man set by federal agencies, along with guidelines from advisory organizations worldwide, have yet to catch up with the strange realities now being revealed.

Is Any Exposure Safe?
Well before Hiroshima or even Madame Curie, people were exposed to natural radiation from cosmic rays and rocks. In the 20th century, humans added to the load with fallout from weapons tests, nuclear power waste, and medical advances like dental X-rays, CT scans, and cancer treatments.

The plethora of radiation units measuring all these elements gets in the way of understanding what they mean. The activity, or amount of energetic particles emitted into the environment, is measured in curies and becquerels, but the amount of radiation energy absorbed by living tissue is expressed in sieverts. Named for the Swedish physicist who was a founder of the International Radiation Protection Association, the sievert scale is the closest thing to a measure of biological effectiveness available. For humans, outright radiation sickness (red skin, nausea, organ failure) starts at about 1.0 sievert. (In this article and accompanying graphics, doses have been converted to sieverts for comparison.)

Radiation exposure is cumulative over a lifetime, so limits on how much additional radiation it is safe to receive in any time period are set low as a precaution. Everyone in the U.S. is exposed to about 6 millisieverts a year from the combination of medical procedures, natural sources, and remnant radiation from the weapons testing era; by the age of 10, most people will have accumulated enough exposure from all sources to be at some risk.

The EPA’s and Nuclear Regulatory Commission’s current annual exposure limit is 1 millisievert — in addition to the annual estimate of 6 millisieverts mentioned above, so ideally people won’t be exposed to more than 7. But those who actually work around nuclear materials will likely reach or exceed that exposure, and then the precautionary principle becomes surprising pliable: the International Commission on Radiological Protection (which is different from Sievert’s organization) recommends an annual occupational limit of 20 millisieverts, while the NRC sets that limit at 50 millisieverts.

To arrive at these numbers, regulators and international agencies like the ICRP follow what’s called the linear no-threshold hypothesis. This says that any exposure above zero always creates some risk of harm, and each added increment of dose adds a corresponding increment of risk. The paradox is that regulators have to settle on some exposure limits to protect workers and the public, and these limits imply that below those lines, there is a safe zone — even while they insist there is no threshold of exposure that’s truly risk free.

Recent research has exposed how untenable this position is but has not yet offered much clarity or comfort.

While we are getting better at seeing radiation’s effects on individual cells, it remains difficult to predict the effect of any particular dose on a population. The conventional wisdom regarding what happens to people at exposures below 100 millisieverts has been essentially conjecture; many experts think exposures in that zone pose no harm. But epidemiological studies show there is wide variation in people’s sensitivity to radiation: for example, at 100 millisieverts — a tenth of the dose causing outright radiation sickness — women have about a third higher risk for cancers of solid organs than men, whereas men are significantly more at risk for leukemia, according to the National Academy of Sciences committee that is charged with recommending protection measures. Further, discussions of effects on populations focus on average dose, which doesn’t reflect what actually happens to individuals. “If you gave 1 millisievert to two people standing next to each other and looked at the total energy depositions in their cells,” says Keith Baverstock, a former radiation protection official at the World Health Organization, one might get pummeled and the other escape altogether, yet the average dose to that population would remain the same.

Based on emerging science, Baverstock considers 100 millisieverts “a high dose.” It takes about that much to destabilize a cell’s genome, suggested Mary Helen Barcellos-Hoff of the New York University School of Medicine, and colleagues, in 2008. And this year, German researchers Hagen Scherb and Kristina Voigt said that the ICRP’s risk models lead to “considerably underestimated health risks” at exposures under 100 millisieverts.

Scientists do agree that in our most sensitive population — fetuses — doses as low as 6 to 10 millisieverts are harmful. In 1958, pioneering British epidemiologist Alice Stewart found a 40 percent added risk of leukemia in children whose mothers had abdominal X-rays in that range while pregnant. Scherb and Voigt estimated in 2007 that prenatal exposure at the rate of 1.5 millisieverts a year could cause birth defects and even stillbirths. Limits formerly considered safe seem less and less so…

Read it all.

City Journal:

In the summer of 1755, 23-year-old George Washington galloped back and forth across a blood-soaked battlefield near present-day Pittsburgh, trying heroically but unsuccessfully to rally the panicked British force in which he served to withstand a withering attack by Britain’s French and Indian enemies in a war his own hotheadedness had ignited two years earlier. An Indian chief ordered his braves to shoot down the seemingly fearless six-footer, conspicuous not only for his height and daring but also for being, as Thomas Jefferson later marveled, “the best horseman of his age and the most graceful figure that could ever be seen on horseback.” The Indians fired volley after volley, putting four bullets through his coat and killing two horses out from under him, but he fought on unscathed. Fifteen years later, the same chief told him how vividly he remembered that day, which convinced him that the Great Spirit must have a brilliant future in store for the young officer whom his braves miraculously couldn’t kill no matter how hard they tried.

When Washington’s fellow delegates to the Second Continental Congress unanimously elected him commander in chief of the American armies on June 15, 1775, two months after the shots at Lexington and Concord had launched the American Revolution, he had a similar premonition. He wrote to his wife—“my dear Patcy”—to tell her that he was off to war, explaining that he couldn’t “refuse this appointment without exposing my Character to such censures as would have reflected dishonour upon myself, and . . . have lessend me considerably in my own esteem.” But he also felt that, “as it has been a kind of destiny that has thrown me upon this Service, I shall hope that, my undertaking of it, is designd to answer some good purpose. . . . I shall rely therefore, confidently, on that Providence which has heretofore preservd, & been bountiful to me, not doubting but I shall return safe to you in the fall.”

He was wrong about the timing—it was eight years before he came home—but right about the destiny. And it was in the next 19 months, mostly in New York and fleeing from it, that he knocked on the door of history and entered the pantheon of great men.

Washington loved the theater—Shakespeare, Sheridan, and, above all, Addison’s patriotic Roman tragedy of Cato—and a good thing, too, for running the war required adroit stagecraft. He became the master of appearance, the paragon of role-playing, a virtuoso actor who could move his audience to passion and to tears. And he loved dressing for a role. He designed his first coat, down to the fussiest detail, at 17 or 18; as a French and Indian War colonel, he bedecked himself with gilt buttons, a gold shoulder knot, and gold lace on his hat; and at the end of his life, he was still designing uniforms for himself, puzzling over whether to have embroidery or not, slash cuffs or not—but needing for sure “tasty Cockades (but not whimsically foolish),” incorporating silver eagles, for his hat.

The Battle of Bunker Hill blazed up as he headed toward Boston to take command of the army there. The British had marched 2,300 redcoats straight up the hill on June 17, intending to overawe the Americans by showing that “trained troops are invincible against any numbers or any position of undisciplined rabble,” as General John Burgoyne brayed. The shock and awe were on the other side, though, because the Americans, whom Colonel William Prescott had ordered not to fire “until you can see the whites of their eyes,” didn’t retreat until they had killed or wounded almost half the British, including 90 officers, compared with 430 American casualties out of some 1,200 men. It was a “dear bought victory,” mourned General Sir Henry Clinton; “another such would have ruined us.”

When Washington arrived in Massachusetts on July 2, the Continental Army had taken control both of Dorchester Neck between Boston and the rest of Massachusetts, and of Cambridge across the Charles River to the north, bottling up the sobered redcoats. Trouble was, the Americans had no ammunition for “Months together, with what will scarce be believed—not 30 rounds of Musket Cartridges a Man,” Washington wrote. Not only to make the British believe that they were in his power but also to keep his own men confident, the general had to pretend—convincingly, 24 hours a day, despite his own fear and frustration—that all was well, as he waited, like Mr. Micawber, for something to turn up. “I know that without Men, without Arms, without Ammunition, without any thing that is fit for the accomodation of a Soldier that little is to be done—and, which is mortifying; I know, that I cannot stand justified to the World without exposing my own Weakness & injuring the cause by declaring my wants,” he wrote. “[M]y Situation has been such that I have been obligd to use art to conceal it from my own Officers.” All this “produces many an uneasy hour when all around me are wrapped in Sleep. . . . I have often thought, how much happier I should have been, if . . . I had taken my Musket upon my Shoulder & enterd the Ranks, or . . . had retir’d to the back Country, & lived in a Wig wam—If I should be able to rise superior to these, and many other difficulties, . . . I shall most religiously believe that the finger of Providence is in it, to blind the Eyes of our Enemys.”

Nor was this all. For the first years of the war, Washington endured what his biographer Ron Chernow calls the “Sisyphean nightmare” of having his whole army evaporate on December 31, when their one-year hitches ended. By late November 1775, only 3,500 soldiers agreed to stay past their terms; by year-end, a paltry 9,650 untrained new recruits had signed on, half the number needed. “It takes you two or three Months to bring New men to any tolerable degree acquainted with their duty,” and even longer to bring independent-minded Americans to “such a subordinate way of thinking as is necessary for a Soldier,” Washington lamented. Then, as the end of their terms approaches, you try to cajole them to stay longer, so you “relax your discipline, in order as it were to curry favour with them”—meaning that “the latter part of your time is employed in undoing what the first was accomplishing.” Nevertheless, Washington crowed afterward, during those months “we have disbanded one Army & recruited another, within Musket Shot of two and Twenty Regimts, the Flower of the British Army.”

Meanwhile, Congress had written new roles for him and his army, and Washington had to establish them credibly in the eyes of the British commanders he faced, including General Thomas Gage, the commander in chief and governor of Massachusetts, who had served with him in the French and Indian War 20 years earlier. Little more than a month after taking command, Washington wrote Gage that he had heard reports that American soldiers captured at Bunker Hill, even “those of the most respectable Rank, when languishing with Wounds and Sickness,” had been “thrown indiscriminately, into a common Gaol appropriated for Felons.” Just be aware, he wrote, that we’ll treat British POWs exactly as you treat Americans. You choose: either “Severity, & Hardship” or “Kindness & Humanity.” Gage replied that of course he mixed up officers and enlisted men promiscuously, “for I acknowledge no rank not derived from the king.” This was the wrong response, especially to a newly minted commander in chief who, as a mere colonial officer two decades earlier, had resented having to defer to officers with less merit than he but with royal commissions.

“You affect, Sir, to despise all Rank not derived from the same Source with your own,” Washington thundered back, asserting a new, democratic understanding of legitimacy and worth. “I cannot conceive any more honourable, than that which flows from the uncorrupted Choice of a brave and free People—The purest Source & original Fountain of all Power.” Furthermore, you claim that you’ve shown “Clemency” by not hanging my men as rebels. But it remains to be seen “whether our virtuous Citizens whom the Hand of Tyranny has forced into Arms, to defend their Wives, their Children, & their Property; or the mercenary Instruments of lawless Domination, Avarice, and Revenge best deserve the Appellation of Rebels.” A higher authority than you will decide. “May that God to whom you then appealed, judge between America & you!”…

Read it all.

The Flock

February 20, 2012

Via AJC

Come And Gone

February 20, 2012

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

Follow

Get every new post delivered to your Inbox.

Join 83 other followers