The Fine Print

January 10, 2012

Via About

Iran on the Brink

January 10, 2012

Defining Ideas:

For thirty years, America has appeased the Islamic Republic, setting the stage for the current crisis.

Just in the last few months, events have hastened to a crisis in Iran’s long confrontation with the West. The ongoing civil war in Syria looks more and more likely to end with the ouster of strongman Bashar al-Assad, one of Tehran’s most stalwart regional allies and an important supporter of the Iranian proxy terrorist organization, Hezbollah in Lebanon. In November, the International Atomic Energy Agency reported evidence suggesting that Iran is carrying out “undisclosed nuclear-related activities,” including the “development of a nuclear payload for a missile.” According to Israeli intelligence, Iran now has enough material for four to five nuclear bombs.

Since the IAEA report was made public, mysterious explosions have rocked Iran. On November 12, a huge blast completely destroyed a military base that housed Iran’s long-range missile development facility, killing the founder of Iran’s missile program and destroying 180 missiles. Another explosion on November 28 seriously damaged a nuclear conversion site. And in December, blasts have occurred at the Isfahan oil refinery, a military base in Kerman, and a steel factory making nose cones and other parts for missiles. These attacks have rattled further a regime that is on edge over the rumors of a possible Israeli military strike and the impact of international economic sanctions. Rioters, comprised of members of the brutal Basij militia, recently attacked and sacked the British embassy in Tehran, either because of British sanctions against Iran’s banking sector, or internal power struggles among Iran’s leaders.

All these signs of turmoil point to great instability in a region critical to American interests. The dangerous instability would only escalate if a regime ruled by clerics, who endorse an apocalyptic strain of Shia Islam, gained nuclear weapons capability. Yet the roots of this critical moment stretch back thirty years, when America’s policy of serial appeasement of the Iranian mullahs and their aggression first began.

The Islamic Republic of Iran was born in the 1979 attack on the U.S. embassy in Tehran and the kidnapping of our citizens there, 52 of whom were held hostage for 444 days. This blatant violation of international law and challenge to U.S. prestige was met with the whole repertoire of appeasing evasions evocative of the 1938 crisis in the Sudetenland: diplomatic “outreach,” a U.N. Security Council Resolution, a U.N. commission of inquiry into America’s “crimes,” secret third-party negotiations, and, of course, economic sanctions and a trade embargo rendered toothless by the Iranian threat to cut Europe off from Iranian oil.

In the end, this policy of appeasement failed and the West lost Iran.

In the end, this appeasement failed and the West lost Iran. What arose was an oil-rich, jihadist state that immediately began to make good on one ayatollah’s promise that “an Islamic and divine government, much like Iran” would be created in other Muslim nations.

As part of that expansion, in 1982 Iran’s Quds force––the unit of the Revolutionary Guards tasked with exporting the Islamic Revolution––turned Lebanon’s Bekaa Valley into a training facility for a number of terrorist outfits, most importantly Hezbollah. One of these groups, Islamic Jihad, struck America in 1983 with two suicide truck-bomb attacks in Beirut. The first targeted our embassy, killing 17. A few months later, the more devastating attack destroyed the U.S. Marine barracks, killing 241 Marines and other military personnel sent to Lebanon as peacekeepers. A high Iranian official would later say that the Iranians had trained the bombers, and that “we were happy” when they heard of the attack.

Though the Reagan administration planned a retaliatory strike on the Bekaa training camps, it was never carried out. Instead, the Marines were pulled out “in a rush,” Secretary of State George Shultz wrote later, “amid ridicule from the French and utter disappointment and despair from the Lebanese.” Osama bin Laden would later refer to this retreat as further evidence that America had “foundations of straw” and could be toppled with attacks that sent troops home in body bags.

This unpunished aggression was followed by other murders and kidnappings of Americans, including CIA station chief William Buckley, who was beaten and tortured to death in 1985 by Imad Fayez Mugniyah, the architect of the Marine barracks bombing. In response, the Reagan administration undertook the disastrous arms-for-hostages operation that unfolded from 1985 to 1986. What came to be known as “Iran-Contra” involved ransoming American hostages held by terrorists in Lebanon by selling Iran several thousand anti-tank and anti-aircraft missiles in violation of an arms embargo. Three hostages were released––but then were immediately replaced by three other kidnapped Americans. The only message sent to the Iranians was that the United States would provide advanced weaponry to a regime that had declared war on it and murdered its troops and citizens.

Iran continues to support terrorist groups that attack U.S. interests.

Throughout the nineties, Iran continued with impunity to provide support and training to terrorist organizations attacking U.S. interests, including al Qaeda. One attack, a joint project of Hezbollah, Iran, and al Qaeda, was the 1996 truck bombing of the Khobar Towers, a residential complex for U.S. Air Force personnel near Dhahran, Saudi Arabia, which killed 19 Americans. A few years later, according to the 9/11 Commission Report, Iran helped al Qaeda members, including future 9/11 hijackers, cross into and out of Afghanistan. More recently, Iran’s Revolutionary Guards force has continued to support the terrorists killing our troops in Iraq and Afghanistan, supplying insurgents with anti-aircraft missiles and roadside bombs designed to penetrate protective armor on vehicles. And let’s not forget other unpunished assaults on the United States, such as the kidnapping and detention of three U.S. citizens on the false charge of spying, and the uncovered plot to murder the Saudi ambassador in Washington, D.C. in a public restaurant. Meanwhile, Iran moves ever closer to possessing nuclear weapons.

For thirty years, this aggression has been answered with ineffective economic sanctions, or rewarded with diplomatic “outreach” and “meaningful engagement,” the latter from President Obama. Over the years, America has failed to punish the Iranian regime for many reasons, ranging from Cold War geopolitical calculations to Iran’s importance as a major oil exporter. But the most important failure is the lack of understanding of the religious motives of Iranian behavior.

This misunderstanding was evident in how in 1979 the West portrayed the architect of the revolution, Ayatollah Khomeini, as a fringe “fanatic,” in the words of Time magazine. Our foreign policy establishment understood the revolution itself as essentially secular, an anti-colonial overthrow of an American puppet who oppressed his people and stymied their aspirations for democratic self-determination.

Yet the roots of Khomeini’s revolution were mainly religious, a reaction against the Shah’s modernization and secularization programs. To the clerical establishment, these were evidence that the Shah was “fundamentally opposed to Islam itself and the existence of a religious class,” as Khomeini preached. To President Carter’s foreign policy team, however, the religious dimension of the revolt against the Shah was secondary, as an Assistant Secretary of State put it, to the “very substantial improvements . . . in living standards and economic and social opportunities” made under the Shah but vitiated by his refusal to give his people the democratic freedom they presumably were now demanding. But as Khomeini would say later, “We did not create a revolution to lower the price of melons.” The point, rather, was to create an Islamic government consistent with Sharia law, like the regime still ruling Iran…

Read it all

Miller-McCune:

Communities with more primary care doctors enjoy better health, yet those physicians are a dying breed. Here is what some schools are doing to combat the looming shortage.

Technically, all Judy Sweet needs is a blood pressure test. In most doctors’ offices, this would be an in-and-out visit. Sweet’s doctor, however, never rushes her patients. Mary Elizabeth Sokach is a primary-care provider based in Exeter Township, a rural Pennsylvania community about 15 miles west of Scranton. When Sokach walks into the room, she greets Sweet like an old friend, then examines her closely. She asks when Sweet last had an eye exam. (“She’s a phenomenal artist, so we have to keep her hands and her vision going,” explains Sokach.) And she talks to Sweet about sleep and pain management.

Working alongside Sokach is a medical student named Adam Klein. Sokach kneads her fingers along the back of Sweet’s neck and invites Klein to do the same. “There’s tension in the right?” asks Klein. Sokach nods. She’s deduced that Sweet has muscle spasms, the source of years of nightly pains and sleep problems. A simple prescription should end all that discomfort.

Sokach works out of a converted house within earshot of the Susquehanna River; she has spent her life in Exeter. Her practice emphasizes primary and preventive care, and close attention to patients. But medical schools are turning out fewer and fewer doctors like her. Most schools operate under a century-old model that’s better suited to producing specialists than primary-care providers.

This is an ominous trend. Numerous studies show that communities with a high number of primary-care providers per capita have lower medical costs and better health outcomes. By contrast, according to a 2004 analysis of Medicare data conducted by researchers at Dartmouth College, a large-market presence of specialists actually lowers care quality. The Association of American Medical Colleges estimates that the United States is on track to have about 90,000 fewer physicians than it needs a decade from now. Half that shortage will be concentrated in primary care.

To help stave off the looming shortfall of physicians, existing schools are expanding and new schools are opening. In 2002, there were 125 allopathic medical schools in the United States. (Allopathic physicians are the ones with M.D. after their names, as opposed to osteopaths, naturopaths, etc.) Today, there are 134. Another nine schools are in development and applying for accreditation. In the 1980s and ’90s, by comparison, only one new medical school opened in the United States.

That’s why Mary Elizabeth Sokach has a medical student working with her today. Sokach is a faculty member — and Adam Klein is a student — at The Commonwealth Medical College in Scranton. It’s a new medical school that’s been funded partly by the state, and partly by Blue Cross of Northeastern Pennsylvania, which has seen health-care dollars leaving the region as people seek care elsewhere.

Commonwealth is part of a cadre of medical schools that are trying out new models of education in hopes of producing more physicians who will stay and practice medicine in their immediate communities. Simply by virtue of being new and free of traditions, these schools are better able to innovate, according to Valerie Weber, the chair of Commonwealth’s Department of Clinical Sciences.

In the newer schools, interacting with patients and examining social factors that influence health are central to the curriculum. Faculty members are prized as much for their skills in the classroom as for their skills in the research lab. It’s a nontraditional approach intended to produce highly traditional doctors — physicians devoted to their communities. What policymakers and health-care experts are now waiting to see is: Will this really work?

To fully understand the link between medical schools and sophisticated hospitals — to understand medical education at all — you need to go back to Abraham Flexner, an educator who conducted a detailed study of American and Canadian medical schools for the Carnegie Foundation in 1910. Flexner visited all 155 medical schools in the United States, and what he found was an appalling number of ill-equipped and exploitative institutions that simply churned out diplomas. Many medical schools didn’t even require a high school education. Quackery flourished, and marginal doctors flooded the market.

To counter what he called the “over-production of uneducated and ill trained medical practitioners,” Flexner suggested a template for medical education that has since been adopted nearly universally: two years of classroom instruction in basic sciences followed by two years of clinical work in a sequence of specialties. To ensure that instruction would be anchored to institutions with proper scientific standards, Flexner recommended that medical schools affiliate themselves with established universities and hospitals. In Flexner’s view, Baltimore’s Johns Hopkins School of Medicine, tied to Johns Hopkins University, was the quintessential example of how to do medical education right. The Flexner Report remains the most influential document in medical education today.

Still, for all the improvements that Flexner helped bring about, his report is often blamed for glorifying hospital research and medical specialization at the expense of primary care. That’s why many of the new medical schools have been experimenting with significant variations on the Flexner model….

Read it all.

Foreign Policy:

When the euro officially entered circulation at the stroke of midnight on Jan. 1, 2002, fireworks lit up the night sky across Europe to celebrate the scrapping of the French franc, German deutsche mark, Greek drachma, and a clutch of other ancient currencies. Brussels hosted an extravagant sound-and-light show, while Frankfurt unveiled a five-story statue of the freshly minted euro as a pop band belted out “With Open Arms (Euro World Song).” “I am convinced,” European Central Bank President Wim Duisenberg declared, that the launch of euro coins and banknotes “will appear in the history books in all our countries and beyond as the start of a new era in Europe.”

The early 2000s did feel like the European moment. Enlightened policy wonks on both sides of the Atlantic gushed about the glamorous new arrival on the global stage. In this magazine in 2004, Parag Khanna described the “stylish” European Union as a “metrosexual superpower” strutting past the testosterone-fueled, boorish United States on the catwalk of global diplomacy. Later that year, economist Jeremy Rifkin penned a book-length encomium, The European Dream: How Europe’s Vision of the Future Is Quietly Eclipsing the American Dream, which was followed by Washington Post reporter T.R. Reid’s unlikely bestseller, The United States of Europe: The New Superpower and the End of American Supremacy. In 2005, foreign-policy expert Mark Leonard explained Why Europe Will Run the 21st Century.

One wonders how well these books are selling today, now that the European dream has become a nightmare for many, with the euro teetering on the brink of collapse and the union that produced it mired in a triple crisis that will take years, if not decades, to resolve.

First, there’s the economic catastrophe. Like the United States, Europe is living through its fiercest financial crisis since the 1930s. Unemployment is high — more than 20 percent in formerly go-go Spain — while growth is almost nonexistent, banks are collapsing, and indebted governments are running out of money. Some countries, among them Britain, Greece, Ireland, Italy, Portugal, and Spain, face the prospect of a generation of hardship.

Second, the economic crisis comes on top of the deepest political crisis the European Union has faced. Its most ambitious project, the creation of a single currency, is in danger of collapse. The principle of the free movement of people, another cornerstone of EU integration, is being challenged as some states reintroduce border controls. Visionary leadership is in short supply. And a disgruntled electorate is turning in droves toward anti-immigrant populism. In his annual address last September, European Commission President José Manuel Barroso admitted, “We are facing the biggest challenge in the history of our union.” A month later, German Chancellor Angela Merkel described the threat to the euro as Europe’s “worst crisis since the end of World War II.” For the first time in my 20 years in Brussels, the splintering of the European Union is no longer science fiction but a real, if still somewhat unlikely, possibility.

The European Union was built on the myth that we are one people with one common destiny — an “ever closer union,” in the words of the 1957 Treaty of Rome that founded what was then called the European Economic Community. We are now discovering that regional and national differences are not dissolving and that Europeans think and act very differently from one another. The British view of the state’s role is very different from the French view. The Greek or Italian concept of law is very different from that of Sweden or Denmark. Latvians have a very different view of Russia from Germans. What an Irishman is prepared to pay in taxes is very different from what a Dane or Belgian will allow.

This lack of unity is Europe’s third and most profound crisis, one that underlies the continent’s economic and political woes. Most Europeans have little idea what the EU stands for in the world, what binds its people together, where it has come from in the past, and where it is going in the future. After more than 60 years of EU integration, 200,000 pages of legislation, and a hefty (and still growing) stack of treaties, we have succeeded in building a European Union without Europeans.

“Yes, but what is a European?”…

Read it all.

When Harry Met Sally, 2012

January 10, 2012

Via Newsday

Early Election Results

January 10, 2012

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

Follow

Get every new post delivered to your Inbox.

Join 83 other followers