May 14, 2012
One day last summer, Anne and her husband, Miguel, took their 9-year-old son, Michael, to a Florida elementary school for the first day of what the family chose to call “summer camp.” For years, Anne and Miguel have struggled to understand their eldest son, an elegant boy with high-planed cheeks, wide eyes and curly light brown hair, whose periodic rages alternate with moments of chilly detachment. Michael’s eight-week program was, in reality, a highly structured psychological study — less summer camp than camp of last resort.
Michael’s problems started, according to his mother, around age 3, shortly after his brother Allan was born. At the time, she said, Michael was mostly just acting “like a brat,” but his behavior soon escalated to throwing tantrums during which he would scream and shriek inconsolably. These weren’t ordinary toddler’s fits. “It wasn’t, ‘I’m tired’ or ‘I’m frustrated’ — the normal things kids do,” Anne remembered. “His behavior was really out there. And it would happen for hours and hours each day, no matter what we did.” For several years, Michael screamed every time his parents told him to put on his shoes or perform other ordinary tasks, like retrieving one of his toys from the living room. “Going somewhere, staying somewhere — anything would set him off,” Miguel said. These furies lasted well beyond toddlerhood. At 8, Michael would still fly into a rage when Anne or Miguel tried to get him ready for school, punching the wall and kicking holes in the door. Left unwatched, he would cut up his trousers with scissors or methodically pull his hair out. He would also vent his anger by slamming the toilet seat down again and again until it broke.
When Anne and Miguel first took Michael to see a therapist, he was given a diagnosis of “firstborn syndrome”: acting out because he resented his new sibling. While both parents acknowledged that Michael was deeply hostile to the new baby, sibling rivalry didn’t seem sufficient to explain his consistently extreme behavior.
By the time he turned 5, Michael had developed an uncanny ability to switch from full-blown anger to moments of pure rationality or calculated charm — a facility that Anne describes as deeply unsettling. “You never know when you’re going to see a proper emotion,” she said. She recalled one argument, over a homework assignment, when Michael shrieked and wept as she tried to reason with him. “I said: ‘Michael, remember the brainstorming we did yesterday? All you have to do is take your thoughts from that and turn them into sentences, and you’re done!’ He’s still screaming bloody murder, so I say, ‘Michael, I thought we brainstormed so we could avoid all this drama today.’ He stopped dead, in the middle of the screaming, turned to me and said in this flat, adult voice, ‘Well, you didn’t think that through very clearly then, did you?’ ”
Anne and Miguel live in a small coastal town south of Miami, the kind of place where children ride their bikes on well-maintained cul-de-sacs. (To protect the subjects’ privacy, only first or middle names have been used.) The morning I met them was overcast and hot. Seated on a sofa in the family’s spacious living room, Anne sipped a Coke Zero while her two younger sons — Allan, 6, and Jake, 2 — played on the carpet. So far, she said, neither of the younger boys exhibited problems like Michael’s.
“We have bookshelves full of these books — ‘The Defiant Child’, ‘The Explosive Child,’ ” she told me. “All these books with different strategies, and we try them, and sometimes they seem to work for a few days, but then it goes right back to how it was.” A former elementary-school teacher with a degree in child psychology, Anne admitted feeling frustrated despite her training. “We feel like we’ve been spinning our wheels,” she said. “Is it us? Is it him? Is it both? All these doctors and all this technology. But nobody has been able to tell us, ‘This is the problem, and this is what you need to do.’ ”….
How Not to Cure Chronic Diseases: Have Congress pass costly legislation that fails to address the problem it’s trying to solve
May 14, 2012
Members of Congress can be astonishingly clueless about the shortcomings of federal agencies and how to correct them. At a press conference in February, U.S. Senator Barbara Mikulski (D-Maryland) and several bipartisan colleagues unveiled the Spending Reductions through Innovations in Therapies (SPRINT) Act, legislation intended to spur innovation in pharmaceutical research and development for chronic and costly health conditions such as Alzheimer’s Disease, cancer, diabetes, and heart disease. The speeches were heavy on promises but light on insight.
According to the press release, the bill will invest “in public-private partnerships to ensure scientists and researchers are able to develop new safe and effective drugs”; shrink product development timelines; increase the number of drugs in the development pipeline; and expedite “the Food and Drug Administration (FDA) review process so that drugs can be brought more quickly to market to the patients who need them.”But the legislation fails to address what actually impedes the development of important new medicines. Either the legislators who oversee the FDA and control the nation’s coffers are woefully uninformed or they’re poseurs more concerned with posturing than public health. (Neither of those would be a first.)
There is currently plenty in the development pipeline. The federal government has already announced a boost in spending on research and development for Alzheimer’s Disease so that the Department of Health and Human Services’s spending in the next fiscal year (FY 2013) alone would exceed $500 million. Moreover, drug companies currently spend more than $65 billion on pharmaceutical R&D, and they know better than anyone that the big payoffs will be for treatments for the kinds of prevalent chronic diseases targeted by the legislation.
For example, more than a hundred drugs are currently in development for Alzheimer’s Disease, dementias, and other cognition disorders, and almost 900 medicines are being tested for cancer. The treatments involve drastically different approaches, including vaccines, human gene therapy, and orally administered and injectable medicines.
The problem is to get the new drugs through the pipeline and into the marketplace. Unfortunately, government regulation has become a significant obstacle for drug developers and a disincentive for potential investors. Bringing a new drug to market now takes twelve to fifteen years and costs more than $1.4 billion. The number of drugs approved by the FDA each year is trending downward despite significant annual increases in the agency’s budget. Perhaps the most ominous statistic of all is that drug manufacturers recoup their R&D costs for only one in five approved drugs, a deterioration from the one in four figure of about a decade ago.
Potential investors are responding to these disincentives. A recent survey of the intentions of venture capital firms reflects the negative impacts of drug and medical device regulation. It revealed that the firms have begun to avoid funding early-stage pharmaceutical and device companies in the United States, and both the dollars and the R&D are increasingly moving abroad.
Government regulation has become a significant obstacle to drug development.
Thirty-six percent of respondents said they plan to increase investments in life science companies in Europe, while only 13 percent plan to increase investment in U.S. companies; and 31 percent said they plan to decrease investment in life science companies in the United States, compared to 7 percent that plan to decrease investment in Europe. Sixty-one percent of the investors cited regulatory challenges as the primary reason; more specifically, they alluded to dysfunction, unpredictability, and risk aversion at the FDA. The proposed legislation would address none of these endemic problems.
Many of the challenges to pharmaceutical development are caused by the FDA’s excessive risk aversion—unchecked, encouraged, or even created by Congress—that has forced companies to perform ever-larger, longer, more complex, and more expensive clinical trials. Expressing his industry’s frustration at the FDA’s capriciousness and intransigence, Fred Hassan, the former CEO of the drug company Schering-Plough, said of the regulatory climate: “What will it take to get new drugs approved? The point is, we don’t know.” Kenneth Kaitin, director of the Tufts Center for the Study of Drug Development, described the obstructionist culture at the FDA as having caused it to become viewed as “an agency that is supposed to keep unsafe drugs off the market, not to speed access to life-saving drugs.”
The head of the FDA, Margaret Hamburg, continues to deny that her agency is in any way responsible for the crisis in drug development, but the reality is that the FDA has consistently pushed the envelope of its statutory authority in ways that stifle innovation. Although there exists a statutory requirement only to show that a new drug is safe and effective, the agency—sometimes spurred by Congress—has invented new criteria, including a requirement to demonstrate superiority over existing drugs, which it applies arbitrarily. Proving that a test drug is better than existing drugs often is much more difficult and vastly more expensive than just proving that it is safe and effective. If the efficacies of two medicines differ only marginally, the clinical trials must be very large in order to show a statistically significant difference between them.
If this new criterion was widely implemented, there would be less competition in the drug market, fewer treatment choices for physicians, and higher prices for drugs. Wyeth’s chairman and CEO Robert Essner described the impact of the requirement to show superiority this way: “If you’re the first company to get approved in a certain area and competitors can’t get on the market, the FDA is now establishing monopolies. And that’s certainly not their mandate.” Whatever one thinks of the existing requirements, surely we should not have an FDA that aggressively discourages competition.
The FDA’s demand that a new drug demonstrate superiority has contributed to the much-publicized recent shortages of various drugs. The fewer alternative medicines there are for a given ailment, the greater the vulnerability to shortages there are if the production and distribution of one of the drugs are disrupted for any reason.
The FDA has consistently pushed the limits of its authority in ways that stifle innovation.
The FDA has begun to impose what amounts to yet another criterion for new drugs: post-marketing studies as a condition of approval. Such clinical trials used to be relatively rare—and were required for patient populations such as the elderly or pregnant women that were not adequately represented in the earlier studies—but are now required in more than three-quarters of approvals.
And in 2008, Congress created still another criterion for new (and certain old) drugs to meet by enabling the FDA to require companies to produce a Risk Evaluation and Mitigation Strategy (REMS) whenever regulators perceived the need for one. A REMS is “a strategy to manage a known or potential serious risk associated with a drug or biological product,” which could be said to encompass virtually every new medicine. The FDA’s explanation continues, “A REMS can include a Medication Guide, Patient Package Insert, a communication plan, elements to assure safe use, and an implementation system, and must include a timetable for assessment of the REMS.”…
The era of U.S.-approved, iron-fisted Arab dictators is over: Washington must get used to a Middle East in which anti-Western sentiment abounds and political Islam emerges as a major force.
May 14, 2012
NOWHERE IN the world have the latest shocks to the Old Order been more powerful than in the Middle East and North Africa, where massive civic turmoil has swept away long-entrenched leaders in Egypt, Tunisia and Yemen, toppled a despot in Libya and now challenges the status quo in Syria. Over the past sixty years, the only other development of comparable game-changing magnitude was the 1989 fall of the Berlin Wall and the subsequent collapse of the Soviet Union.
It isn’t clear where the region is headed, but it is clear that its Old Order is dying. That order emerged after World War II, when the Middle East’s colonial powers and their proxies were upended by ambitious new leaders stirred by the force and promise of Arab nationalism. Over time, though, their idealism gave way to corruption and dictatorial repression, and much of the region slipped into economic stagnation, unemployment, social frustration and seething anger.
For decades, that status quo held, largely through the iron-fisted resolve of a succession of state leaders throughout the region who monopolized their nations’ politics and suppressed dissent with brutal efficiency. During the long winter of U.S.-Soviet confrontation, some of them also positioned themselves domestically by playing the Cold War superpowers against each other.
The United States was only too happy to play the game, even accepting and supporting authoritarian regimes to ensure free-flowing oil, a Soviet Union held at bay and the suppression of radical Islamist forces viewed as a potential threat to regional stability. Although successive U.S. presidents spoke in lofty terms about the need for democratic change in the region, they opted for the short-term stability that such pro-American dictators provided. And they helped keep the strongmen in power with generous amounts of aid and weaponry.
Perhaps the beginning of the end of the Old Order can be traced to the 1990 invasion of the little oil sheikdom of Kuwait by Iraq’s Saddam Hussein—emboldened, some experts believe, by diplomatic mixed signals from the United States. It wasn’t surprising that President George H. W. Bush ultimately sent an expeditionary force to expel Saddam from the conquered land, but one residue of that brief war was an increased American military presence in the region, particularly in Saudi Arabia, home to some of Islam’s most hallowed religious shrines. That proved incendiary to many anti-Western Islamists, notably Osama bin Laden and his al-Qaeda terrorist forces. One result was the September 11, 2001, attacks against Americans on U.S. soil.
The ensuing decade of U.S. military action in Iraq and Afghanistan produced ripples of anti-American resentment in the region. The fall of Saddam Hussein after the 2003 U.S. invasion represented a historic development that many Iraqis never thought possible. But the subsequent occupation of Iraq and increasing American military involvement in Afghanistan deeply tarnished the U.S. image among Muslims, contributing to a growing passion for change in the region. Inevitably, that change entailed a wave of Islamist civic expression that had been suppressed for decades in places such as Egypt and Tunisia.
Meanwhile, the invasion of Iraq also served to upend the old balance of power in the region. It destroyed the longtime Iraqi counterweight to an ambitious Iran. This fostered a significant shift of power to the Islamic Republic and an expansion of its influence in Syria, Lebanon, the Gaza Strip and the newly Iran-friendly Iraqi government in Baghdad. Iran’s apparent desire to obtain a nuclear-weapons capacity—or at least to establish an option for doing so—tatters the status quo further. With Israel threatening to attack Iran’s nuclear facilities, many see prospects for a regional conflagration, and few doubt that such a conflict inevitably would draw the United States into its fourth Middle Eastern war since 1991.
In short, recent developments have left the Middle East forever changed, and yet there is no reason to believe the region has emerged from the state of flux that it entered last year, with the street demonstrations in Tunisia and Egypt. More change is coming, and while nobody can predict precisely what form it will take, there is little doubt about its significance. Given the region’s oil reserves and strategic importance, change there inevitably will affect the rest of the world, certainly at the gas pump and possibly on a much larger scale if another war erupts.
A CAREFUL look at the history of the Middle East reveals that, although the West has sought to hold sway over the region since the fall of the Ottoman Empire following World War I, its hold on that part of the world has been tenuous at best. This has been particularly true since the beginning of the current era of Middle Eastern history, marked by the overthrow of colonial powers and their puppets.
Like so many major developments in the region, this one began in Egypt. On July 22, 1952, Colonel Gamal Abdel Nasser led a coup that toppled the British-installed King Farouk, a corrupt Western lackey. The son of a postal clerk, Nasser was a man of powerful emotions. He had grown up with feelings of shame at the presence of foreign overlords in his country, and he sought to fashion a wave of Arab nationalism that would buoy him up as a regional exemplar and leader. By 1956, he had consolidated power in Egypt and emerged as the strongest Arab ruler of his day—“the embodied symbol and acknowledged leader of the new surge of Arab nationalism,” as the influential columnist Joseph Alsop wrote at the time.
From President Eisenhower’s secretary of state, John Foster Dulles, Nasser secured a promise of arms sales, but Dulles failed to deliver. So Nasser turned to Soviet leader Nikita Khrushchev, who was happy to comply. The young Egyptian leader’s anti-Western fervor impressed Khrushchev, and he saw an opportunity: ship arms to Nasser and gain a foothold in the Middle East. The West’s relations with Nasser deteriorated further when Dulles withdrew financial support for the Egyptian leader’s Aswan Dam project. Nasser, in retaliation, seized the Suez Canal, the hundred-mile desert waterway that nearly sliced in half the trade route from the oil-rich Persian Gulf to Great Britain.
Britain joined with France and Israel in a military move to recapture the canal, but Eisenhower sternly forced them to halt their offensive and withdraw the troops. Thus did Nasser manage to secure the canal for his country for all time. Alsop and his brother Stewart, both Cold War hawks, complained that Britain’s options narrowed down to “the grumbling acceptance of another major setback for the weakening West.”…
May 14, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.