When Bureaucrats Stymie Science: The CDC’s policy on the meningococcal vaccine wastes taxpayer dollars and threatens public health
January 15, 2012
“If we can save only one child’s life…” is a phrase frequently used to justify one initiative or another. It has been invoked in recent years to promote causes ranging from the installation of seatbelts in school buses to anti-alcohol campaigns directed at pre-teens. But when it comes to saving lives through certain infant vaccinations, public health officials seem not to grasp the concept.
Consider meningococcal disease, a rare but devastating bacteria-caused illness that primarily affects infants and children. Its elimination has been on the U.S. Centers for Disease Control’s list of priorities since 1999, but in early 2010, around the same time that a vaccine to prevent meningococcal disease in infants was submitted to the FDA for approval, the CDC began to show signs of retreating from its earlier resolve. Its motives are unclear.
Recommendations for children’s immunization schedules come from the CDC’s Advisory Committee on Immunization Practices (ACIP). Last year, the Meningococcal Working Group, made up of experts who advise the ACIP, considered a recommendation not to perform routine infant vaccination for meningococcal disease, ostensibly because the burden of the illness in the United States is “minimal” and does not justify the expense of adding the vaccine to the infant immunization schedule.
How does one define a minimal burden? The ACIP report doesn’t specify, and many infectious disease experts would beg to differ with that characterization. Meningococcal disease is both unpredictable and deadly, killing 10 to 15 percent of those who contract it. Among those who survive, as many as one in five will suffer permanent disability from amputations, seizures, paralysis, hearing loss, and learning disabilities—hardly a “minimal burden” for the individuals and the families affected—or for a health care system that shares the costs of lifelong disability. The onset of the disease is rapid and insidious, and it can be fatal within 24 to 48 hours of the first non-specific symptoms.
The narrow window of time to diagnose and successfully treat a patient makes the treatment—instead of the prevention—of meningococcal disease a dubious strategy. That’s why vaccination is so important. Given the success of current U.S. vaccine programs, which have almost eliminated many other infectious diseases, the prevention of meningococcal disease qualifies as an unmet public health need.
The goal of the U.S. vaccination program should be to eliminate meningococcal disease. The preferred strategy to eliminate the disease would include infant immunization, even though—for reasons that are unknown—the incidence of the disease has declined in recent years. In 2007, the CDC estimated that there were 1,000 cases and 130 deaths nationally, compared with 2,800 cases and 300 fatalities a decade earlier. History has shown, however, that spikes in the incidence of meningococcal disease can occur over a short time period and without warning. Moreover, the prevailing strains of meningococcus could evolve to become more virulent. What the ACIP considers a minimal burden could increase rapidly, with dire consequences.
Meningococcal disease is both unpredictable and deadly.
The absence of satisfactory treatment options combined with the high risks for infants makes a compelling argument for aggressive protection against the disease in the United States through infant vaccination, which would complement the existing adolescent immunization strategy. The latter ensures retention of immunity.
So who should be vaccinated? The Public Health Agency of Canada calls for use of a meningococcal vaccine as early as two months of age. However, in the United States, the current immunization schedule for this disease begins at two years of age for high-risk children (of whom there are very few) and is recommended for all at age 11. A vaccine for infants as young as two months of age was approved by the FDA in April, but the CDC appears to be wavering on whether to recommend its use for the routine vaccination of infants in spite of its own recent data indicating they are at high risk: The incidence of meningococcal disease is three to seven times higher in infants under one year old than in any other age group.
Many industrialized nations, including Australia, Canada, and 13 European nations, consider the threat of meningococcal disease to be sufficiently high to recommend routine infant vaccinations. The results have been impressive: According to the World Health Organization’s Initiative for Vaccine Research, the 12-year-old infant vaccination program in the U.K. has “had a tremendous impact on the incidence of the disease, resulting in a more than 90% decrease in the number of deaths and clinical cases.” (It is significant that that lower incidence of infections was found only in the single meningococcal strain that was contained in the vaccine. That offers assurance that the reduction in illness resulted from vaccination, not from a lower ambient incidence of meningococcal infections.)
Also according to WHO, the inclusion of the meningococcal vaccine in infant vaccination programs in the Netherlands and Canada has shown efficacy of 87 to 98 percent in various studies. And in millions of patients, meningococcal vaccines have demonstrated excellent safety profiles, like those of other routinely administered vaccines.
Vaccination against meningococcal infection is a public health imperative, but instead of placing a high priority on the elimination of this disease by promoting vaccination at an early age and ensuring that patients and parents have access to state-of-the-art vaccines, the CDC has been dithering. The possible reasons are worth exploring because they could apply to a wide range of decisions that the government will have to make as it plays a larger role in the funding of healthcare…
January 15, 2012
The Treaty of Versailles, negotiated by the fractious Allies in the wake of the First World War, did not crush Germany, nor did it bring her back into the family of nations. Antony Lentin examines a tortuous process that sowed the seeds of further conflict.
Nearly a century on, perceptions of the Paris Peace Conference and the Treaty of Versailles still bear the imprint of The Economic Consequences of the Peace by John Maynard Keynes (1883-1946), which became a bestseller in the wake of the conference. Bitter fruit of Keynes’ own experience as a delegate in Paris, the book condemned what he branded ‘the Carthaginian peace’. The expression was suggested to Keynes by the South African delegate, General Jan Smuts (1870-1950), who referred to the peace concluded in 201 bc after the Second Punic War, when Rome stripped Carthage of its army, navy and overseas possessions and imposed a 50-year indemnity. Otherwise Carthage was left independent and able to recover economically, which eventually it did. Keynes actually seems to have been thinking of the ‘peace’ of 146 bc, when, after the Third Punic War, the Romans slaughtered the inhabitants of Carthage or sold them into slavery, annexing what remained of Carthaginian territory. In The Economic Consequences of the Peace Keynes quoted and endorsed the German view that the Treaty of Versailles signalled ‘the death sentence of many millions of German men, women and children’.
The book was widely translated, has never been out of print and has never lost its authority. Its success may be attributed to Keynes’ reputation as an economist and the brilliance with which he conveyed the disenchantment shared by many of his colleagues in the British delegation. Neither the acute and prophetic analysis published soon after, Jacques Bainville’s Les conséquences politiques de la paix(1920), which has never been translated into English, nor the detailed refutation of Keynes by Etienne Mantoux, The Carthaginian Peace or The Economic Consequences of Mr Keynes (1944), succeeded in stemming its influence, though while none of Keynes’ predictions were realised almost every one of Bainville’s were. More recent research contained in two collections of scholarly papers has fared little better. William Keylor, in his contribution to The Treaty of Versailles 75 Years After (1998), and Zara Steiner in ‘The Treaty of Versailles Revisited’, published in The Paris Peace Conference, 1919: Peace without Victory (2001), strove to correct what Steiner calls ‘the misused image of the “Carthaginian” peace’. In The Lights that Failed: European International History 1919-1933 (2005) Steiner repeats that ‘the traditional view’ of Versailles ‘needs to be abandoned’. But still historians have failed to break the Keynesian spell. Is the accepted image wholly illusory, or does it express an aspect of the truth about the peace treaty?
After the ‘war to end war’ extravagant hopes were raised by the Paris Peace Conference, the first and greatest ‘summit conference’ of modern times. Even before the conference opened President Woodrow Wilson, en route from the United States, feared that it might end in ‘a tragedy of disappointment’. At its height, more than a thousand statesmen, diplomats and their staff, representing some 30 nations, were engaged in the business of peacemaking. The British delegation alone numbered over 200. Among them was Harold Nicolson, a junior diplomat who later published another classic of disillusionment, Peacemaking 1919 (1933). Nicolson recalls the conference resembling ‘a riot in a parrot-house’. Fifty-two commissions met in 1,646 sessions to draft reports on subjects ranging from prisoners of war to undersea cables, from the internationalisation of the Kiel Canal to responsibility for the war, all incorporated in a treaty extending to 15 chapters and 440 clauses.
The conference eclipsed any other in the scope of the responsibilities it undertook, with the frontiers of a new Europe of nation states to delineate and treaties to conclude with Austria, Hungary, Bulgaria and Turkey, as well as with Germany. But progress suffered badly from the want of a basic organisational plan. Both Wilson and the British prime minister, David Lloyd George, mistrusted traditional diplomacy, which they believed had contributed to the outbreak of the war. They and the French premier, Georges Clemenceau, insisted on keeping both the shifting agenda and the conduct of negotiations in their own hands.
Wilson sought to establish the League of Nations, his panacea for world peace, as part and parcel of the peace treaties. The opening weeks of the conference were devoted to drafting the constitution, or Covenant, of the League. At the same time a council consisting of the five Allied leaders (of France, Britain, the US, Italy and Japan) and their foreign ministers sat through lengthy presentations of territorial claims from spokesmen of the new states. Clemenceau’s object was above all to ensure the future security of France against Germany, which he was sure would be intent on revenge. For Lloyd George the priority was reparations, which turned out to be the most time-consuming and divisive of all the problems faced.
Nicolson thought the lack of a systematic agenda vitiated the conference from the outset. Instead of getting to grips with the long-term challenge of Germany, the peacemakers found themselves struggling to cope with the distracting sideshows of a dozen minor wars and several sporadic and short-lived Communist revolutions. At the same time they were under domestic pressures from what Lloyd George called ‘the too fierce ardour of an expectant public’. He himself had done much to fan the flames with his electoral pledges to ‘make Germany pay’ and had to return periodically to London to face raucous backbenchers in his Conservative-dominated coalition. Wilson, too, returned temporarily to Washington for the opening session of a Congress dominated by his isolationist Republican opponents, whose suspicions of the League of Nations he failed to allay. Clemenceau was also briefly out of action when an assassination attempt left a bullet in his chest.
Not until the end of March 1919 – fearing that the example of Bolshevism in Russia might prove irresistible to a volatile Europe craving stability, work and bread – did Lloyd George, Wilson, Clemenceau and, to a lesser extent, Prime Minister Orlando of Italy, attempt to grasp the nettle of peace with Germany in ‘a race’, said Wilson, ‘between peace and anarchy’. Accompanied only by interpreters and advisers and meeting daily in 145 private sessions between late March and June, they took all the main decisions themselves as the Supreme Council or ‘Big Four’: ‘Four men’, said Lloyd George, ‘endeavouring to make the world spin round the way it should’.
It was from these closed sessions in stuffy rooms across six weeks of intensive bargaining that the treaty with Germany emerged as a set of improvised arrangements between Allies with different and often conflicting aims on such contentious territorial issues as Danzig, the Saar and the Rhineland, over which they deliberated at length. At various times one or other would stalk out of the room, threaten to leave the conference, or in Orlando’s case, to do so: the Big Four became the Big Three. ‘How did you get on?’ Clemenceau was asked after one stormy session with Wilson. ‘Splendidly’, he replied. ‘We disagreed about everything.’ On another occasion Clemenceau came close to blows with Lloyd George, whom he accused, not without cause, of serial duplicity. Wilson, exasperated at the demands of both Clemenceau and Lloyd George, ordered theSS George Washington to prepare for his early return. They all stuck it out – Orlando came back in the end – accepting that compromise was inevitable if the conference was not to collapse; but the compromises reached only after immense difficulty and heart-searching were between the Allies, not between them and Germany…
Diss ‘Like’: It’s, like, not a verbal tic. It’s an epidemic, symptomatic of our thought-thin, blathering age
January 15, 2012
Depending on how you do the math, there are between a quarter-million and a million words in the English language. The 20-volume second edition of The Oxford English Dictionaryboasts north of 291,500 entries. A Texas-based outfit, the Global Language Monitor, puts the number of words at 1,010,649—that’s as of May 24, 2011—and growing at the rate of one new word every 98 minutes. Whatever the count, there are plenty of words to go around.
Of all these hundreds of thousands of words, only one do I hold in contempt. That word is “like”—not the tepid expression of mild appreciation but the parasitic form that now bleeds the mother tongue, marks the user as a dunce, and, were it truly understood, scandalizes our schools.
No word has less meaning or says as much about what has become of education.
It is easy—and fashionable—to dismiss it as a personal pet peeve (a pedagogical hypersensitivity,) a verbal tic (like Tourette’s, a disability that, though embarrassing, calls for accommodation, not correction), or a sophomoric affliction akin to acne—soon to be outgrown and impolite to point out. It amuses others as an endearing aspect of the ingénue who texts through class and surfaces now and again, with hand raised, bursting with earnestness to volunteer that “like, when I, like, think about this, I, like. … ” (“Thank you, Heather,” says the instructor, grateful for any relief from his or her own monotonous monologue.) Then there are those who are merely disdainful, content to ridicule the afflicted, and take it on as part of the sackcloth and ashes that goes with being a teacher. The collective response of the academy: feigned deafness.
But having spent the last 30 years in the company of the possessed, I have come to view “like” as something more pernicious, a kind of carrier, like the flea that brings with it the plague. It is the byproduct of a culture that is loath to set standards, pathologically averse to confrontation, and prostrate in the face of precipitously declining verbal and writing skills. The endless cascade of “likes” is nothing more than the sound of our own collective dereliction, a verbal cue that we care less about our students and more about teaching to the test, conducting perfunctory self-assessments, winning student-evaluation pageants, and maintaining the illusion that all is well—the oral equivalent of grade inflation.
If we have succeeded in leaving no child behind, it is perhaps because we have condemned children to a common cluster of mediocrity, like, you know, unable to speak, and worse yet, unaware of the severity of their impediment. It is a form of cruelty, a velvet noose they will forever wear around their necks as they venture out into the world, presenting themselves in job interviews, addressing clients, patients, judges, customers, and peers, reminding one and all with whom they come into contact that, notwithstanding a diploma and high GPA, they have failed and, worse yet, we have failed them.
To those who imagine that “like” is merely empty calories, think again. It is used as the connective tissue, for many the basic webbing upon which all sentences are formed. These are sentences whose destination is a mystery even to the speaker. It is a way to buy time, a stalling device that keeps the sentence aloft even when the air is no longer under its wings. It creates the illusion of forward movement but imparts no progress toward an idea or a position. Such a sentence hovers, hoping that some direction will clarify itself before exhaustion—the speaker’s, the listener’s, or both—takes hold and the sentence collapses from sheer vacuity.
“Like” is merely an adhesive that, ironically, holds together unlike elements. It represents the antithesis of forethought, is inimical to critical thinking, a counterfeit expression, a poseur emboldened by years of self-indulgence and pedagogical neglect. It is a kind of get-out-of-jail-free card, relieving the speaker of accountability. It tells students that the world is so intrigued by what they have to say that it is willing to clean up after them, to sift through the verbal refuse for the nuggets concealed within.
Where “like” is the norm, silence is abhorred and the oxygen requisite to contemplation denied. Reflection, in a time of instant messaging, seems as quaint as the quill pen and the flickering of candles.
We extol critical thinking but wince to make room for a quiet period to allow it entrance. We attach a premium to spontaneity, even if it produces blather. Better to have the mouth moving than the wheels of the brain quietly turning.
There is (or, at least, was) a difference between classroom participation and contribution. Half the time when students speak, it is like listening to the Delphic oracle—the meaning is so elliptical and vague as to be indecipherable…
January 15, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.