The Campaign-Finance Stories That Don’t Get Written: Consultants and insiders feed the fundraising frenzy. How much do they make, anyway?
April 20, 2012
There was something comically self-evident about the headline on the story that led the April 13 print edition of The New York Times: “Campaigns Plan Maximum Push to Raise Money.” Unless the political world is struck by a wave of Gandhian self-abnegation, it is nearly impossible to imagine a scenario that would justify the opposite headline: “Campaigns Languid About Money in White House Race.”
The actual story, by Nicholas Confessore, was standard-issue money-in-politics journalism. It offered vague estimates that as much as $1 billion each could be spent on behalf of Barack Obama and Mitt Romney; it predicted the final breakdown of the federal public financing system; it included the obligatory ain’t-it-terrible quote from the president of Common Cause; and it detailed the role that super PACs and state party committees will play in the upcoming presidential race.
What makes the article relevant for our purposes was both its placement on the Timesfront page and the generic qualities that make it almost certain to be replicated in the weeks ahead. The next day, April 14, Politico featured a story by Kenneth P. Vogel and Robin Bravender headlined, “Democratic super PACs gain ground but still trail GOP in fundraising.” Confessore returned to the topic April 16 in a widely quoted blog post for theTimes detailing an internal Romney memo that put the campaign’s fundraising goals (in conjunction with the Republican Party) at $800 million, with an estimated additional $200 million slated to come from super PACs.
Money in presidential politics is a worthy topic covering everything from the contortions that candidates go through to raise it (Obama, according to Confessore, has already held more than 100 major fund-raisers) to the presidential access major donors receive in all administrations (remember Bill Clinton’s White House coffees and Lincoln Bedroom sleep-overs?). But the problem with most campaign spending coverage (and I am not trying to single out either Confessore or the Times) is what it leaves out. The stories almost invariably reflect the narrow world view of campaign consultants and politics insiders, which holds that more money always equals more votes in presidential politics.
Missing from the equation is skepticism about the self-interested role of political insiders and campaign consultants in ballyhooing the merits of unlimited campaign spending. Good reporters would not be swayed if prominent Realtors trumpeted the benefits of home ownership over renting, but there is a long tradition of glossing over the built-in bias of campaign ad-makers and strategists when they prophesize doom if candidates fail to raise more money to pay for their services. A typical example is the April 16 Associated Press dispatch by Ken Thomas about the president’s March fundraising haul, which noted, “Obama’s campaign team has tried to generate a sense of urgency, telling donors they need to get involved because of Republican-leaning super PACs aiming to raise hundreds of millions of dollars to defeat the president.” Though the story distanced itself from the fundraising frenzy by attributing the “sense of urgency” to the Obama camp, nothing in the article challenged the underlying money-always-talks ethos of campaign consultants.
No, I have not been reading Rebecca of Sunnybrook Farm and taking my political cues from Doctor Pangloss. Cash-on-hand calculations can shape destiny in presidential primaries, such as when former Minnesota Gov. Tim Pawlenty, once regarded as the most serious rival to Romney, dropped out after losing the Iowa Straw Poll last August. Rick Santorum, in his first interview after bowing to Romney’s inevitability, invoked the political aphorism: “Every presidential campaign ends for the same reason—you run out of money.” And in congressional elections, too, money really shouts; this year, super PAC spending by groups like Karl Rove’s American Crossroads may hand the Republicans a decisive edge.
But the presidential general-election campaign is a huge exception to these political orthodoxies. With four days of national conventions plus three presidential debates—plus nearly four years of the Obama presidency and the GOP primary campaign—voters have such strong impressions of the candidates from news coverage that TV commercials only matter at the margins. (The independent expenditure 2004 Swift Boats ads that helped sink John Kerry represent a now-outmoded counter-example. The spots first aired at the moment of Kerry’s maximum vulnerability: when he was exhausted and nearly broke after winning the Democratic primaries. Kerry was also the last Democratic presidential nominee to accept federal funding for the fall campaign and the spending limitations that came with it).
That is why the only financial issue that matters in shaping the outcome of the fall presidential election is whether the campaigns are in rough parity. A $50 million or $100 million difference between the Obama and Romney forces in a $2 billion race would be a rounding error, not a major strategic advantage. But many reporters get caught up in the raw data of campaign fundraising totals and independent expenditure figures that are released by the Federal Election Commission, and write with the assumption that these numbers are invariably meaningful. In contrast, a Washington Post article in late March by Dan Eggen represented a praiseworthy effort at debunking the conventional wisdom: “Super PACs could have a more limited impact on the general election than it appears from the Republican primaries, where they have dominated spending in part because most of the candidates have raised relatively little.”
When presidential horse-race reporters compare piles of candidate contributions and Super PAC swag, they assume that all campaigns are equally efficient in deploying this cash on the political battlefield. What raw fund-raising numbers hide is whether anyone is getting rich (or richer) as they try to elect a president. Almost never asked are questions like: How much personal profit are the ad-makers, the outside strategists, the pollsters, and the fundraising consultants making? What are their contractual arrangements, which can be as much as eight to 10 percent of the TV ad buy? Which campaign is being more parsimonious with donor dollars?…
When Bad Is Good: Artworks that offend propriety are filling auctions, museums, and galleries. Is there anything left to be upset about?
April 20, 2012
“There is nothing worse than good taste,” thundered the English art critic Jonathan Jones in the Guardian in 2010. “Nothing more stultifying than an array of consumer choices paraded as a philosophy of life. And there is nothing more absurd than someone who aspires to show good taste in contemporary art.”
The occasion for such hyperbole was an exhibition of Damien Hirst’s work, at Paul Stolper gallery in London, widely derided by critics. Having often campaigned aggressively for Hirst’s status as a genius, Jones was defending himself against his peers on the slippery slopes of “taste.”
“Where being interested in Hirst would once have counted as good taste in terms of today’s art, it now stands exposed as bad taste,” wrote Jones. “I am happy to display the bad taste of still being interested in him.”
If, as Jones asserts, bad taste is nothing but good taste after a few years of aging, declaring support for someone with a bruised reputation can be just a clever way of getting a jump on next season’s fashion.
But the larger question is whether bad taste is even a consideration anymore. And if so, what might it mean? Take, for example, Hirst’s 2007 sculpture For the Love of God (a.k.a. the “Crystal Skull”), a platinum cast of a human cranium encrusted with 8,601 flawless diamonds: the piece was interpreted by some as a commentary on wretched excess in an art world awed by glamour and swimming in cash, but, whatever Hirst’s satiric intention may have been, it was hard to detect in the asking price of £50 million.
Through canny marketing and promotion, the skull became Hirst’s most talked-about work since The Physical Impossibility of Death in the Mind of Someone Living, from 1992. Crowds lined up around the glass case when the “Crystal Skull” went on view at London’s White Cube Gallery and then at the Rijksmuseum in Amsterdam, even as critics fumed.
The Maurizio Cattelan retrospective at the Guggenheim Museum in New York this past winter was another case of an artist whose bad-boy image—and impish mockery of same—has been immensely profitable for him and a winning ticket with audiences and some of the press. It didn’t seem to matter that the show received a shellacking from most of the critics, the spectacle nevertheless attracted record crowds (roughly 4,000 people a day), prompting the museum to add extra hours to accommodate them.
Cattelan, like Hirst, has hit on a formula that forecloses on the possibility of an audience’s feeling insulted. Only a tiny number of Catholics took umbrage at La Nona Ora, Cattelan’s 1999 sculpture of Pope John Paul II struck by a meteor, and even they weren’t sure why they should be offended. When the piece sold at auction in 2004 for $3 million, Cattelan’s act of smirking impiety was confirmed as a high-priced collectible. As Peter Schjeldahl wrote in the New Yorker, Cattelan’s career “reveals, or even fortifies, the fact that self-parody has become the life-support system of international art infrastructures. Make people feel smart, and they will put up with anything. The mindset cannot be outflanked or overturned, because it routinely performs those operations on itself.”
Bad taste often passes for avant-garde taste these days—so long as the artist signals “transgressive” intent. And whereas kitsch in art was once to be assiduously disdained, art that traffics in sentimentality and bathos behind a dancing veil of ironic laughter has become highly prized. Jeff Koons, John Currin, Lisa Yuskavage, Richard Prince, and Takashi Murakami are just a few of those who have learned that coy subversion can be popular and lucrative. As long as everyone is in on the joke that the art is satirizing its own historical codes of representation, there is nothing to be upset about.
More difficult to place is outsider art, a genre that has expanded over the last 20 years from focusing on traditional folk crafts to including obsessive outpourings by the mentally ill and flea-market pickings. Jim Shaw’s influential “Thrift Store Paintings” exhibition and book, from 1990, assembles and appropriates works by unknowns that prove to be as disturbing and complex as anything dreamed up by a schooled Surrealist. The Museum of Bad Art, founded in 1994 and now with three galleries in the Boston area, collects works by amateurs that, as its website says, have “a special quality that sets them apart in one way or another from the merely incompetent.” The sincerity and conviction of the ineffectual artist is often what is so moving.
Further muddying the issue are the many painters and sculptors who now make deliberately “bad” art. The awkward figuration and ugly color harmonies in the canvases of Albert Oehlen and Werner Büttner, for example, are polemical sorties against good taste and look back to late de Chirico, Picabia’s nudes from the late 1940s, and Magritte’s période vache, all of which were similarly directed against academic surrealism. The proudly slipshod handiwork of Martin Kippenberger has spawned a school of admirers and earned him a MoMA retrospective in 2009. The gimcrack, bauble-encrusted assemblages of Rachel Harrison and Joana Vasconcelos owe as much to Kippenberger as to Rauschenberg.
Art made in a riotous spirit of bad taste not only undermines academic notions of correctness and stability, but it also renders itself virtually impervious to criticism, arming itself against attack from realists, modernists, Minimalists, and Post-Minimalists alike by gleefully confessing to its own intentionally questionable quality…
IN THE FOURTH century, the Roman emperor Constantine laid the foundations for a remarkably durable venture. This was the spread of Christian empire. Constantine’s own empire would divide into Eastern and Western halves, from which multiple Christian empires emerged. One was Byzantium, and another was the Holy Roman Empire. Later European empires were legion: the Spanish, the Portuguese, the French, the Russian, the Austro-Hungarian, and the British empire, with its many claims on modern geopolitics. “Christianity is an inherently expansive faith,” Andrew Preston writes, and this faith has often accompanied imperial expansion. Church and empire were inclined to march together. Or so it must have seemed in Europe until World War I—“Christendom’s ultimate civil war,” in Preston’s words.
All the great Christian empires are now dead. France, Spain, Portugal, Austria, Russia, and Britain are no longer empires, and these diminished modern states currently do little to align their foreign policy with Christian causes. If the European Union is a federation of states, and one to which only Christians need apply, it is no empire, and its official language is studiously un-Christian.
Preston’s new book on religion and foreign policy is not about Europe. It is about America and Americans. In over six hundred pages, Preston charts the scope and the centrality of religion in American politics, from the seventeenth century to the present. This book merges American history with the history of Christianity, and in doing so it qualifies the story of Christian empire. Unlike the Christian empires of the past, America has never had an established church. Nor did the American Revolution result in empire. The animating spirit behind much of Preston’s narrative is Christian republicanism, and no Christian republic has ever had the territory or the influence or the power that the United States would come to possess.
Preston’s argument is worth outlining in detail. It has the shape of a double helix. One strand entails the melding of Christian sentiment with state power, through diplomatic maneuvers and the waging of war. This is the sword of the spirit, cherished by the Puritans and by George W. Bush alike. The other strand inverts the ideal of the church militant, appealing instead to a Christian hunger for international peace, for the beating of swords into ploughshares, for a fraternity of nations liberated from war. This is the shield of faith. Preston weaves these metaphors, both taken from Paul’s letter to the Ephesians, into a sweeping historical analysis.
Seeking to explain why “U.S. foreign policy has often acquired the tenor of a moral crusade,” Preston first turns his attention to the seventeenth century. Avidly Protestant, “the American colonies never underwent a counterreformation,” he observes, and they waged almost continuous war against enemies deemed theologically other—i.e. Catholics and Native Americans. These Christian soldiers prided themselves on fighting holy wars, regularly fitting themselves into Old Testament patterns, the New World’s Israelites imbued with “a consistent belief in America as a chosen nation and in Americans as a chosen people.”
Going forward, Preston accents the Protestant origins of the American Revolution. London was equated with Rome, and “the new political order [in America] newly codified a very old and very Protestant tradition of hostility to arbitrary power,” Preston observes. American historians have outdone themselves in analyzing the Founders and the Enlightenment, the legacy of Hume and Montesquieu in American political thought. Preston notes that “Adams, Washington, and especially Jefferson cited Milton to justify or explain their political views,” citations that reflect the rise of an American-style Christian republicanism. In the place of an established church, and opposed to the Church of England, not to mention the Church of Rome, was the first amendment to the constitution.
America’s Christian republicanism could be warlike, and it could just as well be pacifist. A Vermont newspaper labeled the War of 1812 “a holy war,” while this same war so outraged other (no less devout) New Englanders that they publically debated secession from the Union. The War of 1812 provoked “the first truly pacifistic antiwar movement” in the United States, Preston writes. Antiwar movements would continue to emanate from New England for centuries to come. In antebellum America, Christian republicanism nurtured the abolitionist spirit, and the Civil War was (among other things) a war over the proper relationship between the Christian faith and the American polity.
Preston applies a consciously contemporary vocabulary to the Civil War. This was “the nation’s first war of humanitarian intervention,” he states, with North and South construed as separate countries, one advanced and the other backwards. Abolitionists defined the Union’s campaign as “a war of liberation.” The Civil War marked another portentous development: the entry of Catholics into American civic life. What had been implacably Protestant, in the American self-conception, was becoming more broadly Christian and was destined to become Judeo-Christian in the twentieth century. Catholics, followed by Jews, did a great deal to link America to the outside world. So did millions of Protestant missionaries in the far-flung lands they were laboring to convert. In the second half of the nineteenth century, these American missionaries were “the brokers of global cultural exchange,” Preston argues, just as the United States was inserting itself into the global economy and scrambling for empire with the great European powers.
Between World War I and World War II, pacifist aspirations kept colliding with the call for war. In fact, the American presidents of this period could only justify overseas war by promising international peace. Woodrow Wilson was the first to do so, motivated in his foreign policy by “Christian reformism,” as Preston calls it. Wilson was drawn forward by his vision of a League of Nations, which was to be headquartered in Geneva, “the birthplace of Calvinism and the seat of Reformed Protestantism,” Preston reminds us. Wilson’s dreams collapsed beneath the opposition of more conservative American Protestants.
Where Woodrow Wilson failed, Franklin Roosevelt succeeded. FDR’s was a “serene spirituality,” and no less tenacious for its serenity. Synthesizing centuries of historical experience, FDR held “the Christian republican view that religion was the source of democratic freedom because it was the source of conscience and private belief,” Preston writes. Roosevelt pushed this conviction in an ecumenical direction. Catholics and Jews were invited to participate in an American project sure to outshine the authoritarian evils of Nazi Germany and imperial Japan…
April 20, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.