Iran’s Nuclear Research

February 16, 2012

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

GOP Ink

February 16, 2012

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

The Walrus:

CANADA EXISTS FOR NO NATURAL REASON

Let’s begin with an obvious fact no one will admit: Canadians and Americans are more or less the same people. A Torontonian in New York does not stick out, while a Kentuckian well might. Neither does a resident of Medicine Hat, Alberta, feel out of place in Butte, Montana, though a Vancouverite definitely would. Which is not to say that no significant differences exist between Canadians and Americans — just that our shared national border, unlike those of Europe, was not shaped by linguistic and ethnic variations. The War of 1812 made all the difference here. A complicated and unpleasant struggle, mostly forgotten, sundered our two countries. And that struggle is now 200 years old, which makes this as good a time as any to start remembering.

CANADA EXISTS BECAUSE OF THE WAR OF 1812

Military historians generally describe the War of 1812 as a stalemate. After two and a half years of fighting, not much changed between the United States and England, nor between the United States and Canada. But the war — as much as the more decisive battle of the Plains of Abraham, the American Revolution, and the Civil War — foretold North America’s political shape, its current reality. For the US, the war confirmed its status as a sovereign state and tested the limits of manifest destiny. On this side of the border, the matter is much simpler: if we hadn’t won the War of 1812, we wouldn’t be Canadian.

CANADA EXISTS BECAUSE OF TAXES (AND TAX BREAKS)

In a continental irony, after the revolution the new American government had to raise taxes far higher than British authorities had ever dreamt of doing, to finance the overthrow of Westminster rule. Across the border, the British suddenly realized colonists could easily grow alienated. So they lowered taxes and offered prospective settlers of Upper Canada 200 acres of free land. Loyalists and late Loyalists, followed by tax exiles and land speculators from the newly united States, quickly populated what would become the province of Ontario. Upper Canada’s settler population ballooned, from 6,000 in 1785 to 14,000 in 1791, with men and women looking for opportunities and willing to wink at their US citizenship — just as British officials willingly welcomed them back from their flirtation with liberty, without too many questions.

So in the days leading up to the war, it was optimistic but not preposterous for Representative John A. Harper of New Hampshire to predict that Canadians would greet American soldiers as liberators: “They must sigh for an affiliation with the great American family — they must at least in their hearts hail that day, which separates them from a foreign monarch, and unites them by holy and unchangeable bonds, with a nation destined to rule a continent.” They would not, after all, be invaded by a foreign people. Canadians would be brought back into the fold of American Revolutionary ideals.

What occurred two centuries ago was more or less a family feud. The Pulitzer Prize–winning historian Alan Taylor titled his 2010 survey of the conflict The Civil War of 1812 — a phrase that captures the profound connection between the combatants, and the unstable relationship between the Empire and the Republic. “Brother fought brother in a borderland of mixed peoples,” he writes. Those on both sides instantly recognized, and noted, the unique horror of firing on people so like themselves.

CANADA EXISTS BECAUSE OF HUBRIS

The reasons the United States invaded Canada were, and remain, contentious and unclear. Officially, residues of the revolution — unresolved issues of maritime law, military conscription, and possession of the Ohio Country — led to the declaration of war on June 18, 1812. But the unofficial reasons — the prize and the odds of success — were grubby, petty. “United States,” then as now, was something of a misnomer, and the war emerged out of squabbling between the majority Jeffersonian Republicans, who hungered for expansion, and the minority Federalists, who benefited from close economic ties with Britain. By declaring war, the Republicans intended to make the Federalists look anti-patriotic and undemocratic. Both parties, however, believed the conquest of Upper and Lower Canada would be a cakewalk. At sea, the Royal Navy, though sixty times the size of the fledging US Navy, was distracted by Napoleon’s ships; on land, nearly eight million Americans would square off against just 300,000 Canadians. In August 1812, former president Thomas Jefferson declared, “The acquisition of Canada, this year, as far as the neighborhood of Quebec, will be a mere matter of marching.”

CANADA EXISTS BECAUSE OF A SERIES OF LUCKY BREAKS

The comedy of errors began immediately. Even though the United States declared war, which should have given its soldiers the advantage of surprise, a messenger carried the news to Canadian military outposts on the Niagara River before it reached the New York side. British soldiers quickly detained several important — and unsuspecting — American personages on our side of the river. The initial blunder served as a telling prequel to subsequent disasters.

CANADA EXISTS BECAUSE OF BLUNDERING AMERICANS

The grand view of history has traditionally offered two paths for the interpretation of events. The first imagines social trends rising up from below to sweep humanity along in their irrepressible, all-powerful waves. The other dreams of iconic figures who shape history through their own vision and will. In the case of the United States during the War of 1812, we find neither. Instead, a third way emerges: history dominated by stupidity and impulse. From the revolution to the present moment, hardly a single generation of Americans has passed without giving rise to a bona fide military genius. The Civil War alone produced half a dozen. To Canada’s good fortune, the post-revolution US Army was stacked with bunglers and officers past their prime. It might have taken Canada easily, if not for the miraculously systemic idiocy among the top brass…

Read it all.

City Journal:

Squelching rumors this past fall of a presidential run, New Jersey governor Chris Christie observed that he had lots more to do to fix a “broken” state. He wasn’t kidding: though already the nation’s most heavily taxed state, New Jersey can balance its budget only by ignoring billions of dollars in employee pension liabilities and by slashing aid to struggling local governments. Christie has pushed through reforms that cut spending and cap property-tax increases. But he has only begun to grapple with an institution that bears much of the responsibility for the state’s fiscal woes: the New Jersey Supreme Court.

For half a century now, New Jersey has been home to the most activist state appellate court in America. Lauded by proponents of “living” constitutions who urge courts to make policy instead of interpret the law as written, the New Jersey Supreme Court has profoundly transformed the Garden State by seizing control of school funding, hijacking zoning powers from towns and cities to increase subsidized housing, and nullifying taxpayer protections in the state constitution. Its undemocratic actions have blown apart the state’s finances and led to ill-conceived and ineffective policies. If you want to understand what rule by liberal judges looks like on the state level, you need only look at New Jersey, which is teetering on bankruptcy though it remains one of America’s wealthiest states.

In January, Christie nominated two new members to the court, appointments that have the capacity to reshape the seven-member panel. But taming the court won’t be easy, even for the pugnacious Christie, whose initial efforts to reform it met ferocious resistance. “I don’t think the supreme court has any business being involved in setting the budget of the state government,” Christie complained last year. Yet it is involved, extensively—and that must change if Jersey taxpayers are ever to find relief.

New Jersey’s supreme court, charged with hearing cases brought to it from lower judicial levels, is the product of the state’s 1947 constitution, which replaced an unwieldy 16-member Court of Errors and Appeals with today’s seven-member body, appointed by the governor and confirmed by the state senate. A dean of New York University’s law school, Arthur Vanderbilt, served as the new court’s first chief justice. Vanderbilt is best remembered today for persuading President Dwight Eisenhower to appoint William Brennan, at the time also a Jersey justice, to the U.S. Supreme Court, whose liberal activist wing he led for more than three decades.

As chief justice for nine years, Vanderbilt helped forge the New Jersey Supreme Court’s expansive understanding of its role. For instance, he wrote the majority opinion in Winberry v. Salisbury, a decision that gave the court itself, not the legislature, the power to make rules for the state judiciary. That ruling set New Jersey’s judiciary apart from the court systems in most other states—as well as from the federal judiciary, which ultimately derives its authority from Congress. Some critics have even argued that Winberry violates the U.S. Constitution’s guarantee that every state must have a republican form of government. “Under the doctrine ofWinberry v. Salisbury,” wrote New Jersey lawyer Anthony Kearns in a 1955 ABA Journal article, “we can only conclude that laws of practice and procedure are exclusively in the hands of men who are not elected.”

Since Winberry, the court has usurped the roles of the governor and the state legislature in many other areas, relying on questionable readings of the New Jersey Constitution to pursue its own views of justice. But nowhere has the court’s ambition had a bigger or more disastrous impact than in education policy, particularly with a series of decisions, collectively known as Abbott v. Burke, that have massively extended judicial control over the Jersey schools.

The Abbott cases initially resembled dozens of “fiscal-equity” lawsuits filed around the country beginning in the late 1960s. These suits challenged education funding levels for urban school districts, arguing that because schools were financed through local property taxes, wealthy districts received far more funding than less affluent ones did—especially as migration from troubled cities dragged down property values. This, the lawsuits contended, violated various provisions in state constitutions.

In 20 states, judges largely dismissed the suits as outside the scope of those constitutions. In 16 others, courts ordered states to come up with more equitable ways to finance the schools. This generally meant spending more money, often raised through sales and income taxes, in lower-income districts. New Jersey was one of the 16; in its case, the key constitutional phrase guaranteed state residents a “thorough and efficient system of free public schools.” At first, the New Jersey Supreme Court followed the path of other state courts and simply ordered extra spending in poor districts. But in 1976, when the state legislature didn’t comply, the court ordered the schools shut down until the legislature agreed to institute a tax to fund the new spending. The chief justice at the time was Richard Hughes, who had previously spent eight years as governor trying fruitlessly to get the state legislature to enact an income tax to boost education spending. “They didn’t want the income tax then? Well, they’ll want one now,” Hughes told the press. Years later, he admitted that he had wielded as much power as chief justice as he had in the governor’s seat.

But an advocacy group called the Education Law Center challenged the new spending. Merely giving urban schools new funding wasn’t enough to satisfy the “thorough and efficient” clause, the group argued; the state had to fund education in urban districts at a level that would enable them to compete with plush suburban districts. In 1985, the New Jersey Supremes agreed, and when James Florio took the governor’s office five years later, he complied by passing $2.8 billion in sales- and income-tax increases—the largest such hike in state history—to bring the city schools up to par. The court, still unsatisfied, quickly ruled that the state had to add yet more “supplemental” spending to poor districts to help offset the “additional disadvantages” that students in those areas faced. After the tax increase cost Florio reelection in 1993, his successor, Christine Todd Whitman, passed her own education financing formula, which sought to increase spending in urban districts to within $1,200 per pupil of what Jersey’s richest districts spent.

That was still not enough for the supreme court. In what became known as Abbott IV, the judges declared Whitman’s plan unconstitutional and ordered the state to fund poor districts generously enough that their per-pupil spending would be the same as in the state’s wealthiest districts—which were among the richest in the nation. The court also ruled that the state had to pay for a menu of new social programs for kids in poor districts (now called “Abbott districts”).

Over the next decade, as the plaintiffs returned to the New Jersey Supreme Court nearly a dozen times, the judges steadily transformed the nature of the case. No longer was it simply a matter of “fiscal equity”; rather, it morphed into what judicial analysts call an “adequacy” case, in which the court determines what constitutes an adequate education. Judges don’t merely determine levels of spending; they also initiate and monitor specific programs—policy details that, in most states, are left to elected officials. In short, the supreme court seized power in education policy.

It used that power in unthinking and expensive ways. For example, it ordered pre-K classes to be offered to all three- and four-year-olds in Abbottdistricts, even though the state constitution guarantees public education only for children “between the ages of five and eighteen.” Some studies have found no educational gain from such programs, while others suggest that only students in extremely expensive versions, with small student-to-teacher ratios, can make tiny gains, hard to replicate across entire school systems. “The evidence does not support instituting broad, full-scale programs,” conclude education scholars Eric Hanushek and Alfred Lindseth in their 2009 book Schoolhouses, Courthouses and Statehouses. The pre-K program would grow to cost New Jersey $500 million annually.

Meanwhile, the court ignored numerous examples of how some urban school districts were failing because they were run by corrupt, wasteful political machines whose main goals were patronage and power, not educational success. In 1986, the state described the Jersey City school system as “adrift, having a managerial structure which is a product of politics and patronage.” In 1994, after a long investigation, the state issued a damning report on the Newark school system, accusing it of being “at best flagrantly delinquent and at worst deceptive in discharging its obligations to the children.” At the time, the Newark system, thanks to theAbbott-mandated state aid, was spending $10,700 per pupil, significantly higher than the Jersey average of $8,571. In 2002, a similar state investigation into Camden schools found, as the Courier-Post of Cherry Hill summarized it, “a lack of planning, a chaotic budget process, too many employees in virtually every department and lack of spending controls.” At the same time, a state arbitrator working in another Abbott district, Asbury Park, wrote that “a crisis exists” because of “a pervasive feeling of educational corruption.”…

Read it all.

TNR:

For many, the 2008 election wasn’t just a victory for Democrats—it was also the long-awaited return of young adults to the voting booth. Now Obama supporters are hoping that, come Election Day 2012, young adults will once again turn out in droves. But 2008 probably didn’t signal a permanent resurgence of the youth vote. In fact, there are good reasons to believe that young people will vote in significantly lower numbers this time around.

It has long been a puzzle why so many young adults do not vote—and why their already low voting rate has generally fallen over the decades. In 1972, 53 percent of 18-to-29-year-olds went to the polls. By 2000, the figure had fallen to just 36 percent, a historic low. (In contrast, the voting rate among people aged 65 or older rose five percentage points during those years, to 68 percent.) There is no doubt that the Obama campaign of 2008 energized the under-30 crowd, boosting their voting rate to 46 percent. But even then, fewer than half of 18-to-29-year-olds went to the polls compared with more than two-thirds of people aged 65 or older, according to the Census Bureau.

So why don’t young adults vote? That’s a vexing question political campaigns have been asking for decades. The most likely answer is that young adults do not vote because many are still—in a sense—children, without adult commitments or responsibilities. The data suggest that three factors consistently make a difference in voting rates: money, marriage, and homeownership. Those are the adult commitments that give people a stake in society; to protect and expand their stake, they vote. Take a look at money and voting: The gap in voter participation between the highest and lowest income groups is a stunning 26 percentage points. For marriage and homeownership, the gaps are 16 to 17 percent.

Recent years have seen Americans in their twenties delay starting careers, getting married, and buying homes—and as the road to adulthood has lengthened, voting rates among the young have generally fallen (the notable exceptions are 2004 and 2008). Now, the bad economy is exacerbating these trends. For the nation’s young, the Great Recession has turned money, marriage, and homeownership into an impossible dream.

Let’s start with money. Among 18-to-29-year-olds in the labor force, fully 44 percent are unemployed or underemployed, according to a Gallup survey—that’s more than any other demographic segment. The financial consequences are not pretty. Householders under age 25 lost more ground than any other group between 2008 and 2010, according to the Census Bureau, their median income falling by 13 percent after adjusting for inflation. Those aged 25 to 29 had the next highest decline, with their median income falling five percent.

This economic climate has led twentysomethings to put off another traditional marker of adulthood: marriage. Young adults are not just postponing marriage—they are shunning it, and it’s not hard to figure out why. Being holed up in your parents’ basement with creditors pounding on the door does not impress the guys or girls. That scenario, playing out in communities across the nation, explains why 64 percent of men aged 25 to 29 were still single in 2011, up from 59 percent in 2008. Among women in the age group, the never-married share grew from 45 percent to just over 50 percent. Without financial security, marriage is increasingly off the table.

Not surprisingly, homeownership rates have similarly plunged among young adults. Historically, homeowners become the norm in the 30-to-34 age group, when the homeownership rate rises above 50 percent. This has been the case in every year of theCensus Bureau’s data series, which began in 1982. A 53.5 percent majority of householders aged 30 to 34 owned their home in 2008. By 2011, however, only 49.8 percent were homeowners—the first time the figure has fallen below the 50 percent threshold…

Read it all.

Iran: Things Are Looking Up

February 16, 2012

Via Newsday

The Classics

February 16, 2012

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

Follow

Get every new post delivered to your Inbox.

Join 81 other followers