May 16, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.
Transcontinental Education: Soon, nearly every state will have demanding standards for what students should know. A burst of innovation won’t be far behind
May 16, 2012
Like most enterprises in nineteenth-century America, rail-roads in the early 1800s were local affairs. The first trains served mainly to carry goods between towns that canals did not reach, so each region of the country built its own rail lines. As a result, rail gauges— the width between rails—varied widely. The tracks laid between Richmond and Memphis, for instance, used a five-foot gauge, while the gauges of the Erie and Lackawanna lines in New York were six feet wide. Those in the mid-Atlantic, such as the Baltimore and Ohio, used the gauge that was standard in England: four feet eight and a half inches wide. These variations made it exceedingly difficult to connect rail lines, which in turn effectively curbed the use of railroads to conduct commerce across regions.
During the Civil War, President Abraham Lincoln recognized that this balkanized rail system also hurt something else: the war effort. He wanted to transport military materiel and goods across the country by rail. So he proposed a standard track width of five feet for the planned intercontinental railroad. He later amended his proposal to four feet eight and a half inches, to match the gauges of the largest eastern railroads, backing a plan urged by rail barons who wanted to expand their lines and their industry. This standard gauge made it possible to connect lines, and led to an explosion of railroad building. The number of track miles tripled, to 90,000, between 1860 and 1880, and then more than doubled, to 190,000, by 1900. With that expansion came the growth of whole new industries that could only be born through interstate train travel—for example, the auto industry, which depended on steel from Pennsylvania, rubber from Ohio, and coal from West Virginia, all shipped and put together in Michigan. Lincoln’s idea of a common standard helped make the United States the world’s largest industrial power.
In some ways, the American elementary and secondary education system is undergoing a transition similar to what the American rail system underwent around the time of the Civil War. For decades, each state has set its own expectations for what students should know and be able to do at each grade level. These standards might reflect the tradition of local control of education, but they have made it difficult for students to move from state to state; students transferring from fourth grade in, say, Indiana, might face a different set of expectations when they arrive in fifth grade in Illinois. And, by fragmenting the educational marketplace, these varied standards have impeded the kinds of innovations that might otherwise come with economies of scale—in testing, textbooks, and teacher education.
A profound change is quietly under way. Over the past couple of years, under the leadership of national organizations representing state leaders, nearly every state, with little fanfare, has adopted common standards for student learning in English language arts and mathematics. These standards—known as “common core standards”—spell out the knowledge and skills all students are expected to acquire in order to be prepared for college and careers by the time they graduate from high school. For example, the standards state that by the end of the third grade students should be able to distinguish their own point of view from that of a narrator. By the end of high school, students should be able to “[a]nalyze seventeenth-, eighteenth-, and nineteenth- century foundational U.S. documents of historical and literary significance (including the Declaration of Independence, the Preamble to the Constitution, the Bill of Rights, and Lincoln’s Second Inaugural Address) for their themes, purposes, and rhetorical features.”
While that last bar might seem high, the idea is that students need to be able to perform this kind of task in order to succeed in college and the workplace. And now, for the first time, nearly all students in the United States will be expected to meet that same standard, regardless of where they live. By setting common expectations, states have made it possible for students everywhere to graduate from high school similarly prepared for post-secondary education and work.
At the same time, the states have also opened the door for a flood of innovation in educational products and techniques. Educators in one state who have come up with a dynamic new way of teaching can now share their knowledge with educators throughout the country, without having to fear that their insights will be utterly lost in translation. Colleges of education can work together across state lines to redesign and improve teacher education, because the teachers they ed-ucate will be teaching to the same standards. And textbook companies can develop new and better products, taking advantage of digital technology, because they can sell to a near-national market. These materials will replace the widely loathed and ineffective products that were produced by cobbling together expectations from each of the states publishers hoped to sell to.
Perhaps most importantly, the standards are making possible new assessments that could radically transform instruction and learning. The assessments are being built by two consortia of states, which can pool resources to create better tests than states could develop on their own. As a result, the consortia plan to develop challenging and innovative assessments that measure the full range of standards, such as the ability to think critically and solve problems, rather than the narrow skills covered by conventional “fill-in-the-bubble” formats. And because of the strong influence of tests on instruction, these assessments are also likely to encourage a tremendous shift in teaching and learning in nearly every classroom in the United States. Humble standards can lead to great innovations.
Most countries have some form of educational standards. In the United States, the idea began to take off in the late 1980s. At the time, advocates believed that students would learn more if states spelled out specifically what all students should know and be able to do and aligned all aspects of the education system—teacher preparation, curriculum, testing—to those expectations. That way, all oars would be rowing in the same direction.
Unlike in other countries, where national ministries of education promulgated academic standards, the setting of standards in the U.S. began in a hybrid fashion, with national organizations developing nonbinding statements of what students should learn in each subject area, and states adopting their own standards for their students, sometimes—though not always—based on the national documents. These efforts were spurred by legislation enacted during the Clinton administration, which gave grants to states to pursue standards setting and then required states to set standards as a condition of federal aid. By the end of the 1990s, all but one state(Iowa) had developed standards.
The result of this effort was mixed. Mathematics performance for nine- and thirteen-year-olds rose substantially between 1999 and 2004, a period when standards were in place, after being flat throughout the 1990s, according to the National Assessment of Educational Progress (NAEP), a federal testing program. Reading scores for nine-year-olds went up as well. But scores for seventeen-year-olds remained flat in both subjects…
According to the conventional interpretation of the global economic recession, growth has ground to a halt in the West because demand has collapsed, a casualty of the massive amount of debt accumulated before the crisis. Households and countries are not spending because they can’t borrow the funds to do so, and the best way to revive growth, the argument goes, is to find ways to get the money flowing again. Governments that still can should run up even larger deficits, and central banks should push interest rates even lower to encourage thrifty households to buy rather than save. Leaders should worry about the accumulated debt later, once their economies have picked up again.
This narrative — the standard Keynesian line, modified for a debt crisis — is the one to which most Western officials, central bankers, and Wall Street economists subscribe today. As the United States has shown signs of recovery, Keynesian pundits have been quick to claim success for their policies, pointing to Europe’s emerging recession as proof of the folly of government austerity. But it is hard to tie recovery (or the lack of it) to specific policy interventions. Until recently, these same pundits were complaining that the stimulus packages in the United States were too small. So they could have claimed credit for Keynesian stimulus even if the recovery had not materialized, saying, “We told you to do more.” And the massive fiscal deficits in Europe, as well as the European Central Bank’s tremendous increase in lending to banks, suggest that it is not for want of government stimulus that growth is still fragile there.
In fact, today’s economic troubles are not simply the result of inadequate demand but the result, equally, of a distorted supply side. For decades before the financial crisis in 2008, advanced economies were losing their ability to grow by making useful things. But they needed to somehow replace the jobs that had been lost to technology and foreign competition and to pay for the pensions and health care of their aging populations. So in an effort to pump up growth, governments spent more than they could afford and promoted easy credit to get households to do the same. The growth that these countries engineered, with its dependence on borrowing, proved unsustainable.
Rather than attempting to return to their artificially inflated GDP numbers from before the crisis, governments need to address the underlying flaws in their economies. In the United States, that means educating or retraining the workers who are falling behind, encouraging entrepreneurship and innovation, and harnessing the power of the financial sector to do good while preventing it from going off track. In southern Europe, by contrast, it means removing the regulations that protect firms and workers from competition and shrinking the government’s presence in a number of areas, in the process eliminating unnecessary, unproductive jobs.
THE END OF EASY GROWTH
To understand what will, and won’t, work to restore sustainable growth, it helps to consider a thumbnail sketch of the economic history of the past 60 years. The 1950s and 1960s were a time of rapid economic expansion in the West and Japan. Several factors underpinned this long boom: postwar reconstruction, the resurgence of trade after the protectionist 1930s, more educated work forces, and the broader use of technologies such as electricity and the internal combustion engine. But as the economist Tyler Cowen has argued, once these low-hanging fruit had been plucked, it became much harder to keep economies humming. The era of fast growth came to a sudden end in the early 1970s, when the OPEC countries, realizing the value of their collective bargaining power, jacked up the price of oil.
As growth faltered, government spending ballooned. During the good years of the 1960s, democratic governments had been quick to expand the welfare state. But this meant that when unemployment later rose, so did government spending on benefits for the jobless, even as tax revenues shrank. For a while, central banks accommodated that spending with expansionary monetary policy. That, however, led to high inflation in the 1970s, which was exacerbated by the rise in oil prices. Such inflation, although it lowered the real value of governments’ debt, did not induce growth. Instead, stagflation eroded most economists’ and policymakers’ faith in Keynesian stimulus policies.
Central banks then changed course, making low and stable inflation their primary objective. But governments continued their deficit spending, and public debt as a share of GDP in industrial countries climbed steadily beginning in the late 1970s — this time without inflation to reduce its real value. Recognizing the need to find new sources of growth, Washington, toward the end of President Jimmy Carter’s term and then under President Ronald Reagan, deregulated many industries, such as aviation, electric power, trucking, and finance. So did Prime Minister Margaret Thatcher in the United Kingdom. Eventually, productivity began to pick up.
Whereas the United States and the United Kingdom responded to the slump of the 1970s with frenetic deregulation, continental Europe made more cosmetic reforms. The European Commission pushed deregulation in various industries, including the financial sector, but these measures were limited, especially when it came to introducing competition and dismantling generous worker protections. Perhaps as a result, while productivity growth took off once again in the United States starting in the mid-1990s, it fell to a crawl in continental Europe, especially in its poorer and less reform-minded southern periphery. In 1999, when the euro was introduced, Italy’s unemployment rate was 11 percent, Greece’s was 12 percent, and Spain’s was 16 percent. The resulting drain on government coffers made it difficult to save for future spending on health care and pensions, promises made even more onerous by rapidly aging populations.
DISRUPTING THE STATUS QUO
For the United States, the world’s largest economy, deregulation has been a mixed bag. Over the past few decades, the competition it has induced has widened the income gap between the rich and the poor and made it harder for the average American to find a stable well-paying job with good benefits. But that competition has also led to a flood of cheap consumer goods, which has meant that any income he or she gets now goes further than ever before.
During the postwar era of heavy regulation and limited competition, established firms in the United States had grown fat and happy, enjoying massive quasi-monopolistic profits. They shared these returns with their shareholders and their workers. For banks, this was the age of the “3-6-3″ formula: borrow at three percent, lend at six percent, and head off to the golf course at 3 PM. Banks were profitable, safe, and boring, and the price was paid by depositors, who got the occasional toaster instead of market interest rates. Unions fought for well-paying jobs with good benefits, and firms were happy to accommodate them to secure industrial peace — after all, there were plenty of profits to be shared.
In the 1980s and 1990s, the dismantling of regulations and trade barriers put an end to this cozy life. New entrepreneurs with better products challenged their slower-moving competitors, and the variety and quality of consumer products improved radically, altering peoples’ lives largely for the better. Personal computers, connected through the Internet, have allowed users to entertain, inform, and shop for themselves, and cell phones have let people stay in constant contact with friends (and bosses). The shipping container, meanwhile, has enabled small foreign manufacturers to ship products speedily to faraway consumers. Relative to incomes, cotton shirts and canned peaches have never been cheaper…
In countries that did reform, deregulation was not an unmitigated blessing. It did boost entrepreneurship and innovation, increase competition, and force existing firms to focus on efficiency, all of which gave consumers cheaper and better products. But it also had the unintended consequence of increasing income inequality — creating a gap that, by and large, governments dealt with not by preparing their work forces for a knowledge economy but by giving them access to cheap credit…
May 16, 2012
The soldiers around me were barely visible, but I could smell them. They had not washed for days, and a sharp musk of sweat and sleeplessness, tobacco and chemically mummified food, wove through the fields and orchards. It was after midnight, moonless, the stars brilliant but unhelpful. The soldiers wore night-vision goggles, but I did not, so I stumbled after their scent along the remote edge of a fading war, envisioning things I could not see.
Up ahead, in the stream of black shapes, were the American soldiers I had come to fear. They were men who enjoyed demolishing Afghan houses, men who shot dogs in the face. The pair who had embraced like lovers, one tenderly drawing the blade of his knife along the pale, smooth skin of his friend’s throat. There was a guy who’d let the others tie his legs open and mock-rape him, and there were several men who had boasted of plans to murder their ex-wives and former girlfriends.
We paused in the darkness. A line of Afghan soldiers shuffled past, also nearly blind without night-vision equipment. They moved into position for the coming raid, clumsy as boxcars, trailing their own earthy stink. I thought back to what an American Army sergeant had told me hours earlier.
“This is where I come to do fucked-up things.”
His face had been clear and smooth, his smile almost shy. It was a statement of happy expectation, as though Afghanistan were a playground. He was the de facto leader of a platoon I will call Destroyer, and although he is a real person, not a composite, I have heard his words in many variations, from many American combat troops. But he and some of his men were the first I had met who seemed very near to committing the dumb and vicious acts that we call war crimes.
We marched on, toward houses the soldiers planned to raid and doors that would soon be blasted open, toward men who would be ripped awake, blindfolded, and hauled away. The sergeant’s words rattled in my head. I hoped the men would not do anything terrible.
Since 2006 I have written off and on about the wars in Afghanistan and Iraq. Nearly all of my work in those countries has been done embedded with NATO, mostly American military units. Many times I have watched soldiers or Marines, driven by boredom or fear, behave selfishly and meanly, even illegally, in minor ways. In a few searing moments I have wondered what would come next, what the men would do to prisoners or civilians or suspected insurgents. And I have wondered how to describe these moments without reporting melodramatic minutiae or betraying the men who allowed me in.
Most soldierly stupidity does not amount to crime; most soldiers never commit atrocities. U.S. soldiers shooting at goats, for example, or pilots getting drunk on base, or guards threatening the lives of prisoners, all things I have seen, defy military rules and erode efforts to win hearts and minds. But how bad is it, really? Do we care? What is my responsibility when I see it? I have never found good ways to write about the subhuman wash of aggression and the small episodes of violence military men and women cycle through daily, or the choices they make in the midst of this.
We tend to ignore such problems unless they are connected to a crime. An editor at a major magazine once dismissed such unsteady subjects by saying, “Yes, but bad things happen everywhere.” Perhaps she was telling me to lighten up. She was also summarizing a national attitude toward the wars. I write about it now because what I witnessed with Destroyer, and other units, routinely and unquietly returns to me.
I joined the platoon last summer at the end of a weeklong mission designed to clear insurgents from a series of towns and valleys in central Afghanistan. In 10 years of war, I was told, NATO troops had never visited the region. Intelligence reports called it a Taliban stronghold, and commanders expected heavy fighting. Going in, many soldiers told me they believed they would die.
Destroyer and several other units had dropped into the valleys by helicopter at night. During the day, they pushed through a sun-killed landscape of rock and withered grasses, where it was Destroyer’s job to search for weapons caches and battle insurgents alongside a wobbly unit of Afghan National Army (ANA) troops.
Each night, the men slept in abandoned qalats (fortified residential compounds), or they moved into occupied ones, handed the residents some cash, and kicked them out. I met the soldiers at a qalat they had temporarily confiscated, a large, newly painted house. Tall walls enclosed a courtyard containing a small orchard, a garden, and a well. Several rooms ran along one wall, and the soldiers had moved into them, sleeping head to foot on floors littered with cigarette packs, candy wrappers, and food scraps. The place was heavy with a scent I would later follow through the night.
I first met Staff Sergeant James Givens, as I will call him, outside one of these rooms. I had been asking about a dog that lay on the far side of the courtyard beside a heap of garbage. Like many soldiers, I sought out dogs whenever possible in Afghanistan, hoping to pet them or play with them, searching for a reminder of home. No one was paying this dog any attention. A soldier told me with a laugh that it was sleeping, so I walked over and found the animal leashed and dead, killed by a gunshot.
“What dead dog?” Givens said, grinning. “He’s just takin’ a nap.”
A captain standing nearby asked rhetorically and perhaps for my benefit if it had really been necessary to shoot a pet. Givens laughed.
“Sir, we’ve left plenty of animals alive in this area.”
One of those bits of violence. I shifted gears and began doing my job, hanging out with Givens and his men, hearing their stories while we waited for dark and that night’s raid. The dog continued napping for another day until Afghan soldiers, preparing their dinner a few feet away, wearied of the odor and moved the carcass.
The men of Destroyer said that so far the worst-case scenarios had not unfolded. They had searched houses and outbuildings and found little evidence of insurgents. Fighting in the valleys and towns was relatively light; mortars now and then, some rifle fire. Across the entire operation only one soldier, an Afghan, had been killed. The Taliban had not mined the region with IEDs or dug into the hillsides in anticipation of a grand battle. Most Taliban, if ever they had been in the area, slipped away while the Americans and the ANA flooded in…
May 16, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.