The Medal of Honor is the highest award the United States bestows on members of its armed forces. It tells you something about the nature of the decoration that of the 246 Medals of Honor given during the Vietnam War, 154 of them had to be pinned to body bags. The citations of the 92 survivors make clear that what spared them was little more than dumb luck: bullets that barely missed their livers and hearts; feet that somehow danced around land mines and booby-traps; and on one occasion, a North Vietnamese grenade that just fizzled out harmlessly, while one extremely fortunate sailor lay on top of it, waiting for it to blow up, and hoping that his body would shield the others.
Bravery and luck are uneasy partners. A “good” person, according to our mothers, is one who makes the right decisions, even when they are hard ones. Being brave feels like it should be the same as being good: in a garden of forking paths, the trail that looks dark or deadly is often the morally correct one. Brave people take the right path, and what makes them brave is their mental state when they choose it. To say that luck is involved seems to violate the spirit of bravery. Indeed, the intuitive view is that the only thing that really determines whether an act is brave is whether you think the right path is the treacherous one, and take it anyway.
The exalted status of the Medal of Honor serves as a curious test case for our collective intuitions about bravery. And it turns out the intuition above is a confused one. The view that the Medal is something you get by being a brave soldier, and that bravery is a state of mind, has led veterans of the Afghanistan and Iraq war to wonder why they have won only seven Medals—one every year and a half, compared to one every few weeks in Vietnam and one every few days in the Second World War. There will soon be an eighth: last week, the White House announced a forthcoming Medal of Honor for Staff Sgt. Sal Giunta, who ran through enemy fire to rescue a wounded comrade. Giunta will be the first living Medal of Honor recipient since Vietnam.
Combat bravery is not exactly rare, so why is it so difficult to merit the Medal of Honor? In practice, to earn the Medal you have two methods at your disposal: the hard way and, well, the other hard way. Call them Hard Way #1 and Hard Way #2.
Hard Way #1 is to commit a multi-part act of near comic-book-style heroism (see here or here, or Giunta’s case) and, more often than not, die. Pentagon committees then convene to determine whether your valor merits an award traditionally given for acts so brave that no one would have even thought to complain if the soldier had neglected to do them.
Hard Way #2 is a faster and surer method of winning the Medal: smother a grenade with your body and save the lives of your fellow servicemen. This method nearly always wins the Medal—some 70 times in Vietnam, and three times since September 11—but the catch is that you almost always die.
Almost. In May 1968, Donald E. Ballard was serving in Vietnam as a Navy medical corpsman. His job meant that he had to run into extreme danger constantly, since casualties were often too badly wounded to come to him. In Quang Tri province, after a North Vietnamese Army ambush, Ballard leapt on what he thought was a live grenade. The grenade did not go off. According to his citation, he lay on the weapon for a few instants, then “calmly arose from his dangerous position” and kept on treating casualties. Ballard went on to a commission in the Army and distinguished service in the Kansas National Guard, retiring last year as colonel.
The standard intuition about Ballard’s situation is that he deserves the Medal as much as the seventy-odd men who smothered grenades that detonated and killed them. Ballard didn’t even get scratched. But how could Ballard know the grenade was a dud? If the mental act of jumping on a grenade is bravery, then his actions do qualify.
The case gets trickier when you pose a few hypotheticals. What if the North Vietnamese soldier had thrown not a grenade but a bar of soap carved realistically to look like one? Would the nation have decorated Ballard for jumping on a bar of soap? What if it was not a grenade at all, but a piece of fruit that fell from a tree, and that in the heat of combat Ballard mistook for a grenade? The alternate-universe Ballard who jumped on a bar of soap or a piece of fruit had the same mental state as the Ballard who threw himself upon a dud grenade. Can you get the Medal of Honor for jumping on a mango?
They incite panic in the souls of even the most diligent students. Everything about final exams is fraught with terror: the blue books passed out from the front of the room, the clock ticking on the wall, three hours to finish in some large auditorium with banked seating, and grade point averages hanging in the balance. If professors listen closely enough, they can hear the sound of pens scribbling and caffeine pumping through the veins of 200 students who have been cramming for days, intent on learning, if for no other reason than they don’t want to fail.
These exams are not just a rite of passage, but a fundamental and longstanding tool that American college professors have been using, in some format, since the 1830s. Now comes the twist, the pop-quiz question of the day: What happens when the final exam starts vanishing from American higher education?
The answer: No one knows. But apparently we’re about to find out.
Across the country, there is growing evidence that final exams — once considered so important that universities named a week after them — are being abandoned or diminished, replaced by take-home tests, papers, projects, or group presentations. Anecdotally, longtime professors say they have been noticing the trend for years. And now, thanks to a recent discussion at Harvard University, there are statistics that make clear just how much the landscape has changed.
In the spring term at Harvard last year, only 259 of the 1,137 undergraduate courses had a scheduled final exam, the lowest number since 2002, according to Jay M. Harris, the dean of undergraduate education. Harris said he’s hesitant to read too much into the numbers, which, he said, don’t include whatever final exams were scheduled in language courses, don’t reflect the other forms of assessment that have replaced exams, and don’t account for small seminar classes, which typically would not have a traditional, sit-down, blue-book final.
But the low rate of actual scheduled finals at Harvard last spring — just 23 percent — was considered significant enough to prompt one striking change. For years, final exams in Cambridge were considered a given, and the bureaucratic rules reflected that reality. Courses were simply assumed to include a seated, three-hour final exam; any professor who wished to opt out had to request permission. But that wasn’t happening, Harris said, forcing the registrar’s office to track down professors each semester, only to learn that, no, they were not planning on a final exam. So starting this fall, the onus has been flipped: The university will assume there will be no finals in courses. Any professor who actually wants to hold one will need to say so.
The change, which was first reported in Harvard Magazine, is not a statement on the value of final exams one way or the other, Harris said. But the shrinking role of big, blockbuster tests at Harvard and colleges elsewhere is raising serious pedagogical questions about 21st century education: How best do students learn? And what’s the best way to assess that? Is the disappearance of high-stakes, high-pressure final exams a sign that universities are failing to challenge today’s students, or is it just a long overdue acknowledgment that such tests aren’t always the best indicator of actual knowledge?
October 3, 2010
This image has been posted with express written permission. This cartoon was originally published at Town Hall.