Invisible Gorillas Are Everywhere

January 30, 2012

The Chronicle Of Higher Education:

By now most everyone has heard about an experiment that goes something like this: Students dressed in black or white bounce a ball back and forth, and observers are asked to keep track of the bounces to team members in white shirts. While that’s happening, another student dressed in a gorilla suit wanders into their midst, looks around, thumps his chest, then walks off, apparently unseen by most observers because they were so focused on the bouncing ball. Voilà: attention blindness.

The invisible-gorilla experiment is featured in Cathy Davidson’s new book, Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn (Viking, 2011). Davidson is a founder of a nearly 7,000-member organization called Hastac, or the Humanities, Arts, Sciences, and Technology Advanced Collaboratory, that was started in 2002 to promote the use of digital technology in academe. It is closely affiliated with the digital humanities and reflects that movement’s emphasis on collaboration among academics, technologists, publishers, and librarians. Last month I attended Hastac’s fifth conference, held at the University of Michigan at Ann Arbor.

Davidson’s keynote lecture emphasized that many of our educational practices are not supported by what we know about human cognition. At one point, she asked members of the audience to answer a question: “What three things do students need to know in this century?” Without further prompting, everyone started writing down answers, as if taking a test. While we listed familiar concepts such as “information literacy” and “creativity,” no one questioned the process of working silently and alone. And noticing that invisible gorilla was the real point of the exercise.

Most of us are, presumably, the products of compulsory educational practices that were developed during the Industrial Revolution. And the way most of us teach is a relic of the steam age; it is designed to support a factory system by cultivating “attention, timeliness, standardization, hierarchy, specialization, and metrics,” Davidson said. One could say it was based on the best research of the time, but the studies of Frederick Winslow Taylor, among others, that undergird the current educational regime (according to Davidson) depend upon faked data supporting the preconceptions of the managerial class. Human beings don’t function like machines, and it takes a lot of discipline—what we call “classroom management”—to make them conform. Crucial perspectives are devalued and rejected, stifling innovation, collaboration, and diversity.

It wasn’t always that way. Educational practices that seem eternal, such as letter grades, started hardly more than a century ago; they paralleled a system imposed on the American Meat Packers Association in the era of The Jungle. (At first the meatpackers objected because, they argued, meat is too complex to be judged by letter grades.) The factory assembly line provided inspiration for the standardized bubble test, which was adopted as a means of sorting students for admission to college. Such practices helped to make education seem efficient, measurable, and meritocratic, but they tended to screen out collaborative approaches to problem-solving.

Drawing on her scholarly work in American literary history, Davidson argued that resistance to technology in education is not new. Every new technology takes time to become accepted by institutional cultures. Writing, for example, was once considered a degenerate, impoverished form of communication; it’s why we know about the teachings of Socrates only from the writings of Plato. When the print revolution produced cheap novels for a mass audience, popular works were regarded as bad for young people, especially women, who secreted books in their skirt “offices.” Following the long trajectory of the Protestant Reformation, you no longer needed someone to tell you what to think: You could read for yourself, draw your own conclusions, and possibly select your own society. Now the Internet offers a radical expansion of that process of liberation: It challenges institutional authority, it’s uncontrolled, and it has the potential to disrupt existing hierarchies, opening up new fields of vision, and enabling us to see things that we habitually overlook.

Browsing the 2012 conference program of the Modern Language Association, which includes nearly 60 sessions involving the digital humanities, Stanley Fish recently observed that “I remember, with no little nostalgia, the days when postmodernism in all its versions was the rage and every other session at the MLA convention announced that in theory’s wake everything would have to change.” Now the isms of prior decades—“multiculturalism, postmodernism, deconstruction, postcolonialism, neocolonialism, racism, racialism, feminism, queer theory”—seem to have retreated. But the ethos and disciplinary range of the digital humanities on display at Hastac suggest that this movement is not a replacement for the old order of “Theory” that reigned in the 80s and 90s so much as it is a practical fulfillment of that movement’s vision of a more inclusive, egalitarian, and decentralized educational culture.

Providing examples of how people have worked collaboratively, using the Internet, to develop effective responses to real-world problems, Davidson made a compelling argument for significant reforms in higher education (many examples are provided in her book). Too many of our vestigial practices, such as the tenure monograph and the large-room lecture, have become impediments to innovative scholarship. Students often learn in spite of our practices, learning more outside of the structured classroom than in it. Google is not making the rising generations stupid, Davidson argued; on the contrary, they rely on it to teach themselves, and that experience is making students aware that invisible gorillas are everywhere—and that one of them is higher education as most of us know it.

I might add, as the cost of traditional education increases beyond affordability for more and more students, that they (and their employers) may increasingly decide that they don’t need us. We need to find more ways to expand and diversify higher education beyond traditional degrees earned in late adolescence. Without abandoning the value of preparing students for citizenship and a rewarding mental life, we need to develop more-flexible systems of transparent long-term and just-in-time credentialing, earned over the course of one’s life in response to changing needs and aspirations. Apparently to that end, Hastac is now supporting the exploration of digital “badges” signifying the mastery of specific skills, experiences, and knowledge.

Whatever the means, there is an emerging consensus that higher education has to change significantly, and Davidson makes a compelling case for the ways in which digital technology, allied with neuroscience, will play a leading role in that change.

Nevertheless, graduate students on Hastac panels—and especially in conversation—complain bitterly that their departments are not receptive to collaborative, digital projects. In most cases, their dissertation committees expect a written, 200-page proto-monograph; that’s nonnegotiable. Meanwhile, assistant professors complain that they can earn tenure only by producing one or perhaps two university press books that, in all likelihood, few people will read, when their energies might be more effectively directed toward online projects with, potentially, far greater impact.

In the context of a talk at Hastac on publishing, one graduate student observed that digital humanists—for some time, at least—must expect to perform double labor: digital projects accompanied by traditional written publications about those projects. The MLA and the American Historical Association have established guidelines for evaluating digital projects, but most faculty members are not yet prepared to put those guidelines into effect. It requires a radical change of perspective for scholars who have invested so much of their lives in written criticism as the gold standard. “The associate professors, especially,” one panelist noted, “judge the next generation by the standards they were expected to meet.” Senior professors seem more prepared to “let the kids do their thing.”…

Read it all.

About these ads
Follow

Get every new post delivered to your Inbox.

Join 83 other followers

%d bloggers like this: