May 6, 2012
I have been an active blogger since 2006, and I often say that becoming one was the best decision I have ever made in my academic life.
In terms of intellectual fulfillment, creativity, networking, impact, productivity, and overall benefit to my scholarly life, blogging wins hands down. I have written books, produced online courses, led research efforts, and directed a number of university projects. While these have all been fulfilling, blogging tops the list because of its room for experimentation and potential to connect to timely intelligent debate. That keeps blogging at the top of the heap.
My academic identity—I’m a professor of educational technology at the Open University in the United Kingdom—is strongly allied with my blog. Increasingly we find that our academic identities are distributed. There was a time when you could have pointed to a list of publications as a neat proxy for your academic life, but now you might want to reference not only your publications, but also a set of videos, presentations, blog posts, curated collections, and maybe even your social network. All of these combine to represent the modern academic. My blog sits at the heart of these, the place where I reference the other media and representations.
This is not to argue that a blog should play the same role for everyone. A key aspect of the digital revolution is not the direct replacement of one form of scholarly activity with another, but rather the addition of alternatives to existing forms. In his book From Gutenberg to Zuckerberg: What You Really Need to Know About the Internet (Quercus, 2012), my colleague John Naughton argues that this is a lesson we should learn. “Looking back on the history,” he writes, “one clear trend stands out: Each new technology increased the complexity of the ecosystem.”
This trend is evident in academic practice. Previously if I wanted to convey an idea or a research finding, my choices were limited to a conference paper or journal article or, if I could work it up, a book. These choices still remain, but in addition I can create a video, podcast, blog post, slidecast, and more. It may be that a combination of these is ideal—a blog post gets immediate reaction and can then be worked into a conference presentation, shared through SlideShare, or turned into a paper that is submitted to a journal. In each case the blog or social network becomes a key route for sharing and disseminating the findings. One recent study suggests that use of Twitter, for instance, can both boost and predict citations of journal articles.
So blogging works for me, but it might not work for you. Maybe you’re more of a YouTube person, or a podcaster, or maybe your skill really lies in acting as a filter and a curator, using a tool such as Scoop.it, which allows you to curate and share resources on a particular topic. Or maybe you’re the trusted source for finding the valuable research in your field. It’s clear, though, that our academic ecosystem
is a more complex one now. This raises two difficult questions for academics who are expected to do research: First, do these new types of activity count as scholarship? And, if so, how do we recognize and reward them?
In my book The Digital Scholar: How Technology Is Transforming Scholarly Practice(Bloomsbury USA, 2011; free online), I argue that if you look across all scholarly activities, the use of new technology has the potential to change practice. For example, those who teach now have access to abundant, free, online content, while in the past teaching resources were often scarce and expensive.
Scholars no longer need a television series to engage the public. The academic publishing system is being thrown into question as we look at open-access approaches and different means of conducting peer review. And so on. There is hardly an area of scholarly practice that remains unaffected.
An example I like to cite is that of my colleague Tony Hirst, who blogs at OUseful.Info. He likes to play with open data, visualization tools, and mashups. On any one day an idea may occur to him, such as, ‘”I wonder how the people who tweet a particular link are connected? And who are they connected to?” In a short space of time, he will have experimented with data and tools to provide an answer, and blogged his results. None of this requires research funds or a peer-review filter, and it takes place over a much shorter time period than traditional research. Yet it would be difficult to argue that the blog does not constitute widely accepted definitions of scholarly research.
So I would argue that the answer to the first question above, as to whether new approaches such as blogging constitute scholarly activity, is an emphatic yes. Which leads us to a more problematic question: How should we recognize it?…
Why fiction is good for you: The beautiful lies of novels, movies, and TV stories have surprisingly powerful effects- and may even help make society tick
May 6, 2012
Is fiction good for us? We spend huge chunks of our lives immersed in novels, films, TV shows, and other forms of fiction. Some see this as a positive thing, arguing that made-up stories cultivate our mental and moral development. But others have argued that fiction is mentally and ethically corrosive. It’s an ancient question: Does fiction build the morality of individuals and societies, or does it break it down?
This controversy has been flaring up — sometimes literally, in the form of book burnings — ever since Plato tried to ban fiction from his ideal republic. In 1961, FCC chairman Newton Minow famously said that television was not working in “the public interest” because its “formula comedies about totally unbelievable families, blood and thunder, mayhem, violence, sadism, murder, western bad men, western good men, private eyes, gangsters, more violence, and cartoons” amounted to a “vast wasteland.” And what he said of TV programming has also been said, over the centuries, of novels, theater, comic books, and films: They are not in the public interest.
Until recently, we’ve only been able to guess about the actual psychological effects of fiction on individuals and society. But new research in psychology and broad-based literary analysis is finally taking questions about morality out of the realm of speculation.
This research consistently shows that fiction does mold us. The more deeply we are cast under a story’s spell, the more potent its influence. In fact, fiction seems to be more effective at changing beliefs than nonfiction, which is designed to persuade through argument and evidence. Studies show that when we read nonfiction, we read with our shields up. We are critical and skeptical. But when we are absorbed in a story, we drop our intellectual guard. We are moved emotionally, and this seems to make us rubbery and easy to shape.
But perhaps the most impressive finding is just how fiction shapes us: mainly for the better, not for the worse. Fiction enhances our ability to understand other people; it promotes a deep morality that cuts across religious and political creeds. More peculiarly, fiction’s happy endings seem to warp our sense of reality. They make us believe in a lie: that the world is more just than it actually is. But believing that lie has important effects for society — and it may even help explain why humans tell stories in the first place.
IT’S NOT HARD to see why social critics have often been dismayed by fiction. We spend a huge amount of time lost in stories, with the average American spending four hours per day watching television alone.
And if the sheer time investment were not enough, there’s the content. Since fiction’s earliest beginnings, morally repulsive behavior has been a great staple of the stories we tell. From the sickening sexual violence of “The Girl with the Dragon Tattoo,” to the deranged sadism of Shakespeare’s Titus Andronicus, to Oedipus stabbing his eyes out in disgust, to the horrors portrayed on TV shows like “Breaking Bad” and “CSI” — throughout time, the most popular stories have often featured the most unpleasant subject matter. Fiction’s obsession with filth and vice has led critics of different stripes to condemn plays, novels, comic books, and TV for corroding values and corrupting youth.
Moreover, it’s clear that these stories really can change our views. As the psychologist Raymond Mar writes, “Researchers have repeatedly found that reader attitudes shift to become more congruent with the ideas expressed in a [fictional] narrative.” For example, studies reliably show that when we watch a TV show that treats gay families nonjudgmentally (say, “Modern Family”), our own views on homosexuality are likely to move in the same nonjudgmental direction. History, too, reveals fiction’s ability to change our values at the societal level, for better and worse. For example, Harriet Beecher Stowe’s “Uncle Tom’s Cabin” helped bring about the Civil War by convincing huge numbers of Americans that blacks are people, and that enslaving them is a mortal sin. On the other hand, the 1915 film “The Birth of a Nation” inflamed racist sentiments and helped resurrect an all but defunct KKK.
So those who are concerned about the messages in fiction — whether they are conservative or progressive — have a point. Fiction is dangerous because it has the power to modify the principles of individuals and whole societies.
But fiction is doing something that all political factions should be able to get behind. Beyond the local battles of the culture wars, virtually all storytelling, regardless of genre, increases society’s fund of empathy and reinforces an ethic of decency that is deeper than politics…
Delta blues is as much legend as it is music. In the popular telling, blues articulated the hopelessness and poverty of an isolated, oppressed people through music that was disconnected from popular trends and technological advances. Delta blues giants like Robert Johnson were victims, buffeted by the winds of racism, singing out mostly for personal solace. The story is undoubtedly romantic, but it just isn’t true. “It angers me how scholars associate the blues strictly with tragedy,” B.B. King complained in his 1999 autobiographyBlues All Around Me. “As a little kid, blues meant hope, excitement, pure emotion.”
The tragic image of the blues that originated in the Mississippi Delta ignores the competitive and entrepreneurial spirit of the bluesman himself. While it is certainly true that the music was forged in part by the legacy of slavery and the insults of Jim Crow, the iconic image of the lone bluesman traveling the road with a guitar strapped to his back is also a story about innovators seizing on expanded opportunities brought about by the commercial and technological advances of the early 1900s. There was no Delta blues before there were cheap, readily available steel-string guitars. And those guitars, which transformed American culture, were brought to the boondocks by Sears, Roebuck & Co.
Music has always been an instrument of upward mobility in the black community. During slavery, performers were afforded higher status than field workers. As the entertainment for plantation soirees, musicians were expected to be well versed in the social dance styles demanded by white audiences. But when performing in slave quarters, they played roughly the same repertoire. Former slaves’ narratives reveal that the slave musical ensemble closely resembled later minstrel-show string bands: fiddles and banjos, accompanied by various percussion instruments, usually the tambourine and two bones being struck together as claves. While the image of slaves dancing waltzes seems odd now, it was common in rural black communities well into the 20th century.
At the conclusion of the Civil War, freed black men were suddenly looking for employment.Musicianers, as they were called, could earn more money than the typical day laborer. With newfound freedom of movement, and cultural norms that had established entertainment as one of the few widely accepted jobs for blacks, Reconstruction became a time of great opportunity for black musicians. In an 1882 article in The Century Magazine a white onlooker at a 19th-century Georgia corn shucking described the elite status of the musicianer like this: “The fiddler is the man of most importance. He always comes late, must have an extra share of whiskey, is the best-dressed man in the crowd, and unless every honor is shown him he will not play.”
The music played by these 19th-century musicians was not blues, and their plucked string instrument was not the guitar; it was the banjo. In 1781 Thomas Jefferson wrote about the instrument slaves played at his plantation, the banjar, “which they brought with them from the hinterlands of Africa.” These simple instruments usually had four strings and no frets.
It may seem odd that an instrument with African roots, originally played by plantation slaves, would become popular among the white masses, but the banjo was portable, melodic, and relatively easy to play. Banjo proselytizers, seeking to overcome anxieties about embracing a product of slave culture, would go so far in trying to whitewash the instrument’s ancestry as to claim that it had “reached its apogee through the contribution of whites” who had added frets and a fifth string to the original banjar.
A few early “classic blues” recordings featured the banjo, often fit with a guitar neck to provide a wider range. But these vaudevillian sides, cut by people like “Papa” Charlie Jackson, sound only distantly related to the Delta blues of Tommy Johnson or Skip James. The sound of the Delta is the sound of the steel-string guitar. The guitars of the 19th century used gut strings and were expensive and difficult to play. So despite having superior range and flexibility compared to banjos, guitars were still a rare sight in the black community. That all began to change in the 20th century.
The Mississippi Delta was a unique place in the American South. While the rich, alluvial soil appealed to antebellum planters, the dense, malaria-infested swamp that sat on top of it was barely hospitable, prompting many plantation owners to operate as absentee landlords. This led to blacks outnumbering whites by ratios as high as 10 to 1.
With the end of the Civil War, entrepreneurs rushed to cultivate this rich land. But the demographic makeup of the population did not change. Recruited by labor agents promising higher wages and greater opportunity, thousands of freedmen migrated to the region. While the work was taxing and the living conditions dreadful, there was a new kind of autonomy for the black man to sing about: the possibility of walking away from the mistreatment of an employer. This new sense of self-determination became a staple of later blues songs like “Key to the Highway” and “Dust My Broom.” The range of workplace alternatives was growing wider every day, as the railroad began to connect more and more of rural America with the outside world.
Charlie Patton, long considered the godfather of the Delta blues, was an early beneficiary of these new opportunities. Sometime around 1900, Charlie’s father, Bill, moved his family to Cleveland, Mississippi, to work on the farm of a man named Will Dockery. Dockery had arrived in Cleveland in 1888 fresh out of college and used a $1,000 gift from his grandmother to purchase a small plot of swampy timberland and open a mill. By the early 1900s, Dockery Farms had grown into a self-sufficient community, boasting a post office, general store, and a rail terminal. Dockery developed a reputation for paying good wages and treating his workers well. But wages weren’t all that drew Bill Patton to Dockery’s place. He also hoped to pry his adolescent son Charlie from the influence of the most prominent musical family in central Mississippi: the Chatmons. A God-fearing man, Bill Patton was determined not to let his son fall sway to the evils of secular music and the Chatmons’ decadent lifestyle…
May 6, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.