GOP Politics

January 5, 2013

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

If America really is in decline, does it mean the west is in decline as well? Can a political system in seeming permanent gridlock survive? Do Americans have the stomach for meaningful fiscal and political reforms?

The answers to those questions have far reaching implications. The rest of the world is watching just to see what we are made of. European Union nations, battered by currency instability, budget crises and burgeoning debt are looking for answers from the US. Asia, the emerging global economic giant threatens American and western economic dominance and hegemony. In addition, these nations are also building huge new military structures. Projecting power it seems is no longer a western manifestation of dominance.

The new Middle East reality poses threats of it’s own, from rogue states to very real terror threats. Whereas it used to take armies and navies to wreck havoc on a nation, terror can now do the same at a much lower cost. 

Think about this: Since 9/11, we have changed the rhythm of the nation. Consider how many hours of productive business are lost because we have to show up at the airport hours before our flights. Consider the costs of layer upon layer of security and consider the costs of the national intelligence infrastructure. According to the Washington Post, there are now 3.984 counter terrorism organizations- and those are just the ones we know about abd doesn’t count the federal agencies which have grown by over a third since 9/11. The counter terror organizations in your area can be found by clicking here.

“The government has built a national security and intelligence system so big, so complex and so hard to manage, no one really knows if it’s fulfilling its most important purpose: keeping its citizens safe.”- WAPO

When democracy has to contend with protecting itself as opposed to expanding freedoms, can it survive?

The National Interest:

A QUESTION haunts America: Is it in decline on the world scene? Foreign-policy discourse is filled with commentary declaring that it is. Some—Parag Khanna’s work comes to mind—suggests the decline is the product of forces beyond America’s control. Others—Yale’s Paul Kennedy included—contend that America has fostered, at least partially, its own decline through “imperial overstretch” and other actions born of global ambition. Still others—Robert Kagan of the Brookings Institution and Stratfor’s George Friedman, for example—dispute that America is in decline at all. But the question is front and center and inescapable.

It may be the wrong question. America is a product of Western civilization—part and parcel of it, inseparable from it. Thus, no serious analysis of America’s fate as a global power can be undertaken without placing it within the context of the West, meaning primarily Europe.

Kagan disputes this. In his influential little book of 2003, Of Paradise and Power: America and Europe in the New World Order, he famously suggested Americans are from Mars whereas Europeans are from Venus. “They agree on little and understand one another less and less,” he wrote, adding, “When it comes to setting national priorities, determining threats, defining challenges, and fashioning and implementing foreign and defense policies, the United States and Europe have parted ways.”

Perhaps. But they share the same cultural heritage, and their fates are bound together, whether they like it or not. Think of Greece and Rome, both part and parcel of the classical civilization. They honored the same gods, pursued the same modes of artistic expression and viewed politics in largely the same way during their periods of greatest flowering. And their fates were intertwined—enforced with brutal finality by Roman military potentates Mummius on the ground and Metellus at sea even as the younger Scipio was destroying Carthage in a way the Greeks never experienced because, unlike Carthage, they didn’t represent an alien civilization. Will Durant pegs the end of Greek civilization at ad 325, when Constantine founded Constantinople and Rome took a decisive turn away from its heritage—and that of Greece.

So it is with America and Europe. Hence, an analysis of American decline must lead to questions about Western decline. And an analysis of Western decline must lead to Oswald Spengler, the German intellectual who in 1918 produced the first volume of his bombshell workDer Untergang des Abendlandes (The Decline of the West), followed by the second volume in 1922. Spengler’s thesis forced his readers to look at history through an entirely new prism. They did, and he enjoyed a surge of influence. But the man and his work are in eclipse today, and there’s little evidence that scholars pondering American decline have consulted the dark musings of this German romantic or his overarching theory of history. Robert D. Kaplan, the itinerant scholar of peoples and cultures, describes Spengler as “at once . . . turgid, hypnotic, profound, and, frankly, at times unintelligible in English translation.” He sees far more historical validity in the forces of geography than in Spengler’s ardent musings about the power of culture in directing history.

But it wasn’t always so. As John Farrenkopf points out in his Prophet of Decline: Spengler on World History and Politics, Spengler’s Decline beguiled numerous prominent men of ideas and action in post–World War II America. They included George Kennan, Henry Kissinger, Paul Nitze, Louis Halle, Hans Morgenthau and Reinhold Niebuhr. Kennan read Spengler in the original language during a stay in Germany in his youth. Kissinger’s undergraduate thesis at Harvard focused on Spengler, along with Toynbee and Kant, and he once confessed to a “perverse fascination” with the German’s thinking, although Kissinger ultimately rejected the idea of inevitable decline. Nitze left Wall Street as a young man specifically to study Decline at Harvard, while Halle reported receiving poor grades there because of his preoccupation with the book. And yet, as Farrenkopf notes, Spengler’s “place in modern international theory has received relatively little attention” and “his challenging ideas have not been reformulated into a theoretical stance on international relations.” Probably, he suggests, this is because his pessimism is a little too ominous for any but the most theoretical musings.

When Spengler’s book appeared in the wake of the Great War’s carnage, conventional historians attacked it immediately. The scholarly world, suggests H. Stuart Hughes in Oswald Spengler: A Critical Estimate, “has been embarrassed to know what to do about it.” Though it manifests prodigious study and substantial knowledge, Decline is not considered respectable scholarship. “It is too metaphysical,” writes Hughes, “too dogmatic—in all respects, too extreme. Yet there it sits—a massive stumbling-block in the path of true knowledge.” He seems to be saying that subsequent scholars couldn’t quite dismiss the book but also couldn’t figure out precisely how to incorporate its arguments into their thinking.

THE PURPOSE of this article is to hold up the Spengler thesis as a prism through which we might view the state of the world in AD 2013 and probe the question of American and Western decline. I do so without endorsement but with a conviction that elements of that thesis might enlighten efforts to understand our time. Spengler’s work might be viewed as somewhat akin to a potent medicine that can be beneficial in appropriate doses but dangerous when ingested whole, given its metaphysical, dogmatic and extreme qualities cited by Hughes. Besides, Spengler’s thesis is unyieldingly deterministic, which makes it philosophically suspect as well as psychologically unacceptable, given the human aversion to the amoral essence of determinism and its assault on the concept of salvation, whether divine or temporal.

But two elements of Spengler’s thinking merit particular attention. One is his rejection of the “Idea of Progress,” that hoary Western notion that mankind has advanced over the centuries through quickening stages of development, from primitiveness and barbarism to enlightenment and civilization—and that mankind will continue to advance through the human experience on earth. The Idea of Progress has animated the thinking of nearly all significant Western philosophy since its first stirrings in the thirteenth century. As writer and philosopher Robert Nisbet put it, “No single idea has been more important than, perhaps as important as, the idea of progress in Western civilization.”

In our own time, the Idea of Progress serves as progenitor of the concepts of Eurocentrism and American exceptionalism. It was the underpinning of Francis Fukuyama’s famous “End of History” perception that Western democratic capitalism represents the culmination of human civic development. It fuels today’s foreign-policy belief, so prevalent across the political spectrum, that America’s world role is to remake other societies and cultures in the Western image.

Spengler, by contrast, embraced a view of history as the story of various discrete civilizations, each with its own distinct culture, that emerged, developed, flowered and then declined. This cyclical view subsumes certain underlying perceptions. First, since civilizations and cultures are distinct, there can be no universal culture. No body of thought emanating from one culture can be imposed upon another, either peacefully or through force. And civilizational decline is an immutable rule that applies to all civilizations, including the West.

The second noteworthy element of Spengler’s thought is his view, based on his study of eight great civilizations, that the process of decline carries with it a surge of imperial fervor and a flight toward Caesarism. Hegemonic impulses come to the fore along with forms of dictatorship. As Charles and Mary Beard wrote in The American Spirit, “Spengler’s judgment of history certainly conveyed to American readers the notion that ‘Western civilization’ was doomed and that another Caesar, the conquering man of blood and iron, would bring it to an end.” This phase, which Spengler calls the civilizational phase, can last a couple centuries, and the question Americans face today, looking at the world through the Spenglerian prism, is whether their country, as leader of the West, is in the process of embracing these elements of Spengler’s civilizational phase.

BUT FIRST let’s look at the man and his philosophy. Spengler was born in 1880 in the northern region of the Harz Mountains. His father, austere and distant, was a mining engineer and postal official in the town of Halle. After a classical high school education, young Spengler studied mathematics and science at universities in Berlin, Munich and Halle. Then he experienced probably the greatest disappointment of his life when he failed his oral exams. Though he passed six months later, the lapse barred him from the rarefied life of the German university professor, and he resigned himself to teaching in the Realgymnasium (high school) system. But he soon gave that up and moved to Munich, where he lived quietly on his inheritance.

In 1911, he watched with mounting alarm as his country entered into a tense confrontation with France in what was known as the Second Moroccan Crisis. War was averted when Germany backed down—in humiliation—after Britain threw her weight behind France. But the episode left young Spengler with an indelible fear that war between Germany and the French-British alliance had become inevitable. He saw this looming conflict as a clash of epic proportions with profound consequences for Western civilization.

He set out to write a book predicting this conflagration and exploring the existential rivalry between Great Britain, the trade empire of democratic capitalism, perceived by many Germans as intrinsically decadent; and Germany, a rising socialistic empire widely viewed in Spengler’s country as representing a more hallowed Prussian Kultur. The question was which power would dominate the West during its civilizational phase.

But soon he developed a vision for a wider exploration of the rise and fall of world civilizations, including the culturally spent and sterile West. He plunged into the project, continuing even as the war he had predicted turned into blood-soaked reality. Finally, in 1918 the Viennese house of Wilhelm Braumuller brought out the first volume of Der Untergang des Abendlandes. Anticipating tepid interest and minimal sales, Braumuller printed just 1,500 copies.

The book hit the German consciousness like a boulder tossed upon an anthill. As one scholar wrote a few years later, “Never had a thick philosophical work had such a success—and in all reading circles, learned and uneducated, serious and snobbish.” Sales hit a hundred thousand within eight years, and the book was translated into numerous languages. As Hughes noted, Spengler became “the philosopher of the hour.” Readers were beguiled by his sheer audacity. He didn’t paint with little brushstrokes but attacked the canvas with wide swings of his arms, painting over whole strands of Western philosophy…

Read it all.

The Boss. 

No need for any name, no need for any other words. The Boss means Bruce Springteen.

What sets Springsteen apart? Maybe it has to do with who he is and where he came from. He was a working class kid from New Jersey, not particularly good looking and arguably, not even all that talented in either vocals or musicianship. Nevertheless, Springsteen’s music seems timeless and appeals to a wide spectrum of people.

The Boss’ music never makes you feel special or unique. Springsteen himself never claims any kind of iconic status. Both he and his music reflect the common man and his struggles. There is no magic on a grande scale, only the few moments of satisfaction and contentment each of can choose to carve out for ourselves.

Springteen’s’ music is less about how we feel than who it is we are.  Our essential selves does not change. We grow,we evolve but our essential selves stays the same. Maybe that is why Bruce Springsteen and his music seem so timeless and eternal, even. We know him, we can count on him and we want to believe he’d stand up for us in the same way we’d stand up for him.

That other Jersey phenom, Bon Jovi may be more talented, have a better voice and be a genuine heart throb but he ain’t the Boss.

One of his best, I believe- Jersey Girl.

New Yorker:

Nearly half a century ago, when Elvis Presley was filming “Harum Scarum” and “Help!” was on the charts, a moody, father-haunted, yet uncannily charismatic Shore rat named Bruce Springsteen was building a small reputation around central Jersey as a guitar player in a band called the Castiles. The band was named for the lead singer’s favorite brand of soap. Its members were from Freehold, an industrial town half an hour inland from the boardwalk carnies and the sea. The Castiles performed at sweet sixteens and Elks-club dances, at drive-in movie theatres and ShopRite ribbon cuttings, at a mobile-home park in Farmingdale, at the Matawan-Keyport Rollerdrome. Once, they played for the patients at a psychiatric hospital, in Marlboro. A gentleman dressed in a suit came to the stage and, in an introductory speech that ran some twenty minutes, declared the Castiles “greater than the Beatles.” At which point a doctor intervened and escorted him back to his room.

One spring afternoon in 1966, the Castiles, with dreams of making it big and making it quick, drove to a studio at the Brick Mall Shopping Center and recorded two original songs, “Baby I” and “That’s What You Get.” Mainly, though, they played an array of covers, from Glenn Miller’s “In the Mood” to the G-Clefs’ “I Understand.” They did Sonny and Cher, Sam and Dave, Don & Juan, the Who, the Kinks, the Stones, the Animals.

Many musicians in their grizzled late maturity have an uncertain grasp on their earliest days on the bandstand. (Not a few have an uncertain grasp on last week.) But Springsteen, who is sixty-two and among the most durable musicians since B. B. King and Om Kalthoum, seems to remember every gaudy night, from the moment, in 1957, when he and his mother watched Elvis on “The Ed Sullivan Show”—“I looked at her and I said, ‘I wanna be just . . . like . . . that’ ”—to his most recent exploits as a multimillionaire populist rock star crowd-surfing the adoring masses. These days, he is the subject of historical exhibitions; at the Rock and Roll Hall of Fame Museum, in Cleveland, and at the National Constitution Center, in Philadelphia, his lyric sheets, old cars, and faded performing duds have been displayed like the snippets of the Shroud. But, unlike the Rolling Stones, say, who have not written a great song since the disco era and come together only to pad their fortunes as their own cover band, Springsteen refuses to be a mercenary curator of his past. He continues to evolve as an artist, filling one spiral notebook after another with ideas, quotations, questions, clippings, and, ultimately, new songs. His latest album, “Wrecking Ball,” is a melodic indictment of the recessionary moment, of income disparity, emasculated workers, and what he calls “the distance between the American reality and the American dream.” The work is remote from his early operettas of humid summer interludes and abandon out on the Turnpike. In his desire to extend a counter-tradition of political progressivism, Springsteen quotes from Irish rebel songs, Dust Bowl ballads, Civil War tunes, and chain-gang chants.

Early this year, Springsteen was leading rehearsals for a world tour at Fort Monmouth, an Army base that was shut down last year; it had been an outpost since the First World War of military communications and intelligence, and once employed Julius Rosenberg and thousands of militarized carrier pigeons. The twelve-hundred-acre property is now a ghost town inhabited only by steel dummies meant to scare off the ubiquitous Canada geese that squirt a carpet of green across middle Jersey. Driving to the far end of the base, I reached an unlovely theatre that Springsteen and Jon Landau, his longtime manager, had rented for the rehearsals. Springsteen had performed for officers’ children at the Fort Monmouth “teen club” (dancing, no liquor) with the Castiles, forty-seven years earlier.

The atmosphere inside was purposeful but easygoing. Musicians stood onstage noodling on their instruments with the languid air of outfielders warming up in the sun. Max Weinberg, the band’s volcanic drummer, wore the sort of generous jeans favored by dads at weekend barbecues. Steve Van Zandt, Springsteen’s childhood friend and guitarist-wingman, keeps up a brutal schedule as an actor and a d.j., and he seemed weary, his eyes drooping under a piratical purple head scarf. The bass player Garry Tallent, the organist Charlie Giordano, and the pianist Roy Bittan horsed around on a roller-rink tune while they waited. The guitarist Nils Lofgren was on the phone, trying to figure out flights to get back to his home, in Scottsdale, for the weekend.

Springsteen arrived and greeted everyone with a quick hello and his distinctive cackle. He is five-nine and walks with a rolling rodeo gait. When he takes in something new—a visitor, a thought, a passing car in the distance—his eyes narrow, as if in hard light, and his lower jaw protrudes a bit. His hairline is receding, and, if one had to guess, he has, over the years, in the face of high-def scrutiny and the fight against time, enjoined the expensive attentions of cosmetic and dental practitioners. He remains dispiritingly handsome, preposterously fit. (“He has practically the same waist size as when I met him, when we were fifteen,” says Steve Van Zandt, who does not.) Some of this has to do with his abstemious inclinations; Van Zandt says Springsteen is “the only guy I know—I think the only guy I know at all—who never did drugs.” He’s followed more or less the same exercise regimen for thirty years: he runs on a treadmill and, with a trainer, works out with weights. It has paid off. His muscle tone approximates a fresh tennis ball. And yet, with the tour a month away, he laughed at the idea that he was ready. “I’m not remotely close,” he said, slumping into a chair twenty rows back from the stage.

Preparing for a tour is a process far more involved than middle-aged workouts designed to stave off premature infarction. “Think of it this way: performing is like sprinting while screaming for three, four minutes,” Springsteen said. “And then you do it again. And then you do it again. And then you walk a little, shouting the whole time. And so on. Your adrenaline quickly overwhelms your conditioning.” His style in performance is joyously demonic, as close as a white man of Social Security age can get to James Brown circa 1962 without risking a herniated disk or a shattered pelvis. Concerts last in excess of three hours, without a break, and he is constantly dancing, screaming, imploring, mugging, kicking, windmilling, crowd-surfing, climbing a drum riser, jumping on an amp, leaping off Roy Bittan’s piano. The display of energy and its depletion is part of what is expected of him. In return, the crowd participates in a display of communal adoration. Like pilgrims at a gigantic outdoor Mass—think John Paul II at Gdansk—they know their role: when to raise their hands, when to sway, when to sing, when to scream his name, when to bear his body, hand over hand, from the rear of the orchestra to the stage. (Van Zandt: “Messianic? Is that the word you’re looking for?”)

Springsteen came to glory in the age of Letterman, but he is anti-ironical. Keith Richards works at seeming not to give a shit. He makes you wonder if it is harder to play the riffs for “Street Fighting Man” or to dangle a cigarette from his lips by a single thread of spit. Springsteen is the opposite. He is all about flagrant exertion. There always comes a moment in a Springsteen concert, as there always did with James Brown, when he plays out a dumb show of the conflict between exhaustion and the urge to go on. Brown enacted it by dropping to his knees, awash in sweat, unable to dance another step, yet shooing away his cape bearer, the aide who would enrobe him and hustle him offstage. Springsteen slumps against the mike stand, spent and still, then, regaining consciousness, shakes off the sweat—No! It can’t be!—and calls on the band for another verse, another song. He leaves the stage soaked, as if he had swum around the arena in his clothes while being chased by barracudas. “I want anextreme experience,” he says. He wants his audience to leave the arena, as he commands them, “with your hands hurting, your feet hurting, your back hurting, your voice sore, and your sexual organs stimulated!

So the display of exuberance is critical. “For an adult, the world is constantly trying to clamp down on itself,” he says. “Routine, responsibility, decay of institutions, corruption: this is all the world closing in. Music, when it’s really great, pries that shit back open and lets people back in, it lets light in, and air in, and energy in, and sends people home with that and sends me back to the hotel with it. People carry that with them sometimes for a very long period of time.”

The band rehearses not so much to learn how to play particular songs as to see what songs work with other songs, to figure out a basic set list (with countless alternatives) that will fill all of Springsteen’s demands: to air the new work and his latest themes; to play the expected hits for the casual fans; to work up enough surprises and rarities for fans who have seen him hundreds of times; and, especially, to pace the show from frenzy to calm and back again. In the past several years, Springsteen has been taking requests from the crowd. He has never been stumped. “You can take the band out of the bar, but you can’t take the bar out of the band,” Van Zandt says.

The E Street Band members are not Springsteen’s equals. “This is not the Beatles,” as Weinberg puts it. They are salaried musicians; in 1989, they were fired en masse. They await his call to record, to tour, to rehearse. And so when Springsteen sprang out of his chair and said, “O.K., time to work,” they straightened up and watched for his cue.

Huh . . . two . . . three . . . four…

Read it all.

Via US News

The Green Initiative

January 5, 2013

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

The Relationship Counselor

January 4, 2013

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

TMI- Too much information- is a concept we are all familiar with. 

With a keyboard, a mouse click or two, we can access just about any bit of information we might need. What was once  a long term investment in term of time and commitment, information can now be readily absorbed in a matter of moments. Is that a bad thing?

Maybe, maybe not.

As access to knowledge increases there will inevitably be problems and culture shock as one ‘regime’ replaces another- but that may very well be a small price to pay.

The same argument was made with the advent of the printing press and encyclopedias and other reference books. Prescribed avenues of study were upended as students could rely on books to search out information for themselves. The advent of books allowed students to become with related and non related disciplines and as a result research yielded fantastic results. As literacy and books came into wide purview (and not controlled by religious or aristocratic classes) our knowledge- and freedom- expanded exponentially. Education didn’t just empower a few individuals. Education empowers a nation, a culture and a society.

In nations where intellectual pursuits are encouraged, societies are free and advanced. In nations where intellectual pursuits are stifled, cultures and societies remain backward and constrained. A society and culture  which fights to keep racism and bigotry institutionalized are lesser societies and cultures. A society and culture which fights and resists racism and bigotry are far healthier.

As revolutions rock the beginning of the 21st century we can only hope the revolutionaries who access to global information networks choose to embrace the elevation of their populations, societies and cultures. Those revolutionaries who choose to rearrange the deckchairs on what is a sinking ship will themselves be overthrown and be soon be forgotten.

Once unleashed, an educated class and those with access to education and knowledge cannot be kept down.

The New Criterion:

We now live in the early part of an age for which the meaning of print culture is becoming as alien as the meaning of manuscript culture was to the eighteenth century. “We are the primitives of a new culture,” said Boccioni the sculptor in 1911. Far from wishing to belittle the Gutenberg mechanical culture, it seems to me that we must now work very hard to retain its achieved values.

—Marshall McLuhan, The Gutenberg Galaxy, 1962

Technological revolutions are far less obvious than political revolutions to the generations that live through them. This is true even as new tools, for better and worse, shift human history more than new regimes do. Innovations offer silent coups. We rarely appreciate the changes they bring until they are brought. Whether or not we become the primitives of a new culture, as the Futurist Umberto Boccioni observed, most of us still live behind the times and are content to do so. We expect the machines of the present to fulfill the needs of the past even as they deliver us into a future of unknowns.

World-changing inventions almost always create new roles rather than fill old ones. It’s a great invention, but who would ever want to use one?was the classic response to the telephone, variously attributed to Ulysses S. Grant or Rutherford B. Hayes but probably said by neither of them. Life-altering technologies often start as minor curiosities and evolve into major necessities with little reflection on how they reform our perceptions or even how they came to be.

In the eighteenth century, Edmund Burke could see the significance of the French Revolution while observing its developments in real time. Yet “in the sixteenth century men had no clue to the nature and effects of the printed word,” writes Marshall McLuhan in The Gutenberg Galaxy, his 1962 book on the printing revolution and the dawning of the electronic age. It wasn’t until nearly 200 years on that Francis Bacon located the printing press alongside gunpowder and the compass as changing “the whole face and state of things throughout the world.” Writing in his 1620 book Novum Organum (“New Instrument”), Bacon maintained that “no empire, no sect, no star seems to have exerted greater power and influence in human affairs than these mechanical discoveries.” In the nineteenth century, Victor Hugo called the invention of printing the “greatest event in history” and the “mother of revolution.” Political revolution began in this technological upheaval.

An argument can be made, and so I will make it here, that the invention of the Internet is the under-recognized revolution of our time. The world-changing technology of the Internet, of course, is already apparent and barely needs retelling. The Internet is more significant than the telephone, the television, the transistor, or the personal computer because it subsumes all these prior inventions into a new accumulation that is greater than the sum of its parts. As the network of networks—the “inter-network”—the Internet is a revolution of revolutions.

Yet while we appreciate the Internet’s technological wonders, the cultural landscape it leads to is less explored. We acknowledge the Internet’s effect on information but are less considering of its influence on us. Even as we use its resources, most of us have no understanding of its mechanics or any notion of the ideas, powers, and people that led to its creation.

One way to situate the Internet is to see it as inaugurating the next stage of copy culture—the way we duplicate, spread, and store information—and to compare it to the print era we are leaving behind. New technologies in their early development often mimic the obsolete systems they are replacing, and the Internet has been no different. Terms like “ebook” and “online publishing” offer up approximations of print technology while revealing little of the new technology’s intrinsic nature.

Just as the written word changed the spoken word and the printed word changed the written word, so too will the digital word change the printed word, supplementing but not replacing the earlier forms of information technology. Speaking and writing both survived the print revolution, and print will survive the Internet revolution. The difference is that the Internet, with its ability to duplicate and transmit information to an infinite number of destinations, will increasingly influence the culture of the copy.

What the world is today, good and bad, it owes to Gutenberg,” wrote Mark Twain. “Everything can be traced to this source, but we are bound to bring him homage, . . . for the bad that his colossal invention has brought about is overshadowed a thousand times by the good with which mankind has been favored.”

The Gutenberg revolution occurred around 1440 in near obscurity. The life of Johannes Gutenberg, the German metalsmith from Mainz, is largely unknown. The exact nature of the invention that he first unveiled in Strasbourg remains a source of debate. Even as the technology of book printing spread through Germany and Italy, Gutenberg died a financial failure. His recognition as the inventor of typography only came at the start of the sixteenth century, over three decades after his death.

Gutenberg did not invent every component that gave rise to the printed page. His innovation, as commonly understood, was to put existing technologies together in a press that used oil ink and movable type to stamp Roman letters arranged in rows onto a page. Gutenberg’s expertise in metalwork helped him develop a metal alloy for the letter punches that could withstand the pressures of the printing process. He also devised a simple hand mold to recast the punches. This not only led to the rise of a book’s standardized font but also enabled the reproduction of the printing machine itself.

The rapid development of print culture in Europe occurred across two trajectories at once. Each printing press could produce hundreds and soon thousands of pages a day, just as the printing machines themselves could be duplicated. In the 1450s, the greatest early demonstration of the new technology was the production of the Gutenberg Bible. Copies of Gutenberg’s rare original Bibles are today considered among not only our most valuable printed books but also the most beautiful. Thirty years after this initial run—a start-up operation that landed Gutenberg in court with his disgruntled investors—there were 110 printing presses in operation across Europe, with fifty in Venice alone. By 1500, European presses had already produced over twenty million books. A century after that, the number was 150 to 200 million copies. The printing press made bestsellers out of writers in their own lifetimes. Erasmus sold a remarkable 750,000 copies. Luther distributed 300,000 printed tracts.

The rise of print culture had countless benefits, but it also overturned many of the achievements of the manuscript culture it replaced. The great proliferation of printed books meant that scholars no longer had to seek out rare copies of written material, but literature also no longer enjoyed the protection of a scholarly class and a culture of scholasticism went into decline. As the sixteenth century saw a renewed interest in ancient writing, due to the wide reproduction of classical works, Latin also lost ground as the lingua franca of high culture. An increasingly literate middle-class public, unfamiliar with Latin, sought out books in their own vernaculars, which helped give rise to new national identities. As reading became a silent activity to be accomplished alone, the printed book challenged the oral tradition. Likewise grammar and syntax were regularized to illuminate sense rather than stress…

Read it all.

It’s tough to be a true socialist nowadays. The right wants nothing to do with socialism, equating it to the devil’s work incarnate and the left wants to distance themselves from an orthodox ideology which has pretty much failed. Very few if any. are willing to openly embrace an ideology which has been made an orphan.

That is changing. Socialism- or what portends to be socialism- is making a comeback. This newest incarnation of the ideology isn’t quite as confrontational or adversarial. Today’s socialism is indeed Marxist oriented to be sure, but focused more on quality of life and character rather than on the evils of capitalism. That is an important distinction. The emphasis is on the moral and not the material. The call is for vision and not violence.

In other words the ingredients are the same but the recipe has changed.

Will this new version of socialism take hold? Only time will tell.

Boston Review:

Outside of the odd bout of scare-mongering, socialism is a term with little currency in American political discourse. Only one member of Congress openly embraces the term. The Socialist Party itself was already a spent force by 1936 (when it barely outpaced the Prohibitionist Party). Since the end of the Cold War, America’s radical left has been marginalized by the Democratic Party and policy elite, and turned instead to anarchism and a vague kind of anti-corporatism.

However, socialism’s reputation is making a comeback, at least among the young. If Millennials need any pointers, the avowedly socialist editorial board of Jacobin is happy to oblige. Founded two years ago by Bhaskar Sunkara (then 21 years old), the magazine aims to reinvigorate America’s left with writing that is practical, accessible, and very radical. The enterprise so far has been surprisingly successful. Today, its Web site gets around 250,000 unique views a month. Sunkara decided the project would get boring if left entirely online and so financed a print magazine from a handful of subscriptions and $2,000 from his own pocket. Today the magazine has more than 2,000 subscribers, including influential activists, labor leaders, and some of the very mainstream media figures it occasionally targets. (Full disclosure: I periodically contribute to Jacobin.) The press is paying attention.

Last week, Sunkara and I spoke in a non-descript Manhattan office building overlooking the statue of the Wall Street Bull. Read on for the secrets of Sunkara’s origins, Jacobin’s engagement with liberals, and the political legacy of Michael Harrington, the last socialist to make a stir in the intellectual scene.

Note: This interview has been edited and condensed.

Jake Blumgart: Who is Jacobin’s intended audience? You don’t really seem to be trying to engage with conservatives.

Bhaskar Sunkara: The intended audience is connected to the two distinct goals of Jacobin. The first is an intra-left goal to reassert the importance of class and Marxist analysis in the context of an increasingly anarchist-inflected left. We aren’t dogmatic and orthodox, we don’t think the old ways of organizing and thinking are the way forward, but we’re committed to adapting those ways of thinking to new material realities.

But there is another goal, which is more directed to the general public and—I don’t think I’ve put it this crassly before—to liberals: articulating radical left ideas and doing so in a way that is clear and accessible. The pieces are meant to be uncompromising in content but informed, accessible, and in good faith. Over the course of the project, this attempt has been wildly successful. We may get furious cries from the left for getting attention from people such as Christopher Hayes, Reihan Salam, Andrew Sullivan, and whatnot. But that’s part of our intended purpose. We don’t want a world where Hayes and Katrina van den Heuvel are the de facto left in this country. That’s not saying anything against them; they are principled social democrats. That’s a lot for the American context. But by existing and getting the amount of mass media attention we get — from Rolling Stone to the New York Times —we’re visible reminders of a long-forgotten, and uncompromisingly socialist, political tradition. We are also trying to bring a radical perspective on politics and economics to our predominately young audience, while other publications from our generation are focused more on culture. It’s very much in the tradition of the Second International radicals—Kautsky, Lenin, Luxemburg, and their contemporaries weren’t academics.

That’s not saying that those frameworks don’t have their place, and I love publications like n+1 and the like, but I’m talking about poverty critiques that feel like they have to start with a hook from The Wire. I think that’s bullshit. I think we can just write the essay on poverty and include a few line graphs in it. I think the left can do with a dose of empiricism and that our ideas can stand-up next to others by virtue of their seriousness.

The left has been speaking in euphemisms for too long, focused on branding and compromising their ideas. We say directly that we’re a socialist publication armed with a Marxist critique, seeking radical political goals, and we’re compelling enough to have entered into the spotlight. There’s something to be said for that.

We think our ideas should stand up to the same scrutiny as other publications. That’s why we get attention from people like Hayes and Salam.

JB: From the first article of the first issue, Jacobin has put forward ideas about the politics of free time vs. time spent at tedious jobs. “The Politics of Getting A Life” as Peter Frase memorably put itMother Jones’s Kevin Drum had a couple posts recently about the declining availability of low-skill human work, but he just describes a trend and says he can’t imagine how to change it. The options seem to be immiserating work or immiserating unemployment. Jacobin is one of the only places I’ve seen trying to show an alternative: Socialism as not just a redistributive but an emancipatory ideology.

BS: Yes, we are reclaiming freedom for the left. There is something in the vision of socialism that isn’t just rooted in the kitschy Soviet producerism. To some extent socialism could ideally be the ultimate ownership society. A radical extension of democracy and control into the social and economic realms, a society without the exploitation of person by person—it’s a realm of individual empowerment.

Take this recent debate: technological improvements could create mass unemployment or lead to shorter working hours. But changes in the mode of production do not necessarily mean improvement or emancipation unless there is a political will to change the way the fruits of this productivity are distributed. This is a pretty basic point. That’s why people would benefit from reading Marx. But if you don’t have it in your political repertoire, you end up as someone like Drum who has a profoundly stunted political imagination. Drum and all these people are running into the distant limits of their brand of analysis, and it’s clear even to them that Jacobin has ideas for the future—even if some of these aren’t too novel and are just updates on traditional left-wing ideas that meet the material needs of our moment.

JB: I recall Matt Yglesias’s response to Peter Frase’s “Four Futures” essay, which tried to imagine this society beyond low wages and incredibly long hours. Reducing the length of the workweek has been a historical project of the left, but it hasn’t been in the American conversation much recently, to my knowledge. It seems valuable that someone like Yglesias is engaging with that idea.

BS: We’re at a moment where it seems as though there are no ideas. But society is always constantly in motion; there are always little imperceptible changes even if we don’t acknowledge them as they come about. Human society hasn’t reached its peak, but we aren’t living in the worst of all possible worlds either. We live in a world in which worker movements and other progressive movements have made life more livable, and capitalism itself has been an intensely dynamic force, in many ways, for human civilization and its future possibilities. That being said, there is so much unnecessary suffering in the world. So much exploitation and oppression that should have become absolutely obsolete thanks to the material abundance and technology we do have. The problems facing society are mostly political ones rather than actual material ones. That’s not to imply any undue optimism about the future, but to suggest that it’s important to keep the political imagination open enough to embrace visions of a better society.

Marxism today has become this super-academic hobby that has lost its political urgency.

That’s on an intellectual level, but we are a political journal and we have to deal with the fact that the left is so fragmented and marginalized even with the emergence of something like Occupy. On some level the relevance of Jacobin will hinge on objective conditions and political developments. In the course of two years the magazine hasn’t become massive in the mainstream sense, but it’s in the conversation and lots of people read it…

Read it all.

What exactly defines Buddhism? Is it the supposedly magical properties which can transform the individual or is something far more subtle- the ability to transcend a shallow reality into a more substantive and profound one?

When Buddhists talk a world in harmony, what does that really mean? What is meant to be accepted on faith alone and what is meant to be intellectually dissected? Finally, is Buddhism a religion or a philosophy?

What follows is the story of an English priest who absorbed every incarnation of Buddhism and came to understand the manifestations as a way of life which could and did transform lives- not to a predetermined or predestined molds but rather, into an individuality and ever changing masterpiece  of  the human experience which is unique to each of us, changing as life goes on.

Think of Buddhism and Zen as the ultimate guide to self awareness.

aeon:

Ever since I was a child, I have been acutely sensitive to the idea — in the way that other people seem to feel only after bereavement or some shocking unexpected event — that the human intellect is unable, finally, to make sense of the world: everything is contradiction and paradox, and no one really knows much for sure, however loudly they profess to the contrary.

It is an uncomfortable mindset, and as a result I have always felt the need to build a conceptual box in my mind big enough to fit the world into. Most people seem to have a talent for denying or ignoring life’s contradictions, as the demands of work and life take them over. Or they fall for an ideology, perhaps religious or political, that appears to render the world a comprehensible place.

I have never been able to support either strategy. A sense of encroaching mental chaos was always skulking at the edges of my life. Which is perhaps why I fell into an acute depression at the age of 27, and didn’t recover for several years.

The consequence of this was my first book, a memoir called The Scent of Dried Roses (1996). While I was researching it, I read the work of the psychologist Dorothy Rowe, a quiet, almost secret, follower of Buddhist philosophy. Secret, because Rowe knew what the term ‘Buddhist’ implied to the popular imagination (as it did to me) — magical thinking, Tibetan bell-ringing, and sticking gold flakes on statues of the Buddha.

Truth is not to be found by picking everything to pieces like a spoilt child

It was through Rowe’s writing that I first came across Alan Watts, and he sounded like an unlikely philosopher. His name evoked the image of a paper goods sales rep on a small regional industrial estate. But through Watts and his writing, I was exposed directly to the ideas of Zen Buddhism. I was suspicious at first, perceiving Zen Buddhism to be a religion rather than a philosophy. I wasn’t interested in the Four Noble Truths, or the Eightfold Path, and I certainly didn’t believe in karma or reincarnation.

All the same, I read a couple of Watts’s books. They made a significant impact on me. The Meaning of Happiness (1940) and The Wisdom of Insecurity (1951) are striking primers to his work, and they underlined what Rowe was already teaching me: that life had no intrinsic meaning, any more than a piece of music had an intrinsic ‘point’. Life was, in Zen parlance, yugen — a kind of elevated purposelessness.

Watts, like Rowe, showed me how we construct our own meanings about life. That nothing is a given and, since everything is uncertain, we must put together a world view that might fit roughly with the facts, but is never anything other than a guess — a working fiction. This, too, is a typical Zen understanding — that life cannot be described, only experienced. Trying to see all of life is like trying to explore a vast cave with a box of matches.

Impressed though I was, I more or less forgot about Watts after I finished his books, and pursued my career as a fiction writer. I was weary of introspection. Then, years later, a bad spell in my life propelled me back into a chasm. In 2004, three close friends died in sudden succession. One died in front of my eyes. Another was murdered. A third succumbed to cancer. My depression — and that original sense of meaninglessness — resurfaced. I turned to Watts again. This time, it was as if I was reading for dear life.

Alan Watts had been prolific in his 58 years. He died in 1973, after producing not only 27 books but also scores of lectures, all of which were available online. They had intriguing titles such as ‘On Being Vague’, ‘Death’, ‘Nothingness’ and ‘Omnipotence’. I stopped writing novels and worked my way through every one of them instead.

I found a DVD of an animation of Watts by Trey Parker and Matt Stone (of South Park fame). I discovered that Van Morrison had written a song about him, and that Johnny Depp was a follower. But he remained largely unknown in Britain, even though he was English, albeit an expatriate.

Watts was born in 1915 in Chislehurst, Kent. His father had been a sales rep for the Michelin tyre company and his mother was a housewife whose father had been a missionary. In later life, Watts wrote of mystical visions he’d had after suffering fever as a child. During school holidays — while he was a scholar at King’s School in Cambridge — he went on trips with the Buddhism enthusiast Francis Croshaw, who first developed his interest in Eastern religion.

With penetrating eyes like Aleister Crowley’s, he described himself as a ‘spiritual entertainer’

By the age of 16, Watts was the secretary of the London Buddhist Lodge, which was run by the barrister Christmas Humphreys. But Watts spoiled his chances of a scholarship to Oxford because one examination essay was judged ‘presumptuous and capricious’. And, despite his obviously brilliant mind, Watts never achieved a British university degree. This, perhaps, is another of his qualities that chimes with my own spirit — I too left school with only two A-levels, and am, like Watts, an autodidact.

As a young man, Watts worked in a printing house and then a bank. During this time, he hooked up with the Serbian ‘rascal guru’ Dimitrije Mitrinović — a follower of the Armenian spiritual teacher GI Gurdjieff and the Russian esotericist PD Ouspensky — who became a major influence on his thinking.

At the age of 21, in 1936, he attended the World Congress of Faiths at the University of London. There, he heard the renowned Zen scholar DT Suzuki speak, and was introduced to him. Later that year, Watts published his first book The Spirit of Zen.

That same year, he met the American heiress Eleanor Everett, whose mother was involved with a traditional Zen Buddhist circle in New York. He married Eleanor in 1938 and they moved to America, where he trained as an Episcopal priest, before leaving the ministry in 1950, thus separating once and for all from his Christian roots. From then on he concentrated on the study and communication of Eastern philosophical ideas to Western audiences.

Ifelt powerfully attracted to Alan Watts. Not only to his ideas, but to him, personally. Watts was no dry, academic philosopher. With eyes hooded and penetrating like Aleister Crowley’s, he was a jester as well as a thinker, describing himself as a ‘spiritual entertainer’. Aldous Huxley described him as ‘a curious man. Half monk and half racecourse operator.’ Watts wholeheartedly agreed with Huxley’s characterisation. He carried a silver cane ‘for pure swank’, he hung out with Ken Kesey and Jack Kerouac (he is even parodied in On the Road as Arthur Whale). His English public school-educated voice was rich and deep, like a prophet’s, and his laugh juicy and contagious.

But it was his thinking that most excited me. He was, if not the earliest, then certainly the foremost translator of Eastern philosophical ideas to the West. In some ways, his interpretations were radical — for instance, he dismissed the core Zen idea of zazen(which meant spending hours seated in contemplative meditation) as unnecessary. ‘A cat sits until it is tired of sitting, then gets up, stretches, and walks away,’ was his forgiving interpretation of zazen. Slightly less forgiving was his comment on Western Zen enthusiasts, whom he mocked as ‘The uptight school … who seem to believe that Zen is essentially sitting on your ass for interminable hours.’ It was a great relief to read this for someone like me, who found the idea of excessive meditation as unhealthy as the idea of excessive masturbation.

Watts also rejected the conventional ideas of reincarnation and the popular understanding of karma as a system of rewards and punishments carried out, lifetime after lifetime. It was this radical approach that made his ideas so fresh — he had no time for received wisdom, even from those who claimed to know Zen inside out.

The idea of walking around with a metaphorical stick to whack yourself with is foreign to a Zen master

Many Zen ideas have become debased into ‘new age’ philosophy, basely transmuted into wishful thinking, quasi-religious mumbo jumbo and the narcissistic fantasies of the ‘me generation’. But before the beatniks and the hippies got hold of it, Zen philosophy, as described by Watts, was hard-edged, practical, logical and, in some ways, oddly English in tone, as it had deep strands of scepticism and humour. (You’ll never see Christian saints laughing. But most of the great sages of Zen have smiles on their faces, as does Buddha.)

Zen and Taoism are more akin to psychotherapy than to religion, as Watts explained in his book Psychotherapy East and West (1961). They are about finding a way to maintain a healthy personality in a culture that tends to tangle you up in a lot of unconscious logical binds. On the one hand, you are told to be ‘free’ and, on the other, that you should follow the demands of the community. Another example is the instruction that you must be spontaneous…

Read it all.

Crisis Averted

January 4, 2013

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

Via US News

The Deal, Explained

January 3, 2013

This image has been posted with express written permission. This cartoon was originally published at Town Hall.

When a bad or careless decision is made you can be sure the law of unintended and tragic consequences will be in effect.

What preceded the suicide of Rutgers student Tyler Clementi are textbook examples of fear, bigotry and hate. They weren’t seen or understood as such by the persecutors of Tyler Clementi, but in the end, that is exactly what it was. He was driven to suicide because he was relentlessly persecuted because he was gay.

To put that in proper perspective, he was persecuted because he was gay. Nothing else. It wasn’t as if he hurt anyone, was overbearing or engaged in any kind of inappropriate public behavior. Just the opposite, really. He was a kind and considerate human being, am asset to the community. 

No, he was persecuted simply because of who he was.

His persecutors were not evil- just the products of a society that tolerates some kinds of discrimination, but not other kinds. Good kids who would never tolerate discrimination based on race, religion, disability, national origin or any of a myriad of other distinguishing features were unable to understand persecution of anyone based on their sexual identity is an equal evil.

New Yorker:

Dharun Ravi grew up in Plainsboro, New Jersey, in a large, modern house with wide expanses of wood flooring and a swimming pool out back. Assertive and athletic, he used “DHARUNISAWESOME” as a computer password and played on an Ultimate Frisbee team. At the time of his high-school graduation, in 2010, his parents bought space in the West Windsor and Plainsboro High School North yearbook. “Dear Dharun, It has been a pleasure watching you grow into a caring and responsible person,” the announcement said. “You are a wonderful son and brother. . . . Keep up your good work. Hold on to your dreams and always strive to achieve your goals. We know that you will succeed.”

One day this fall, Ravi was in a courthouse in New Brunswick, fifteen miles to the north, awaiting a pre-trial hearing. In a windowless room, he sat between two lawyers, wearing a black suit and a gray striped tie. His eyes were red. Although he is only nineteen, he has a peculiarly large-featured, fully adult face, and vaguely resembles Sacha Baron Cohen. When Ravi is seen in high-school photographs with a five-o’clock shadow, he looks like an impostor.

His father, Ravi Pazhani, a slight man with metal-frame glasses, sat behind him. Some way to the right of Pazhani were Joseph and Jane Clementi. Jane Clementi, who has very straight bangs, wore a gold crucifix. She and her husband form a tall, pale, and formidable-looking couple. Their youngest son, Tyler, had died a year earlier, and the family’s tragedy was the silent focus of everyone in the room. That September, Tyler Clementi and Ravi were freshman roommates at Rutgers University, in a dormitory three miles from the courtroom. A few weeks into the semester, Ravi and another new student, Molly Wei, used a webcam to secretly watch Clementi in an embrace with a young man. Ravi gossiped about him on Twitter: “I saw him making out with a dude. Yay.” Two days later, Ravi tried to set up another viewing. The day after that, Clementi committed suicide by jumping from the George Washington Bridge.

Clementi’s death became an international news story, fusing parental anxieties about the hidden worlds of teen-age computing, teen-age sex, and teen-age unkindness. ABC News and others reported that a sex tape had been posted on the Internet. CNN claimed that Clementi’s room had “become a prison” to him in the days before his death. Next Media Animation, the Taiwanese company that turns tabloid stories into cartoons, depicted Ravi and Wei reeling from the sight of Clementi having sex under a blanket. Ellen DeGeneres declared that Clementi had been “outed as being gay on the Internet and he killed himself. Something must be done.”

Enraged online commentary called for life imprisonment for Ravi and Wei, and Ravi’s home address and phone number were published on Twitter. Ravi was called a tormenter and a murderer. Garden State Equality, a New Jersey gay-rights group, released a statement that read, in part, “We are sickened that anyone in our society, such as the students allegedly responsible for making the surreptitious video, might consider destroying others’ lives as a sport.” Governor Chris Christie, of New Jersey, said, “I don’t know how those two folks are going to sleep at night, knowing that they contributed to driving that young man to that alternative.” Senator Frank Lautenberg and Representative Rush Holt, both from New Jersey, introduced the Tyler Clementi Higher Education Anti-Harassment Act. Clementi’s story also became linked to the It Gets Better project—an online collection of video monologues expressing solidarity with unhappy or harassed gay teens. The site was launched the day before Clementi’s death, in response to the suicide, two weeks earlier, of Billy Lucas, a fifteen-year-old from Indiana who, for years, had been called a “fag” and told vicious things, including “You don’t deserve to live.” That October, President Barack Obama taped an It Gets Better message, referring to “several young people who were bullied and taunted for being gay, and who ultimately took their own lives.”

It became widely understood that a closeted student at Rutgers had committed suicide after video of him having sex with a man was secretly shot and posted online. In fact, there was no posting, no observed sex, and no closet. But last spring, shortly before Molly Wei made a deal with prosecutors, Ravi was indicted on charges of invasion of privacy (sex crimes), bias intimidation (hate crimes), witness tampering, and evidence tampering. Bias intimidation is a sentence-booster that attaches itself to an underlying crime—usually, a violent one. Here the allegation, linked to snooping, is either that Ravi intended to harass Clementi because he was gay or that Clementi felt he’d been harassed for being gay. Ravi is not charged in connection with Clementi’s death, but he faces a possible sentence of ten years in jail. As he sat in the courtroom, his chin propped awkwardly on his fist, his predicament could be seen either as a state’s admirably muscular response to the abusive treatment of a vulnerable young man or as an attempt to criminalize teen-age odiousness by using statutes aimed at people more easily recognizable as hate-mongers and perverts.

Ravi had made four court appearances since his indictment. That morning’s hearing was intended to set a trial date, and to consider motions previously submitted by Steven Altman, Ravi’s lawyer.

Judge Glenn Berman announced that he was denying the defense’s request to see various documents in the possession of the state, including a handwritten document—conceivably, a suicide note—found among Clementi’s things at Rutgers. Then, over the objections of Julia McClure, an attorney in the Middlesex County prosecutor’s office, Berman confirmed an earlier ruling: the defense should privately be given the full name of Clementi’s romantic partner on the night of the alleged offenses. The man, known in the public record as M.B., was a likely prosecution witness.

Ravi was visibly anxious when the judge addressed him. Last May, Berman reminded him, he had rejected a plea offer made by McClure. “You are presumed innocent,” he said. “But if you are found guilty, the exposure”—the sentencing potential—“is significant.” For the charge of bias intimidation alone, the judge would be expected to sentence Ravi to between five and ten years. If Ravi accepted the plea offer, he would serve no more than five years. Berman asked Ravi if he understood. Ravi said yes, in an unexpectedly high voice, and gave a reflexive smile.

He was not taking this deal. Berman set a trial date of February 21st. The Clementis waited for Ravi and his father to leave, then walked out, hand in hand.

On a Saturday night in August, 2010, a week before starting college, Dharun Ravi decided to look online for his future Rutgers roommate. He was living with his parents in Plainsboro. Ravi, who was planning to major in math and economics, had learned that he had been assigned to Davidson Hall—a collection of single-story, barracks-like dorms on Busch campus, which is considered the dullest of the four Rutgers campuses in New Brunswick and neighboring Piscataway. He would be in Davidson Hall C, a coed dorm for about eighty students. He knew Clementi’s first name and that his last name started with C; he also knew his e-mail address, keybowvio@yahoo.com—apparently, a distillation of musical terms—and had e-mailed him but received no reply.

Late that night, according to instant-message communications released by attorneys into the public record, Ravi Googled “keybowvio.” This set in motion a remote, electronic dynamic between the two students that was never quite overtaken by real-world engagement—even after they moved into a tiny room together.

A little before midnight, Ravi began an I.M. exchange with Jason Tam, a high-school friend. Ravi had found some of Keybowvio’s posts on a Yahoo forum: something about fish tanks, Ravi told Tam, and something else “pertaining to violins.” If, with “pertaining,” Ravi was aiming for sly disdain, Tam struck a different note: “I’m calling it now. This guy is retarded.” Ravi showed Tam a link to a page on a health forum where, three years earlier, Keybowvio had asked why his asthma symptoms had suddenly worsened, noting that he had prescriptions for Advair and Singulair. Nobody had replied. There was just Keybowvio’s follow-up: “Anyone?” (“What a pussy,” Tam wrote.) Ravi and Tam also found questions about anti-virus software and contributions to a Web site of counter-revolutionary peevishness called Anythingbutipod. In these old posts, at least, Keybowvio—who was indeed Tyler Clementi—seemed worried or defensive about computing. Ravi mocked his roommate for “asking if he should boot linux everytime he surfs internet.”

Just before midnight, Ravi wrote to Tam: “FUCK MY LIFE / He’s gay.” He had found Keybowvio’s name on Justusboys, a gay-pornography site that also has discussion areas. Ravi sent Tam a link to a page that contained sex-tinged ads but was otherwise mundane. It was a conversation, from 2006, prompted by Keybowvio’s question about a problem with his computer’s hard drive. Keybowvio noted that his electronic folders were fastidiously organized; perhaps jokingly, he added, “i have ocd.”

In the next few minutes, Ravi wrote “wtf”—“what the fuck”—seven times. He posted a link to the Justusboys page on his Twitter account: “Found out my roommate is gay.”

Read it all.

What is the price of academic success? Is the value of a college education measured by the education, the college experience or some ethereal blend of the two? Are those things supposed to be jam packed into an undergraduates life? Are schedules meant to organize every aspect of a student’s life, leaving spontaneity and impulsive behavior out of the picture?

These questions matter. Spontaneity and impulse are in many ways, like fingerprints. We are each unique and those things are reflective of our own personality. Spontaneity and impulse are stimulated by our own unique internal processes. They are an expression of our unique selves.

Like the jazz musician or singer who improvises on an old classic or the photographer who finds a new way to look at the world around us, our impulsive behaviors are one way others see us for who we really are- and that is just as important as the other, more deliberate behaviors we exhibit. 

So what happens when life is scripted?

Harvard Magazine:

YOU WAKE UP EACH MORNING with a fever; you feel like a shadow of yourself. But no time for sickness today—the Adams House intramural crew has one of its thrice-weekly practices at 6 A.M., and you…will…row. Some mornings, you watch the sunrise from Lamont Library after hitting your study groove there around 11 the night before and bushwhacking through assignments during the quiet time between 3 A.M.and 5. The rower and late-night scholar is Becky Cooper ’10. “Lamont is beautiful at 5 A.M.—my favorite time,” she says. “Sunlight streams in.” There’s plenty to do—Cooper is taking five courses, concentrating in literature but still pre-med: “I can’t close doors.”

She writes out her daily schedule to the minute: “Shower, 7:15-7:20.” Lunch might be at the Signet Society, the private, arts-oriented, undergraduate club where she is vice-president. She also belongs to the Isis, a female social club, and has held the post of Dionysus at the Harvard Advocate, planning social events like the literary quarterly’s spring dinner (which she revived) for 70 attendees. Cooper has an omnivorous appetite for learning and experience: new fascinations constantly beckon, and she dives in wholeheartedly. Yet the ceaseless activity leaves little space or time for reflection on who she is or what she wants. “I’m more terrified of being bored than busy,” she explains. “Though I’m scared I’ll work myself into a pile of dust if I don’t learn when to stop.”

Cooper has always been super-active. Even in elementary and middle school, she “adopted an intense work ethic” and participated in track, basketball, chorus, a pottery class, and gymnastics. At the “pressure cooker” Stuyvesant High School in Manhattan, she put the shot and racewalked for the track squad, and added cheerleading. After track meets and practices on Saturdays, she had a Sunday job as a docent in a science museum. And from seventh grade on, she attended summer camps for gifted students at upstate college campuses.

At Harvard, she has hosted a two-hour weekly jazz show on WHRB, and as a freshman acted in Ivory Tower, the long-running Harvard TV soap opera viewable on YouTube. (Last summer, she also acted in an independent film shot by a friend in Miami, learning American Sign Language for the part.) In the summer of 2007, Cooper tasted some ravishing ravioli di zucca (pumpkin)—“I was in heaven”—and determined to learn Italian and cook in Italy. As a sophomore, she got a job with Harvard University Dining Services, working with their consultant, cookbook author Mollie Katzen, and the next summer, after two months in Paris with theInternational Herald Tribune, was baking in Italy as a pastry chef and speaking only Italian.

As a Crimson staffer, Cooper wrote a food column every other week for the arts section. Frequently, her classes and meetings ran from 8 A.M. until 11 P.M., when she went over her column, line by line, with another Crimson editor. She returned to college this spring after taking the fall term off to continue a summer job assistingNew Yorker staff writer Adam Gopnik. “It’s exhausting—here now, where next?—continually hopping from one thing to another,” she says. “You never let yourself rest. Harvard kids don’t want to do 5,000 things at 97 percent; they’d rather do 3,000 things at 150 percent.”

There’s no irony intended: “That’s the standard operating procedure,” Cooper explains. “College here is like daring yourself to swim the length of a swimming pool without breathing. A lap is a semester. I want to do everything I possibly can.” She works on a 28-hour day, she says: some days sleeping 10 hours, others, two. She can describe different levels of exhaustion. One level, she explains, is a “goofy feeling, like feeling drunk all the time; you’re not quite sure what’s going on. Then there’s this extra level of exhaustion, where you feel dead behind your eyes. The last four weeks, that’s where I’ve been. I get sick a lot.”

Keeping Up with the Einsteins

AMAZINGLY ENOUGH, Cooper is not unusual at Harvard College. Students today routinely sprint through jam-packed daily schedules, tackling big servings of academic work plus giant helpings of extracurricular activity in a frenetic tizzy of commitments. They gaze at their Blackberries (nicknamed “Crackberries” for their addictive pull) throughout the day to field the digital traffic: e-mail and text messages, phone calls, Web access, and their calendars. Going or gone are late-night bull sessions with roommates and leisurely two-hour lunches—phone calls and texting punctuate meals, anyway.

“They are unbelievably achieving,” says Judith H. Kidd, formerly associate dean for student life and activities, who retired from Harvard last year. “They are always on. They prefer to be busy all the time, and multitask in ways I could not imagine. Students will sign up for three or four activities and take one of them up to practically NGO level. They were organizing international conferences.”

There’s a wide consensus that today’s undergraduates make up the most talented, accomplished group of polymaths ever assembled in Harvard Yard: there’s nothing surprising about meeting a first-chair cellist in the Harvard-Radcliffe Orchestra who is also a formidable racer for the cycling club, or a student doing original research on interstellar dark matter who organized a relief effort in sub-Saharan Africa. “You could say it’s a high-end problem,” says dean of admissions Bill Fitzsimmons ’67, Ed.D. ’71, “but one of the dilemmas for the kind of multitalented people who come to places like Harvard is that they could do almost anything. Andespecially if that’s true, you need to think hard about what it is you really value, which direction is right for you.”

The paradox is that students now live in such a blur of activity that idle moments for such introspection are vanishing. The French film director Jean Renoir once declared, “The foundation of all civilization is loitering,” saluting those unstructured chunks of time that give rise to creative ideas. If Renoir is right, and if Harvard students are among the leaders of the future, then civilization is on the precipice: loitering is fast becoming a lost art. And if the tornado of achievement that whirls through Cambridge has its obvious rewards, there are, as with most tornadoes, downsides.

Sleep deprivation, for example: varsity athletes, representing about 20 percent of undergraduates, seem to be the only sizable student category to sleep and rise at roughly conventional hours, according to Harry Lewis ’68, Ph.D. ’74, McKay professor of computer science and former dean of Harvard College. At Becky Cooper’s high school, the standing joke was: “Friends, grades, sleep: you only get two.” Sleep was nearly always the odd one out. Cooper attributes her own frequent low-level infections and colds to exhaustion. Undergraduates tend to push themselves relentlessly and to disbelieve physical limits. “Harvard kids,” Cooper says, “think of themselves as superheroes.”

New technologies vastly enlarge the game of keeping up with the Einsteins. “If you aren’t on Facebook, you feel guilty, you feel like you’re being a bad citizen, or worse, that you are out of it,” says Hobbs professor of cognition and education Howard Gardner ’65, Ph.D. ’71, who studies excellence in the realm of work. “One thing we discovered in our research is that kids look up people whom they don’t know on Facebook, because they want to see how much they’re achieving. If you’re on theCrimson, but someone else is on the Crimson and the swimming team, well, then….”

The explosion of busyness has occurred not in academics (most students still take four courses a semester), but largely in extracurricular activities. “Extracurriculars are now as important as coursework,” says Gardner. “I wouldn’t have said that 40 years ago.” The number of student organizations grew almost sevenfold from 1960 to 2007-08, skyrocketing from 60 groups to 416, although undergraduate enrollment grew only about 10 percent, from about 6,000 to 6,655. In recent years, the College has added an average of 40 to 50 new student groups annually (though about half don’t endure), says David Friedrich, M.T.S. ’04, assistant dean of Harvard College for student life. In singing, for example, there are now 19 small a cappella groups at the College; before the Radcliffe Pitches were founded in 1975, the Harvard Krokodiloes were the sole such group on campus….

Read it all.

There was a time kids came home from school with ‘I’m home!’ announcement and proceeded to disappear into a world of their own- friends, drawing, Lego building,  bike riding, ball playing and whatever else were the interests and priorities kids set for themselves. Like so many other things that kind of lifestyle is seemingly gone.

Now, parents decide what extracurricular activities their children participate in and monitor (read: demand) success milestones and achievements and endless practice, practice, practice. For many children the interests of the parents trump the interests (and often, talents) of their kids. Far too many kids have come to believe parents needs or desires trump their own needs to explore their own interests.

So what happens when the young adult flies the coop and doesn’t achieve the expected success on the expected schedule?

What happens when those helicopter parents insist on continuing to manage their child’s life?

What happens to the development of the young adult?

The Chronicle Review:

Time: last year. Place: an undergraduate classroom, in the airy, well-wired precincts of Silicon Valley University. (Oops, I mean Sun-Kissed-Google-Apps-University.) I am avoiding the pedagogical business at hand—the class is my annual survey of 18th-century British literature, and it’s as rockin’ and rollin’ as you might imagine, given the subject—in order to probe my students’ reactions to a startling and (to me) disturbing article I have just read in the Harvard alumni magazine. The piece, by Craig Lambert, one of the magazine’s editors, is entitled “Nonstop: Today’s Superhero Undergraduates Do ’3000 Things at 150 Percent.’”

As the breaking-newsfeed title suggests, the piece, on the face of it, is anecdotal and seemingly light-hearted—a collegiate Ripley’s Believe It or Not! about the overscheduled lives of today’s Harvard undergraduates. More than ever before, it would appear, these poised, high-achieving, fantastically disciplined students routinely juggle intense academic studies with what can only seem (at least to an older generation) a truly dizzy-making array of extracurricular activities: pre-professional internships, world-class athletics, social and political advocacy, start-up companies, volunteering for nonprofits, research assistantships, peer advising, musical and dramatic performances, podcasts and video-making, and countless other no doubt virtuous (and résumé-building) pursuits. The pace is so relentless, students say, some plan their packed daily schedules down to the minute—i.e., “shower: 7:15-7:20 a.m.”; others confess to getting by on two or three hours of sleep a night. Over the past decade, it seems, the average Harvard undergraduate has morphed into a sort of lean, glossy, turbocharged superhamster: Look in the cage and all you see, where the treadmill should be, is a beautiful blur.

I am curious if my Stanford students’ lives are likewise chockablock. Heads nod yes; deep sighs are expelled; their own lives are similarly crazy. They can barely keep up, they say—particularly given all the texting and tweeting and cellphoning they have to do from hour to hour too. Do they mind? Not hugely, it would seem. True, they are mildly intrigued by Lambert’s suggestion that the “explosion of busyness” is a relatively recent historical phenomenon—and that, over the past 10 or 15 years, uncertain economic conditions, plus a new cultural emphasis on marketing oneself to employers, have led to ever more extracurricular add-ons. Yes, they allow: You do have to display your “well-roundedness” once you graduate. Thus the supersize CV’s. You’ll need, after all, to advertise a catalog of competencies: your diverse interests, original turn of mind, ability to work alone or in a team, time-management skills, enthusiasm, unflappability—not to mention your moral probity, generosity to those less fortunate, lovable “meet cute” quirkiness, and pleasure in the simple things of life, such as synchronized swimming, competitive dental flossing, and Antarctic exploration. “Yes, it can often be frenetic and with an eye toward résumés,” one Harvard assistant dean of students observes, “but learning outside the classroom through extracurricular opportunities is a vital part of the undergraduate experience here.”

Yet such references to the past—truly a foreign country to my students—ultimately leave them unimpressed. They laugh when I tell them that during my own somewhat damp Jurassic-era undergraduate years—spent at a tiny, obscure, formerly Methodist school in the rainy Pacific Northwest between 1971 and 1975—I never engaged in a single activity that might be described as “extracurricular” in the contemporary sense, not, that is, unless you count the little work-study job I had toiling away evenings in the sleepy campus library. What was I doing all day? Studying and going to class, to be sure. Reading books, listening to music, falling in love (or at least imagining it). Eating ramen noodles with peanut butter. But also, I confess, I did a lot of plain old sitting around—if not outright malingering. I’ve got a box of musty journals to prove it. After all, nobody even exercised in those days. Nor did polyester exist. Once you’d escaped high school and obligatory PE classes—goodbye hirsute Miss Davis; goodbye, ugly cotton middy blouse and gym shorts—you were done with that. We were all so counter cultural back then—especially in the Pacific Northwest, where the early 1970s were still the late sixties. The 1860s.

The students now regard me with curiosity and vague apprehension. What planet is she from.

But I have another question for them. While Lambert, author of “Nonstop,” admires the multitasking undergraduates Harvard attracts, he also worries about the intellectual and emotional costs of such all-consuming busyness. In a turn toward gravitas, he quotes the French film director Jean Renoir’s observation that “the foundation of all civilization is loitering” and wonders aloud if “unstructured chunks of time” aren’t necessary for creative thinking. And while careful to phrase his concerns ever so delicately—this is the Harvard alumni magazine, after all—he seems afraid that one reason today’s students are so driven and compulsive is that they have been trained up to it since babyhood: From preschool on, they are accustomed to their parents pushing them ferociously to make use of every spare minute. Contemporary middle-class parents—often themselves highly accomplished professionals—”groom their children for high achievement,” he suspects, “in ways that set in motion the culture of scheduled lives and nonstop activity.” He quotes a former Harvard dean of student life:

This is the play-date generation. … There was a time when children came home from school and just played randomly with their friends. Or hung around and got bored, and eventually that would lead you on to something. Kids don’t get to do that now. Busy parents book them into things constantly—violin lessons, ballet lessons, swimming teams. The kids get the idea that someone will always be structuring their time for them.

The current dean of freshmen concurs: “Starting at an earlier age, students feel that their free time should be taken up with purposeful activities. There is less stumbling on things you love … and more being steered toward pursuits.” Some of my students begin to look downright uneasy; some are now listening hard.

Such parental involvement can be distasteful, even queasy-making. “Now,” writes Lambert, parents “routinely ‘help’ with assignments, making teachers wonder whose work they are really grading. … Once, college applicants typically wrote their own applications, including the essays; today, an army of high-paid consultants, coaches, and editors is available to orchestrate and massage the admissions effort.” Nor do such parents give up their busybody ways, apparently, once their offspring lands a prized berth at some desired institute of higher learning. Lambert elaborates:

Parental engagement even in the lives of college-age children has expanded in ways that would have seemed bizarre in the recent past. (Some colleges have actually created a “dean of parents” position—whether identified as such or not—to deal with them.) The “helicopter parents” who hover over nearly every choice or action of their offspring have given way to “snowplow parents” who determinedly clear a path for their child and shove aside any obstacle they perceive in the way.

Now, as a professor I have had some experiences with “hel­icopter” parents, and were weather patterns on the West Coast slightly more rigorous, I’m sure I would have encountered “snowplow” parents as well. Indelibly etched on my brain, I tell the class, is a phone call I received one winter break from the aggrieved mother of a student to whom I had given a C-minus in a course that fall. The class had been a graduate course, a Ph.D. seminar, no less. The woman’s daughter, a first-year Ph.D. student, had spoken nary a word in class, nor had she ever visited during office hours. Her seminar paper had been unimpressive: Indeed it was one of those for which the epithet “gobsmackingly incoherent” might seem to have been invented. Still, the mother lamented, her daughter was distraught; the poor child had done nothing over the break but cry and brood and wander by herself in the woods. I had ruined everybody’s Christmas, apparently, so would I not redeem myself by allowing her daughter to rewrite her seminar paper for a higher grade? It was only fair.

While startled to get such a call, I confess to being cowed by this direct maternal assault and, against my academic better judgment, said OK. The student did rewrite the essay, and this time I gave it a B. Generous, I thought. (It was better but still largely incomprehensible.) Yet the ink was hardly dry when the mother called again: Why wasn’t her cherished daughter receiving an A? She had rewritten the paper! Surely I realized … etc. One was forced to feign the gruesome sounds of a fatal choking fit just to get off the phone.

Did such hands-on parental advocacy—I inquired—trouble my students? My caller obviously represented an extreme instance, but what did they think about the wider phenomenon? Having internalized images of themselves (if only unconsciously) as standard-bearers of parental ambition—or so Lambert’s article had it—their peers at Harvard didn’t seem particularly shocked or embarrassed by Ma and Pa’s lobbying efforts on their behalf. According to one survey, only 5 to 6 percent of undergrads felt their parents had been “too involved” in the admission process. Once matriculated (there’s an interesting word), most students saw frequent parental contact and advice-giving as normal: A third of Harvard undergraduates reported calling or messaging daily with a parent…

Read it all.

Via AJC

Good News, Bad News

January 3, 2013

Via TIME

We will resume regular posting on Thursday, January 3, 2013.

In this, our last post of 2012, SC&A want to extend New Year’s Greetings to our readers.

In a less than perfect world, we crave the warmth and security of home. Whether as children or adults, our home and family should be where we feel most comfortable and at ease. Home is the respite from daily battle, where you return when the day is done. Your home and family are the center of your life, whether you feel it at the moment or not.

It is from homes our most important life decisions are made. Our homes have shaped many of our beliefs and attitudes, our awareness and self esteem, that feeling of worth- and in a healthy individual, that motivator to give to others. It is from our homes and families that we learn to share, to cope, to play and to forgive. We learn to be comfortable with ourselves. Most importantly, in a healthy home, we learn to laugh and be happy.

The drive for the acquisition of more ‘things’, larger homes, vacation homes, more luxurious homes and so on, all contribute to and create that false sense of security and delusion, one that only further isolates the soul. As with investment portfolios, familial security requires diversification. If the acquisition and over valuation of material goods is the only way one can find what is only temporary inner peace, that is all that person will end up with- temporary peace and satisfaction.

We envy the mythical Bob Cratchit is because the values of a real home and the comforts that real home brings, are eternal. There is no later or newer model, no ‘next generation.’ In fact, what is real home and comfort becomes even more valuable with the passage of time- our lives, our experiences all add texture and meaning to ‘home.’ Those that have been fortunate enough to have come from a healthy home that can attest to it. Those that have not been so blessed, see it clearly- like the cancer patient who sees the healthy person. That person understands more than most, the value of the gift of health. The person who does not come from a healthy home does not have to be convinced of the values found therein.

A healthy home is really an extension of a common life, common values and common goals- positive contributions to our world and the world around us. That is not to imply that we all need be the same- it is to imply that whatever our differences, be they religious, political or philosophical, family and home need be our point of origin, the place from where who we are, differences and all, originate. When that is so, we relish going home. When it is not so, we are reluctant to return.

Going home isn’t about a party or a meal. Going home is refuel and recharge on those things that make us family. We want to go home, to be there, with those we love and care for- or those we want to be with or love. Why? Because we have all been there, at least once. We have all experienced that spiritual dimension of home and family, at least once in our lives- and it is intoxicating and addictive. We want more, no matter how far away that might be for some.

When we go home for the holidays, whether we do so with eagerness or trepidation, it would wise to keep those things in mind.

Those who fear the place they came from can go home and participate in the toxicity, or they can choose not to engage, and ignore it. While some may choose to watch television, there might be someone willing to engage in meaningful conversation. Taking some time with a bored or sullen child can have a meaning and impact you and that child in ways you cannot begin to imagine. A few hours, without the defensive facade and deliberately pushing aside the tension will not change the dysfunction and pain of the past. Still, those few hours, meaningfully spent, can serve to highlight how far you have come from that dysfunctional environment and how much that means to you. Those few hours might serve as a lamplight to further potential and possibilities.

That may not change the family you came from- but it will change the family you create- even if it is only a family of one.

May the coming year find your home a place of safety, security and family. In 2013, may your home be more about meaning and love for self and others and less about the things that matter far less.

Portions of this post have been previously published

Happy 2013 From Congress

December 30, 2012

Via US News

Via US News

Follow

Get every new post delivered to your Inbox.

Join 83 other followers