In the summer of 1725 a peculiar youth was found in the forest of Hertswold near Hameln in northern Germany. Aged about 12, he walked on all fours and fed on grass and leaves. ‘A naked, brownish, blackhaired creature’, he would run up trees when approached and could utter no intelligible sound. The latest in a long line of feral children – in turn celebrated, shunned and cursed through the ages – ‘The Wild Boy of Hameln’ would be the first to achieve real fame.
After a spell in the House of Correction in Celle, the boy was taken to the court of George, Duke of Hanover and King of the United Kingdom, at Herrenhausen. There the young curiosity was initially treated as an honoured guest. Seated at table with the king, dressed in a suit of clothes with a napkin at his neck, he repelled his host with his complete lack of manners. He refused bread, but gorged himself on vegetables, fruit and rare meat, greedily grasping at the dishes and eating noisily from his hands, until he was ordered to be taken away. He was given the name of Peter, but was variously known as ‘Wild Peter’, ‘Peter of Hanover’, or, most famously, ‘Peter the Wild Boy’.
In the spring of 1726, after briefly escaping back to the forest, Peter was brought to London where his tale had aroused particular interest. As in Hanover, he caused a sensation and his carefree nature provided an amusing antidote to the stultifying boredom and decorum of court life. He appealed especially to Caroline, Princess of Wales, who persuaded the king to allow Peter to move to her residence in the West End, where he was kept virtually as a pet. Though he insisted on sleeping on the floor, he was dressed carefully each morning in a tailor-made suit of green and red. He was also appointed a tutor, who had him baptised and taught him to bow and kiss the hands of the ladies at court.
Peter quickly became a celebrity. On one level, tales of his antics busied the London gazettes. Jonathan Swift, whose fictional ‘Yahoos’ Peter appeared to personify, noted sourly that ‘there is scarcely talk of anything else’. He was soon the ‘talk of the town’, his portrait graced the walls of the King’s Grand Staircase at Kensington Palace and an effigy of him was erected in a waxworks on the Strand…
But Peter could not to live up to the popular interest invested in him and a fickle public quickly abandoned him in favour of the next unfortunate. His academic progress also failed to match his earlier promise. He was declared ‘unable to receive instruction’, despite the attentions of ‘the ablest masters’. He could say nothing beyond his own name and a garbled form of ‘King George’. By 1728, his tutor had given up his efforts and Peter was retired to the country. A home was found for him on a farm near Northchurch in Hertfordshire and a generous crown pension of £35 per annum was supplied for his upkeep. The ‘talk of the town’ became a humble farm hand.
Though still only an adolescent, Peter faded into provincial obscurity and thereafter rarely troubled the gossip columns. He developed a taste for gin and loved music, reportedly swaying and clapping with glee and dancing until he was exhausted. But he never learned to speak and his lack of any sense of direction gave cause for concern. In 1745, the year of the Jacobite Rebellion, he was arrested as a suspected Highlander and, six years later, he wandered as far as Norwich, where he was thought to be a Spanish subversive. As a result he was fitted with a heavy leather collar bearing the inscription: ‘Peter, the Wild Man of Hanover. Whoever will bring him to Mr Fenn at Berkhamsted, Hertfordshire, shall be paid for their trouble.’ He finally died, aged around 72, in 1785.
Though Peter’s life is remarkable enough, what is most astounding is the sheer scale of scientific and philosophical interest that his case aroused. While wits opined that the boy might be corrupted by the sybaritic life of London high society, others saw in him an ideal test case for the nascent sciences of anthropology and psychology.
To the thinkers of the Age of Reason, Peter represented a blank slate. As humanity in its ‘raw’ state, he was what Jean-Jacques Rousseau called ‘the noble savage’, man ‘unspoilt’ by society and civilisation. He was indeed a fascinating subject, but he provoked further, disquieting, enquiry. He was undoubtedly human but, lacking speech and socialisation, could he be classed as a man? Could he have a soul? Could he possess the power of thought?
Of the numerous thinkers and writers who addressed the subject, Daniel Defoe did so with the most clarity in his pamphlet Mere Nature Delineated, published in 1726. He described Peter as an ‘object of pity’ but cast doubt on the story of his origins, dismissing it as a ‘Fib’. On the issue of Peter’s soul, he was more charitable. Possessed of the gift of laughter and thought, Peter clearly had a soul, he wrote, but its powers did not yet act within him. He was, in sum, ‘in a state of Mere Nature … a ship without a Rudder’. And it was the task of his tutors to bring him to ‘the Use of his Reason’. He deferred the final verdict on Peter, therefore, until the results of his education became apparent. If he could receive instruction – if he could be taught to heed his soul – then he would become a man. And, what was more, he would be a lesson to us all, especially, wrote Defoe, ‘those who think nobody so wise as themselves’.
Defoe wrestled manfully with the uncomfortable question that Peter posed: what was it that divided ‘us’ from ‘them’, man from the animals? Different minds arrived at different conclusions. But the habitual tidier of nature Carl Linnaeus was typical. He reassured mankind by creating a separate species of ‘wild men’ or homo ferens. Peter was still clearly an outsider – one of ‘them’.
Peter’s example was later used in numerous theories of child development, socialisation and the role of language. Many thinkers dwelt on his inability to learn to speak. The philosopher James Burnett (Lord Monboddo), whose ideas anticipated some of Darwin’s, presented him as an illustration of his theory of the evolution of language in the human species. He saw Peter as evidence that ‘man was born mute, and that articulation is altogether … a habit acquired by custom and exercise’. To others, Peter was thought to demonstrate the existence of a ‘critical window’ in which language and other skills are developed in the child. Having missed the ‘window’, Peter could never learn such skills again. Hence the apparent failure of his esteemed tutors…
Perhaps you’ve seen the helmet babies – on the T, strapped into portable carrying chairs between fidgety parents; on the street, curled up in slings against the chests of dads. Helmet babies look strange, their soft baby heads encased in shells of foam, tight straps hugging their chins. If you’ve spotted one at close range, you may have felt the temptation to ball your hand into a fist, reach over, and give that fortified little noggin a gentle “knock knock.”
Not long ago, babies were only fitted with helmets if they were born with irregularly shaped heads. But in recent years, entrepreneurial manufacturers have expanded the market, creating helmets designed for any children whose parents want to protect them from scrapes and bruises while they’re learning how to stand upright and walk. These helmets have names like ThudGuard, SoftTop, and Baby No Bumps. Some even come with decorative Mickey Mouse ears.
The baby helmet is just one piece of the protective armor being built around childhood these days. There are soft pads to shield babies’ knees from irritation while they’re learning how to crawl. Specialty feeding spoons change color when the food is too hot. GPS devices track babies’ movement in real time. The Safety Turtle antidrowning alarm alerts you when they get into the water.
As these products proliferate – perhaps you’d like to dress your baby in a full-body jumper with special pockets that make it impossible to drop him? – so does the sentiment that perhaps we’re going too far, and that parents have let their protective instincts get the best of them. In books, magazines, and parenting blogs, a divisive public debate has placed safety-conscious moms and dads on the defensive against a chorus of critics who believe America’s children are being crippled by paranoid overprotection.
At the center of this debate, as the authors of books like “Free Range Kids” and “Too Safe For Their Own Good” will tell you, is a worry that contemporary parenting trends are producing a generation of weaklings, kids so insulated from the world around them that they’ll never learn to navigate risk or handle pain. And though on the surface the debate concerns the future of childhood, it is also built on a belief about the past: that being a kid used to be radically different than it is today – more free, more rough-and-tumble, and on a deep level, better.
Over the past 20 years or so, historians specializing in childhood have forged a body of work that has begun to change how we see that past. The experience of childhood has, indeed, changed dramatically over time. And though
some historians have argued that the very concept of childhood as we know it did not properly exist in people’s minds until a few centuries ago, a rich body of evidence suggests that parents haven’t changed much at all – that they have always gone as far as possible to protect their offspring, in some cases outfitting them with early versions of the very safety devices we now think of as quintessentially modern.
What an overload of safety measures might actually do to children, psychologically and emotionally, remains up for debate. And while it’s incontrovertible that children used to have more freedom, it’s also clear that the march towards ever more sophisticated methods of child protection began long ago – as did the worries about their effects. Helmets for babies may seem a little extreme, but there is every reason to think they were also inevitable. The proper response to our zeal for safety may lie not in fighting the impulse, then, but in engineering ways to blunt its negative effects.
We want to believe there was a time when it was all very different – when kids could be kids, and parents weren’t too risk averse to let their offspring grapple with the world’s harshness. This is the idea embedded in much of the criticism one hears about contemporary child-rearing: that once upon a time kids were taught responsibility instead of fear, and were encouraged to make formative mistakes instead of being vigilantly insulated against them.
This belief formed the basis of one of the first works of childhood history ever published, when in the year 1960 a French historian named Philippe Ariès made the provocative claim that, during the Middle Ages, children had been treated like miniature adults. Ariès argued in his landmark book, “Centuries of Childhood,” that until the 18th century or so, the concept of childhood as distinct from other life stages did not really exist…
August 14, 2011
The 9/11 Memorial, with its shimmering pools, mesmerizing waterfalls, and elegiac bronze inscriptions of the names of the lost, will be dedicated this September 11. On the tenth anniversary of the attacks, the monument, floating on the footprint of the twin towers, will have the distinction of being the only element of the plans created, debated, endlessly revised, and repeatedly parsed for metaphorical, political, and practical implications that has actually come to fruition at Ground Zero, offering a degree of the closure that is elusive in so many other ways.
However, ongoing construction has blocked access to the memorial, resulting in limited tickets for the general public, which were snapped up by mid-July. The museum slated for the World Trade Center site, whose mission is to “bear solemn witness” using artifacts, photos, stories, and videos, won’t open until 2012. So the collective desire for a venue for communal reflection and remembrance, to face the incomprehensible and grasp at the intangible, cannot immediately be fulfilled at the site of the attacks themselves.
This has left an opening—and a profound dilemma—for the city’s cultural institutions. For New York museums, it’s not clear whether creating content related to the anniversary of 9/11 is a responsibility, an opportunity, or an invitation to inevitable and unwanted controversy.
In the aftermath of 9/11, it quickly became clear that art about or at Ground Zero was perceived by many as subject to a vetting process by constituencies connected to the attacks—and that stated priorities of patriotism, as well as the moral rights of victims and their families, trumped freedom of expression. This is why Eric Fischl’s Tumbling Woman disappeared so fast from Rockefeller Center and why the Drawing Center pulled out of a planned cultural facility on the World Trade Center site. Though the roles of irony and sincerity, initially reversed after the attacks, have been restored to their pre-2001 levels in the larger art world, it is not clear whether that’s the case with content related to 9/11. The chronicle in the official book of the 9/11 Memorial, A Place of Remembrance (National Geographic), shows, as if any more proof were needed, how sensitive, delicate, and fraught each object, image, and symbol of the attacks remains.
Yet several of the city’s museums are moving forward with 9/11 programming. The Metropolitan Museum of Art is exhibiting the 9/11 Peace Story Quilt (2006), Faith Ringgold’s project with New York City schoolchildren. The New Museum is showing [Swi:t] Home: A Chant (2001–6), by Elena del Rivero, who stitched together burnt papers and other detritus that blew into her studio, across from the twin towers. Other venues, including the Morgan Library & Museum, are presenting memorial readings or concerts. And three institutions are offering ambitious shows that reflect on the attacks or their legacy.
The New-York Historical Society, which has become the uptown clearinghouse for objects related to the tragedy, is presenting “Remembering 9/11,” a “memorial installation,” as curator Marilyn Satin Kushner puts it, of objects documenting “attack, shock, recovery.” It includes about 150 images from “here is new york: a democracy of photographs,” the wildly popular, critically acclaimed, crowdsourced, non-curated SoHo exhibition; objects from the shrines that materialized around the city after the attacks; photos of Tribute in Light, the public-art project whose projected beams stood in for the missing towers; children’s letters to firefighters and police officers; architect Michael Arad’s drawings of the 9/11 Memorial, and more. Satin Kushner’s approach for this show, she says, is very different from her approach to curating art, where she tries to explain why it’s relevant. “The objects speak for themselves,” she comments.
If such objects fulfill a visceral need to collect and remember, the challenge for art museums approaching 9/11 is different—presumably, to process this raw material into some kind of expression that transforms, questions, enlightens, inspires, soothes, or any of the other things culture is supposed to do. Documentary artifacts are “inextricably bound with the event,” says Peter Eleey, chief curator at MoMA PS1. “They don’t allow us to make our own meaning.” Yet as Eleey set out to curate a show on 9/11, he decided that there isn’t enough significant art inspired by 9/11 to anchor an exhibition. Individual works like del Rivero’s have been made with or about detritus from the attacks—or, as Art Spiegelman put it in the title of his 2004 graphic elegy, about living “In the Shadow of No Towers.” But what’s lacking, Eleey says, is work that “speaks to the immensity of the event”—though he’s not sure it’s even fair to demand such a thing. In part, he notes, addressing the spectacular nature of the attacks is a challenge for artists because it is so closely tied with terror, violence, and death. In addition, the ongoing presence of soldiers in the field and construction on the ground reflect a different problem: “How do you memorialize or commemorate an event that hasn’t been concluded but is referred to as a finished event?” he comments…
August 14, 2011
This image has been posted with express written permission. This cartoon was originally published at Town Hall.
August 14, 2011