November 30, 2011
November 30, 2011
November 30, 2011
The Whole World Is Watching: In an increasingly monitored world, how can consumers and citizens reclaim ownership of their private lives?
November 30, 2011
Early in 2010, The Guardian reported plans by the British Police and Home Office for a remarkable new venture in domestic surveillance. Unmanned aerial drones, now used for tracking insurgents in Pakistan and Afghanistan, are to be adapted (unarmed, one hopes) to monitor Britain’s civil population. An initial aim of the project is crowd control during the 2012 London Olympics. Thereafter, these high-tech surveillance engines are to become a permanent feature of state security and law enforcement—much to the distress of civil libertarians and privacy advocates, who immediately objected to the plans.
But no one can say this is especially new. With an estimated 1.7 million video cameras deployed on the ground, George Orwell’s homeland can probably already claim world leadership in state-sponsored monitoring of its population. And the intensification of all forms of institutional tracking of individuals isn’t restricted to Britain—it is occurring the world over. All told, the United States has probably contributed more to these trends than any other country as both the creator and exporter of different means of government and corporate surveillance. The sheer variety of forms implicated in this monitoring is striking. They include real-time recording of consumers’ buying habits and finances; tracking of travelers’ movements by air, train, and road; monitoring of private citizens’ telecommunications; and the mass harvesting of tidbits of personal data from social sites like Facebook.
The seemingly relentless pace of innovation in surveillance cannot be ascribed to any one interest, policy, organizational purpose, or political mood. Instead, it suffuses all manner of relations between institutions and individuals, from the allocation of welfare-state benefits to the pursuit of suspected terrorists.
The result has been change in the very texture of everyday life. Being “alone” is not what it used to be. Our whereabouts, our financial transactions, our uses of the World Wide Web, and countless other data routinely register in the automated consciousness of corporate and state bureaucracies. More importantly, the results of such monitoring in turn shape the treatment we receive from these organizations—sometimes in ways that we know, and often in ways we hardly imagine.
Many observers dismiss these developments with a shrug: The fate of personal privacy in the face of institutional data-gathering may be hopeless, they hold, but such a development is not really serious. The collection of personal data supports all sorts of valued corporate conveniences and public policies, from easy credit to protection from terrorist threats. The fact that my most intimate medical information is held by a distant bureaucracy is hardly a loss, the argument goes, so long as the people who handle it don’t really know me in any personal way. And why should I care if government agencies track my communications, movements, or expenditures if I have nothing to hide? We ought to be grateful for these developments, and not challenge them with anachronistic values like privacy.
Such nonchalance shortchanges both the complexity of the changes we are enmeshed in and their repercussions in our everyday lives. In every realm of life, the flow (and restriction) of personal information confers advantage and disadvantage between parties, opening some possibilities and closing others. In face-to-face relationships, as elsewhere, we do not readily disclose information about areas of our lives in which we feel weak, troubled, or ashamed. Nor do we reveal information that could confer strategic advantage on the other party, like the maximum we are willing to pay in a purchase we are negotiating, for example. For all sorts of reasons, we cherish the ability to control sensitive information about ourselves.
Thus, even those who profess themselves unconcerned about privacy are apt to object when unauthorized use of their information works againstthem. They will not appreciate finding themselves the losers, for example, in “target pricing”—the practice in which online retailers raise or lower prices offered to different customers for identical items on the basis of their past buying habits. They will be displeased if they discover that their bosses have accessed their medical files from the company health-care plan, and used their medical data as bases for decisions on pay or promotions. They will feel aggrieved on finding themselves subjected to marketing appeals for embarrassing products or services—incontinence supplies, treatments for sexual dysfunction—on the strength of their past website visits or consumer choices. They will wax indignant if they discover that the prices they are quoted for insurance coverage are raised because the insurer has discovered that they have low credit scores, which supposedly correlate with greater likelihood of filing insurance claims. And they will be outraged if they find themselves victims of “universal default”—creditors’ policy of raising a customer’s rates in one credit account based on reports that the amounts of credit used in other credit accounts have risen.
In these cases and countless others, people resent receiving unfavorable treatment on the basis of information about themselves that they consider “nobody else’s business.” The trouble is, notions of what information constitutes anyone’s own “business” are in headlong transformation. We live in a world in which possibilities for accessing personal data are mutating in ways that institutions, unsurprisingly, exploit to their own advantage. What disclosures and uses of personal data are held “reasonable” under such circumstances is constantly up for grabs. That is why the need for serious public conversations about privacy is so urgent.
Classic visions of liberal society stress judicious limitations of institutional power, both governmental and corporate, coupled with preservation of individual autonomy and freedom of choice. We accept that institutions like the IRS have investigative powers sufficient to collect most taxes owed, most of the time. But we recoil—I hope—at an idea like unlimited IRS monitoring of all taxpayers’ e-mails and phone conversations aimed at registering key words associated with tax evasion. Such (hypothetical, but quite feasible) measures could be very effective in spotting underreporting of taxable income. But even greatly increased compliance with tax obligations is not worth such sweeping losses to privacy…
How Brooklyn Got Its Groove Back: New York’s biggest borough has reinvented itself as a postindustrial hot spot
November 30, 2011
In 1982, I moved with my husband and our two young children into a partly renovated brownstone in Park Slope, Brooklyn. Last year, New Yorkpronounced the area “the most livable neighborhood in New York City,” but in those days, real-estate agents euphemistically described it as “in transition,” meaning that the chances you’d get mugged during a given year were pretty good. Educated middle-class couples like us, who had been moving into the area between Seventh Avenue and Prospect Park for more than a decade, lived alongside the Irish, Italian, and Puerto Rican immigrants who had given Brooklyn its working-class identity and its former nickname, “Borough of Churches.” For us, Saint Francis Xavier’s was just the sponsor of our children’s Little League teams, but it remained a religious and community center for those who also frequented smoke-filled bars on Seventh like Snooky’s and Moody’s.
Among the old-timers was our neighbor Peggy Lehane. Her late husband had been a postal worker, and, like a lot of Park Slopers back in the day, she helped the family’s modest finances by taking in boarders in their four-story house. Legend had it that at one time she had a clientele of respectable bachelors and shabby “heiresses” to whom she served tea on silver trays. By the time we arrived, her boarders were elderly men and women on government assistance. We could hear the 3 AM moaning of these sad creatures and smell the contents of their bedpans, which they sometimes tossed into the patch of grass in back. Like the neighborhood, the house was transitional. More than once, and much to the wonderment of my children, ambulances arrived to remove a white-sheet-covered body.
As the decade proceeded and crime worsened, Park Slope trembled between shabby respectability and drug-fueled violence. Rumors circulated of a crack house near Prospect Park, which had become a dangerous shadow of the original Olmsted-and-Vaux masterpiece. The streets around us endured a nightly explosion of shattered glass from car windows. Kids on their way home from elementary school were knocked down; parents walking from the subway station were held up at gunpoint. When my children went to camp, suburban kids, hearing that they were from Brooklyn, would ask: “Have you ever been shot?”
Our neighbor’s house reflected the Slope’s perilous condition. When we first moved in, Mrs. Lehane, always wearing a faded but neatly pressed dress, thick stockings, and lipstick, used to sweep the sidewalk with the intensity of a corporate lawyer on a gym treadmill. Now, her lipstick was smudged, her dresses were torn, and her stockings sagged. Instead of elderly renters, she took in “former” alcoholics and drug addicts living on disability payments. Mrs. Lehane’s children had moved to Long Island, but her foul-tempered granddaughter moved in, supposedly to oversee the house. The granddaughter’s violent fights with her boyfriend would sometimes wake us in the middle of the night. One day, Mrs. Lehane disappeared—to a nursing home, I heard. Many of our friends and acquaintances—fed up with vagrants on their stoops and graffiti, or terrified for the safety and education of their kids—left as well. I don’t know what combination of denial and passiveness made us stay. It seemed inevitable that something terrible would happen.
And so it did. One October night in 1995, after putting our Brooklyn-born youngest child to bed, my husband smelled smoke. Sure enough, a thin film of gray was swaying through our second-floor hallway, and we quickly spotted sickening black waves of the stuff pouring out of the moldings atop our bedroom windows. We ran outside to find the street jammed with fire trucks, ambulances, and awestruck neighbors. For the next two hours, we sat on a neighbor’s stoop and watched the Lehane house—its 1890s mahogany-trimmed parlor; its oak parquet floors; its memories of bourgeois Victorian respectability, of hard-knock immigrants, of addiction and decay—consumed in a conflagration apparently caused by a tenant who’d fallen asleep with a lit cigarette in his hand. We were lucky: our house suffered only some smoke damage. Others were not: several firemen were hurt, and a boarder died from smoke inhalation. For the next three years, the charred and empty house brooded over the block, a symbol of an uncertain urban future.
If you’ve been in Park Slope recently, you can probably guess how things turned out for the Lehane house. But you may not know why. How did the Brooklyn of the Lehanes and crack houses turn into what it is today—home to celebrities like Maggie Gyllenhaal and Adrian Grenier, to Michelin-starred chefs, and to more writers per square foot than any place outside Yaddo? How did the borough become a destination for tour buses showing off some of the most desirable real estate in the city, even the country? How did the mean streets once paced by Irish and Italian dockworkers, and later scarred by muggings and shootings, become just about the coolest place on earth? The answer involves economic, class, and cultural changes that have transformed urban life all over America during the last few decades. It’s a story that contains plenty of gumption, innovation, and aspiration, but also a disturbing coda. Brooklyn now boasts a splendid population of postindustrial and creative-class winners—but in the far reaches of the borough, where nary a hipster can be found, it is also home to the economy’s many losers.
To understand the emergence of the new Brooklyn, it’s best to start by recalling its original heyday. From the mid-nineteenth century to 1898, when it became part of New York City, Brooklyn was one of the nation’s preeminent industrial cities, and its dominance continued until about 1960. Facing New York’s deepwater harbor and the well-traveled East River, Brooklyn’s waterfront was lined with factories. Workers in those factories lived in the borough’s numerous tenements, row houses, and subdivided townhouses. Some worked the assembly line in the Ansonia Clock Factory in Park Slope. (It later became the neighborhood’s first condo-loft space.) Others worked in the Brooklyn Navy Yard, in an area now known as Vinegar Hill. Still others worked on the docks in Red Hook, the inspiration for the Marlon Brando movie On the Waterfront; in the Arbuckle coffee-roasting factory under the Manhattan Bridge; in the paint factories and metal shops in Gowanus; in the breweries in the once-German enclaves of Williamsburg, Greenpoint, and Bushwick; and in the pharmaceutical factory founded in East Williamsburg by Charles Pfizer. They worked in the Domino sugar refinery, at one time the largest in the world, whose big red DOMINO sign (still illuminating the East River at night) was all that some Manhattanites knew firsthand of Brooklyn…
November 30, 2011
When I was very small I lived on a defunct chicken farm. There was a house with a yard, and these together took up half an acre. To the north there was a long, thin chicken coop, empty of chickens, and behind it lay the back pasture, which occupied one acre. Perpendicular to this, to the west, there was the side pasture. Steers dwelled in the back pasture, ate hay, shat, sculpted odd forms on the salt lick (until we had them shot and butchered). As far as I know, these were actually existing steers. But the side pasture was inhabited, I imagined for a long time, by a fox. When I went there by day, I felt I was entering upon its territory; and when I lay in bed at night, I was certain it was out there, in its burrow, dwelling. It lived there like a human in a home, and was as real as any neighbor—except that I had myself brought it into existence, likely by projecting it out of a picture in a book.
The fox did not need to exist in order to function in my imagined community, one which must be judged no more or less real than that of, say, Indonesians, or of humanity. It was enough that there be foxes at all, or creatures that fit that description, in order for me to conjure community with the imaginary fox in the side pasture. And it was no mere puerile phantasm that caused me to imagine this community, either. It was rather my thinking upon my own humanity, a condition which until very recently remained, over the course of an entire human life, embedded within a larger community of beings.
These days, we are expected to grow out of that sort of thinking well before puberty. Our adult humanity consists in cutting off ties of community with animals, ceasing, as Lévi-Strauss put it, to think with them. When on occasion adults begin again to think about animals, if not with them, it is to assess whether animals deserve the status of rights-bearers. Animal rights, should there be such things, are now thought to flow from neurophysiological features and behavioral aptitudes: recognizing oneself in the mirror, running through mazes, stacking blocks to reach a banana.
But what is forgotten here is that the animals are being tested for re-admission to a community from which they were previously expelled, and not because they were judged to lack the minimum requirements for the granting of rights. They were expelled because they are hairy brutes, and we learned to be ashamed of thinking of them as our kin. This shame only increased when Darwin confirmed our kinship, thus telling us something Paleolithic hunters already knew full well. Morality doubled up its effort to preserve a distinction that seemed to be slipping away. Since the 19th century, science has colluded with morality, always allowing some trivial marker of human uniqueness or other to function as a token for entry into the privileged moral universe of human beings. “They don’t have syntax, so we can eat them,” is how Richard Sorabji brilliantly reduces this collusion to absurdity.
Before and after Darwin, the specter of the animal in man has been compensated by a hierarchical scheme that separates our angelic nature from our merely circumstantial, and hopefully temporary, beastly one. And we find more or less the same separation in medieval Christian theology, Romantic nature poetry, or current cognitive science: All of it aims to distinguish the merely animal in us from the properly human. Thus Thoreau, widely lauded as a friend of the animals, cannot refrain from invoking animality as something to be overcome: “Men think that it is essential,” he writes, “that the Nation have commerce, and export ice, and talk through a telegraph, and ride 30 miles an hour, without a doubt, whether they do or not; but whether we should live like baboons or like men, is a little uncertain.” What the author of Walden misses is that men might be living like baboons not because they are failing at something or other, but because they are, in fact, primates. Thoreau can’t help invoking the obscene and filthy beasts that have, since classical antiquity, formed a convenient contrast to everything we aspire to be.
The best evidence suggests that this hatred of animals—there’s no other word for it, really—is a feature of only certain kinds of society, though societies of this kind have dominated for so long that the hatred now appears universal. Until the decisive human victory over other predatorial megafauna several thousand years ago, and the subsequent domestication of certain large animals, the agricultural revolution, the consequent stratification of society into a class involved with food production and another, smaller class that traded in texts and values: Until these complex developments were well under way, human beings lived in a single community with animals, a community that included animals as actors and as persons.
In that world, animals and human beings made up a single socio-natural reality. They killed one another, yes, but this killing had nothing in common with the industrial slaughter of domestic animals we practice today: Then, unlike now, animals were killed not because they were excluded from the community, but because they were key members of it. Animals gave themselves for the sake of the continual regeneration of the social and natural order, and in return were revered and treated as kin.
As human beings abandoned community for domination, thinking with animals became a matter of symbolism. Bears showed up on coats of arms, for example, not because the warriors who fought behind these shields were fighting as bears, as magically transformed ursine warriors. They were fighting behind the bear shield simply because that’s what their clan chose, as today one might choose Tasmanian Devil mudflaps for one’s truck. It was an ornament, a mere symbol…