July 19, 2012
July 19, 2012
In the 13th century, the Chinese emperor Kublai Khan embarked on a bold experiment. China at the time was divided into different regions, many of which issued their own coins, discouraging trade within the empire. So Kublai Khan decreed that henceforth money would take the form of paper.
It was not an entirely original idea. Earlier rulers had sanctioned paper money, but always alongside coins, which had been around for centuries. Kublai’s daring notion was to make paper money (the chao) the dominant form of currency. And when the Italian merchant Marco Polo visited China not long after, he marveled at the spectacle of people exchanging their labor and goods for mere pieces of paper. It was as if value were being created out of thin air.
Kublai Khan was ahead of his time: He recognized that what matters about money is not what it looks like, or even what it’s backed by, but whether people believe in it enough to use it. Today, that concept is the foundation of all modern monetary systems, which are built on nothing more than governments’ support of and people’s faith in them. Money is, in other words, a complete abstraction—one that we are all intimately familiar with but whose growing complexity defies our comprehension.
Today, many people long for simpler times. It’s a natural reaction to a world in which money is becoming not just more abstract but more digital and virtual as well, in which sophisticated computer algorithms execute microsecond market transactions with no human intervention at all, in which below-the-radar economies are springing up around their own alternative currencies, and in which global financial crises are brought on for reasons difficult to parse without a Ph.D. Back in the day, the thinking goes, money stood for something: Gold doubloons and cowrie shells had real value, and so they didn’t need a government to stand behind them.
In fact, though, money has never been that simple. And while its uses and meanings have shifted and evolved throughout history, the fact that it is no longer anchored to any one substance is actually a good thing. Here’s why.
Let’s start with what money is used for. Modern economists typically define it by the three roles it plays in an economy:
It’s a store of value, meaning that money allows you to defer consumption until a later date.
It’s a unit of account, meaning that it allows you to assign a value to different goods without having to compare them. So instead of saying that a Rolex watch is worth six cows, you can just say it (or the cows) cost $10 000.
And it’s a medium of exchange—an easy and efficient way for you and me and others to trade goods and services with one another.
All of these roles have to do with buying and selling, and that’s how the modern world thinks of money—so much so that it seems peculiar to conceive of money in any other way.
Yet in tribal and other “primitive” economies, money served a very different purpose—less a store of value or medium of exchange, much more a social lubricant. As the anthropologist David Graeber puts it in his recent book Debt: The First 5000 Years(Melville House, 2011), money in those societies was a way “to arrange marriages, establish the paternity of children, head off feuds, console mourners at funerals, seek forgiveness in the case of crimes, negotiate treaties, acquire followers.” Money, then, was not for buying and selling stuff but for helping to define the structure of social relations.
How, then, did money become the basis of trade? By the time money makes its first appearance in written records, in Mesopotamia during the third millennium B.C.E., that society already had a sophisticated financial structure in place, and merchants were using silver as a standard of value to balance their accounts. But cash was still not widely used.
It’s really in the seventh century B.C.E., when the small kingdom of Lydia introduced the world’s first standardized metal coins, that you start to see money being used in a recognizable way. Located in what is now Turkey, Lydia sat on the cusp between the Mediterranean and the Near East, and commerce with foreign travelers was common. And that, it turns out, is just the kind of situation in which money is quite useful.
To understand why, imagine doing a trade in the absence of money—that is, through barter. (Let’s leave aside the fact that no society has ever relied solely or even largely on barter; it’s still an instructive concept.) The chief problem with barter is what economist William Stanley Jevons called the “double coincidence of wants.” Say you have a bunch of bananas and would like a pair of shoes; it’s not enough to find someone who has some shoes or someone who wants some bananas. To make the trade, you need to find someone who has shoes he’s willing to trade and wants bananas. That’s a tough task.
With a common currency, though, the task becomes easy: You just sell your bananas to someone in exchange for money, with which you then buy shoes from someone else. And if, as in Lydia, you have foreigners from whom you’d like to buy or to whom you’d like to sell, having a common medium of exchange is obviously valuable. That is, money is especially useful when dealing with people you don’t know and may never see again.
The Lydian system’s breakthrough was the standardized metal coin. Made of a gold-silver alloy called electrum, one coin was exactly like another—unlike, say, cattle. Also unlike cattle, the coins didn’t age or die or otherwise change over time. And they were much easier to carry around. Other kingdoms followed Lydia’s example, and coins became ubiquitous throughout the Mediterranean, with kingdoms stamping their insignia on the coins they minted. This had a dual effect: It facilitated the flow of trade, and it established the authority of the state.
Modern governments still like to place their stamp upon money, and not just on bills and coins. In general, they prefer that money—whether physical cash or digital—be issued and controlled only by official entities and that financial transactions (especially international ones) be traceable. And so the recent rise of an alternative currency like Bitcoin [see “The Cryptoanarchists’ Answer to Cash,” in this issue], which is based on a cryptographic code that allows for anonymous transactions and that so far has proved to be uncrackable, is the kind of thing that tends to make governments very unhappy.
The spread of money throughout the Mediterranean didn’t mean that it was universally used. Far from it. Most people were still subsistence farmers and existed largely outside the money economy.
But as money became more common, it encouraged the spread of markets. This, in fact, is one of the enduring lessons of history: Once even a small part of your economy is taken over by markets and money, they tend to colonize the rest of the economy, gradually forcing out barter, feudalism, and other economic arrangements. In part this is because money makes market transactions so much easier, and in part because using money seems to redefine what people value, pushing them to view things in economic, rather than social, terms.
Governments were quick to embrace hard currency because it facilitated the collection of taxes and the building of military forces. In the third century B.C.E., with the rise of Rome, money became an important tool for unifying and expanding the empire, reducing the costs of trade, and funding the armies that kept the emperors in power.
The decline of the Roman Empire, starting in the third century C.E., saw a decline in the use of money as well, at least in the West. Parts of the former empire, like Britain, simply stopped using coins. Elsewhere people still used money to balance accounts and keep track of debts, and many small kingdoms minted their own coins. But in general, the circulation of money became less central, as cities shrank in size and commerce dwindled…
July 19, 2012
One of the great achievements of Western civilization is what we commonly call “the rule of law.” By this we mean the basic principles of fairness and due process that govern the application of power in both the public and the private spheres. The rule of law requires that all disputes — whether among private parties or among the state and private parties — be tried before neutral judges, under rules that are known and articulated in advance. Every party must have notice of the charge against him and an opportunity to be heard in response; each governing rule must be consistent with all the others, so that no person is forced to violate one legal requirement in order to satisfy a second. In the United States, our respect for such principles has made our economy the world’s strongest, and our citizens the world’s freest.
Though we may take it for granted, the rule of law is no easy thing to create and preserve. Dictators and petty despots of all sorts will rebel against these constraints in order to exercise dominion over the lives and fortunes of their subjects. But anyone, of any political persuasion, who thinks of government as the servant of its citizens — not their master — will recognize that compliance with the rule of law sets a minimum condition for a just legal order.
That, however, is precisely where the difficulties begin — for minimum conditions by themselves are not enough. Law is not just an idealized system of rules: It also involves the public administration of those rules by a wide range of elected and appointed officials in an endless array of particular circumstances. For those who would defend a just legal order, the basic challenge is to strike a proper balance — between limiting the discretion of these officials so that they do not undermine the rule of law, while also allowing them enough leeway to perform their essential roles.
Lately in America, we have done a poor job of preserving this balance. In practice — and, increasingly, in legal theory — government officials have been given unprecedented ability to make exceptions to the law, both in enforcing it and in respecting the rights granted under it. Indeed, the past year has seen two of the most enormous pieces of legislation in U.S. history — the Patient Protection and Affordable Care Act and the Wall Street Reform and Consumer Protection Act — make the imbalance far worse. Both laws seek to dramatically transform vast swaths of the American economy; both give enormous power to the government to bring about these transformations. And yet both laws are stunningly silent on exactly how these overhauls are to take place. The vague language of these statutes delegates much blanket authority to government officials who will, effectively, make the rules up as they go along.
As these officials stumble through how to implement these sprawling new laws, they will inevitably come up against unanticipated obstacles (or powerful interests) that will demand exceptions to the statutes’ far-reaching provisions. In some cases, special benefits or permissions releasing companies from government regulations will simply be granted. In others, the releases will be provided only if the regulated parties agree to waive some legal protection to which they would otherwise be entitled.
Neither of these practices — providing waivers or demanding waivers — is necessarily pernicious. Indeed, in some cases, they are part and parcel of the ordinary course of business in the modern administrative state. But both are open to abuse, and that abuse makes for a particularly dangerous form of government power.
After all, people concerned for their freedom and rights are always most alert to threats that arise when governments (or other powerful institutions) force us to do what we don’t want to do. The power of coercion is more easy to define, to identify, and to resist. But we are not sufficiently alert to the flip side of this problem: the risks that come with the power to create exceptions and to grant dispensations. Indeed, this is a much more subtle, insidious assault by government: Rather than setting the state and the private sector against each other in a healthy tension, it fuses them, making the private sphere dependent on the government’s benevolence. And when currying the favor of capricious government officials is required for a person’s well-being or a firm’s very existence, government abuse becomes nearly impossible to oppose.
“Government by waiver” is thus among the most serious challenges to the rule of law in our time.
WAIVERS AND THE MODERN STATE
The issue of government by waiver arises in any system of public administration. But as the size of the state has expanded dramatically, the scope of the problem has grown right along with it.
Under the traditional classical-liberal model of limited government, the state has a few critical, but well-defined, objectives — each directed toward controlling the use of force and facilitating voluntary agreements among private parties. Together, they ensure that the rules of the road are clear and knowable to all individuals, and in turn two felicitous consequences follow: First, individuals who know their rights are able to take easy steps to avoid getting enmeshed with the law; second, clear rules make it easier to monitor the conduct of public officials. Thus, the more limited the scope of government, the fewer difficulties there are in controlling the discretion of its officers.
Prior to the rise of the modern administrative state, the delineation of property rights and the enforcement of contracts were governed primarily by such simple common-law rules. Each person was expected to forbear against any physical invasion of the person or property of another, which meant that simple conformity to a bright-line rule could keep most people out of mischief. That one rule was easy to understand, and its content did not vary with the number of people in society, their personal characteristics, or their levels of wealth. Exchanges between individuals, meanwhile, could be conducted through voluntary agreements, which tended to clarify rights and preserve flexibility. The risk of abuse was low, because everyone could pick the parties with whom he chose to deal and the terms on which he wished to interact. Thus labor contracts were often written “at will,” which meant that an employer could hire and fire as he chose, and a worker could accept employment or quit his job whenever he wanted.
These rules were, of course, subject to exceptions that dealt with duress, fraud, non-disclosure, incompetence, and undue influence. They left open areas such as child labor, in the case of which there can be genuine differences regarding whether legal limits in fact protect children from parental exploitation or deny families the opportunity to work their way out of poverty. But such exceptions wane in importance in any regime that uses voluntary exchange to achieve economic growth.
Most critical to the rule of law, running such a system requires little administrative oversight. When disputes arise, judicial determinations are generally easy to make, because everything turns on readily verifiable public acts. Moreover, the public enforcement of laws — even when it requires some case-by-case discretion — involves relatively modest functions. Issuing building permits, for instance, is limited to straightforward matters of health and safety — touching only such issues as falling objects or traffic interference (as opposed to, say, more vague interests like “neighborhood character,” “economic development,” and “urban renewal”). The constrained range of tasks assigned to government officials thus leaves more decisions in private hands, where individual choice is driven by individual preference or by competitive market conditions (rather than by government command). Under such circumstances, the harmful pressures exerted on the rule of law are very low.
But the modern state does not limit itself to these defined objectives. Ostensibly desirous of assuring the welfare of its citizens, today’s state claims that it must take a far greater role in the life of the nation. This conception of politics comes to see such welfare — measured in terms of material well-being and access to some essential goods like housing, health care, and education — as a right to be guaranteed to the people by their government. Providing for every such right requires resources, which must be obtained by the state through taxation and mandates. Since no regime of positive rights can repeal the iron law of scarcity — which dictates that the provision of goods to some people necessarily imposes correlative burdens on others — a regime of positive rights is a regime of very demanding duties and requirements.
The question, then, is whether these duties and requirements must be enforced in absolutely all circumstances. To any reasonable observer, the answer is surely not: The immense range of circumstances that present themselves in a huge, complex society means that there will always be hard cases calling for exceptions. Sometimes various requirements have to be waived to avoid unreasonable hardship, and sometimes various requirements have to be waived for the system to function at all.
But can such waivers be made compatible with the rule of law?…
Once Upon a Time: The lure of the fairy tale and really violent stories designed to scare us to death
July 19, 2012
In Grimms’ Fairy Tales there is a story called “The Stubborn Child” that is only one paragraph long. Here it is, in a translation by the fairy-tale scholar Jack Zipes:
Once upon a time there was a stubborn child who never did what his mother told him to do. The dear Lord, therefore, did not look kindly upon him, and let him become sick. No doctor could cure him and in a short time he lay on his deathbed. After he was lowered into his grave and covered over with earth, one of his little arms suddenly emerged and reached up into the air. They pushed it back down and covered the earth with fresh earth, but that did not help. The little arm kept popping out. So the child’s mother had to go to the grave herself and smack the little arm with a switch. After she had done that, the arm withdrew, and then, for the first time, the child had peace beneath the earth.
This story, with its unvarnished prose, should be clear, but it isn’t. Was the child buried alive? The unconsenting arm looks more like a symbol. And what about the mother? Didn’t it trouble her to whip that arm? Then we are told that the youngster, after this beating, rested in peace. Really? When, before, he had seemed to beg for life? But the worst thing in the story is that, beyond disobedience, it gives us not a single piece of information about the child. No name, no age, no pretty or ugly. We don’t even know if it is a boy or a girl. (The Grimms used ein Kind, the neuter word for “child.” Zipes decided that the child was a boy.) And so the tale, without details to attach it to anything in particular, becomes universal. Whatever happened there, we all deserve it. A. S. Byatt has written that this is the real terror of the story: “It doesn’t feel like a warning to naughty infants. It feels like a glimpse of the dreadful side of the nature of things.” That is true of very many of the Grimms’ tales, even those with happy endings.
Jacob and Wilhelm Grimm were born to a prosperous couple (the father was a lawyer), Jacob in 1785, Wilhelm in 1786. The family lived in a big house in the Hessian village of Hanau, near Kassel, and the boys received a sound primary education at home. But when they were eleven and ten everything changed. Their father died, and the Grimms no longer had any money. With difficulty, the brothers managed to attend a good lyceum and then, as their father would have wished, law school. But soon afterward they began a different project, which culminated in their famous book “Nursery and Household Tales” (“Die Kinder- und Hausmärchen”), first published in two volumes, in 1812 and 1815, and now generally known as Grimms’ Fairy Tales.
The Grimms grew up in the febrile atmosphere of German Romanticism, which involved intense nationalism and, in support of that, a fascination with the supposedly deep, pre-rational culture of the German peasantry, the Volk. Young men fresh from reading Plutarch at university began sharing stories about what the troll said to the woodcutter, and publishing collections of these Märchen, as folk tales were called. That is the movement that the Grimms joined in their early twenties. They had political reasons, too—above all, Napoleon’s invasion of their beloved Hesse, and the installation of his brother Jérôme as the ruler of the Kingdom of Westphalia, a French vassal state. If ever there was a stimulus to German intellectuals’ belief in a German people that was culturally and racially one, and to the hope of a politically unified Germany, this was it.
Two things sustained the Grimms. First, their bond as brothers. For most of their lives, they worked in the same room, at facing desks. Biographers say that they had markedly different personalities—Jacob was difficult and introverted, Wilhelm easygoing—but this probably drew them closer. Wilhelm, when he was in his late thirties, made bold to get married, but the lady in question simply moved into the brothers’ house and, having known them for decades, made the domestic operations conform to their work schedule.
That was their other lodestar: their work. Eventually, their specialties diverged somewhat. Wilhelm remained faithful to folklore, and it was he who, after the second edition of “Household Tales” (1819), did all the editorial work on the later editions, the last of which was published in 1857. Jacob branched out into other areas of German history. Independently, Jacob wrote twenty-one books; Wilhelm, fourteen; the two men in collaboration, eight—a prodigious output. Though their most popular and enduring book was “Household Tales,” they were serious philologists, and, in the last decades of their lives, what they cared about most was their German Dictionary, a project on the scale of the Oxford English Dictionary. Wilhelm died at seventy-three. Jacob carried on for four years, and brought the dictionary up to “F.” Then he, too, died. Later scholars finished the book.
There are two varieties of fairy tales. One is the literary fairy tale, the kind written, most famously, by Charles Perrault, E. T. A. Hoffmann, and Hans Christian Andersen. Such tales, which came into being at the end of the seventeenth century, are original literary works—short stories, really—except that they have fanciful subject matter: unhappy ducks, princesses who dance all night, and so on. To align the tale with the hearthside tradition, the author may also employ a certain naïveté of style. The other kind of fairy tale, the ancestor of the literary variety, is the oral tale, whose origins cannot be dated, since they precede recoverable history. Oral fairy tales are not so much stories as traditions. In the words of the English novelist Angela Carter, who wrote some thrilling Grimm-based stories, asking where a fairy tale came from is like asking who invented the meatball. Every narrator reinvents the tale. The historian Robert Darnton compares the oral tale tellers to the Yugoslavian bards studied in the twentieth century by Albert Lord and Milman Parry, in the effort to understand how the Homeric epics were composed. The premodern tale tellers might also be thought of as descendants of the scops of the Anglo-Saxon Dark Ages or of the griots of West Africa, men whose job it was to carry stories. But scholars tend to associate fairy tales with women, at home, telling stories to one another to relieve the tedium of repetitive tasks such as spinning (which often turns up in these narratives). Each woman would add or subtract a little of this and that, and so the story changed.
In the Grimms’ time, industrialization was starting to simplify or eliminate certain domestic chores. For that reason, among others, the oral tale was beginning to disappear. Intellectuals considered this a disaster. Hence the many fairy-tale collections of the period, including the Grimms’. They were rescue operations. The Grimms, in the introduction to their first edition, assert that almost all their material was “collected” from oral traditions of their region and is “purely German in its origins.” This suggests that the tales were supplied by humble people, and the brothers say that their primary source, Dorothea Viehmann, was a peasant woman from a village near Kassel. They claim that they did not change what Viehmann or the others said: “No details have been added or embellished.”
Much of this was not true. The people who supplied the first-edition tales were largely middle class: the brothers’ relatives, friends, and friends of friends. As for Viehmann, she was not a peasant but the wife of a tailor. She was also a Huguenot. In other words, her culture was basically French, and she was no doubt well acquainted with French literary fairy tales, Perrault’s and others’. So much for the material’s being “purely German in its origins.” But at least Viehmann was an oral source. Many items in the Grimms’ first edition came not from interviewees but from other fairy-tale collections.
Most important, the brothers, especially Wilhelm, revised the tales thoroughly, making them more detailed, more elegant, and more Christian, as one edition followed another. In the process, the stories sometimes doubled in length. The folklore scholar Maria Tatar supplies three sentences from the brothers’ original draft of “Briar Rose,” which we call “The Sleeping Beauty”:
[Briar Rose] pricked her finger with the spindle and immediately fell into a deep sleep. The king and his retinue had just returned and they too, along with the flies on the wall and everything else in the castle, fell asleep. All around the castle grew a hedge of thorns, concealing everything from sight.
And here, after seven successive revisions, is how that passage reads in the final edition of “Household Tales”:
[Briar Rose] took hold of the spindle and tried to spin. But no sooner had she touched the spindle than the magic spell took effect, and she pricked her finger with it. The very moment that she felt the prick she sank down into the bed that was right there and fell into a deep sleep. And that sleep spread throughout the entire palace. The king and the queen, who had just come home and entered the great hall, fell asleep, and the whole court with them. The horses fell asleep in the stables, the dogs in the courtyard, the pigeons on the roof, and the flies on the wall. Even the fire that had been flaming on the hearth stopped and went to sleep, and the roast stopped crackling, and the cook, who was about to pull the kitchen boy’s hair because he had done something wrong, let him go and fell asleep. And the wind died down and not a single little leaf stirred on the trees by the castle.
All around the castle a briar hedge began to grow. Each year it grew higher, and finally it surrounded the entire castle and grew so thickly beyond it that not a trace of the castle was to be seen, not even the flag on the roof….