The Withering of the Affluent Society: Though Americans see upward mobility as their birthright, that assumption faces growing challenges, with consequences not just for the size of our wallets but for the tenor of our politics
August 14, 2012
The future of affluence is not what it used to be. Americans have long believed—it’s part of our national character—that our economic well-being will constantly increase. We see ourselves as a striving, inventive, and pragmatic people destined for higher living standards. History is a continuum of progress, from Robert Fulton’s steamboat to Henry Ford’s assembly line to Bill Gates’ software. Every generation will live better than its predecessors.
Well, maybe not.
For millions of younger Americans—say, those 40 and under—living better than their parents is a pipe dream. They won’t. The threat to their hopes does not arise from an impending collapse of technological gains of the sort epitomized by the creations of Fulton, Ford, and Gates. These advances will almost certainly continue, and per capita income—the average for all Americans and a conventional indicator of living standards—will climb. Statistically, American progress will resume. The Great Recession will be a bump, not a dead end.
The trouble is that many of these gains will bypass the young. The increases that might have fattened their paychecks will be siphoned off to satisfy other groups and other needs. Today’s young workers will have to finance Social Security and Medicare for a rapidly growing cohort of older Americans. Through higher premiums for employer-provided health insurance, they will subsidize care for others. Through higher taxes and fees, they will pay to repair aging infrastructure (roads, bridges, water systems) and to support squeezed public services, from schools to police.
The hit to their disposable incomes would matter less if the young were major beneficiaries of the resultant spending. In some cases—outlays for infrastructure and local services—they may be. But these are exceptions. By 2025 Social Security and Medicare will simply reroute income from the nearly four-fifths of the population that will be under 65 to the older one-fifth. And health care spending at all age levels is notoriously skewed: Ten percent of patients account for 65 percent of medical costs, reports the Kaiser Family Foundation. Although insurance provides peace of mind, the money still goes from young to old: Average health spending for those 45 to 64 is triple that for those 18 to 24.
The living standards of younger Americans will almost certainly suffer in comparison to those of their parents in a second crucial way. Our notion of economic progress is tied to financial security, but the young will have less of it. What good are higher incomes if they’re abruptly revoked? Though it wasn’t a second Great Depression, the Great Recession was a close call, shattering faith that modern economic policies made broad collapses impossible. Except for the savage 1980-82 slump, post-World War II recessions had been modest. Only minorities of Americans had suffered. By contrast, the Great Recession hurt almost everyone, through high unemployment, widespread home foreclosures, huge wealth losses in stocks and real estate—and fears of worse. A 2012 Gallup poll found that 68 percent of Americans knew someone who had lost a job.
The prospect of downward mobility is not just dispiriting. It assails the whole post–World War II faith in prosperity. Beginning in the 1950s, commentators celebrated the onrush of abundance as marking a new era in human progress. In his 1958 bestseller The Affluent Society, Harvard economist John Kenneth Galbraith announced the arrival of a “great and unprecedented affluence” that had eradicated the historical “poverty of the masses.”
Economic growth became a secular religion that was its own reward. Perhaps its chief virtue was that it dampened class conflict. In The Great Leap: The Past Twenty-Five Years in America (1966), John Brooks observed, “The middle class was enlarging itself and ever encroaching on the two extremes”—the very rich and the very poor. Business and labor could afford to reconcile because both could now share the fruits of expanding production. We could afford more spending on public services (education, health, environmental protection, culture) without depressing private incomes. Indeed, that was Galbraith’s main theme: Our prosperity could and should support both.
To be sure, there were crises of faith, moments when economic progress seemed delayed or doomed. The longest lapse occurred in the 1970s, when double-digit inflation spawned pessimism and frequent recessions, culminating in the 1980-82 downturn. Monthly unemployment peaked at 10.8 percent. But after Federal Reserve chairman Paul Volcker and President Ronald Reagan took steps to suppress high inflation, faith returned.
Now, it’s again imperiled. A 2011 Gallup poll found that 55 percent of Americans didn’t think their children would live as well as they did, the highest rate ever. We may face a crimped and contentious future.
Let’s be clear: The prospect is not national impoverishment; it is of relative deprivation. Even if disposable per capita incomes fell 10 percent—an extreme outcome—Americans would remain wealthy by any historical standard. Such a change would entail a decline in the annual disposable income from $37,000 to $33,300 (in 2011 inflation-adjusted dollars), probably over many years. People might adjust in ways that barely affected daily routines. They might live in slightly smaller houses, drive more fuel-efficient vehicles, or eat out a bit less. These are inconveniences, not tragedies…
The Future of U.S. Health Care: With the Affordable Care Act here to stay, what are the prospects for Medicaid, employer-based insurance, and single-payer?
August 14, 2012
The Affordable Care Act is a monumental accomplishment. Thanks to its expansion of health care coverage and new regulations, tens of millions of Americans will feel more secure, knowing that they can seek medical attention when they need it and that they will be protected from the insurance industry’s most egregious practices.
But the reform was very much limited by the American terms of the debate, particularly the enduring belief that markets are always more efficient than government (even though our current private insurance system demonstrates otherwise) and the conviction that any changes to the arrangements of the insured cannot fly. The result is a sprawling, confusing, Gorgon-headed workaround, whose beneficial features are difficult for the typical consumer to discern.
Now that the Act has run the Supreme Court gauntlet, how will it affect the structure and politics of health care going forward? Assuming the ACA survives Republican repeal attempts, does it represent a large step toward a single-payer system? What will happen to the employer-provided sector of health insurance? Will health insurance in the United States settle into a pattern of competing private plans? And will some states really opt out of the ACA’s Medicaid expansion and forgo billions of federal dollars?
Prognostication is always a dangerous business—witness the many observers who thought the Supreme Court would uphold the Medicaid expansion but strike down the individual mandate, the opposite of the Court’s ruling. Policy changes of this magnitude are especially difficult to assess, since they take years to unfold, with all the legislative tweaks and developments that are sure to follow. As political scientist Eric Patashnik has pointed out, Social Security was reshaped by decades of amendments and changes before it took the form we know today. A recent Congressional Budget Office report on the ACA admits that the legislation contains so many moving parts, and the changes are so consequential, that prediction is difficult. With this caveat, I offer the following thoughts with modesty.
A review of past trends and policy experiences provides some guidance. It suggests that the prospects for a single-payer system are no brighter than they were before the Act was passed. The employer-provided system will likely survive for the foreseeable future, although the forces that have fed its deterioration over time remain in place. What appears most likely is a competing system of private insurers, attractive in that the health exchanges in which they will operate steer subsidies toward lower- and middle-income consumers, less attractive in that private insurers retain many of the administrative inefficiencies of the present system. And we can expect that conservative governors will not be able to refuse the Medicaid expansion indefinitely, as there will be considerable pressure from providers and voters (who will otherwise see their tax dollars flowing out of state) to join.
The Unraveling of Employer-Provided Insurance
Just over half of all Americans get health insurance through an employer (another 30 percent are covered through public insurance programs such as Medicare and Medicaid, and 16 percent have no health insurance). The employer-based system has been deteriorating for a long time, as rising costs have led to fewer and fewer employers offering health insurance, and more and more employees joining the uninsured population. This trend was a key motivation behind health reform in the first place. The ACA, and the similar 2006 Massachusetts reform that preceded it, both contain a mandate requiring employers of a certain size to offer health insurance to employees or pay a fine ($295 per employee in Massachusetts, $2,000 per employee nationally under the ACA).
At first glance, we might expect employers to drop health insurance: the cost of the fine is well below the cost of offering insurance, and employees will still be able to get insurance, with subsidies in many cases, on the exchanges established by the ACA. However, the Massachusetts experience, along with projections for the ACA, suggests this outcome is unlikely. In Massachusetts, employer-based coverage increased slightly after health care reform was implemented. Projected declines under the ACA are modest. A March 2012 Congressional Budget Office study of the ACA estimates that in 2019 to 2022, three to five million fewer non-elderly persons will have employer-provided insurance compared to a pre-ACA baseline. Similarly, the Urban Institute estimated that employer-provided insurance coverage would have declined by 500,000 if the ACA had been fully implemented in 2010 while the Lewin Group predicted a decline of 3 million if the law had been in place in 2011. A Rand analysis predicted that employer-based coverage would actually increase by 4 million in 2016 thanks to the ACA.
Why would employer-based coverage endure, when the employer fine is so low? As the CBO points out, the ACA and the market for employees present a complex web of incentives; there is more for employers to think about than the fine. Chiefly, businesses have to compete for employees, and health insurance has proven one of the most desirable benefits over the decades. Businesses that do not offer health insurance have to increase their compensation to compete, and the compensation has to be greater than a comparable amount of health insurance, since wages and salaries are taxed, while health insurance benefits are not.
Americans’ deep suspicion of government make the prospects of a single-payer system very poor.
That said, employers’ incentives to drop health insurance might grow over time, particularly because of the so-called “Cadillac tax” on high-value health plans. The tax break for employer-provided health insurance, in which neither employers nor employees pay taxes on health insurance premiums, currently has no cap. This encourages the adoption of elaborate insurance plans, which, in turn, create incentives for overindulging health care. Beginning in 2018, a 40 percent excise tax will be imposed on the value of employer-provided plans above a threshold of $10,200 for individual coverage and $27,500 for family coverage. This provision was included in the ACA to cut costs and to discourage both employers and employees from choosing such plans. Many current plans will reach the thresholds between now and 2018, and after that date, more and more plans will be subject to the tax since the thresholds will be adjusted to the Consumer Price Index, which rises at a lower pace than medical inflation. In essence, the Cadillac tax is a gradual repeal of government’s enormous subsidization of employer-provided health care…
When a moth flies at night, it uses the moon and the stars to steer a straight path. Those light sources are fixed and distant, so the rays always strike the moth’s multilensed eyes at the same angle, making them reliable for nocturnal navigation. But introduce something else bright—a candle, say, or a campfire—and there will be trouble. The light radiates outward, confusing the moth and causing it to spiral ever closer to the blaze until the insect meets a fiery end.
For years Richard Dawkins has used the self-immolation of moths to explain religion. The example can be found in his 2006 best seller, The God Delusion, and it’s been repeated in speeches and debates, interviews and blog posts. Moths didn’t evolve to commit suicide; that’s an unfortunate byproduct of other adaptations. In much the same way, the thinking goes, human beings embrace religion for unrelated cognitive reasons. We evolved to search for patterns in nature, so perhaps that’s why we imagine patterns in religious texts. Instead of being guided by the light, we fly into the flames.
The implication—that religion is basically malevolent, that it “poisons everything,” in the words of the late Christopher Hitchens—is a standard assertion of the New Atheists. Their argument isn’t just that there probably is no God, or that intelligent design is laughable bunk, or that the Bible is far from inerrant. It’s that religion is obviously bad for human beings, condemning them to ignorance, subservience, and endless conflict, and we would be better off without it.
But would we?
Before you can know for sure, you have to figure out what religion does for us in the first place. That’s exactly what a loosely affiliated group of scholars in fields including biology, anthropology, and psychology are working on. They’re applying evolutionary theory to the study of religion in order to discover whether or not it strengthens societies, makes them more successful, more cooperative, kinder. The scholars, many of them atheists themselves, generally look askance at the rise of New Atheism, calling its proponents ignorant, fundamentalist, and worst of all, unscientific. Dawkins and company have been no more charitable in return.
While the field is still young and fairly small—those involved haven’t settled on a name yet, though “evolutionary religious studies” gets thrown around—its findings could reshape a very old debate. Maybe we should stop asking whether God exists and start asking whether it’s useful to believe that he does.
Let’s say someone gives you $10. Not a king’s ransom, but enough for lunch. You’re then told that you can share your modest wealth with a stranger, if you like, or keep it. You’re assured that your identity will be protected, so there’s no need to worry about being thought miserly. How much would you give?
If you’re like most people who play the so-called dictator game, which has been used in numerous experiments, you will keep most of the money. In a recent study from a paper with the ominous title “God Is Watching You,” the average subject gave $1.84. Meanwhile, another group of subjects was presented with the same choice but was first asked to unscramble a sentence that contained words like “divine,” “spirit,” and “sacred.”
The second group of subjects gave an average of $4.22, with a solid majority (64 percent) giving more than five bucks. A heavenly reminder seemed to make subjects significantly more magnanimous. In another study, researchers found that prompting subjects with the same vocabulary made some more likely to volunteer for community projects. Intriguingly, not all of them: Only those who had a specific dopamine receptor variant volunteered more, raising the possibility that religion doesn’t work for everybody.
A similar experiment was conducted on two Israeli kibbutzes. The scenario was more complicated: Subjects were shown an envelope containing 100 shekels (currently about $25). They were told that they could choose to keep as much of the money as they wished, but that another member of the kibbutz was being given the identical option. If the total requested by the participants (who were kept separated) exceeded 100 shekels, they walked away with nothing. If the total was less than or equal to 100, they were given the money plus a bonus based on what was left over.
The kicker is that one of the kibbutzes was secular and one was religious. Turns out, the more-devout members of the religious kibbutz, as measured by synagogue attendance, requested significantly fewer shekels and expected others to do the same. The researchers, Richard Sosis and Bradley Ruffle, ventured that “collective ritual has a significant impact on cooperative decisions.”
See also a study that found that religious people were, in some instances, more likely to treat strangers fairly. Or the multiple studies suggesting that people who were prompted to think about an all-seeing supernatural agent were less likely to cheat. Or the study of 300 young adults in Belgium that found that those who were religious were considered more empathetic by their friends.
The results of other studies are less straightforward. A Harvard Business School researcher discovered that religious people were more likely to give to charity, but only on the days they worshiped, a phenomenon he dubbed the “Sunday Effect.” Then there’s the survey of how belief in the afterlife affected crime rates in 67 countries. Researchers determined that countries with high rates of belief in hell had less crime, while in those where the belief in hell was low and the belief in heaven high, there was more crime. A vengeful deity is better for public safety than a merciful one…
August 14, 2012
This image has been posted with express written permission. This cartoon was originally published at Town Hall.