October 4, 2011
October 4, 2011
Until very recently, the role of father was one of great respect in our culture and the image of the good father was a source of societal integration or at least one of widespread social agreement. Mass entertainment, including movies and television, generally supported the idea that to aspire to being a good father, was something noble as well as commonplace and accessible. The father was loved, trusted and revered. The image of the good father was everywhere. On television we had Father Knows Best, My Three Sons, Make Room For Daddy, Leave It To Beaver, Bonanza and so on. In movies the quintessential good father was often played by Gregory Peck (The Yearling, To Kill a Mockingbird), but the good father was also found in Westerns (Van Heflin in Shane, Jimmy Stewart in Shenandoah). This began to change during the sixties when, in situation comedies on television, the father of the family became the butt of jokes (Archie Bunker) and this trend has continued ever since with a brief revival of the good father in the 1980s with The Cosby Show. All the while, the image of the bad father was becoming more commonplace (Married with Children) even if it was quite shocking at first (Christopher Walken in At Close Range).
Today, fathers, like priests, are automatically suspect. Casey Anthony was able to make an allegation of sexual abuse against her father (with no substantiating evidence) and was believed, at least to some extent, by the jury during her murder trial because suspicion of fathers is at an all time high. From Oprah Winfrey’s repressed memories to Kathryn Harrison’s The Kiss to crime dramas in which the innocent-seeming father is often revealed to be the villain, all these images combine to undermine our trust in fathers. Underneath all this is the loss of faith in the ultimate Father, God, and a loss of trust in God’s fatherly nature which was once taken as self-evident. God-knowing souls throughout the ages have repeatedly confirmed that God is not simply like a father, but acts consistently as a father, a good father, even a perfect father in the lives of the faithful.
Common religious understanding allows that the Heavenly Father bids his children to come to him and provides everything needful for us to do so in complete freedom. I think it is safe to say that coercion has no part in our understanding of the divine plan; and in fact, the existence of forced conformity in any belief system may be seen as evidence of its falsity. In truth, we are free at every stage to accept or reject the Father’s leading, in all or in part, to roam away or to return. Every moral decision we make either advances or retards our progress. Human beings may indulge in acts of coercion or forcing conformity on their brethren, but the divine being never does this. The good father respects the free will of his children for the value of our love for him lies in the very freedom of its bestowal. Love is reduced to nothing if it is not freely given.
It is fashionable today to cite the very fact of our freedom as proof of God’s malevolence. (And if God is malevolent, then it becomes incumbent on us to reject him.) The thinking goes that if God loved us, he would not allow the natural outworking that often results from our own free choosing – cruelty, violence, destruction and death. The fact that God allows evil to temporarily flourish, does not mean he creates evil unless our philosophy rejects the idea of free will. For in order to save us from ourselves, the Father would be forced to remove our free will which is the very purpose of our creation, the very thing which makes us valuable and which makes life valuable to us. The fact that God allows the temporary manifestation of evil as the natural result of human freedom, does not mean God is evil.
The question then becomes, would there be value inherent in the life of a will-less computer-minded robot whose value is found only in its function as a cog in the wheel of divine will? An example of this thinking is found in Islam where a man’s value is measured by his conformity to, and function within, the Islamic system. The individual human being has no intrinsic value in himself = the individual is sacrficed to the system. In Islam, Allah may be described as a king or a judge, but he cannot be described as a father, much less a good father.
Faith is rightly defined as trust in God. Implicit in this is not only the idea that God is good, but that God is knowable. One cannot develop trust in an unknowable being. Faith may also be defined as having the belief in one’s own ability to know God. If a person doesn’t believe he can know God, he can have no relationship with God and can therefore never develop faith in God. In Islam God is defined as thoroughly transcendent and unknowable, therefore Islam itself cannot properly be defined as a “faith” in the Western sense of the word…
October 4, 2011
Can cities and states reject their long history of fiscal irresponsibility?
A century ago, america’s states and cities faced a crisis in government. A number of commonwealths — California, New York, New Jersey, Wisconsin, and Illinois conspicuous among them — as well as cities across the land labored under the heavy weight of costly and corrupt misrule. This state of affairs was generally blamed on an unholy triple alliance of large corporations (“the trusts”) and other business interests, party bosses and machines, and compliant legislators and officials. Lincoln Steffens, the most prominent muckraking journalist of the day, scathingly described the results in influential magazine articles, gathered together in his 1906 books The Shame of the Cities and The Struggle for Self-Government.
Since then the clock of time has completed a circuit of 100 years and more, and the nation’s states and municipalities again face a crisis in government. A number of commonwealths — yes, California, New York, New Jersey, Wisconsin, and Illinois still conspicuous among them — as well as a number of counties and cities, labor under a heavy weight of debt, deficits, and future obligations. This state of affairs is generally blamed on an Iron Triangle of public employee unions, compliant governors and legislators, and a complaisant electorate.
The current crisis is the product not of contracts, graft, and patronage — the mother’s milk of early-20th-century state and local politics — but of sweetheart salary, pension, and health insurance deals secured by public employee unions: the mother’s milk of early-21st-century politics. The parlous state of American state and local government circa 1900 may properly be seen as a consequence of the in many other respects admirable rise of a democratic party politics over the course of the preceding century. The fiscal-deficit crisis of the past several years may properly be seen as a consequence of the in many other respects admirable rise of the welfare state.
Progressivism, the movement that rose in reaction to state and local malfeasance a century or so ago, was defined by the dictates of party politics; the clash of social, ideological, and economic interest groups; and the federalism of American government. It both reflected and reinforced the growing legitimacy of the administrative and social welfare state, and a rising discontent with boss-machine-party politics.
There are signs that a reaction is taking shape comparable in its scale and impact to progressivism but this time aimed at the excesses rather than the insufficiencies of American government. A hundred years ago a generation of governors undertook (with varying degrees of intensity) to combat the unsavory corporate-political machine combines of their time. Among the more conspicuous were Republicans Robert La Follette in Wisconsin, Hiram Johnson in California, Theodore Roosevelt and Charles Evans Hughes in New York, and Democratic Woodrow Wilson in New Jersey. Their counterparts today focus, with varying degrees of intensity, on the fiscal consequences of overgenerous pension and health care arrangements for public employees. Among them: Republicans Scott Walker in Wisconsin, Chris Christie in New Jersey, and Mitch Daniels in Indiana, and Democrats Jerry Brown in California and Andrew Cuomo in New York.
Insofar as they see themselves engaged in an effort to change policies that stem from an abuse of government power and threaten the well-being of the polity, they are very much the descendants of their counterparts a century and more ago. They are contesting the spend-and-tax conventional wisdom of past decades, much as their progressive equivalents took on the reigning assumptions of Gilded Age machine politics.
The core problem is the massive, rapidly growing fiscal burden of current debt and future obligations spawned by public employee pensions and health care. It has become a touchstone public issue. In June 2011, the Hoover Institution’s David Brady and Michael McConnell convened a State and Municipal Fiscal Default Workshop, designed to subject the crisis to concentrated, expert examination. Participants included leading bankruptcy and contract law-school professors, political scientists and historians, and a number of professionals who manage state and local government finance.
The crisis stems from the rise of pension and health care provisions for public employees. In the past, these were commonly seen as fringe benefits. Today they are most often regarded as entitlements. There is a wealth of political and governmental meaning in that verbal shift.
The United States was long notable for its failure to provide such benefits for employees of any sort, private or public. They were more common in Europe, the product of old feudal and new socialist influence. As in so much modern American history, the Great Depression and the Second World War was the great divide. Widespread private-pension and health care coverage came with the war. In the decades after 1945, these provisions grew from inexpensive fringe benefits to ever more costly entitlements in the public as well as the private sector. Out of this emerged the present crisis.The debt crisis stems from the rise of pension and health care provisions for public employees.
Social Security required the pension recipient, the employer, and the state to contribute to what was defined as a form of paid-for insurance: and hence a vested interest. Wartime wage freezes, tax breaks, and the need to woo scarce workers spurred companies to make entitlements part of their employment contracts, and postwar growth and prosperity, favorable tax breaks, and powerful unions turned them into normal components of employment.
But private sector unions steadily shrank in size and clout during the late 20th century. By the early 2000s, only about eight percent of the private workforce was unionized. When, in the 1980s, the prevailing defined benefit pension plans began to pose a fiduciary threat, firms began to switch to defined contribution plans. The change was helped along by the creation of 401(k) pension plans, the decline of private employee unions, increasing employee mobility, and a mutual fund industry that has been called the sales force of the defined contribution revolution. And when the cost of health care rose steeply with the introduction of Medicare/Medicaid, government bore the brunt of the increase.
The public sector was another story. State and local public employee unions won collective bargaining rights in the late 1950s and early 1960s. As their membership grew to 40 percent of the public workforce, so did their political clout. Public sector union leaders, much like party bosses in an earlier time, secured benefits for their constituents and thus added to their political power…
October 4, 2011
My lifespan encompasses the era when the United States of America was capable of launching human beings into space. Some of my earliest memories are of sitting on a braided rug before a hulking black-and-white television, watching the early Gemini missions. This summer, at the age of 51—not even old—I watched on a flatscreen as the last Space Shuttle lifted off the pad. I have followed the dwindling of the space program with sadness, even bitterness. Where’s my donut-shaped space station? Where’s my ticket to Mars? Until recently, though, I have kept my feelings to myself. Space exploration has always had its detractors. To complain about its demise is to expose oneself to attack from those who have no sympathy that an affluent, middle-aged white American has not lived to see his boyhood fantasies fulfilled.
Still, I worry that our inability to match the achievements of the 1960s space program might be symptomatic of a general failure of our society to get big things done. My parents and grandparents witnessed the creation of the airplane, the automobile, nuclear energy, and the computer to name only a few. Scientists and engineers who came of age during the first half of the 20th century could look forward to building things that would solve age-old problems, transform the landscape, build the economy, and provide jobs for the burgeoning middle class that was the basis for our stable democracy.
The Deepwater Horizon oil spill of 2010 crystallized my feeling that we have lost our ability to get important things done. The OPEC oil shock was in 1973—almost 40 years ago. It was obvious then that it was crazy for the United States to let itself be held economic hostage to the kinds of countries where oil was being produced. It led to Jimmy Carter’s proposal for the development of an enormous synthetic fuels industry on American soil. Whatever one might think of the merits of the Carter presidency or of this particular proposal, it was, at least, a serious effort to come to grips with the problem.
Little has been heard in that vein since. We’ve been talking about wind farms, tidal power, and solar power for decades. Some progress has been made in those areas, but energy is still all about oil. In my city, Seattle, a 35-year-old plan to run a light rail line across Lake Washington is now being blocked by a citizen initiative. Thwarted or endlessly delayed in its efforts to build things, the city plods ahead with a project to paint bicycle lanes on the pavement of thoroughfares.
In early 2011, I participated in a conference called Future Tense, where I lamented the decline of the manned space program, then pivoted to energy, indicating that the real issue isn’t about rockets. It’s our far broader inability as a society to execute on the big stuff. I had, through some kind of blind luck, struck a nerve. The audience at Future Tense was more confident than I that science fiction [SF] had relevance—even utility—in addressing the problem. I heard two theories as to why:
1. The Inspiration Theory. SF inspires people to choose science and engineering as careers. This much is undoubtedly true, and somewhat obvious.
2. The Hieroglyph Theory. Good SF supplies a plausible, fully thought-out picture of an alternate reality in which some sort of compelling innovation has taken place. A good SF universe has a coherence and internal logic that makes sense to scientists and engineers. Examples include Isaac Asimov’s robots, Robert Heinlein’s rocket ships, and William Gibson’s cyberspace. As Jim Karkanias of Microsoft Research puts it, such icons serve as hieroglyphs—simple, recognizable symbols on whose significance everyone agrees.
Researchers and engineers have found themselves concentrating on more and more narrowly focused topics as science and technology have become more complex. A large technology company or lab might employ hundreds or thousands of persons, each of whom can address only a thin slice of the overall problem. Communication among them can become a mare’s nest of email threads and Powerpoints. The fondness that many such people have for SF reflects, in part, the usefulness of an over-arching narrative that supplies them and their colleagues with a shared vision. Coordinating their efforts through a command-and-control management system is a little like trying to run a modern economy out of a Politburo. Letting them work toward an agreed-on goal is something more like a free and largely self-coordinated market of ideas.
SPANNING THE AGES
SF has changed over the span of time I am talking about—from the 1950s (the era of the development of nuclear power, jet airplanes, the space race, and the computer) to now. Speaking broadly, the techno-optimism of the Golden Age of SF has given way to fiction written in a generally darker, more skeptical and ambiguous tone. I myself have tended to write a lot about hackers—trickster archetypes who exploit the arcane capabilities of complex systems devised by faceless others.
Believing we have all the technology we’ll ever need, we seek to draw attention to its destructive side effects. This seems foolish now that we find ourselves saddled with technologies like Japan’s ramshackle 1960’s-vintage reactors at Fukushima when we have the possibility of clean nuclear fusion on the horizon. The imperative to develop new technologies and implement them on a heroic scale no longer seems like the childish preoccupation of a few nerds with slide rules. It’s the only way for the human race to escape from its current predicaments. Too bad we’ve forgotten how to do it.
“You’re the ones who’ve been slacking off!” proclaims Michael Crow, president of Arizona State University (and one of the other speakers at Future Tense). He refers, of course, to SF writers. The scientists and engineers, he seems to be saying, are ready and looking for things to do. Time for the SF writers to start pulling their weight and supplying big visions that make sense. Hence the Hieroglyph project, an effort to produce an anthology of new SF that will be in some ways a conscious throwback to the practical techno-optimism of the Golden Age…