The Stone City

Words Made to Last

Friday, July 29, 2005

Minions

Consider William Rehnquist: a once-brilliant man, now 81 years old, infirm, possibly dying of thyroid cancer, likely on pain medication which will still further dull his fading faculties. Yet he clings to "power" -- to the extent that it can still be called his.

Now, consider William Rehnquist's senior clerks. Probably in their thirties, probably brilliant, certainly powerful beyond all their expectations. They now have, for practical purposes, most of the power nominally vested in their senile figurehead. And I conjecture that they cling to it as well, murmuring encouragement and flattery to strengthen a fading man's fading resolve.

People say Justices are unaccountable; how much more so, though, are the faceless functionaries who now hold the reins.

Tuesday, July 26, 2005

Mi Casa Es Su Casa

I have been mulling this idea for some time. Commenting on a QandO post describing the ongoing Republican sellout, I said:
We (Republican-leaning citizens) can afford to see a Democratic House of Representatives in the 2007-2008 term. It is clear that if the Republicans remain in power in both houses, they will become worse than the Democrats until they lose power in some meltdown. I urge every Republican to vote Democratic for the House of Representatives in November 2006.
I really can't put it more clearly than that. Social conservatives are entitled to care about the Supreme Court, and thus about the Senate. But the House, at present, exists only to gargle about Terry Schiavo and lard up appropriations bills. It is necessary to the Republican Party, but to the Republican cause it is worse than useless.

Mi casa es su casa. Take it away, boys.

[Update 2 August: Virginia Postrel voices the same complaints, but doesn't have a plan. Linked to Beltway Traffic Jam.]

[Update 23 August: Mark Tapscott at Townhall joins the chorus. Hat tip: Stephen Bainbridge.]

Monday, July 25, 2005

The Case for Prudishness

James Lileks and Jeff Jarvis have a long-term colloquy on the merits of broadcast standards.
Mr. Jarvis is a steady opponent of restrictive standards, and has done some excellent primary journalism using the Freedom of Information Act to demonstrate that the FCC is allowing itself to be manipulated by very small groups of advocates for restricted content. [I am deliberately avoiding the word "censorship" because, as I will show later, it is incorrect here.]

Mr. Lileks, who watches and enjoys television and has a very young child, has volunteered to fight in the rear guard against the advance of obscenity. As Alex Whitlock would predict, he uses the slippery-slope argument:
I don’t know why people think that there’s some limit out there we’ll reach and say “okay, that much and no more,” and that limit will consist of laws that say you can only say “f*ck” six times in one prime-time broadcast show. Of course it just gets worse and worse until you find yourself in a nursing home watching the TV, and an ad comes on for “Now That’s What I Call Music #403,823,” including the smash hit “Go F*ck a Dead Man in the *ss, Bitch” and when you ask the orderly to turn it down he laughs and calls you a prude. Hey, that’s just how people talk, man.
[Mr. Whitlock's own contribution is here.]

But there is a better argument in favor of content standards for media. To begin, we note Mr. Jarvis's Rules of Engagement:
No personal attacks, hate speech, bigotry, or seven dirty words in the comments or comments will be killed along with commenters.
This is not evidence of hypocrisy. Mr. Jarvis's stated preference is for a market-based, as opposed to regulation-based, determination of potentially offensive content. With his content restrictions, he has chosen the market segment in which to position himself, just as he believes broadcast networks should be free to do.

The underlying theory -- that a market for non-obscene entertainment will prompt its emergence -- is appealing. But it rests on a questionable premise, namely that society's optimum should be the same as the market's. In fact, there are strong reasons to suppose that this is not the case.

By way of analogy, consider the regulations disseminated by the Occupational Safety and Health Administration. These regulations distort the free market, denying employees the choice of whether to sell their own safety in exchange for a higher wage.

Nonetheless it is generally accepted that they are justified, for two reasons. One is simply that not all prospective employees can be relied upon to demand a fair price for the risk they would bear. Thus the state [or the Nanny-State, if you prefer] forbids that sort of commerce altogether.

The deeper justification of OSHA regulations is that they move the domain of competition. Employers, denied the option of trading safety for efficiency, must seek innovations to gain efficiency without sacrificing safety. These innovations might be economically inefficient in the short term [so that, in the absence of OSHA regulations, it would be cheaper to pay an employee to bear more risk], but may nonetheless evolve into economically efficient practices, especially if the cost of bodily risk continues to increase through time. [This shows how the concept of safety regulation is predicated on assumptions of progress, but space constraints prevent further exploration of this idea.]

The same reasoning applies to content restrictions in broadcast media. The first reason we showed is directly analogous to the idea that viewers and broadcasters, even with mutual consent, should not be free to negotiate the obscenity content of entertainment in the marketplace. This idea is rightly ridiculed, since individual viewers are obviously the best judges of their preferred obscenity content, so the OSHA analogy is no justification at all.

The second justification, however, is in fact far stronger in this case. Content restrictions move the domain of competition, by denying broadcasters the choice of whether to use obscenity to appeal to a given audience segment. As long as these restrictions are stable and predictable, broadcasters may be expected to respond with innovation [of a charmingly retro kind] so that they can attract, without recourse to obscenity, an audience which not only condones but arguably enjoys it. [The difficulty of this challenge varies across genres, so the mix of content would be affected. For example, large swathes of stand-up comedy rely on obscenity both for brevity of expression when precision is not needed, and for comic contrast against expectations. But this is not, in itself, a problem.] This is a desirable outcome.

Mr. Jarvis has imposed his own Rules of Engagement in order to protect the quality of discourse in discussions on his blog. The result is that people who might normally use the "seven dirty words" in normal conversation will instead have to find another way to convey their meaning. Restriction of content provides an impetus toward reasoned articulation.

In struggling to interest and entertain an audience without the blunt tool of obscenely "realistic" dialogue, creators of all kinds will be obliged to convey their meaning through "unrealistic" articulate invective. Art and entertainment are shapers, not mere mirrors, of culture; thus the level of discourse seen in mass media will be reflected in the everyday world in which you and I live. This is the case for prudishness.

Mr. Jarvis's position is that the freedom of the conversation between broadcaster and viewer is
a more important good than the elevation of discourse I hope for. This is a defensible position,
whose acceptance or rejection largely hinges on an article of faith. My own profession is that
There is no significant facet of the human condition which cannot be communicated without recourse to obscenity.
If this axiom is rejected, then restriction of content is indeed censorship. But broadcast restrictions, if they are stable and evenly enforced, cannot rightly be called censorship because they do not restrict the substance of what is communicated; the "right" to use obscenity in entertainment is not much weightier than the "right" to broadcast a TV signal on channel six-and-a-half.

This defense of speech restrictions in general is, alas, not fully sufficient to defend our own mercurial and easily manipulated system. Advocates of restrictions have largely brought this situation on themselves by hiding behind bureaucracy. They should not allow themselves to be cowed by spurious cries of censorship. A national discourse on how to raise the level of national discourse is overdue.

[Update 3 August: former blogger Obvious Troll has an amusing post on reserving profanity for when it is really needed.]

Friday, July 22, 2005

Zugzwang

If the London police have apprehended a member of the current active terrorist cell for questioning, then the leaders of that cell know their time is short; their preparation and recruiting are a rapidly wasting asset. Their logical response is to immediately launch any attacks they can. It seems that just this might be happening; a bomb carrier was shot at Stockwell station this morning, and Moorgate has just been evacuated. I expect the next few days to see a continuation of this, as the leaders rush to "spend" their recruits. Batten down, mates.

Wednesday, July 20, 2005

Curious

In T. S. Eliot's Book of Practical Cats, we get a tongue-in-cheek description of a criminal mastermind:
Macavity, Macavity, there's no one like Macavity,
He's broken every human law, he breaks the law of gravity.
His powers of levitation would make a fakir stare,
And when you reach the scene of crime--Macavity's not there!
You may seek him in the basement, you may look up in the air--
But I tell you once and once again, Macavity's not there!
I am wondering why no one has thought to apply this description to Karl Rove.

Monday, July 18, 2005

Miners' Strikes

Edward "Ted" Heath, the ill-fated Conservative Prime Minister brought down by the opposition of British unions, died Sunday. He called elections in 1974 on an anti-Union platform, using the slogan "Who Governs Britain?" which may have been coined by Margaret Thatcher. The results were not what Mr. Heath hoped:
In 1974 industrial unrest led to two general elections. Labour won them both and Ted Heath returned to the backbenches and eventually had to give way to Margaret Thatcher as leader of the Conservative Party. Wilson returned to government and gave the miners a 35% pay rise. Denis Healey became Chancellor of the Exchequer - he faced a debt of over £4 billion. Inflation was in double figures and wages were linked to prices. Wage increases brought the country to its knees and the government to the polls for the second time in a year. Labour won by 3 votes.
It was Mr. Heath's second loss -- even after the Labour government under Mr. Wilson had caved in to strikes and been rewarded with more strikes -- that led to Ms. Thatcher's replacing him as head of the Conservatives. The re-elected Labour government again failed to staunch the strikes, leading eventually to Ms. Thatcher's long ascendancy on a vigorously anti-union platform.

In reading about the "Who Governs Britain?" rhetoric, I found official histories of the 1983-85 miner's strikes from both the Police Federation and the Trade Unions Council. [Strip URLs for organization homes.] To understand the situation, it is necessary to understand the practice, never widespread in America, of "sympathetic strikes" where workers in one industry or union would strike until the demands of another union were met. Sympathetic strikes are remembered today only by those nostalgic for those sweet bygone days of rampant Union glory; in Britain, they were made illegal by the Trade Disputes and Trade Union Act of 1927, following a general strike sympathetic to the coal miner's union.

But in the flush of relief which brought Labour to power after World War II, this ban was removed. As unions grew steadily more assertive in the postwar era, strikes in support of workers in unrelated industries again became common. The goal of these strikes was to bring national pressure to bear on unyielding employers, so they were naturally deployed where they would cause the most damage to the national economy -- power stations and fuel depots being the favored targets. Unions evolved the tactic of "flying pickets", union members brought in from other parts of the country to support, with violence if necessary, picketing and disruption. ['Picketing', in American usage, has generally peaceful connotations -- the British practice might more accurately be called 'blockading'.] The manufacturers and police could not match this mobility, so that unions were assured of local numerical superiority at each confrontation.

For example, power stations at Dunston, Stella and Saltley Marsh were picketed in 1972. This looks like "alternative" press of the day; here is a sympathetic history [which calls the head of the National Union of Miners [NUM] "a complete and utter right winger"]. A coal-centric history gives us this:
At first the miners picketed at coal power stations, but then it was decided to target all power stations, and also steelworks, ports, coal depots and other major coal users. In South Wales, dockers at Newport and Cardiff supported the miners by refusing to unload coal from ships. On the 21st January, the NUM decided to try to stop the movement of all fuel supplies. Miners from South Wales were involved in the pickets at the Saltley Marsh Coal Depot of the West Midlands Gas Board.

On the 9th February, a state of emergency was declared and 2 days later, the three day working week was introduced to save electricity. On the 19th February, after much negotiation, an agreement was reached between the National Executive Committee of the NUM and the Government. Picketing was called off, and on the 25th February, the miners accepted the offer
[a 35% pay increase] in a ballot, returning to work on the 28th February.

The result of the strike was that the miners' wages became almost the highest amongst the working class. The strike also showed the country how important coal was to the country's economy.

By 1973 however, the miners had moved from first in the industrial wages league to eighteenth.
The miners had won a sudden and drastic pay rise, but the inflation triggered by this rise (and the example set for other union negotiations) and by the onset of the Arab Oil Embargo quickly eroded their gains. So they struck again, bringing down Mr. Heath's government and quickly gaining another large pay rise from the incoming Labour government.

These triumphs were very much in the miners' minds in 1984, when proposed closures of money-losing pits triggered another NUM strike [the strikers demanded that the pits remain open to prevent job losses]. Arguably, the again-empowered Tories and the Coal Board also remembered them, and the desire for revenge led them to take a harder stance. The forces of capital, however, this time had the advantage in mobility:
The police service had undergone huge changes in the years between the Heath and Thatcher governments. Mass mergers had reduced the number of forces to 43, including six new Metropolitan units.
The government also acted to reduce the miners' mobility:
Miners found themselves placed under unofficial marshal [sic] law. Flying pickets were turned back hundreds of miles from their destination, pit villages were occupied, phones tapped, mail opened, miners and their supporters suffered savage beatings, two miners were killed and well over 10,000 arrested. And while the tactics advocated by the NUM leadership did not go much beyond push and shove, the rank and file exhibited as much creativity as they did courage. Spontaneous instinct led all the way to embryonic forms of working class state power. In the hit squads -- embryonic workers’ militias. In the Women Against Pit Closures movement -- an embryonic mass working class women’s movement. In the miners’ support groups -- embryonic soviets.
[I have enclosed a larger quote than is strictly necessary, to give the flavor of this source and to savor the contrast between "savage" police beatings and "creative" miners' hit squads.]

Back to the Police Federation:

Throughout the winter of 1984/5 it was becoming obvious to most miners that they could not win. By March, the drift back to work had become a flood. Older miners decided to take the generous redundancy packages. Left wing leaders around Scargill, except for Tony Benn, were distancing themselves from his increasingly frantic assertions that victory was just around the next corner. In South Wales, the NUM leadership publicly dissociated themselves from Scargill's tactics.

On Sunday 3rd March 1985, just a year after the strike began, a special conference of the NUM voted narrowly to end it. The Coal Board had made no concessions and the pits would soon close for ever.


I will address the moral of this story in a later post.

Thursday, July 14, 2005

Global Faith

For those interested in the evolution of faiths, Philip Jenkins' The Next Christianity discusses the demographic and doctrinal changes ahead for the Catholic Church.
If we look beyond the liberal West, we see that another Christian revolution, quite different from the one being called for in affluent American suburbs and upscale urban parishes, is already in progress. Worldwide, Christianity is actually moving toward supernaturalism and neo-orthodoxy, and in many ways toward the ancient world view expressed in the New Testament: a vision of Jesus as the embodiment of divine power, who overcomes the evil forces that inflict calamity and sickness upon the human race. In the global South (the areas that we often think of primarily as the Third World) huge and growing Christian populations — currently 480 million in Latin America, 360 million in Africa, and 313 million in Asia, compared with 260 million in North America — now make up what the Catholic scholar Walbert Buhlmann has called the Third Church, a form of Christianity as distinct as Protestantism or Orthodoxy, and one that is likely to become dominant in the faith. The revolution taking place in Africa, Asia, and Latin America is far more sweeping in its implications than any current shifts in North American religion, whether Catholic or Protestant. There is increasing tension between what one might call a liberal Northern Reformation and the surging Southern religious revolution, which one might equate with the Counter-Reformation, the internal Catholic reforms that took place at the same time as the Reformation — although in references to the past and the present the term "Counter-Reformation" misleadingly implies a simple reaction instead of a social and spiritual explosion.

In a Name

Joseph Britt's final post on Belgravia Dispatch [reviewed here] has been refined and reprinted as a Washington Post editorial: Arab Genocide, Arab Silence.

We've heard a lot since Sept. 11, 2001, about how Arabs feel humiliated, ashamed, resentful at being regarded by the West as inferior in some way. Sometimes we ignore these feelings; sometimes we try to appease them. Perhaps it is time to say plainly that the way to earn respect is through deeds worthy of respect.

Also today, the Belmont Club mourns for murdered Iraqi children:

The empirical fact is that no group has been killed more often and more brutally by the "Jihad" than Muslims themselves.... The first objective of terror, indeed of the Terror, and the first objective of the Jihad is to maintain internal control over its base.

... But many conservatives have also been blind to the urgent requirement of creating a liberation movement within Islam, in part because they half believe all Muslims are themselves the enemy; in part because they despair of Muslims ever rising up against the medieval institutions which constrain them; in part because they haven't thought about it. But they should. That pile of bloody children's slippers on an Iraqi street is a tally of spirits who were created to be free.

But how can "a liberation movement within Islam" form? How can its adherents show themselves to be different from the butchers and enablers who dominate the public face of their faith? If a struggle to reform Islam were taking place right now, how would we even know?

The problem is that there is no word for "non-fascist, post-medieval Islam" or for the followers thereof, nor is it simple to create one. When the Church was mired in corruption and the exercise of temporal power, those seeking to purge it called themselves Protestants, but they did not thereby cease being Christians. Try to describe the Reformation, without using the words "Protestant" or "Catholic", to someone who does not know the results.

Only once such a movement can be named can we say who belongs to it, or how it can be supported. At present we see only the opposite.

Sheltered Life

I have occasionally run into problems related to obtaining too large a fraction of my knowledge from books:
  • I had no idea who Vera Lynn was when I heard that Pink Floyd song.
  • I read the line "Then the fit hit the Shan," without ever recognizing the pun.
  • The first I heard of the M25 London Orbital was that it was designed by a junior devil.
  • I studied All the King's Men for clues to its State of residence.
Worse still, I would explain my ignorance of pop culture by telling people, "I grew up in a closet." Of course, I had never heard the phrase "out of the closet"...

But worst of all, this is probably still happening and I just don't know it.

Tuesday, July 12, 2005

Inevitability

Yesterday I examined Darwinism, the anti-morality which accepts that might makes right. It occurs to me that this is closely related to the "historical inevitability" claimed by Marxists and used as a justification for Communism.

Fortunately, they were egregiously wrong in practice; but the theoretical sleight-of-hand behind this justification is interesting. A system of government which explicitly rejects God, and seeks to reshape society, cannot then claim any moral basis for its actions in the existing world. By embracing the fallacy of Darwinism, Marx and later Lenin coolly finessed this problem, feeding it its own tail. Communism was inevitable because it was for the good of mankind, and its goodness was in turn evidenced by that same inevitability.

Recommended

Scott Burgess, an American engineer in London, has been unusually busy since 7/7:
I'm not going to take the time to debunk his comments point by point, but I do ask you to read his (why am I certain that it's a male?) post and note the contortions he goes through; first, to deny that the bombings were bombings at all, and afterwards to deny that they were the work of terrorists in general, much less Muslim extremists in particular.
Idiots beware! "He used... sarcasm. He knew all the tricks, dramatic irony, metaphor, bathos, puns, parody, litotes and... satire. He was vicious."

Lubos Motl puts current troubles in perspective:
A few million years ago, you would be eaten by another mammal if you did one error. 14.3 billion years ago (without 300,000 years), the global warming was so bad that you could not even form the Hydrogen atom.

Monday, July 11, 2005

Darwinism

The axioms of evolutionary theory can be simply stated, and understood by anyone. We divide them, roughly, into axioms of optimization and speciation. The former describes the "survival of the fittest" -- for some definition of "fit" -- while the latter deal with the division of populations into genetically distinct species. The variety of theories in modern evolutionary biology, and the bulk of professional interest, are directed toward the mechanisms of speciation; but it is the process of optimization which defines evolution for nonprofessionals.

So much is read into that one phrase, "survival of the fittest". When we begin to delve into its meaning, it seems that it is mostly tautological, and means essentially "survival of the most survivable". That is what an organism must be fit for. Another interesting part of modern biology is unravelling this tautology by finding the evolutionary use of mysterious characteristics.

In common parlance, this phrase is also given an economic as well as a biological reading, due to its common application to the competition among corporations. In this case, it is often assumed that there is no tautology because we can determine independently what "fittest" means [e.g., producing the best product, or having the lowest operating cost]. The assumptions underlying this reasoning are not always true -- consider Chrysler or more recently Fiat -- but they are not utterly invalid, and the "theory" of corporate evolution has some explanatory power.

Something else comes from this economic usage: the idea that evolution, or the competition it implies, is somehow a moral good. The positive principle, that the fittest are likeliest to pass on their characteristics to future generations, becomes conflated with the normative statement that the fittest should be those who survive.

This repulsive pseudo-morality immediately collapses when seen directly. In fact, it is the negation of all morality by the claim that whatever wrongs are done -- a strong man stealing food from a starving child in a refugee camp, an unarmed populace herded into ghettos for later extermination, a politically inept functionary sent to the gulags, a lone man standing in front of a row of tanks at Tien An Men -- are simply facets of a struggle for survival, instructive information about how the world works. Survival of the fittest, and all that.

It is not hard to recognize the illogic, as well as the immorality, of substituting Darwinism for traditional ideas of morality. But there is another problem just behind this, exemplified by the fact that you knew what I meant by "Darwinism" in the preceding sentence. The anti-evolution movement damages our schools, our children and our nation; and the conflation of evolution with Darwinism abets its spread outside the isolated camps of extreme fundamentalism. It is as if Adam Smith had been named Fasc.

We reject the idea that the strong is intrinsically good, but there is a moral lesson to be drawn from the theory of competition for survival:
What is good should be made strong.

[Update 12 July: corrected "inflation" to "evolution" in penultimate paragraph. Further thoughts here.]

Wednesday, July 06, 2005

Free to Good Home

The (self-imposed) pressure of creating daily posts for this blog is leading me to produce lower-quality posts, and is consuming more time than I would like. I am reluctant to maintain a blog with less-than-daily posts, which would be hardly worth checking. It seems to me that the best solution would be to discontinue The Stone City and join forces with another blogger or group thereof.

In this event, I would post 2-3 times a week. I think that, from my existing posts, you can get a clear idea of what I would (and would not) contribute. I would ask that Mr. Whitlock and Mr. Maguire be blogrolled.

If anyone is interested, just leave a comment here or email me: asammler@gmail.com.

Tuesday, July 05, 2005

Justice, not Justices

An impressive idea at Law and Society: "Eliminating the Supreme Court as a Standing Body." Go there!

[Hat tip: Volokh Conspiracy, though their link removes the "as a standing body" qualifier, which is exactly the interesting part of the idea.]

Monday, July 04, 2005

In Praise of Darkness

In yesterday's Washington Post, Professor Linda Ross Meyer eulogizes Sandra Day O'Connor's term on the Supreme Court. In writing this editorial, she manages to combine complacent fuzziness about the impact of Court power with self-serving sophistry smirking at the rule of law. I have reproduced the full text in this post, with added commentary.

Will the Supreme Court be different without Sandra Day O'Connor? You bet. But it will be different not only because she provided crucial votes for a right to privacy, against race-based redistricting, against state endorsement of religion and for affirmative action, etc. The court will be different without her because, if President Bush nominates a replacement on the basis of that person's position on constitutional theory, there will be no one to take the court on an outing from its ivory tower.

Justice O'Connor is known in court circles for organizing field trips for her clerks. She was determined to pry us clerks from behind our computers and from under our piles of briefs, and scoot us into the whirl of working Washington. Getting out into the real world, talking with real people, was a trial for some of her library-carrel-reared clerks, but these outings were not just matters of tourism. They gave us the foundational experience for understanding O'Connor's approach to jurisprudence and her place on the court. And they taught us what the court will lack when O'Connor leaves.

She would troop us around the District, taking in exhibits, touring postal facilities or wandering about historic ships [sic], all the while peppering every docent or captain or postal worker with her questions and characteristic energy and enthusiasm. "Isn't this interesting!" she would exclaim over the concept of self-adhering postal stamps or machinery for dredging clams from the muck of Chesapeake Bay. We bleary-eyed bookworms would dutifully agree.

To translate: Ms. O'Connor is not just sheltered, she is so sheltered that the sight of a self-adhesive postal stamp is a novelty to her. I wonder if she saw a digital watch. Probably not, since the locations chosen for these anthropological field trips -- "postal facilities" and historic ships -- are not exactly hotspots in the cauldron of the real world. Think of Ms. O'Connor "taking in exhibits": she is so isolated from the outside world that she goes to museums to catch up. And no one tells her what foolishness this is.

At the court itself, it seemed to us as if O'Connor kept her feet on the ground in a place often dominated by Grand Constitutional Theory -- a combination of theoretical absolutes and scholarly minutiae of the kind that law professors love. Justice Stephen Breyer's opinions, for example, often sound as though they came directly from the policy wonks at the nonprofit Rand Corp., complete with appendices and concordances. Justice Clarence Thomas, meanwhile, believes his job is to return the court to the constitutional law of the 18th century, according to the Framers' intent, and his opinions usually contain long historical discourses about the state of law in 1790, as capital defendants dance on the head of an historical pin.

Mr. Thomas has a framework for interpreting law. The horror! So might Mr. Breyer, if his "policy wonk" opinions are consistent.

Justice Antonin Scalia, likewise, believes in theoretical absolutes. The court should expunge all rights not explicitly articulated in the text. It should never look at legislative history. And it should establish neat, clear, interlocking rules that can be applied consistently for all areas of law (and without further judgment by presumably less-competent trial courts). "What distinguishes the rule of law from the dictatorship of a shifting Supreme Court majority is the absolutely indispensable requirement that judicial opinions be grounded in consistently applied principle," Scalia wrote in his dissent to last week's court decision limiting the display of the Ten Commandments on government property.

Mr. Scalia not only has a framework for interpreting the law; he also points out the real-world meaning of the lack of such a framework. Thus he merits a full paragraph, rather than having to share one with Messrs. Breyer and Thomas. To Ms. Meyer's credit, she reasonably summarizes Mr. Scalia's reasoning, noting that the Court's job is to interpret laws (made by the legislature) in predictable ways.

But O'Connor believed that judging is not policy analysis or scholarship or ideology. Justice doesn't come from a book. The difficult decisions come in shades of gray, not black and white, and the view from the Supreme Court bench can be a narrow one. She looked to the pragmatic consequences of the court's decisions for everyday citizens, and she tried to do justice in the case while leaving play in the constitutional joints so as not to tie the hands of those closer to the facts and the people.

In place of "theoretical absolutes", we get a pragmatic absolute: more power to judges.

As a former trial judge and state legislator, she knew firsthand that abstract rules can't always assure a just result, and so her legal "rules" took the form of balancing tests that allowed wiggle room for the vast and infinite variety of real life (much to chagrin of her more theoretical colleagues). State governments could regulate abortion as long as they didn't place an "undue burden" on women; governments could display religious symbols as long as they didn't "endorse" religion or make religious minorities feel excluded from citizenship. She was also one of the few justices willing to grant certiorari in (and review) death penalty cases not just when the constitutional theories needed to be clarified, but to fix mistakes in particular cases.

They could have the death penalty, as long as Ms. O'Connor thought they were using it right. They could have racial quotas, as long as they agreed to lie about them. Stripped of the fuzzy rhetoric, this means that Ms. O'Connor decided the outcome she felt the law should dictate in each case, and developed the justifications afterwards. But she's really a good person: she "knew firsthand that abstract rules can't always assure a just result", so she systematically gave as much power as possible to judges everywhere, because litigation with an unpredictable outcome is such a boon to our society.

Her jurisprudence was characterized by a deference for local knowledge and a practical humility about the Court's ability to construct or reconstruct a seamless theory of constitutional law. If people were treated unjustly -- arrested and handcuffed for a seatbelt violation, unfairly denied an equal vote by crazy-quilt redistricting, stripped of their property in order to increase the local tax base, discriminated against because of who they were and not what they had accomplished, or detained without a hearing -- then Grand Theory had to bend to Justice, not Justice to Grand Theory. Like her predecessor, Justice Potter Stewart, she didn't need a theory to know an injustice when she saw it.

I fear that this might really be accurate: that Ms. O'Connor truly saw herself as an avatar of Justice, put in her exalted office that she might right the wrongs of society. Certainly the alternative -- that there are some powers which government may justly wield, and others which it may not, and that she might be attempting to accurately interpret existing guidelines distinguishing the two -- has no appeal to Ms. Meyer. Did she need a law to know an injustice when she saw it?

Most of what we have been hearing about potential nominees is about their "judicial philosophy" or their "theory of the Constitution." During last year's presidential campaign debates, Bush said, "I would pick people that would be strict constructionists."

And now, in anticipation not of O'Connor's retirement but of Chief Justice William Rehnquist's departure, activist groups on the left and the right have raised millions of dollars to promote or fight against nominees depending on their judicial philosophies. We're asked to support Janice Rogers Brown because she is dedicated to "preserving the Constitution as ratified," Michael McConnell because of his theory of the establishment clause, Emilio Garza or Edith Jones because they are constitutional originalists, Alberto Gonzales because of his position on substantive due process and abortion rights.

We are being asked to support judges who will honestly and predictably interpret the laws, past and future, of Congress. Apparently this is a Bad Thing.

But these positions and writings reveal just one part of what the president should know to make his next decision. Character and range of human experience should also matter. Does the nominee have truly practical wisdom -- an on-the-muddy-ground understanding of the sheer diversity of human aspiration, emotion, frailty, and passion? Does the nominee have a sense of justice to prevail where simple theory proves inadequate? O'Connor did, and a court without members of these qualities does not bode well for our future.

Judicial fiat isn't a bug, it's a feature! How dry, how unappealing is a judge who merely studies the letters of Law. Let us instead have more like Ms. O'Connor, more fearless dispensers of living Justice based on their understanding (superior, one presumes, to that of lower-court justices) of the true human condition.

All this verbiage studiously ignores the central point, which is that if judges legislate, then legislatures ipso facto do not. This is the world Ms. O'Connor has brought us toward, where power is neither vested in the legislature nor reserved to the people.

To be fair, Ms. O'Connor did not steer a straight course toward this end. To the extent that she sometimes defended principle and clarity, she deserves praise rather than scorn. But Ms. Meyer precisely inverts this equation, lauding what Robert Bork has called "the political seduction of the law" -- she has internalized this corruption so deeply that she can no longer even notice it. How many others, clerks, judges and professors, have joined the same ranks?

[Hat tip: Real Clear Politics.]

[Update 6 July: Beldar has it right. For example: "When they do join a majority opinion, it's often through watering down and muddying up the law (offering up another damned "balancing test" that barely conceals the fundamentally ad hoc and arbitrary nature of so much recent Supreme Court law), or through some sort of horsetrading like O'Connor's pragmatic but absolutely unprincipled opinion for the majority in the Michigan Law School affirmative action case, Grutter."]

[Update 7 July: Stephen Green joins in, citing John Yoo's comments in Time.]

Friday, July 01, 2005

Singularity Discounting

The Singularity (a.k.a. "the Rapture for nerds") is widely expected, among technophiles (who are best positioned to know), within 20 to 40 years. On the other hand, the U. S. Dollar (and other developed currencies) trades bonds and swaps implying a liquid lending market out to 30 or 40 years. In addition, that yield curve (of implied forward rates) is at present essentially flat, with annual rates ranging from 4.85 to 5.05% over the interval in question. Can we deduce anything from this? For the sake of argument, assume that the Singularity has small temporal extent, so we can treat it as instantaneous.

One feature of a singularity is that today's national currencies will become worthless; in addition, any prior investment, either in possessions or in the means of production, will become worthless. This complicates our analysis by introducing a "0 divided by 0" problem. However, we reason as follows: some expenditures (personal upgrades and/or transient pleasures) will be justified up to the moment of the Singularity. Thus a discounting curve built by a rational actor with foreknowledge of the date of the Singularity ("the Date") would show discount factors (the present value of a unit future payment) declining smoothly to zero as the Date approached.

A substantial population of investors with such a belief would distort the yield curve; thus the flatness of the yield curve is an indicator of their absence. We can quantify this a little, but only if we extrapolate a yield curve behavior in the absence of any Singularity.

For example, if we imagine that annual yields after 20 years would, in the absence of a singularity, approach a steady state of 3.5% (a rough historic low) with a decay rate of 0.1/year (i.e., a half-life of about 7 years), the relative change in the 40-year discount factor (i.e., the implied probability of a Singularity) is only 4%. The exact number is dependent on our extrapolation, but the implication is clear: the financial markets do not believe in any Singularity.