The Volokh Conspiracy 
Get posts by e-mail


Academic Legal Writing: personalized copies

Sources on the Second Amendment

Testimony on the Second Amendment

Shards: Poems from the War



Saturday, December 28, 2002


UNEXPECTED BENEFITS: "Woman Shot in Chest But Saved by Breast Implants." Here are the two best paragraphs:
A plastic surgeon was called in to fix the damage and took the opportunity to increase the size of Mrs Soares' breasts with more silicone.

She said: "I'm twice happy, first because my prosthesis saved my life and also because now I look even more beautiful."
(Thanks to GeekPress for the pointer.)

Friday, December 27, 2002


WHEN MAY THE POLICE DEMAND THAT A PEDESTRIAN IDENTIFY HIMSELF? The police may not just stop someone without any reason and demand that he identify himself; the Supreme Court made clear in Brown v. Texas (1979) that stopping someone for no reason violated the Fourth Amendment. (There are exceptions to this, such as for drunk driving checkpoints and the like, but this is the general rule.) The Supreme Court also made clear, in Terry v. Ohio (1969) that the police may stop someone if they have "reasonable suspicion" to believe that the person may be involved in crime, a standard that's below the "probable cause" needed to arrest him, and then briefly detain the person while they figure out what's going on.

     But the Court has not clearly decided whether a state law may obligate people (pedestrians or passengers, and not just drivers) to present identification once they are lawfully stopped. A 1983 case, Kolender v. Lawson, raised this Fourth Amendment issue but didn't squarely decide it because the Court held that the state law was unconstitutionally vague.

     The Nevada Supreme Court has just made it fairly likely that the Court will indeed have to settle the matter. The Ninth Circuit, in Carey v. Nevada Gaming Control Bd. (2002) struck down the Nevada law that required people to present identification when they are lawfully stopped; but the Nevada Supreme Court has just upheld the same law, in Hiibel v. District Court. The Nevada Supreme Court decision creates a disagreement between a federal circuit court (the Ninth Circuit) and a state supreme court in that circuit (the Nevada Supreme Court) about the constitutionality of a particular statute. The Supreme Court tends to view such disagreements as particularly in need of resolution, even more so than the conventional "circuit split," where different circuits (or some circuits and some state supreme courts outside the circuit) reach different results.

     I predict that the Court will agree to hear the case, unless there's some procedural problem that I'm unaware of (these are often hard to tell until one sees the response to the petition for certiorari). You heard it here first (unless you read the Las Vegas Review-Journal, which seems to be the only newspaper that has covered this, or unless you subscribe to the Westlaw WLB service, which is where I first found a paragraph-long description of the case).


THE CONSTITUTION IS NOT A COMPLETE GLOSSARY: I got another message yesterday from a thoughtful reader who protested against the term "states' rights," on the theory that the Constitution recognizes only the rights of individuals, not states. I responded by pointing him to my earlier post that cited lots of Framing-era evidence that showed that states were described as having "rights" at the time.

     Ah, my correspondent responded, but isn't it more important to see what the Constitution says, especially when we're considering whether states' rights should be recognized as a matter of constitutional law? And the Constitution seems to only refer to the rights of individuals, and when it talks about states -- as in the Tenth Amendment -- discusses their "powers" rather than their "rights."

     I don't think that's quite right. If your goal is to understand the meaning of a particular bit of constitutional text, then it does matter what neighboring clauses say; as I've argued before, it would be odd if "the right of the people" meant an individual right in the First and Fourth Amendments, and a states' right or a right of some small state-selected body in the Second Amendment.

     But if your goal is to more broadly understand whom the Framers saw as rights-bearing entities, or how the Framers were generally willing to use concepts such as "right," then the Constitution is too narrow a document for that purpose. Likewise, if your goal is to understand what the Constitution meant to the Framers, you likewise can't limit yourself to the text.

     The Constitution is a short document, which wasn't intended or understood as providing some authoritative and complete glossary for the legal terms or legal concepts of the era. The body of the Constitution uses the term "right" only in one place -- the Copyright/Patent Clause -- even though it protects many things that are seen today and were seen then as rights. The Bill of Rights itself uses the term "right" only in some of its provisions. It doesn't speak of the right to free speech, the right to due process, the right to be free from cruel and unusual punishment, or the right to compensation if one's property is taken. I know of no well-supported theory that shows that these provisions were somehow seen by the Framing generation as being materially different from those that were called "rights." (After all, they were all in the Bill of Rights.)

     The Framers cared about the words they used, but they didn't write with an eye towards having the Constitution be the sole source of all legitimate legal concepts. Their failure to use a certain term -- whether it's their not using the term "right" used to refer to the freedom of speech, or their not using the term "right" to refer to states -- thus can't tell us much about the legitimacy of that concept in the constitutional structure.

     Rather, we should consider the broader structure of Anglo-American jurisprudence within which the Framers were writing. If that jurisprudence recognized a concept, such as the concept that states and other entities besides individuals could have rights, then the failure to mention the concept in the constitutional text doesn't make the concept constitutionally illegitimate. The Constitution does secure to states certain immunities from federal government action, just as it secures to individuals certain immunities from state government action. And there's nothing in the Constitution that makes it illegitimate or unsound to refer to those immunities as "rights," much as the Framers at times did.


NOT QUITE SURE WHY I FIND THIS FUNNY: "Much of the driving force behind the Declaration of 2003 as the Year of Freshwater has come from Tajikistan."

P.S. Just in case you're wondering, because this is the United Nations talking, of course there's no talk of market solutions to freshwater shortages, only "conservation," "sustainable use," sending money around the world, and setting ambitious goals for improving access to clean water.

UPDATE: Reader Kevin Connors notes that you shouldn't take the Boston Globe article above at face value, since it only recommends that people in "rich deserts" like Southern California should pay the full price of their water, without noting that agricultural users generally (not just in deserts) pay hugely subsidized rates for their water, which is why Californians grow water-intensive crops like cotton and rice. By the way, another problem with the article is that it only recommends market pricing for "wealthier regions."

UPDATE 2: Reader Maria Eugenia Garibotti also recommendends this paper on how the privatization of water services in Argentina has reduced child mortality:

In the 1990s Argentina embarked on one of the largest privatization campaigns in the world as part of a structural reform plan. The program included the privatization of local water companies covering approximately 30 percent of the country's municipalities. Since clean water and sewage treatment are critical to control the spread of infectious and parasitic diseases[,] access expansions, quality improvements, and tariff changes associated [with] privatization may have affected health outcomes. Using the variation in ownership of water provision across time and space generated by the privatization process, we find that child mortality fell 5 to 7 percent in areas that privatized their water services overall[] and that the effect was largest in the poorest areas. In fact, we estimate that child mortality fell by 24 percent in the poorest municipalities. These results suggest that the privatization of water services prevented approximately 375 deaths of young children per year.


Thursday, December 26, 2002


TOP FIVE SCOUNDRELS OF 2002?: In late December, lots of news sources supplement their year-end coverage with stories recapping the news of the year. Sometimes they try to look back on some of the major stories or trends; sometimes they pick a Person of the Year or something similar. At their best, these stories offer opportunities for reflection, sort of a first cut of history; at their worst, they fill up time in a slow news day.

     The silliest of such stories that I've seen recently has to be this ABC News story on the top 5 People Behaving Badly in 2002. To be fair, ABC has pitched this list as the subjects of the Top 5 stories involving "scoundrels," rather than the "top scoundrels" of the year. Still, ABC's list is pretty odd:
5) Michael Jackson
4) James Traficant
3) Corporate Bad Guys (yes, that's ABC's phrase)
2) Saddam Hussein
1) Osama bin Laden
To summarize, the people who behaved badly in ways that ABC finds newsworthy are terrorists and dictators trying to destroy the Western World, corrupt businessmen and politicians, and a freakish musician who dangles his child from a balcony. Um, okay.


BETTER SAFE THAN SORRY: The American Council for Science and Health, generally a reputable anti-junk-science outfit, has a piece on the precautionary principle from Thomas DeGregori's book Bountiful Harvest. The author makes a lot of good points about the precautionary principle -- this is the principle that we should avoid new technologies unless they're proven harmless -- it ignores any benefits from the technology; it assumes that the status quo is free of (natural) danger; if it had been applied in the past, any of the technologies that we take for granted today (fire, the knife, penicillin) would never have been developed.

But even critics of the precautionary principle can go too far. The author writes:

The precautionary principle is often defined as the view that "absence of evidence is not the same as absence of risk." What this really says is that the proponents of the principle have lost the argument on the evidence (otherwise they would argue the evidence), so they argue that we should follow their policy prescriptions anyway. Stated differently: if our fears and phobias are right, we are right, but even if we are wrong, well, we are still right. It's "my policy, right or wrong."

I don't like the idea that you need to have evidence of harm to be credible. The environmentalist precautionary principle looks a lot like the economic precautionary principle that you can find in people like Hayek. Here's Hayek explaining why you should oppose even good-looking interventions because you can't predict what happens in spontaneous orders:

Any . . . restriction [of freedom], any coercion other than the enforcement of general rules, will aim at the achievement of some foreseeable particular result, but what is prevented by it will usually not be known. The direct effects of any interference with the market order will be near and clearly visible in most cases, while the more indirect and remote effects will mostly be unknown and will therefore be disregarded. We shall never be aware of all the costs of achieving particular results by such interference.

And so, when we decide each issue solely on what appear to be its individual merits, we always over-estimate the advantages of central direction. Our choice will regularly appear to be one between a certain known and tangible gain and the mere probability of the prevention of some unknown beneficial action by unknown persons. If the choice between freedom and coercion is thus treated as a matter of expediency, freedom is bound to be sacrificed in almost every instance. . . . There are probably few restrictions on freedom which could not be justified on the grounds that we do not know the particular loss they will cause.

That freedom can be preserved only if it is treated as a supreme principle which must not be sacrificed for particular advantages was fully understood by the leading liberal thinkers of the nineteenth century, one of whom even described liberalism as "the system of principles" [citing Constant]. Such is the chief burden of their warnings concerning "What is seen and what is not seen in political economy" [citing Bastiat] and about the "pragmatism that contrary to the intentions of its representatives inexorably leads to socialism" [citing Menger].

This is Law, Legislation, and Liberty, volume 1, pp. 56-57. Now I wouldn't exactly call myself a Hayekian, and I think Hayek is probably wrong to think of freedom as a supreme principle in this way. But what's right here is Hayek's precautionary principle, which has obvious parallels to the environmentalist version that you can find in Jurassic Park and more serious works.

Where the environmentalists go wrong -- and probably you can say the same of Hayek -- is in their asymmetry. As DeGregori himself puts it:

[The precautionary principle] implies that the realm beyond what we currently know is the exclusive preserve of the critics of technology. In other words, if the known evidence does not support your phobia, then the evidence must lie in the great unknown. The position claims that all the possible unknown dangerous actions will be from technological practices, while the less technologically advanced, presumably more "natural" alternatives, carry no unseen dangers.

For instance, one could imagine a pro-technology precautionary principle -- we don't know what hidden poisons natural herbs have, so just in case, let's only take pills made in the safety of a lab. Which one is better depends where the greater unknowns are. Environmentalists claim that genetically modified organisms or certain new synthetic chemicals may have unknown side effects, and maybe they're right on some of these, especially when you're comparing to natural alternatives that have well-known side effects.

The result of the environmentalists' asymmetry is their unwillingness to balance -- but once you go symmetric, you can't escape balancing the risks of new technologies with the risks of old alternatives. Inevitably, the risk-averse will end up saying -- as I do for government interventions in the economy -- that you shouldn't just compare the mean estimates of the benefits but that you should also take into account the variance, that is, figure out which of the alternatives has more uncertainty and possible unknown bad outcomes and be a little bit biased against it. Which is, in a sense, an argument without evidence. Is that so bad?


FRIST & MARION BARRY REVISITED: Eugene makes some very good points about William Saletan's article on Senator Frist's 1994 comment about Marion Barry, but I think Eugene may understate the problems with Saletan's bottom line. Whereas Saletan believes Frist's comment "seems to meet the Lott standard," I do not believe the Frist comment and the Lott comment are in the same league.

First, let's not forget that Marion Barry was convicted for smoking crack. The tape of Mayor Barry curled up into the corner of a hotel room sucking on a crack pipe like some two-bit junkie -- and his post-arrest conniption ("B**** set me up!") -- were broadcast repeatedly on national television. At the time of Frist's comment, Barry was running for mayor again (and was reelected). For those of us who live outside of the Beltway, the idea that someone who was arrested for smoking crack could run successfully for office only a few years later is inconcievable -- and would be no less so were Barry white instead of black. As Eugene suggests, had Frist mentioned black, non-crack-smoking D.C. political figures -- or had he failed to criticize white, crack-smoking D.C. politicians (were there any) -- his critics would have a point. In sum, I believe that the default interpretation of Frist's comment should be that it contained no inherent racial message. To conclude otherwise would be to conclude that any criticism of black political figures is inherently suspect.

Not so with Lott's comment. I don't wish to reopen the whole Lott question, but I think it is fair to say that the default interpretation of any unqualified endorsement of Strom Thurmond's 1948 presidential run implies acceptance of the 1948 Dixiecrat platform. This platform focused, almost exclusively, on segregation. The only innocent explanation of Lott's remark is that Lott was ignorant and insensitive -- ignorant of what the Dixiecrats stood for and insensitive to the message that praise for Thurmond's 1948 campaign would inherently convey. This excuse may be plausible -- Lott's been known to say stupid things -- but it doesn't make the statement itself any less objectionable.

To sum up, I believe that the Frist comment was innocent on its face, but open to racial interpretations whereas the Lott comment was the reverse, racial on its face, but open to innocent interpretations. To me, that is a meaningful difference.


I would like to add one more factor to your brief blurb on the rape of women in combat. My father was a two and one half tour combat vet Ranger in Vietnam and he is adamantly opposed to women in combat for the very reasons you cite. He has, however, one twist on your scenario.

Solders that are captured are trained to recognize that they are eventually going to break under the stress and pain of capture. In light of this fact they are schooled to hold out as long as possible in the hopes that any information they give will be obsolete or altered by that time. With the addition of women in the POW camps, this ability to withhold information until the last minute would be compromised. A man hearing a comrade being tortured can say to himself: "hold on X. You can take it." But all but the
most heartless of men could listen to, let alone witness, the rape of a woman knowing that it is within their power to stop it. The women POWS would be tools used to get at the other solders and would compromise combat operations and the lives of others. This is in addition to the increased shock and trauma such events would have upon the release of these men.


WOMEN IN COMBAT: So here's the nasty issue that I think has to be at the heart of the women-in-combat debate -- the risk of rape of female prisoners of war.

     Soldiers of many nations have been notorious rapists of civilians; consider the reputations of the German, Soviet, and Japanese armies during World War II, as well as the more recent experience in the former Yugoslavia. (To my knowledge rapes of civilians by American and British soldiers have generally been much less extensive and widely tolerated or even encouraged as the rapes by the Germans, Soviets, and Japanese were, but that's not even the point here; I'm not trying to lay blame or give credit, but simply to point out that in fact many nation's soldiers have engaged in mass rapes of those women that they could safely get their hands on.) There's little reason to think that the nastier of today's armies, including the very ones we're likely to be fighting, will act much differently towards our soldiers. Yes, there is always the risk of homosexual rape of male soldiers, but a variety of factors suggest that the likelihood of any male prisoner of war being raped is much less than the likelihood of a female prisoner of war being raped.

     Why should we worry about this, given that we send our soldiers to risk death? I suppose one could say that a high risk of rape is a serious enough extra threat, even compared to death, that we shouldn't inflict it on our soldiers; and that might be enough to, for instance, justify not drafting women into combat positions. But it doesn't by itself justify prohibiting even volunteers from serving in combat. After all, some soldiers volunteer for tasks where the risk of death is considerably higher than the average; why shouldn't they also be able to volunteer for tasks where this other risk is present?

     The answer, I think, has to be that the risk is such that it will substantially change the behavior of soldiers, and of their comrades. Combat units with women in them may be especially reluctant to surrender, either because the women soldiers refuse, or because men soldiers refuse to put their women comrades at risk. Officers may be especially reluctant to send such units (or, if all units have women, then to send units generally) into a position where they might have to surrender. Missions to rescue prisoners may become a much higher priority. Civilian reactions to the war would be affected in a variety of ways.

     Nor could this be dealt with by simply saying "The women soldiers voluntarily assumed the risk." Even if every woman who really wants to be a soldier signs a statement acknowledging the danger and specifically agreeing that fellow soldiers can and should put the danger out of their minds (and I suspect that some brave and patriotic women soldiers would indeed be willing to take such a view), I doubt that many people will be much affected by this sort of waiver.

     Some male soldiers might reason "True, my comrade may end up getting raped if we surrender, but, hey, she knew the job was dangerous when she took it, so I'll just put the risk out of my mind" -- but many soldiers wouldn't. I'm not a military expert, so maybe I'm missing something big here, but this just seems to be simply human nature. If you were a female soldier, a male platoonmate, or an officer, would your judgment really be unaffected by the risks I describe, even if everyone involved had signed every conceivable waiver?

     Now none of these pressures are completely novel. Soldiers' decisions to surrender or not to surrender, commanders' decisions to commit troops, soldiers' and commanders' decisions to rescue prisoners, and the public's willingness to continue a war more generally (or perhaps to fight it especially harshly) have long been affected by a variety of factors that might not fully coincide with the nation's best interests (which, after all, is what military staffing decisions are supposed to ultimately serve). But it seems to me that the risk of women prisoners being raped will likely be yet another factor -- and an important one -- that can drive people's decisions, and can increase the risk to other soldiers' lives and to the success of the war effort.

     The other concerns about women in combat may also have some merit. The possible risks to unit cohesion do strike me as quite serious, as does the risk of pregnancy, even setting aside rape. The disparity between male and female strength is not as relevant, I think, because it just counsels in favor of using actual strength, rather than gender, as the criterion. But the most important factor, I think, is the one I describe above, because it seems to me much harder to eliminate through measures short of excluding women from combat forces.

     Again, I stress that I'm not an expert on the subject, and it might well be that there's some empirical evidence that I'm missing here. Moreover, I come at this with the presumption that women and men should indeed be treated equally, even in military matters, unless there's some very powerful reason to the contrary. Here, though, it seems to me -- at least tentatively -- that there is indeed such a powerful reason.


GAY INDEX: Also in the Washington Monthly, an urban-planning article by Richard Florida on The Rise of the Creative Class. The growth of cities, he says, depends on their ability to attract what he calls the "creative class":

The super-creative core of this new class includes scientists and engineers, university professors, poets and novelists, artists, entertainers, actors, designers, and architects, as well as the "thought leadership" of modern society: nonfiction writers, editors, cultural figures, think-tank researchers, analysts, and other opinion-makers.

Mmmm, think-tank researchers . . . I like this categorization already. And, says Florida, what it takes isn't building huge faceless downtowns and sports arenas, but providing the stuff the creative class values -- fun things to do, a diverse atmosphere, an openness to new ideas:

Talented people seek an environment open to differences. Many highly creative people, regardless of ethnic background or sexual orientation, grew up feeling like outsiders, different in some way from most of their schoolmates. When they are sizing up a new company and community, acceptance of diversity and of gays in particular is a sign that reads "non-standard people welcome here."

The main interesting part of the article is that the author constructs a Creativity Index for different cities, and one of the components of the Index is "diversity, measured by the Gay Index, a reasonable proxy for an area's openness to different kinds of people and ideas." Surprise, San Francisco is the most creative city by this index, followed by Austin, San Diego, Boston, Seattle, Chapel Hill, Houston, Washington, New York, Dallas, and Minneapolis. ("Other measures I came up with, like the Bohemian Index---a measure of artists, writers, and performers---produced similar results.")

Well, any urban planner who uses a gay index as a measure of a city's coolness is, by osmosis, cool himself, and probably right to boot. Politically, if his ideas catch on, I don't want to speculate on whether the world would be better or worse. I dislike cities' building sports arenas and faceless downtowns, because these often end up kicking people out of their homes and businesses through the eminent domain, and regardless, tend to be a waste of tax money. New London, Conn. is the tenth most creative small city, but I know them as the site of an eminent domain lawsuit I helped out with, so this theory would provide ammunition in favor of keeping the blue-collar immigrant neighborhood the city of New London was trying to kick out. On the other hand, the emphasis on "authentic" experiences and local charm could reinforce the opposite sin of development restrictions and restrictive zoning.

Just appreciate the Gay Index, is all I'm saying.

UPDATE: For some alternative perspectives on urban planning, see this column by Robert Robb (link from reader Jim Morse). I see the Robb article less as questioning Florida's descriptive analysis than as warning that artificially creating "creative conditions" may not be sustainable, since if these conditions don't exist in a city it's probably because there isn't really demand for them. So it's more against drawing simple-minded policy prescriptions from Florida's work (cities that attract cool people grow fast, therefore attract cool people for fast growth).


WOMEN IN COMBAT: In the Washington Monthly -- a magazine I greatly respect -- an interesting article by Phillip Carter on women in the armed forces, discussing how the gender composition of the army has changed in the last decade in favor of women, how this change was driven by women in the armed forces who want to get the combat experience necessary for promotion, and how the next war, possibly with Iraq, will test whether that was a good idea.

I don't know much about the issue, and it seems that the article makes too much of a point of disproving strong negatives -- for instance, their units in the Gulf War didn't collapse under the stresses generated by their presence; the worst predictions in Afghanistan didn't come true and there were no reports of mass pregnancies. What about moderately bad predictions, mild (but disruptive) amounts of pregnancy, units performing marginally less well because of stresses generated by the presence of women? So I don't know how much to trust the author here. But, if we do go to war, I'll be glad to see what military experts conclude about the effectiveness of women in the military, and I hope the current trend continues.


PEOPLE STILL UNCLEAR ON THE CONCEPT: But note, in the San Francisco Rainbow Coalition post below . . . "anti-Semitism"? and "oppression"?


"DENOUNC[ING] AN UNPOPULAR BLACK PERSON": William Saletan has an interesting piece in Slate about the charges that Frist engaged in race-baiting when he said -- in his first 1994 Senatorial race against Sen. Jim Sasser, D-Tenn. -- "While I've been transplanting lungs and hearts to heal Tennesseans, Jim Sasser has been transplanting Tennesseans' wallets to Washington, home of Marion Barry." (See Mickey Kaus and Josh Marshall for more on this.) The article's ultimate conclusion is that Frist isn't guilty, partly because he also denounced other, white, politicians.

     Nonetheless, while I generally like the piece, I'm somewhat troubled by one thing that it says:
The Barry line, however, seems to meet the Lott standard: It's open to innocent interpretations but conveys a racial message to anyone who wants to hear it that way. Like Marshall, I get suspicious whenever a white politician denounces an unpopular black person before a white audience.
Now the author is speaking of what makes him suspicious, not of what he thinks is per se wrong, but still -- is it really good for us to become "suspicious" (and suspicious of racism) simply because people denounce unpopular black politicians, who may richly deserve denunciations? Seems to me that in a democracy, denouncing politicians, whether white or black, is a fundamental check on power and abuse. Black politicians ought to be as subject to this as white ones, Jewish or Catholic ones as much as Protestant ones, female as much as male, and vice versa.

     Some might point out that indeed some speakers who condemned Marion Barry might have been racist, as were some listeners with whom such condemnations scored points. But just how much weight should this carry? I don't think that we should stop speaking out against bad people or bad policies simply because some other people who agree with us do so for racist reasons. Even a stopped clock is right twice a day; even racists might accidentally stumble on the right view (e.g., that Marion Barry was a bad guy [clearly right], and that Congress should have exercised more oversight over money it was sending to a badly dysfunctional city government [possibly right]), if for the wrong reason.

     I think the Slate piece is correct when it suggests that things may be different when someone only condemns black (or Jewish or female or Catholic) politicians and doesn't likewise condemn similarly situated white, male, or Protestant politicians. (As I mentioned, the piece concludes that Frist does not fit into this category.) But I think that before we express suspicions of racism, we ought to have at least this sort of evidence -- and not just the fact that a white politician has "denounc[ed] an unpopular black person" (who might well be quite justly unpopular) in front of a white audience.


Besieged by Jewish protesters, San Francisco's Rainbow Grocery backed away from a partial boycott of Israeli-made goods Sunday . . . .

The worker-owned supermarket caused an uproar when two of its largest departments, packaged foods and bulk foods, banned certain Israeli-made goods to protest Israel's treatment of Palestinians.

But as 200 people gathered in front of the grocery Sunday for a demonstration, the cooperative's board of directors announced that the boycott was null and void.

"We apologize for the lack of oversight in allowing our internal processes to break down," said a director, Seanetta, who goes by only one name, reading from a prepared statement explaining that the two departments acted on their own. "We at Rainbow Grocery Cooperative do not tolerate any anti-Semitism on any level, or any form of oppression." . . .
Thanks to reader Dick King for passing this along.


SHOULD SAUDI ARABIA BE ALLOWED TO JOIN THE WTO, GIVEN THEIR CONTINUING ANTI-ISRAEL BOYCOTT? E.V. Kontorovich (a lawprof at George Mason) argues that it shouldn't:
The accession process requires nations that don't have open economies to make radical, and often painful, reforms in the direction of free trade, as China did recently. Riyadh, on the other hand, has been tightening a boycott of Israeli goods, a secondary boycott of American and other firms bold enough to trade with Israel and even a tertiary boycott of businesses trading with businesses that trade with Israel. Other boycotting nations with much smaller economies -- Lebanon and Algeria -- have applied to join the WTO, but their candidacies have not progressed as far.
I don't know enough about the subject to have an informed opinion, but Kontorovich's op-ed seems much worth reading.


JERUSALEM POST COLUMN ON MORMONS AND JEWS: I think it's generally quite excellent (and I thank reader Larry Rothenberg for passing it along); here are some key excerpts:
Gevalt! The Mormons are out to get us when we die. . . .

This is what we Jews are afraid of? Have we fallen, to borrow a Hebrew expression, on our heads?

I know, I know. . . . It's the principle of the thing. After all, we resent it when Christians try to convert us while we're alive, even if they don't succeed. We even have a law against missionizing in the State of Israel. Why should it be different because we're dead?

IT SHOULDN'T be. Our resentment in both cases is uncalled for. So is the law against missionizing. So is the whole idea that there's anything wrong with Christians trying to convert us. It's their right to try, and it's time we grew up and stopped being such crybabies about it -- especially when, far from being something to cry about, much of it, like the Mormons' plan for church expansion in Heaven, is so funny. . . .

[A]s a member of a tiny minority in this world, [I need] to try to see to it that the world lives as openly, freely, and fairly as possible, because these are the conditions under which minorities do best. The last thing I want is to live in a society which thinks it has to control or legislate what some people are allowed to tell other people about religion.

I would be upset, as I think most Jews would be, not only if the United States Congress were to make it illegal for Jews to proselytize among Christians, but even if such proselytization were to be frowned upon by American society at large -- and since it was a famous rabbi who said, "Do not unto others what is hateful unto yourself," it is both hypocritical and foolish for Jews to want to ban Christian missionizing anywhere, including Israel, in this life or the next. . . .

Deep down, of course, the real Jewish fear is that if Christians are permitted to go all-out to convert Jews on a level playing field, they will succeed. This is a fear, however, that calls not for whining about Christianity, but for thinking seriously about Judaism. If it's justified, we'd better understand why. If it isn't, we should outgrow it.

We can't have it both ways, happy to be treated by Christians as their political and social equals while clamoring to be regarded by them as a protected minority. When you're really equal, you don't need special protection.


THOUGH IT RISKS OF REOPENING AN OLD STORY, I thought I had to pass along a claim of actual facts about the warm gun question. OK, it's not exactly actual facts, but merely something printed in a book; and as we know, it may be sheer rumor or fabrication despite that. Nonetheless, here it is, from reader Richard Horvath:
I've dug out the book with at least some information on the song. The book itself is "Revolution in the Head: The Beatles' Records and the Sixties", by Ian MacDonald. The book does mention that the title came from a "slogan of America's National Rifle Association, which Lennon saw in a gun magazine", page 254. It also attributes a quote to Lennon, "I thought, what a fantastic, insane thing to say. A warm gun means you've just shot something", without providing a source.

Also, in footnote 3 on the same page, the author states that Lennon denied that the song had anything to do with drugs. MacDonald also suggests that the song does have erotic undertones (or would it be overtones?) given the influence Lennon's relationship with Yoko Ono was having over most of his music at the time (in other words, it looks like MacDonald is taking a wild guess on the real "meaning" of the song).

As a side note, it appears that the BBC did ban the song for its sexual references.
Come to think of it, I am highly skeptical that this was ever a slogan of the NRA, and though it's not implausible that it might appear in a gun magazine, I'm far from sure about that, too. Still, here it is, for whatever it's worth.


MALAWI IN THE NEWS: Todd Seavey e-mailed me to point out that Malawi, despite its Official Status as one of the world's most obscure countries, is in the news:
Rumours that Western aid agencies are collecting human blood in return for food aid in famine-stricken Malawi have spread panic among villagers, who are barricading their homes, fearing attacks by vampires. The anxiety is so great in parts of the south that farm workers in one of Africa's poorest countries are staying at home. . . .
I suppose this is further evidence of Malawi's obscurity, though -- other countries make it into the news through more traditional channels (economy, science, culture, naked aggression), but to get Malawi in the papers require vampire stories.


ENGLAND, CRIME, AND FOOD: I just read, at the recommendation of guest blogger Iain, Theodore Dalrymple's essay "The Starving Criminal," and I echo Iain's recommendation -- it really does seem to be first-rate. Readable, thoughtful, and (if accurate, and I have no reason to doubt this though I stress as usual that I'm not expert enough to know for myself) quite telling.


SEASON'S GREETINGS: I've always hated the phrase "Season's Greetings," which strikes me as a classic example of leaching all the vigor and familiarity from an American tradition in the name of inoffensiveness. It grates on my ear on so many levels. First, no-one celebrates the season as such. No-one gets warm feelings when they hear about a "season" in the abstract as opposed to a particular concrete holiday.

     Second, who ever says "[fill-in-the-blank]'s Greetings"? Have you ever heard of anyone saying "Summer's Greetings"? "Graduation's Greetings"? This isn't the way normal English speakers talk.

     Third, can't we have something a bit warmer than just "Greetings"? There's nothing terribly friendly or pleasant in greetings as such. Wishing someone a happy this or a merry that is a nice gesture, but just greeting them conveys little by way of good feeling. (True, I don't approve of criticizing well-established phrases on the grounds that they're different from the sum of their parts; that's what idioms are like. But when a new idiom is being pioneered, one should look closely at its elements, since at least its original meaning and connotation will flow from those elements.)

     Now I've never much minded Merry Christmas. I am not a Christian and wasn't raised a Christian, but I certainly like having a merry day on December 25th; I'm even happy to wish others a Merry Christmas. Merry Christmas isn't an attempt to proselytize -- it's merely an expression of goodwill. I can see how some people who didn't grow up with a Christmas tradition (whether or not they're religious) might not be particularly moved by it. But I think it's unreasonable for people to be offended by this. (As I may have mentioned before, it's vitally important to maintain a distinction between reasonable offense and unreasonable offense, even as a matter of manners rather than law. It's obviously hard to tell what offense is reasonable and what isn't, but it's necessary, lest we lose literally hundreds of words and phrases that someone somewhere for some reason thinks are offensive.)

     Still, even if one decides that one doesn't want to wish people a Merry Christmas, why not stick with a familiar, sensible, and pleasant wish -- Happy New Year. The year is changing; even those Americans whose traditional or religious calendars have a different new year's day in fact live their lives by the standard American calendar. We all want our new year to be happy.

     Hence, "Happy New Year"; nothing wrong with that. "Happy New Year" -- traditional, simple, warm. "Season's Greetings" -- bureaucratese, bland, and formal.

     I am not invariably anti-P.C., in part because sometimes the political correctness forces do have it right. Columbus isn't someone to be revered (though he was no great monster by the standards of his monstrous day). Using "gay" as a general pejorative, which is apparently all the rage among kids these days, is hardly right. We should be aware that we live in a society filled with people from many traditions, and sometimes (though only sometimes) adjust our actions, words, and thinking accordingly.

     But "Season's Greetings" captures much of what makes so much political correctness so obnoxious: a tin ear for the language; a pointless rejection of cherished traditions; an insistence on finding offense where none is intended; a bureaucratization of the way we express ourselves; and the rejection of perfectly sensible alternatives that actually fit the way real Americans speak.

Tuesday, December 24, 2002


FEELS LIKE I ONLY JUST GOT HERE: Time for me to turn to familial duties and leave the verdant shores of the Conspiracy. If you've been entertained, interested or even repelled by some of my posts here, please feel free to surf over to The Edge, where you'll find much or the same, as well as priapic Spartans. And my compliments of the season to all of you.


PARDON MY COUGH: It seems that the WTC cough, seen as a harbinger of respiratory doom for New Yorkers, is not as significant as it first appeared. The cough is most likely due to hacking up large particulate matter, which will not be dangerous long-term (depending on the size of the particle, obviously). Smaller particulate matter, which is the real long-term danger, comprised only 1% of the dust cloud, compared to the 20-30% that was feared. The WTC cough made page A1 of the Times on September 10 this year, so one is entitled to ask why this piece of good news was reported on page A22 on Christmas Eve.


IN PRAISE OF SNOWMOBILES: First the Times comes out for DDT, now a New York Times columnist defends the Bush administration's decision to allow snowmobiles in Yellowstone. Is there something in the water?


DDT'S BELATED VINDICATION: The New York Times offers a welcome Christmas present to African nations ravaged by malaria -- if only they had taken such a sensible position sooner. (Perhaps if they had read this or this . . . )


ANARCHY IN THE UK: More evidence of the crisis facing the British legal system. My old friend Michael Gove of The Times of London examines two recent decisions: the ruling of the Lord Chief Justice that first-time burglars should not be imprisoned, and the decision of the Metropolitan Police to run a huge advertising campaign urging people to inform on their neighbors if they have said things that might be homophobic, racist or otherwise discriminatory. Michael sees these twin developments as a sign that the UK is on the verge of anarchy. His conclusion:

It is ultimately respect for the law, firmly and fairly applied, that keeps anarchy at bay. If the law will not protect my property by taking those who steal it off the streets then why should I continue to respect it? If it becomes a means of enforcing one, limited, set of acceptable opinions then how can I be certain that mine are among those that are worthy of respect? What sort of law is it that cannot defend my free enjoyment of either private property or public discourse? The sort of law an anarchist might design.

A very important point, very well put.


TORTUOUS REFORM: Britain has a very low and decreasing conviction rate for indicted criminals, which may be one of the reasons crime rose in the UK while it fell in the US, as this joint study by the Bureau of Justice Statistics and Cambridge University points out. Her Majesty's Government, however, has decided that this is because little things like due process get in the way of "justice." So, they've decided to phase it out. Right to trial by jury and the principle that the jury should only hear facts directly relevant to the case at hand are among the first to go. I'm not a fan of the Lord Chief Justice of England and Wales, but he is right to object to to the suggestion that juries should be allowed to hear the defendent's criminal record in deciding a case.

I recently looked at the US Bill of Rights from an English standpoint, representing as it does the received wisdom, with a few minor quibbles, about the historic rights of Englishmen. Barring the obsolete 3rd Amendment, the rights guaranteed in every other Amendment have come under attack in the UK in recent years:

Amendment 1: Freedom of speech has been further restricted by the EU crime of 'xenophobia'.
Amendment 2: Notwithstanding the Bill of Rights 1689, it is now virtually impossible for an ordinary householder to own a gun legally.
Amendment 4: The Prevention of Terrorism Act and the Regulation of Investigatory Powers act allow searches and seizures that would be banned under the traditional system.
Amendment 5: Double jeopardy protections are to be abolished when there is "compelling new evidence." The government essentially re-nationalized the railroad infrastructure company Railtrack without initially offering compensation to shareholders.
Amendments 6 and 7: The right to trial by jury is to be restricted further.
Amendment 8: The British libel laws allow excessive fines, such as those that ruined Nikolai Tolstoy. The punishment of householders such as Tony Martin who were defending their lives and property might be regarded as cruel and/or unusual.
Amendment 9: The European Charter of Fundamental Rights and the Human Rights Act contain no such clause, thereby implying that the rights contained therein are the only ones that can be enjoyed.
Amendment 10: The European Charter of Fundamental Rights contains the opposite of this clause, allowing the EU and member states to suspend "fundamental" rights in the interests of advancing the Union.

And most of these innovations come from a "modernizing" and "reforming" Government. Just why are those epithets generally used approvingly?

Monday, December 23, 2002


ANTI-SEMITISM IN EUROPE: Just to point out that Britain isn't the hopeless case I'm often accused of making it seem, I think I ought to point to this Anti-Defamation League poll (PDF link) which helps prove that Britain is probably the least anti-semitic nation in the Western world, including America. The ADL proposes four statements that seem to be pretty good indicators of anti-semitism. Here are the percentage agreeing with them in each of Germany, France, the US and UK:

Jews don't care what happens to anyone but their own kind
Germany: 24%
France: 20%
USA: 16%
UK: 10%

Jews are more willing to use shady practices to get what they want
Germany: 21%
France: 16%
USA: 16%
UK: 11%

Jews are more loyal to Israel than to this country
Germany: 55%
France: 42%
USA: 33%
UK: 34%

Jews have too much power in the business world
Germany: 32%
France: 42%
USA: 24%
UK: 21%

All of which probably indicates that the strong support for the Palestinians seen in UK polls is driven more by media coverage than by latent anti-semitism (this is borne out, I think, by the poll at hand, which indicates that Britons are more likely to support the Palestinians if they follow the crisis closely). Once again, however, this indicates how careful people should be when they generalize about "European anti-Semitism."


THE PROPORTION OF INIQUITY. My colleague Orin offers criticisms (a few posts below) of the New York Times' coverage of Total Information Awareness. This puts me in a most difficult position. When asked to pick between Howell Raines and John Poindexter, I am put in mind of what Dr. Johnson said when asked whether Rousseau was as bad a man as Voltaire: "Why, Sir, it is difficult to settle the proportion of iniquity between them."


THE DEVIL IS IN THE DETAILS: That odd magazine, The Economist, has a superficially interesting article contrasting the fortunes of Britain's Conservatives and America's Republicans. The thesis is not that the Tories are exceptional in their straits, but that the Republicans are blessed by operating in a right-wing country. But that's all going to change, so the underlying message is that liberal conservatism is doomed. I have no doubt that Tony Blair and his fellows will be delighted by this message. The trouble is, however, that the indicators the Economist uses to support its thesis that America is right-wing have been cherry-picked or distorted. Take, for instance, the suggestion that America is right-wing because it is a largely Christian country. This is a somewhat odd statement coming from a country that has an established Christian Church that is generally regarded as left-wing. Does opposition to abortion make you right-wing? Tell that to the new Archbishop of Canterbury, probably the most left-wing holder of that office ever, who is personally opposed to abortion. Tony Blair has qualms on the subject too, we are often told. The Economist is on sounder ground when it refers to the distinction as cultural -- if fewer than 1 in 50 Britons says grace, then that is largely because the family meal is also disappearing as a social activity, as Theodore Dalrymple pointed out in the most important article about the UK I read all year. Britain's traditional culture is certainly in much shakier shape than America's.

Nevertheless, The Economist's choice of social indicators strikes me as deliberately designed to emphasize "right-wing" issues. The "gun deaths" indicator, for instance, includes suicides, which research has regularly shown are indifferent to method -- if guns are not available, people will jump off cliffs. But if The Economist had wanted to show the reverse, they could. Here is a table of 6 indicators chosen to show that America and the UK aren't that different overall, or that the US is a better place to live:

Index of hourly compensation per production worker (2000): US 100, UK 80
Unemployment rate (2000): US 4.0%, UK 5.5%
Police-recorded violent crimes per 1000 population (2000): US 5, UK 10
Number of months served in prison per murderer (mid-90s): US 59.2, UK 52.2
Defense spending as %age of GNP (1997): US 3.3, UK 2.7
Gross public debt as %age of GNP (2000): US 59.5, UK 53.5

Given its importance in the UK political scene, one might also add:

Health expenditure as % of GDP (1999): US 13.7, UK 7.0

The picture you'd get from these stats is of an America where the working man is less likely than his British counterpart to be out of work, better compensated, and less likely to be a victim of violence, while the American state as a whole is not much tougher on crime than Britain, spends little more proportionately on defense and finances its public spending much more by debt than by taxes. More right-wing? Not from these stats. Less committed to social justice? Hardly -- the American is much more likely to have a job, a good wage and to live free from the fear of crime (and to get decent, quick medical care).

The British historian Simon Schama -- no conservative, he -- recently commented that Britain's national character exhibited 'a passion for social justice with a tenacious attachment to bloody-minded liberty.' But, as is the nature of these things, the balance between the two tends to swing. In the 80s-mid 90s, Britain was more bloody-minded about liberty than it was passionate for social justice. The pendulum is currently near the far end of its swing the other way. It will swing back. The same is, I think, true about America, although the pendulum is in the bloody-minded part of its swing at the moment. This isn't about one country being more conservative than the other. It's about both sharing the same national characteristics, but being slightly out of synch at the moment. Anyone who misinterprets these swings as proving that one or another political party is finished (as one might have thought about the Labour Party in 1987) is no student of history.


GRAMMATICUS: The well-bred one has kindly asked me to "guest-blog" for a couple of days. As he says, I promised Eugene my posts here would be primarily scientific or Anglospheric. However, just to be perverse, my first post at the Conspiracy has naught to do with statistics or science and only tangentially with the Anglosphere. I feel I must add something to the discussion about split infinitives and end propositions by citing the greatest of English grammarians, H.W. Fowler and Sir Ernest Gowers (Modern English Usage, 2nd Edn., 1965). Of the split infinitive, they say:

The English-speaking world may be divided into (1) those who neither know nor care what a split infinitive is; (2) those who do not know, but care very much; (3) those who know and condemn; (4) those who know and approve; and (5) those who know and distinguish.

1. Those who neither know nor care are the vast majority, and are a happy folk, to be envied by most of the minority classes. 'To really understand' comes readier from their lips and pens than 'really to understand'; they see no reason why they should not say it (small blame to them, seeing that reasons are not their critics' strong point), and they do say it, to the discomfort of some among us, but not to their own.

They go on to describe the barbarities committed by the other groups, identifying the common problem that adverbs inappropriately placed sound awkward. This goes as much for an awkward split infinitive as it does for an adverb placed oddly in order to preserve the integrity of the infinitive. They conclude:

We maintain ... that a real s.i., though not desirable in itself, is preferable to either of two things, to real ambiguity, and to patent artificiality ... more than that, we will freely admit that sufficient recasting will get rid of any s.i. without involving either of those faults, and yet reserve to ourselves the right of deciding in each case whether recasting is worthwhile.

Ah, reserved rights. How anglospheric!

As for prepositional placement, they state

The fact is that the remarkable freedom enjoyed by English in putting its prepositions late and omitting its reflexives is an important element in the flexibility of the language. ... Follow no arbitrary rule, but remember that there are often two or more possible arrangements between which a choice should be consciously made. If the final preposition that has naturally presented itself sounds comfortable, keep it; if it does not sound comfortable, still keep it if it has compensating vigour, or when among awkward possibilities it is the least awkward.

The English way is, in so many things, if nothing else, the triumph of sense over rule. Finally, to those who might quote Churchill in this area, mentioning no names, please remember:

If the 'preposition' is in fact the adverbial particle of a PHRASAL VERB, no choice is open to us; it cannot be wrested from its partner. Not even Dryden could have altered which I will not put up with to up with which I will not put.

And is it not wonderful to see the great lost art of the semi-colon in all its glory?


IAIN MURRAY GUEST-BLOGGING TODAY AND TUESDAY: Please welcome Iain Murray of and The Edge of England's Sword (one of the most dramatic blog names I've seen), who'll be guest-blogging today and much of the day tomorrow. Iain's main interests are in the use and abuse of scientific and statistical information (among other things, he wrote the Encyclopaedia Britannica's item on the use of statistics in the gun control debate), and in what the Anglosphere countries can learn from each other. Iain is reachable at iain at


HOW MANY DAYS IN A ROW CAN THE N.Y. TIMES FEATURE A STORY THAT DISCUSSES "TOTAL INFORMATION AWARENESS" AND WARNS OF THE COMING ARRIVAL OF BIG BROTHER? We're up to three days in a row right now, but this may be just the beginning. Today's installment is a piece by John Markoff and John Schwartz titled Many Tools of Big Brother Are Up and Running. Here's the gist of the piece:
In the Pentagon research effort to detect terrorism by electronically monitoring the civilian population [through Total Information Awareness], the most remarkable detail may be this: Most of the pieces of the system are already in place.

Because of the inroads the Internet and other digital network technologies have made into everyday life over the last decade, it is increasingly possible to amass Big Brother-like surveillance powers through Little Brother means. The basic components include everyday digital technologies like e-mail, online shopping and travel booking, A.T.M. systems, cellphone networks, electronic toll-collection systems and credit-card payment terminals.
. . .

The civilian population, in other words, has willingly embraced the technical prerequisites for a national surveillance system that Pentagon planners are calling Total Information Awareness.
     As a substantive matter, the piece seems, well, a bit much. But what's more interesting to me today is how often the Times has been hitting this theme (the American people foolishly giving up their privacy to Big Brother in the age of terrorism, etc.) in the past week or so. Yesterday, Margaret Talbot made this argument about TIA in an essay in the Sunday magazine titled Losing the Home Front. She wrote:
Total Information Awareness would be a sacrifice -- indeed, a violation of what it has long meant to be an American in relation to the government -- that would last beyond wartime. Unlike those, it would be a renunciation less of comforts than of principles. War will claim a home front, if it's useful enough, and pliant enough, whether we've decided we're living on one or not. So far, the war on Al Qaeda hasn't demanded much more from most of us than airport security hassles, to which we've pretty well grown accustomed. But it could soon be different, and so far, we've hardly noticed.
The day before that, the Times hit the theme through the article on the Denver police investigations that I blogged about on Saturday here.

     Beyond TIA specifically, the Times has averaged about one article a day on the basic theme of technology and civil liberties (and how the former is encroaching upon the latter in the post 9/11 age) during the last week. Here's my tally for the last six days:
1) Many Tools of Big Brother Are Up and Running, by John Markoff and John Schwartz (12/23)
2) Cities Wary of Antiterror Tactics Pass Civil Liberties Resolutions, by Michael Janofsky (12/23)
3) Losing the Home Front, by Margaret Talbot (12/22)
4) Going Electronic, Denver Reveals Long-Term Surveillance, by Ford Fessenden and Michael Moss (12/21)
5) Bush Administration to Propose System for Wide Monitoring of Internet, by John Markoff and John Schwartz (12/20)
6) Some Companies Will Release Customer Records on Request, by John Schwartz (12/18)
     I may be missing one or two ar les; I'm just creating this list from memory and some past posts.

UPDATE: I MISSED ONE! I was wondering how the Times could miss a day -- you'll notice that the list above has no entry for 12/19 -- so I did a quick search and realized that yes, there actually was a story on privacy, surveillance, Total Information Awareness, etc. in the December 19 paper: Study Seeks Technology Safeguards for Privacy by John Markoff. So that's six days in a row! Way to go, Times!

Sunday, December 22, 2002


WISHFUL THINKING. Speaking of the New York Times, about three weeks after the Sept. 11 terrorist attacks, Linda Greenhouse announced in that newspaper's pages that "The Supreme Court's federalism revolution has been overtaken by events." She said that reading the Court's opinion in Printz v. United States, one of its recent cases (from 1997) limiting federal power, "is like unearthing an artifact from a bygone era." Then she continued with the thought that "The end of the federalism revolution raises another question..." again repeating the assumption that a major shift had occurred at the Court.

     The basis for Greenhouse's assumption, of course, was not anything the Court had said. It was her own wishful thinking. She had never liked the Court's federalism cases anyway, so when events occurred that seemed like they might reinforce the case for a strong national government she rushed to declare that her preferred outcome -- the demise of that branch of the Court's jurisprudence -- had come to pass. Or perhaps she was hoping that saying this would make it more likely to become true; maybe the Justices would read it, or read other writings pointing the same way, and conclude that their recent case law on federalism had become passe. In any event, Greenhouse's piece was embarrassingly premature.

     I mention this because another specimen of the same journalistic vice appears in today's New York Times. In the Week in Review section, R.W. Apple has a piece titled "Aftershocks of Lott's Fall." It begins in a fashion familiar from Greenhouse's piece a little over a year ago: "Could it really have been less than two months ago? It feels like a decade." Apple is speaking, of course, of the Republican triumph in the November elections. Republicans controlled both houses of Congress; and "[w]ith a highly popular Republican president ensconced at the other end of Pennsylvania Avenue, politicians and pundits described his party as an awesome machine standing on the verge of an era of partisan preponderance."

     But now, after Lott's resignation, everything is of course very different: "the Republicans were plunged back into a controversy that they thought they had left behind, a debate over the party's traditional reliance on racist or near-racist votes in the South." Apple asks: "will the whole Lott affair quickly recede into insignificance?" He is obliging with the answer: "In political terms, certainly not. The difficult question of race is back, revived and dangerous, and President Bush faces a far harder task in his campaign to demonstrate that the Republican Party is still the party of Lincoln, open to all."

     So true, isn't it? Yessir, Bush and his party are in real trouble. And I'm sure we all probably share Apple's sense that those Republican victories in November seem like they happened a decade ago. Quite a distant memory, really.

     Apple's piece is another instance of a depressing variety of amateurism found increasingly and surprisingly often in the work of veterans at New York Times: journalists prematurely announcing the arrival of outcomes they are rooting for -- or trying to make the outcomes they are rooting for more likely to occur by declaring that they already have arrived.

This page is powered by Blogger.