A selection, first, of commentaries and reviews, followed by longer feature stories, all of them appearing in The Dallas Morning News. If you're not sick of me after all this, Critical Mass, the National Book Critics Circle's website, ran an interview here back when I was still on staff at the News.


Book column
July 18, 2004

When a book becomes a front-page "event," its reviews are akin to news photos. They're snapshots of book critics chasing after insights.

Needless to say, this isn't an admired innovation in reviewing.

A book's release can become a news event when, as with Bill Clinton's memoir, My Life, it's a "one-day laydown." Publishers make bookstores and online services put the boook up for sale all on the same day -- with no advance peeks for critics. Like everyone else, reviewers must scramble to buy a copy. Unlike everyone else, we then have to write a book report. Very, very, quickly.

As recently as five years ago, such laydowns were rare. But since June of last year, in addition to My Life, we've had Bob Woodward's Plan of Attack, Madonna's children's book, the new Harry Potter and Hillary Clinton's Living History.

As one of the overcaffeinated critics who has read and reviewed each of these in less than 48 hours, I abhor what the 24-hour news cycle has done to thoughtful considerations (and the fact that The New York Times always gets a look before the rest of us, anyway). This is impulse-reaction reviewing.

Shamelessly, publishers have been manipulating this trend. The publicity advantages that a "laydown" brings are too good to pass up. For once, a book can have as much media clout as a Hollywood movie.

Books are small potatoes in the media -- but not because our increasingly portly selves are turning dim and plant-like in the light of the internet. The great mass of readers simply don't experience a book -- like a TV show -- all at the same time. It takes months for us to check it out. What's more, books generally don't make gigant-o profits. So they're not exalted like bankable movie stars. But with a mass laydown, a book has a shot at just that kind of flashbulb celebrity. The everyone's-talking-about-it-at-once kind.

For what it's worth, I always was good at cramming for tests. This may explain any knack I have for these turnarounds. But they're still scary -- there's always the chance I'll miss the key phrase that will become the next day's "talking point" for TV and radio gasbags -- none of whom has actually read the book.

Not that those talking points really matter. The coverage of such books follows a general pattern. The first reports rush out with controversial tidbits that don't do justice to the book's larger argument, good or bad. Recall that The Price of Loyalty initially sparked talk for its portrait of President Bush as a clueless manager. Weeks later, the more thoughtful reviews pushed forward the book's real thesis -- that the administration didn't care about deficits or what its tax cuts did to the economy.

In this zippy new world of drag-race reviewing, Mr. Clinton'sMy Life was a surprisingly easy read. At 957 pages, this was a small mercy.

Unlike most executive memoirists, the former president is often funny, poignant and even inspiring. But then he buries it all to impress us with busy he was in the White House. It's as if he hopes to earn extra credit.

Yet this is also why My Life was good material for a high-velocity review. The best writing is in the first h alf, so one gets a sense of Mr. Clinton as an author early on. After that, he mostly reads us his day planner. It's a frictionless skim, just look for the "hot" parts.

In contrast, good fiction -- with nuanced characters and shifting stories -- is much harder to seize and synopsize like this. I'm grateful that, so far, I've not been required to reduce a fine novel to a fast-food, drive-through meal.

That will come. At their best, news photos rise to the level of art. And in time, these panic-atttack reviews may, too. Right now, publishing industry trends reflect the fact that we're at war and in a brutal election. We're getting fired at from all directions, book-wise.

There'll be lots of practice before I have to speed-mulch a novel.

Flap about James Frey's "memoir" ignites debate over mixing fact and fiction
Book essay
January 29, 2006

[NOTE: This essay was written within hours immediately after Frey's appearance on Oprah's show. The final suggestion in the essay was made, therefore, before it became a common industry response.]

To embellish a famous line by that noted nonfiction author, B. B. King: Don't trust nobody 'cept your mother.

And she could be jivin', too.

If any good comes out of the controversy surrounding the now-admitted embellishments in James Frey's A Million Little Pieces, it would be that readers might be more sophisticated about what they read in memoirs, especially those by recovering addicts boasting about their bad selves. And that publishers would fact-check manuscripts more -- that is, fact-check them at all.

Or perhaps they could develop a labeling system to distinguish poetic, first-person reveries from archival research. The labels could look like those government warnings on nicotine, saving us from any exposure to literary invention: "Caution! This autobiography contains 5 percent fiction."

Not very likely scenarios, any of them. But several were vehemently proposed on Thursday's Oprah Winfrey Show. Having boosted Mr. Frey's book, not only by picking it for her club but also by defending it on-air during Larry King Live. Ms. Winfrey brought the author back and raked him over the coals for betraying her faith in him.

She said he "betrayed millions of readers," but we know which reader has her name on Oprah's Book Club. She tossed in his publisher, Doubleday, as well, for deceiving her over early reports about the book's inaccuracies, and then she took on the book industry at large.

When Nan A. Talese, whose Doubleday imprint published A Million Little Pieces, explained that memoirs are based on human memory and publishers check them for possible libel claims but little else, Ms. Winfrey snapped, "Well, that needs to change."

Perhaps it will, now that the industry has embarrassed its Biggest Client. And has cast a pall over one of its biggest profit sources. Nonfiction sells; it outsells fiction by a huge margin, and since at least Tobias Wolff's This Boy's Life in 1989, memoirs have been hot: Tuesdays with Morrie, The Liar's Club, Running with Scissors, Angela's Ashes.

The great majority of frowny faces directed at Mr. Frey have been worn by journalists, including several of Oprah's guests Thursday. However the public may view us (promoters of disloyalty, suck-ups to power, outmoded old typists), journalists are, as a Bush administration official once put it, "reality based." We have a rather conservative faith in facts.

Not surprisingly, then, many of us have been loudly defending truth against all the purported liars: Mr. Frey, Oprah, the book industry in general. Which, in general, was taken aback by the furor. As Sara Nelson, editor-in-chief of Publishers Weekly, put it, publishers just sell people the stories they want to read. "Readers say they want the truth, but they can't handle the truth. Not unless it reads like a novel."

Indeed, Bill Bastone, the investigative journalist behind the Smoking Gun, the Court TV-owned Web site that broke the story about Mr.Frey's fabrications, has reported that 40 percent of the e-mails he has received have expressed outrage at him, not Mr. Frey. So what if the author never served three months in jail? So what if -- as The New York Times reported last week -- counselors at the clinic he supposedly attended say his account is untrue? It's called poetic license, isn't it?

"If it were my choice," Mr. Frey said in 2003, "A Milllion Little Pieces would be listed as literature. It doesn't really matter, though. What matters is how many people read it and how it affects them."

Actually, it does matter how it's listed. That's what this has all been about. Even as entertainment blurs with news, even as we can fake anything digitally on the Internet or in a movie, we're obsessed with the tabloid-authentic, with reality TV and its humiliating public confessions, like Mr. Frey's on Thursday. And like Ms. Winfrey herself, many of us have a simple faith in a publisher's imprint: It means they vouch for the book's content.

So, backed by the publisher's say-so, Mr. Frey stole a little of the power of the real. And he cast into doubt all other memoirs.

It was so simple once, pined Susan Salter Reynolds in the Los Angeles Times. Fact was fact, fiction was fiction -- until Tom Wolfe started mixing the two. Yes, it's all Tom Wolfe's fault. Conservative columnist John Leo, on the other hand, denounced Oprah and her cult of the "emotional truth" -- instead of the truthy-truth -- and took a few whacks at postmodernism for contending "there is not literal truth, only voices and narratives."

But then there was liberal columnist Frank Rich, one of Ms. Winfrey's guests, who found Mr. Frey's dishonesty to be part and parcel with the "White House propaganda machine" that, he says, has been selling stories to hide its incompetence with Katrina or the hard-right beliefs of Samuel Alito.

Each of these scenarios posits some burgeoning cultural crisis in the printed truth. But most of the furor would never have happened if Mr. Frey's book had been released as a novel, as he originally intended. Or if the book had been touted on TV by, say, Charlie Rose.

Instead, the book got the Oprah Seal of Authentic Worthiness, and Ms.Winfrey was forced to do damage control on her public image. It made for a remarkable piece of talk-show televsion, but it seemed to be about Placating an Angry Oprah as much as anything.

Memoirs have been messing with our heads since St.Augustine's Confessions in A.D. 397. Augustine was revolutionary, the first author to recount his childhood as an influence on his moral development. Scholars have never taken his book as a literal testament of fact but as part of his self-examination, his seeking a "larger truth."

Since then, one would think we've come to distinguish outright fraud from, say, the dream-like beauties of Vladimir Nabokov's Speak, Memory or the so-called "fictional memoir" of Frederick Exley's A Fan's Notes. Maybe not. There was The Education of Little Tree (author Forrest Carter wasn't Cherokee; he was a Klansman) and Lillian Hellman's Pentimento (she embellished the story about anti-Nazi work that become the movie, Julia).

And who can forget the swarms of forgettable movie star tell-alls, political memoirs and the many testimonials (weight-loss, religious conversion, financial wonder-working), most of which are about as reliable as Texas rain?

Not every nonfiction writer falsiflies, of course. Samuel Freedman teaches a course in literary nonfiction at Columbia University (full disclosure: I've taken the course). And there's no one more committed to hard reporting.

In his biography of his mother, Who She Was, Mr. Freedman declares that he has a "fundamentalist's faith" in truth and facts. He wrote the book partly as a stand against the memoirists who invent scenes they never saw or recall 30-year-old chats word for word.

But Mr. Freedman set an impossible goal for himself and for nonfiction writing. Despite all his research, he found he could not know "with absolute, 100 percent certainty what was happening inside my mother's head."

Who could? Truth isn't always easy. It's hard enough just knowing yourself. But for Mr. Freedman's pains, reviewers of Who She Was suggested he should have just used more imagination.

You know, invented things.

OK, so how much invention is permitted? Mr. Frey got caught with his lies but insisted Thursday that "not very much" of his book is fabricated. The problem with this line of reasoning is that truth isn't a hard, measurable object, a substance a writer includes on this page but forgets on that one. Neither is truth an all-or-nothing affair, one falsehood and it's gone.

Truth is a destination a writer aims for, and we judge how dilligently or nimbly he heads there, what he finds along the way, how he reveals himself, his artistry. Mr. Frey's sin wasn't writing A Milllion Little Pieces. It was what followed: labeling the book a memoir, lying about his life in interviews, lying about all the documentation he supposedly had that would back up everything. And then came the pathetic backtracking and hedging.

A simple disclaimer, the kind we see very day -- "Names and details have been changed to protect identities, the time frame has also been altered" -- would have made all this jiving unnecessary.

An age-old discussion evolves yet again; two new books take a look

Book essay
September 4, 2005

A few years ago, a young neighbor confronted me as I was mowing the yard. Evidently, I had been portrayed to him as the local skeptic. "You know," he announced, "evolution is just a theory."

He had me there. Evolution is indeed a theory. But then, Albert Einstein's relativity is a theory, too. So I explained that he shouldn't stick a finger into a wall socket to test whether the Comanche Peak nuclear power plant really worked or not.

Many of us still take "theory" to mean "hunch," when in science, it is closer to "a system of well-founded assumptions." That basic misunderstanding characterizes much of our oldest culture war, the still-fractious firefight between creationism and evolution. Witness two new, very different histories that trace the well-worn trenches of this battlefield right up to the present.

Marvin Olasky and John Perry's Monkey Business assumes most Americans don't know what really happened at the 1925 "monkey trial" in Dayton, Tenn. The same could be said about our knowledge of much of our history, but this ignorance has shaped the popular image of creationists' "countrified imbecility" vs. the wise evolutionists. In fact, the trial was hardly a defeat for creationism. The Darwinians won the image battle among the educated classes, but a fair number of states and school boards quickly ditched Darwin.

And the public relations victory among the Eastern and educated was not due to evolution's validity, either, the authors argue. Biased journalists such as H. L. Mencken got the Dayton story wrong, plus there was the highly influential play-turned-film, Inherit the Wind. Dallas, amazingly enough, premiered the 1955 stage drama when Broadway wouldn't.

So Monkey Business sets out to expose, as its subtitle says, The True Story of the Scopes Trial -- with the implication that the news will be revelatory.

But little of it is news to anyone who has paid attention. To cite a small, typical example, the Scopes trial was a "media event," concocted by the tourism-hungry town leaders and the American Civil Liberties Union, out to challenge the state's anti-evolution law. John Scopes may never have even taught Darwin. Monkey Business is drily outraged by all of this, yet it was uncovered decades ago. Authors such as Garry Wills and Stephen Jay Gould have discussed it. Even a History Channel program dealt with it.

To their credit, Mr. Perry and Dr. Olasky, the Texas journalism professor best known as the Bush advisor behind "compassionate conservatism," provide an accurate account of the trial, once we discount their efforts at spin.

But the "true story" of Monkey Business is that the monkey trial is mostly a pretext for the book's outright advocacy of biblical literalism. Dr. Olasky has promoted "faith-based initiatives," and this book certainly counts as one.

Monkey Business makes much of Darwin's Anglican anxiety, his desire to separate God from a natural world order that Darwin saw as driven by brutal competition. Why would a benevolent diety devise such a dog-eat-dog existence?

This implies, though, that evolution has a theological component, a way of seeing God, the authors note. And at the very least, evolution does propose a "hands off" deity. But this would mean that evolution has no special standing in class. It, too, is a "faith-based initiative."

The catch here is that all science presupposes a "hands off" deity. No miracles are allowed to monkey up the lab results. Evolution can't do anything else and still be science.

What Monkey Business actually underscores is not religion but politics. A scientific-religious issue is being argued in courts and school boards, as if they ever could settle one's faith in God or what hominid line led to homo sapiens. Which is why Americans are still fighting about this while the rest of the scientific world has moved on.

In this political struggle, Darwinians have relied on the courts to bar the unconstitutional use of tax money to teach religion. But in doing so, they have fueled widespread (and often Southern, regional) resentments against "elitist experts" and "activist judges."

Creationists, meanwhile, tend to appeal to school boards and the public, knowing they can sway a popular vote. Science isn't based on votes, but political careers are. The well-funded conservative push to teach both intelligent design and evolution is the latest such popular appeal.

In all this, Dr. Olasky and Mr. Perry approvingly cite Michael Ruse, a philosophy professor who made the "evoution is really a religion" argument above. What's more, Dr. Ruse's new book, The Evolution-Creation Struggle, charges many evoutionists, such as Richard Dawkins, with being their own worst enemies, alienating devout supporters with an acidic atheism.

So the surprise is that The Evolution-Creation Struggle is a rich, thoughtful overview, far wiser than the stacked deck of Monkey Business. A Quaker, Dr.Ruse is sympathetic to people's efforts to reconcile religion and reason, but he's also well-versed in scientific thought. Struggle is a succint but nuanced history of ideas, tacking back and forth between the origins of Biblical inerrancy and Darwinism.

Ultimately, he faces the question of the day, whether intelligent design is a science. Because it makes no reference to God or Genesis, should it be taught with evolution?

Consider evolution first. Darwin developed it 150 years ago, before Mendelian genetics, DNA testing and plate tectonics. These involve whole areas of science unknown to Darwin, yet they confirmed and expanded his theory.

ID, in contrast, is an odd science in that it leads nowhere. It has generated no useful experiments. With DNA, we have an amazing creation, supposedly too complex for random mutation to shape. So why is this intelligent design such a mess? Who came up with all of these redundancies and old, useless bits? ID doesn't just fail to offer answers, it doesn't even provide avenues of research.

Dr. Ruse concludes with a sweeping rejection: "We find no empirical or conceptual reason whatsoever to think of intelligent design theory as genuine science."

What we have in creationism vs. evolution is a metaphysical argument, he writes, "a struggle for the hearts and souls of people." Which is why it has become a political campaign to wedge the Bible back into schools. And this is why, justfiably, many scientists and teachers have been up in arms.

Rather as though someone had just jammed their fingers into a wall socket.

'Accomplishment' collapses on itself
Book column
December 7, 2003

A local college dean often gave a speech at public occasions extolling excellence in the arts. And this "excellence" he would define as - that which is excellent.

What could be simpler? Actually, what we have here is a tautology or what's known as circular reasoning. It accepts what it should explain. Tautologies will be the chief charges against Charles Murray's Human Accomplishment: The Pursuit of Excellence in the Arts and Sciences, 800 B.C. to 1950 (HarperCollins, $29.95).

Co-author of the controversial racial education study, The Bell Curve, Dr. Murray set out to assemble humanity's "résumé." It seems we all were applying for a job. Naturally, this résumé features our best work, the historic "greats" in the arts and sciences, the Einsteins, Shakespeares and Lao Tzus. Dr. Murray compiled this list of 4,002 "significant figures" from standard encyclopedias and art histories.

His greats, in short, are those people who have been called great. Dr. Murray faces down this tautology by arguing, basically, "More people agree with me." He trumps us with the size and consistency of his database. You may like, oh, Leroy Nieman's sports paintings, but Dr. Murray has a dozen experts who ignore them entirely.

Dr. Murray draws several unsurprising conclusions from his data, chiefly that the white Christian West leads in producing geniuses, although it's in decline. He claims this as he also declares, "I am choosing one type of expertise and rejecting another, allying myself with the classic aesthetic tradition and rejecting" what might be called relativistic modernism.

In other words, he's making a political case here. Yet Dr. Murray believes all of this is objectively demonstrable. By using statistical methods, he has proven his standards are true and real.

One may well agree with some of Dr. Murray's conclusions: They seem self-evident. Collectively, critics - having pondered art at length - may be fair guides to what's best. Armed with science, logic, capitalism and Christianity, the West has dominated modern history. These are not new ideas. George Orwell believed the novel wouldn't have existed without the inward-looking individualism that Protestantism lent us.

Conservatives may hold up Dr. Murray's book as proof - scientific proof! - that realistic art and traditional music are best, that our culture has been going to hell in a badly-made Cubist handbasket ever since liberal modernism took over. Schools should dump this multi-culti stuff and get back to teaching Western classics, although in Texas, we're lucky if they teach any music or art history at all. But his champions should check the fine print. Dr. Murray is admirably honest and clear: He buttresses his arguments with caveats, cautions and exceptions.

He frames his points, for example, in the easiest terms, stating that it's a fact that Dante is a greater poet than Carl Sandburg. Of course. But having compiled far too many top 10 lists, I can tell you the choice rarely comes down to Dante or Sandburg. It's most often a muddle among three dozen writers kinda-sorta like Sandburg.

One test of any such system, in short, is in the worth and fineness of its distinctions. At one point, Dr. Murray makes an arbitrary 50 percent cutoff because there were so many figures cited by only one source. Let's face it, he says, as beloved as they might be, James Thurber and Dorothy Parker, for example, aren't up there with John Steinbeck or Theodore Dreiser.

What he doesn't discern here is the difference in genre, not genius. Thurber and Parker were humorists. They weren't even trying to compete with the novelists. As with the Oscar for best film, there's a strong bias toward the solemn and big being more "profound."

Speaking of film, Dr. Murray reluctantly rejects the entire form as too "premature" to consider. A debatable but understandable choice. The reason he gives, however, is that films only gained sound 23 years before his 1950 cutoff - implying he doesn't consider silent films worthy at all. There go Buster Keaton and D. W. Griffith and Eric von Stroheim.

Perhaps most telling, however, is that Dr. Murray's entire enterprise is premised on counting the number of lines each genius gets in each book. The more lines, the greater the genius. What could be simpler? That this method - historiometry - has a 134-year-old tradition doesn't make it any less laughably crude. Dr. Murray even says he counted color plates in art books. Genius, it seems, is to be determined by publishers' budgets.

There are many such holes, but the larger point is that what's supposedly scientific is looking creaky and jerry-built. It is still an argument to be made, not a fact to be believed. Arguments about art are matters of persuasion, judgment and insight: a critic's tools. And considering Dr. Murray's more ignorant statements about art - "most serious novels from 1920-1950" are "arid and ephemeral" - I wouldn't trust his judgment.


Current events echo Kipling's 100-year-old portrayal of sparring match for control of Afghanistan
October 21, 2001
[NOTE: Written only a month and a half after 9/11.]

There's a story about violent covert action and native uprisings in the mountains of Afghanistan and India, a story about espionage and Western military intervention.

The story is called Kim, and Rudyard Kipling wrote it 100 years ago.

As the United States switches from cruise missiles and bomber sorties in Afghanistan to much dicier military actions on the ground, Kipling's novel - as well as his 1888 short story, "The Man Who Would Be King" - provide lessons on the risks the country now faces, even lessons on the quagmires of nation-building.

In Afghanistan, American armed forces will be walking in the footsteps of Victorian soldiers who wore pith helmets and carried sabres, soldiers who helped the British Empire hold sway over the region longer than any other imperial power. They did so, more or less, from 1839-1921 - but at the cost of three wars, countless skirmishes and one of the deadliest disasters in British military history, the massacre of more than 12,000 people at the Kabul garrison in 1841.

When considering Afghan history, it's hard not to be struck by déjà vu - the same events keep happening. In 1996, when the victorious Taliban entered Kabul, they killed Russian-backed President Mohammad Najibullah, who had been in hiding for four years, and displayed his battered corpse. One hundred and fifty-five years before, Afghans, angry with British intervention, did the same thing in the same city to Sir William McNaghten, the bungling British envoy.

The violence aside, the incidents demonstrate what journalist-historian Karl Meyer calls "remarkable continuity" - the same factors, even many of the same tribes, still operate as in Kipling's time. Largely untouched by Western technology or social networks (Afghans were never given the railroad, telegraph, civil service and legal systems that Britain built in India), much of Afghanistan remains a "fossil society," says Mr. Meyer, co-author of Tournament of Shadows: The Great Game and the Race for Empire in Central Asia.

It's precisely this small, highly traditional, rural warrior society that al-Qaeda has found refuge in. In fact, in his book, For A Pagan Song, British author Jon Bealby retraces the fictional trek that Kipling's two mercenaries took in "The Man Who Would Be King." The two rogues, played by Sean Connery and Michael Caine in the 1975 John Huston film, plan on stealing a kingdom from the quarreling Afghans in the forbidding Hindu Kush mountains (literally, "Hindu Killer"). They figure that their military training will give them the upper hand in uniting the tribes. In 1998, having seen these same tribes in action, Mr. Bealby concluded that if Kipling's adventurers "were to tumble from the skies once again, more than a hundred years later, the task confronting them would be exactly the same."

That's because "Afghanistan is the Balkans of Central Asia," says Andrew Hess, professor of diplomacy at the Fletcher School of Law and Diplomacy at Tufts University in Massachusetts.

Mountainous regions that have long been strategic crossroads between larger empires - regions such as the Balkans, the Caucasus (Chechnya, Azerbaijan) or the Hindu Kush (Afghanistan, Pakistan) - have repeatedly flared into savage warfare. Despite (even because of) modern inroads, they remain inherently unstable and difficult to control. In addition to their longstanding ethnic or religious hatreds, these regions are extremely isolated, split up by daunting mountains and populated by warrior peoples.

And for generations, these crossroads have been yanked in different directions by outside powers.

In the 19th century, the outside powers competing for Afghanistan were Czarist Russia and the British Empire, and their sometimes-bloody, sometimes-secretive sparring match was known as "the Great Game." Basically, Russia sought expansion into the Persian Gulf, while Britain feared such moves were a prelude to toppling the Ottoman Empire in the Middle East or stealing its own prize, India. Hence, the Crimean War, the Russo-Turkish War, three Afghan wars and numerous insurrections in the area.

It was Kim that popularized the term "The Great Game," and Kipling who remains our chief literary chronicler of the British foot soldier's experience in it. Kipling actually entered Afghanistan only once - in 1885. Born in Bombay but educated in England, he'd returned to India three years before as a journalist - the start of a career that would lead him, in 1907, to be the first British writer to win the Nobel Prize in literature. As a reporter, he visited the famous Khyber Pass near the Afghan-Pakistan border and got shot at, he said, by a Pathan (now called Pashtun) tribesman.

Still, in all, Kipling spent 13 years in India (which, while under the British raj or rule, also included Pakistan and pieces of Afghanistan). This became the central experience of his work. In fact, this colonial writer who was essentially an outsider in both India and Britain became immensely popular as what George Orwell dismissively dubbed "the prophet of British Imperalism." Kipling pugnaciously identified himself with British military interests (and by extension, white, Western interests). It was Kipling, after all, who infamously advised America to "take up the White Man's Burden" - in a 1903 poem urging our conquest of the Philippines.

But if Kipling's esteem rose with the empire, it sank with it, too. As popular sentiment turned against such colonial efforts as the Boer War in 1899-1902, his critics increasingly castigated him for his old-fashioned artistry as well as the jingoism and racism in such lines as "Your new-caught sullen peoples/Half-devil and half-child."

He became safely sanitized as the children's author of The Jungle Book. Yet Kim also remains remarkably popular. It sells more than 1,000 copies per week in English, and even such a staunch anti-imperialist as post-colonial academic Edward Said, author of Orientalism, has admired and edited the novel.

Obviously, there are profound differences between the situations facing Kipling's soldiers and the U.S. military. For starters, the primary goal then was containing Russia. Today, Russia is an ally, just as fearful of terrorism being exported to its people as America is.

"The Great Game eventually developed into the Cold War," says Beatrice Manz, an associate professor of Middle Eastern history at Tufts University. The objective in Central Asia was the same in both - to contain Russia. In the Cold War, however, it was the United States that "took up the old British imperial interests."

But as the rivalries of the Great Game wound down in the early 20th century, says Dr. Hess, both Russia and Great Britain "adopted a policy of using Afghanistan as a buffer state. They isolated it between them."

Hence, Afghanistan's "fossil society" - augmented by a series of leaders suspicious of any outside aid. It's one reason for the continued relevance of Kipling's writings 100 years later. For good or ill, before Osama bin Laden and Sept. 11, Kipling, more than anyone else, shaped the West's ideas of Afghanistan.

In doing so, says Ronald Cluett, professor of classics and history at Pomona College in California, "he did us a tremendous disservice in the way he romanticized the region. He was recalling his own time there, writing nostalgically about it. So he appended an image of schoolboy adventure to Afghanistan that is totally inaccurate.

"On the other hand," Dr. Cluett says, "Kipling also established how central covert actions and espionage are in this world. President Bush has explicitly said that part of this war is going to be fought in secret, and Kipling was a pioneer" in the literature of spycraft.

The story of a mixed-race street urchin who befriends a Tibetan monk while assisting a counterespionage network, Kim is based on real-life, British intelligence efforts - efforts that were designed to forestall native revolts (such as the Great Mutiny of 1857) or stop Russian incursions into India.

But beyond his novel's surface adventure, Mr. Meyer argues, Kipling can't help depicting the rich complexity that is Central Asia and India. With its Hindu, Muslim and Buddhist characters, Kim is a "celebration of a multi-ethnic society," Mr. Meyer says.

"What Kipling can teach us," Mr. Meyer says, "is we err if we underestimate the Afghans. We should approach people of profound difference with a degree of modesty. Whatever his other political sins, Kipling did not see the Empire as simply an excuse to exploit" the Afghan people.

Indeed, "The Man Who Would Be King" can be taken as a forecast of the demise of the British Empire - or what Mr. Meyer calls a "clairvoyant" caution about colonial exploitation, even nation-building. Its two adventurers, Peachey and Danny, succeed in teaching the Afghans military tactics and leading them in battle. They snag a king's ransom in jewels.

But when Peachey figures it's time to skip town with their plunder, Danny opts to stay. His new dream is to marry and establish a benevolent dynasty, to modernize the country with hospitals and schools and an army of "two hundred and fifty thousand men, ready to cut in on Russia's right flank when she tries for India!"

The Afghans won't have it. For his hubris, Danny is killed; Peachey barely escapes with his life.

In other words, imposing a political solution on a people, particularly these people, will not work, no matter how enlightened that solution may seem to be.

"There is a limit to what we can do," says Dr. Manz. "If we seek only a military solution in Afghanistan, we'll fail. If we, as we did in Iran, try to find one strong man, put him in power and work through his secret police, we'll fail.You can't dominate the Afghans that way."

But neither can America just walk away. During his election campaign, George W. Bush criticized President Bill Clinton's nation-building efforts in Somalia and Haiti. But in recent weeks, his administration has conceded that it may have to help organize a post-Taliban, Afghan government - somewhere, somehow - out of the warring ethnic groups, including the ruling Pashtuns and the Uzbeks and Tajiks, who make up much of the Northern Alliance.

"We can't just wash our hands," says Dr. Manz, "because we did it before [when the Soviet Union pulled out of Afghanistan in 1989]. And look what it got us. Any place so desperately poor, with absolutely no social infrastructure left, is going to be a threat to everyone around it. It will destabilize the entire region.
"We have to recognize, on moral grounds, that by our own exertions, we have had a share in creating this."

In his 1942 essay on Kipling, Orwell begins by declaring that Kipling is an imperialist, "morally insensitive and aesthetically disgusting." He simply never understood, Orwell argues, that the British Empire was an economic machine for exploiting its colonies.

But what Kipling had that other British writers didn't, that even his liberal critics don't, Orwell writes, was a "sense of responsibility." He had the sense of responsibility of the person in charge who "is always faced with the question, 'In such and such circumstances, what would you do?"

Painter David Hockney says the Old Masters used long-lost tracing techniques in their masterpieces
December 1, 2001

David Hockney hasn't found a new way of looking at things. He believes he's uncovered an old one, and it's rocking art historians back on their heels.

It's Mr. Hockney's belief that many Old Masters from 1420 to the 1830s used optical devices to capture detailed, realistic effects - far more masters than had been previously assumed, such as Frans Hals, Hans Holbein and Jan van Eyck.

This weekend, at New York University, a conference is being held called "Art and Optics" that will have Mr. Hockney facing critics and historians, including Susan Sontag (On Photography) and Martin Kemp (The Science of Art). For those who want to learn what the uproar is about, there is now Mr. Hockney's new book, Secret Knowledge: Rediscovering the Lost Techniques of the Old Masters (Viking, $60).

The optical devices in question are the camera obscura, the camera lucida, and the concave mirror. The camera obscura is basically a photo camera without film. It's a box (it can be as big as a room) with a hole in it, which allows light to come in and project an inverted, upside-down image. A camera lucida is a tiny prism on a stick. It casts an image on an observer's eye, letting him sketch "through" the image, outlining it on paper. And a concave mirror also projects upside-down images on a wall.

It has been known for years that Albrecht Durer and Johannes Vermeer used the camera obscura. Vermeer, in particular, was friends with Anton van Leeuwenhoek, the inventor of the microscope, and had access to high-quality lenses. Tracy Chevalier's acclaimed 1999 novel, Girl With a Pearl Earring, even revolves around a fictional maid who learns how to assist Vermeer, including with his camera.

But it was when Mr. Hockney was studying pencil sketches by the 19th-century French painter Jean Auguste Dominique Ingres that he noticed a revealing aspect of Ingres' drawing technique. Ingres drew dozens of these small portraits, yet they are uncannily detailed, even capturing complex fabric patterns.

That's because "his line was guided," says Mr. Hockney by phone from Los Angeles. "It was traced. There's quite a difference between a traced line and what I call an 'eyeballed' one." He recognized the confidence and directness of Ingres' lines from their similarity to Andy Warhol's drawings, which Mr. Hockney knew were traced from projections. He also knew from experience that portraits so exceptional take hours, even days. As a busy painter, where did Ingres find the time?

"It was when I began to experiment with the camera lucida," Mr. Hockney says, "that the issue got bigger." In a few moments with an optic device, an artist can set down the eyes and mouth - the most important features - and outline other details. Then he fills in the portrait at his leisure.

How long had painters known they could handle fleeting expressions and tricky lighting this way?

Arguably, Mr. Hockney's theory stands large parts of art history on their heads, says Ted Pillsbury, part owner of the Pillsbury and Peters Gallery and former director of the Kimbell Art Museum in Fort Worth. Rather than Caravaggio's distinctive style, say, growing out of period ideas, it developed from his optical tools. Projections are weak, for example, and require extremely strong light - hence, the highly theatrical spotlight effects known as chiaroscuro. Optics also have a narrow range of focus - Caravaggio's paintings frequently present a line of people with little depth behind them.

"It's the sort of thing only another working artist would uncover," Dr. Pillsbury says. "They're always looking at a work and asking, 'How did he do it?'"

After taping up hundreds of color reproductions in his studio and lining up their increasingly specific treatments of fabrics, armor, foreshortening, and lighting, Mr. Hockney came to believe that around 1420, artists discovered the projection properties of concave mirrors.

"Art historians already have acknowledged that a major change took place around 1420," Mr. Hockney says. But when his book contrasts the almost photographic precision achieved by middling artists (Bronzini, Moroni) with the vaguer efforts of geniuses from only a few years before (Masaccio, della Francesca), it's clear one artist isn't simply more skilled than another. The lesser painters had methods of rendering details that their earlier betters simply did not.

Why keep such methods secret? Actually, by the 19th century, they weren't. Victorian manufacturers even advertised their optic instruments as artists' "aids."

But during the Renaissance, painters belonged to guilds, the forerunners of unions, and swore an oath to keep the guilds' techniques secrets. Beyond the normal desire to protect one's advantages, there was also the Roman Catholic Church. British philosopher Roger Bacon wrote a 13th-century treatise on optics. He barely escaped being burned alive.

"Projecting an image was magic," Mr. Hockney explains. "And magic is heresy."
Coincidentally, Mark Tucker and Nica Gutman, conservators at the Philadelphia Museum of Art, have established that Thomas Eakins, the great American 19th century realist, used photographs to help with his works. Amon Carter Museum curator Patricia Junker confirms that the Fort Worth museum's prized Eakins, Swimming, was one of those paintings.

Unfortunately, some in the news media have simplified all this into the revelation that artists "cheated." Mr. Hockney dismisses this.

"Optics are a tool," he says, "and the tool doesn't paint the painting. It doesn't have the skill. But once you've seen optic images, you know painters would use them."

And once, he says, you notice that from around 1500 to 1860, "there never is a badly drawn basket, lace or satin is rendered perfectly even under the most difficult lighting, you have to wonder. This didn't just happen."

Throughout history, thinkers have willingly acknowledged uncertainty. Can today's politicians?
Book essay
October 23, 2004
[NOTE: This was written nearly three years before today's spate of "atheist" books.]

Most Americans don't know that in a 1797 treaty, President John Adams declared, "the Government of the United States is not, in any sense, founded on the Christian religion."

One might expect such a declaration from Thomas Jefferson, a noted skeptic regarding religion and faith, but it was Adams, the more conventional Christian, who signed the Treaty with Tripoli. He later wrote to Jefferson that laws against religious doubt would be "great obstructions to the improvement of the human mind."

Similarly, most of us are unaware that Abraham Lincoln belonged to no organized church. He believed in God, but his faith was one of doubts and struggles.

During the Civil War, Lincoln said, "I suppose it will be granted that I am not to expect a direct revelation. I must study the plain, physical facts of the case, ascertain what is possible and learn what appears to be wise and right."

In short, he wasn't holding his breath for divine guidance. Imagine how popular a politician would be today saying that - as Lincoln did - to an audience of ministers.

What links Adams and Lincoln (and Jefferson) could be loosely termed empiricism, pragmatic skepticism or just plain doubt.

None of these men believed that God talked to him - or, if God did, that he offered day-to-day counsel on affairs of state.

Today, not many public figures make a case for such skepticism or disbelief. President Bush has expressly courted the Christian right with his religious devotion. Perhaps more importantly, and more popularly, the president has made his own unwavering righteousness key to his re-election and his war on terrorism. In response, John Kerry has touted his Catholic upbringing, his own steadfast values. We hear much more about what the candidates believe than what they have reservations about.

That's because doubt, scholars say, is often seen as a weakness.

"People commonly associate doubt with an unfortunate state, a state of worry," said Jennifer Michael Hecht, a history professor at New York's Nassau Community College. "And that colors everything else."

Dr. Hecht is the author of Doubt: A History (HarperSanFrancisco, $27.95), one of two books released this year that examine the history of such thinking. Her exhaustive study looks at doubt in the philosophy of thinkers and authors from Socrates (condemned to death for atheism) to Salman Rushdie (whose novel, The Satanic Verses, garnered him a fatwa, or Islamic death sentence). Susan Jacoby has a narrower but lively subject in Freethinkers: A History of American Secularism (Metropolitan Books, $27.50), which chronicles the role of skepticism or secular humanism in American politics.

While doubt is often seen as fuzzy and skepticism as counterproductive, rock-solid certitude is generally prized as a strength - in business leaders, public officials, ministers and action heroes. Yet Ronald Reagan, Phil Gramm, Charlton Heston, St. Paul and St. Augustine - among many others - changed their moral stances, their theological or political views.

For them, doubt became a door to new ways of thinking, new proofs. When one looks at doubt like this, Dr. Hecht argued - not as a personal failing but as an intellectual approach that can lead to insights - the history of faith gets turned on its head.

Consider the Book of Job. To test Job's faith, God let Satan steal Job's property, plague him with boils and murder his children. But Job stands fast and is rewarded. God returns everything and more.

Accordingly, Job's story is taught as a great drama of faith.

But, Dr. Hecht noted, it's also cited by agnostics and atheists as the source of their first uncertainties about religion. They began to see God as terrifyingly capricious.

Job's suffering, after all, can be viewed as some brutal, divinely ordained experiment. And when, in the story's most human moment, Job wails, Why? Why am I afflicted like this? God answers, basically, "Who are you to question me?"

In short, the "perfect and upright" Job asks about justice: Why do the good suffer? But God's answer asserts his authority: I'm God. Trust me.

No wonder novelist Virginia Woolf mused in a letter, "I read the book of Job last night - I don't think God comes well out of it."

The ancient Greeks took a whack at this problem of justice - why do the good suffer? - and a great many other doubts about the purpose of the universe, such as: What is the proper way to live? Indeed, with the Skeptics, Stoics, Epicureans and Cynics, "the Greeks came up with the basic building blocks of doubt," said Dr. Hecht, "all before 300 B.C."

We've been re-stacking those blocks ever since. Doubt, Dr. Hecht argued, is not one single thing, like a disbelief in God or in the afterlife. It can be a system of thought (scientific doubt, the testing of hypotheses) or a dead end (the solipsist's doubt that anything exists outside his own mind).

There is Zen Buddhist doubt, a tool for enlightenment. As the Zen saying goes, "Little doubt, little awakening. Big doubt, big awakening. No doubt, no awakening."

The Book of Job, written sometime between 600 and 400 B.C., is one of the earliest Hebraic wrestlings with doubt. Jews have a long tradition of doubters, notably Maimonides, the great 12th-century Talmudic scholar. Maimonides wrote his Guide for the Perplexed for those who knew both Jewish law and Greek rationalism and had trouble reconciling the two. Seeking a middle way between blind faith and outright skepticism, he became one of the precursors of Reform Judaism.

One reason Jews have a deep history of doubt and skepticism, Dr. Hecht said, is their historical "outsider" status in the West. People are more likely to question a system from such a vantage.

"Questions often come from the fringe, from those who aren't heavily invested in an idea," she said. "It's a paradox. There are often good reasons to distrust ideas from the fringe, but it's also where the great innovations come from."

In the history of doubt, the United States stands as one of those innovations, Ms. Jacoby writes in Freethinkers. Our country was the first established without recourse to divine authority: Nowhere in the Constitution, she said, is God invoked to justify our rights or government. The highest authority is "we, the people."

That's not to say America is not a nation of believers. The vast majority of us say we're religious, and roughly eight in 10 claim adherence to some brand of Christianity.

But having rejected the divine right of kings, the Founding Fathers, in their statement of principles, sidestepped the divine entirely. They left ample room for doubt. Does God exist? Does he guide our country? What does he want us to do? On these questions, the Constitution is conspicuously silent.

As author Jason Epstein has noted, the American system of government is itself an expression of pragmatic skepticism. With its three branches and their checks and balances, the United States doesn't let any one authority call the shots. Yet our government is not paralyzed by this. Rather, we are constantly fine-tuning the system: amending things, appealing them or voting out the current crop.

Given all this, it's not surprising, said Dr. Hecht, that so much discussion in America about faith and skepticism is tied up with politics and not philosophy. We continue to argue about the place of personal belief and religious influence in public affairs.

In a sense, America is the conflicted child of the Puritan and the French Revolutions (which, admittedly, followed ours but was motivated by many of the same principles). The Puritans gave us freedom of religious expression and our prime spiritual-cultural instincts (individualism, work ethic, salvation by faith). The French, meanwhile, emphasized freedom from religion and authority. Hence, Ms. Jacoby writes, our separation of church and state, designed to free both sides, the secular and religious, from interference by the other.

But it was also the French Revolution that first cast a suspicion of "foreignness" and "subversion" on skepticism in America, said Ms. Jacoby. Freethinkers such as Jefferson and Thomas Paine were accused of being French radicals, of undermining religion.

While the 18th century had its revolutions and its Enlightenment, it was not the Great Age of Disbelief, said Ms. Jacoby. That would be the 19th century. Evolution, abolitionism and the suffragette movement: When they first arose, all were seen as attacks on Christian authority.

Passages from the Bible had long been used to justify slavery and the subjugation of wives. And anyone who wondered about the origin of species could simply consult Genesis. But after Darwin, the Civil War and women's winning the vote, such positions were harder to hold in the public sphere.
In this way, Ms. Jacoby said, skepticism and doubt have actually helped religions in our age of science. They've helped make religion more tolerant, less coercive, more suitable for a pluralistic, democratic society. By working to peel away inhumane or discriminatory tenets (or mistaken interpretations), they've made it possible for many people to hold to reason and faith.

Mainstream Protestantism, Catholicism and Judaism have all made various accommodations with reason - as in the 1950 Catholic encyclical Humani generis which, although it re-stated traditional ideas on Genesis, declared that "research and discussions" on evolution should continue.

Indeed, Jacques Derrida, the French deconstructionist philosopher who died this month, argued that there is no religion without uncertainty. Mark Taylor, professor of humanities at Williams College in Williamstown, Mass., has said that for Mr. Derrida, "the great religious traditions are profoundly disturbing because they all call certainty and security into question."

Or, as Dr. Hecht put it, "The great questions in the history of religion - how shall we live? what do we know of ourselves? - these are the great questions in the history of doubt as well."

British writer plays devil's advocat to Mother Teresa
November 8, 2003

FORT WORTH - Christopher Hitchens is a rare bird these days.

Not just because he's a British-born journalist who writes about American politics - as well as a contentious, left-wing contrarian who supports the war in Iraq.

Mr. Hitchens has also played "devil's advocate" - against Mother Teresa. This would seem an outsized case of windmill-tilting and nose-thumbing. While she was alive, the Nobel Prize-winning nun often topped international polls as the most admired person on the planet. Last month, she was beatified by the Catholic Church - the second major step toward sainthood, after being found "venerable." Yet Mr. Hitchens testified against her - at the request of the church.

A devil's advocate is not the Keanu Reeves character in an Al Pacino movie. When an individual is being considered by the church for sainthood, a "postulator" is appointed to make the case for that candidate. The devil's advocate, on the other hand, is the person who presents the evidence against sainthood. He's called that because, obviously, in trying to keep candidates out of the ranks of the saintly, he's like a corporate recruiter for the sinful side.

But still, no demons, no pitchforks, no special effects. The advocatus diaboli, as he's called in Latin, is merely the opposing counsel, a prosecuting attorney with a flashy title. Officially, he's even known as the "Promoter of the Faith." It was an honorable position: Before he became Pope Benedict XIV (1740-58), Prospero Lamartini served in the post for 20 years.

It was an honorable position: The church did away with it 20 years ago. Nonetheless, Mr. Hitchens was requested by the Vatican to bring evidence against Mother Teresa. That's as close as ordinary mortals get these days to donning the devil's robes in an ecclesiastical court.

Mr. Hitchens was an obvious choice to raise hell. His 1995 book, The Missionary Position: Mother Teresa in Theory and Practice, aggressively attacked the famous nun's reputation as a selfless servant of the poor. He questioned her relationships with some unsavory global characters and the efficacy and purpose of her missionary work in Calcutta, India. (The iconoclastic Vanity Fair columnist was in Fort Worth last week working on a documentary about Texas that he's doing for Britain's Channel 4. The same TV company produced Hell's Angel, a documentary about Mother Teresa that he co-wrote.)

In his book and documentary, Mr. Hitchens pointed out that Mother Teresa associated with (and applauded) the Duvalier clan, the dictators of Haiti. She accepted a donation of more than $1 million from Charles Keating Jr., the convicted savings-and-loan fraud. Paul Turley, the Los Angeles deputy district attorney in that case, sent her a letter stating that the money she received was not Mr. Keating's to give, that it was stolen from hundreds of small investors. Mother Teresa never returned it.

On a broader level, Mr. Hitchens argued that Catholics and non-Catholics all over the world gave money to help Mother Teresa with her efforts among the poor and sick of Calcutta. But, he maintained, she and her order, the Missionaries of Charity, have not so much provided physical or medical aid as they have worked to convert the poor. The Lancet, the prestigious British medical journal, called the care dispensed at her Calcutta clinic "haphazard."

"Her clinic is just as threadbare as when she began," Mr. Hitchens said in Fort Worth. "Yet she said with pride that she's built more than 500 convents in 125 countries."

Mother Teresa was always candid that her goal was ministering to what she saw as the poor's spiritual needs and not their medical or economic ones. "We are not social workers," she once said. What was needed, she said, was more prayer, more faith. As for her association with what Mr. Hitchens termed "the corrupt and wordly rich," the Rev. Brian Kolodiejchuk, the postulator in her case, has pointed out that Jesus himself sat down with Roman tax collectors.

Many in the Catholic press have called Mr. Hitchens' charges "slurs" and even "bizarre." Bishop Salvadore Lobo of Baruipur, India, labeled Hell's Angel a "very distorted" depiction of the beloved nun and her work.

The objections have often focused on Mr. Hitchens' vitriolic atheism. He's hardly impartial to religion: "I'm hostile to it," he told Free Inquiry magazine in 1996. "I think it is a positively bad idea, not just a false one. And I mean not just organized religion, but religious belief itself."

But, Mr. Hitchens noted, no one has disproved his assertions about the Duvaliers, the Keating money or Mother Teresa's consistently ultra-conservative views. (She opposed the reforms of Vatican II, for example, and supported a proposed ban on divorce in Ireland while supporting Princess Diana's own divorce.)

Still, he said, when it came to handling her beatification, Father Kolodiejchuk was "a fair-minded guy." Attempts to reach Father Kolodiejchuk, who was said to be traveling in Calcutta, were unsuccessful.

Mr. Hitchens was one of several non-Catholics who testified - among hundreds of witnesses whose testimony filled some 34,200 pages of what is called the Acts of the Diocesan Inquiry.

When he received his letter a few years ago from the Vatican asking for input, "I thought, terrific, because I thought I would go to Rome," Mr. Hitchens recalled. "Possibly even to the Sacred Congregation for the Doctrine of the Faith, which is in the old Inquisition office." Maybe he'd find a few leftover thumbscrews lying around. Mr. Hitchens said he'd even pay for the trip himself.

But it turned out that his appearance was overseen by the Catholic archdiocese in his adoptive home, Washington, D.C.

"It was just a taxi ride to Catholic University," Mr. Hitchens grumped. He can see the spire of the campus from his apartment.

Under Pope John Paul II, the canonization protocol has been greatly streamlined. The process was once painstakingly slow, requiring two verified miracles for beatification and two more for canonization. When St. Therese of Lisieux was canonized in 1925 - 28 years after her death - it set a modern speed record. In comparison, Queen Isabella - the one who bankrolled Christopher Columbus - is still waiting, 499 years after her death.

The snail's pace was intended to dampen any faddish enthusiasms. The church would not be buffaloed. Mr. Hitchens likes to quote the 19th-century British historian, Thomas Babington Macaulay, who observed that one of the great achievements of the church was its "containment" of fanaticism.

But in a 1983 "apostolic constitution," John Paul II fast-tracked the canonization process. The four-miracle requirement was cut to two. The devil's advocate position - created in 1587 - was abolished.

Since then, John Paul has become the most prolific saint-maker in history, having canonized 476 people and beatified more than 1,300. Togther, all of his 20th century predecessors canonized 98.

Well before her death in 1997, Mother Teresa was being hailed as a "living saint" and "the saint of the gutters." And now the Macedonian nun has been beatified faster than anyone in modern history. It took only five years, three months.

That's partly because Archbishop Henry D'Souza of Calcutta petitioned for a waiver of the five-year "cooling "cooling off" period that the church had imposed before a person could be considered for beatification. The pope agreed. As Cardinal Joseph Ratzinger, the Vatican's doctrinal overseer, said after Mother Teresa's death, "I am not privy to the innermost thoughts of the Holy Father, but I think he wants it [her canonization] speeded up."

The resulting inquiry wasn't the most august process, Mr. Hitchens said. He likened it to "a seminar hearing in a rundown college." He met with a three-member tribunal in a paneled room at Catholic University -- one of 14 such tribunals worldwide in Mother Teresa's case.

The journalist began by thanking the committee for the chance to present his objections.

"'As you know,' he recalled saying, 'I am not a believer, not a member of the faith, and so whom you make a saint is none of my concern. It's very decent of you to ask me into your internal affairs. But to the extent that the word sainthood or beatification has a secular meaning regarding an exemplary person, I would like to enter a dissent.'"

They accepted a copy of his Mother Teresa book as evidence, and then proceeded through 263 questions -- "a standard questionnaire from Rome that everyone has to fill in," Mr. Hitchens said. "There is no deviation. I had to simply check 'yes' or 'no' or 'no comment.'"

The famously combative journalist was disappointed once again. There was no debate, no probing questions, "no dialectical opposition," he said. Instead, when the questionnaire was done, he was politely told the record of his testimony would be processed quickly.

Father Kolodiejchuck submitted all of the testimony to the Congregation for the Causes of the Saints (the Vatican office that handles canonization). The cardinals and bishops voted. The pope concurred. Mother Teresa can now officially be called "Blessed."

Before she can be called "Saint," another verified miracle is needed. Not surprisingly, Mr. Hitchens dismisses the validity of the first one, the disappearance of an abdominal cyst from a Hindu mother.

But given the enormous popularity of Mother Teresa and given the reverence in which she's held both in the church and out, the only question now would seem to be whether John Paul II will live to see her canonized.

As for the dissenting Mr. Hitchens, he believes "I was restrained by the rules from making my best case."

But he consoles himself with this thought: "I did represent the devil pro bono."

March 22, 2007 10:47 AM |



Best of the Vault


Pat Barker, Frankenstein, Cass Sunstein on the internet, Samuel Johnson, Thrillers, Denis Johnson, Alan Furst, Caryl Phillips, Richard Flanagan, George Saunders, Michael Harvey, Larry McMurtry, Harry Potter and more ...


Big D between the sheets -- Dallas in fiction


Reviewing the state of reviewing


9/11 as a novel: Why?


How can critics say the things they do? And why does anyone pay attention? It's the issue of authority.

The disappearing book pages:  

Papers are cutting book coverage for little reason

Thrillers and Lists:  

Noir favorites, who makes the cut and why



About this Entry

This page contains a single entry by book/daddy published on March 22, 2007 10:47 AM.

The Bush Presidential Memory Hole: Oh lovely. His shrine will be near my house was the previous entry in this blog.

Thrillers and Lists: is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Creative Commons License
This weblog is licensed under a Creative Commons License.