Monday, May 28, 2007

Atheists with Attitude
Why do they hate Him?
by Anthony Gottlieb May 21, 2007
In the second century A.D., it was the Christians who were denounced as “atheists.”
Great portents and disasters turn some minds to God and others away from him. When an unusually bright and long-tailed comet was tracked through the sky in the last two months of 1680, posters and sermons called on Christians to repent. A hen in Rome seemed to confirm that the Day of Judgment was near. On December 2nd, it made an extraordinarily loud cackle and produced an exceptionally large egg, on which could be seen a likeness of the comet, or so it was said. This added to the religious panic. But the comet also sparked a small triumph for rationalism. In the next few years, as Armageddon somehow failed to arrive, a stream of pamphlets across Europe and America argued that heavenly displays were purely natural phenomena. The skeptics won the day. From the eighteenth century onward, no respectable intellectual saw comets as direct messages from God—though there were still some fears that one might eventually hit the earth.
The felling of the World Trade Center in New York, on September 11, 2001, brought its share of religion. Two populist preachers, Pat Robertson and Jerry Falwell, called it divine punishment (though both quickly withdrew their remarks), and not only the bereaved prayed for help. But September 11th and its aftershocks in Bali, Madrid, London, and elsewhere are more notable for causing an outbreak of militant atheism, at least on bookshelves. The terrorist attacks were carried out in the name of Islam, and they have been taken, by a string of best-selling books, to illustrate the fatal dangers of all religious faith.
The first of these books was “The End of Faith,” by Sam Harris, which was published in 2004 and was on the Times paperback best-seller list for thirty-three weeks. Then came “Breaking the Spell: Religion as a Natural Phenomenon,” by Daniel Dennett, a philosopher at Tufts University, who has written popular books on the science of consciousness and on Darwin. Next was “The God Delusion,” by Richard Dawkins, an evolutionary biologist and Britain’s preëminent science writer. Harris joined battle again last year with “Letter to a Christian Nation,” which renewed his attack on Christianity in particular. And now there is “God Is Not Great: How Religion Poisons Everything” (Twelve; $24.99), by Christopher Hitchens, which is both the most articulate and the angriest of the lot. Hitchens is a British-born writer who lives in Washington, D.C., and is a columnist for Vanity Fair and Slate. He thrives at the lectern, where his powers of rhetoric and recall enable him to entertain an audience, go too far, and almost get away with it. These gifts are amply reflected in “God Is Not Great.”
Hitchens is nothing if not provocative. Creationists are “yokels,” Pascal’s theology is “not far short of sordid,” the reasoning of the Christian writer C. S. Lewis is “so pathetic as to defy description,” Calvin was a “sadist and torturer and killer,” Buddhist sayings are “almost too easy to parody,” most Eastern spiritual discourse is “not even wrong,” Islam is “a rather obvious and ill-arranged set of plagiarisms,” Hanukkah is a “vapid and annoying holiday,” and the psalmist King David was an “unscrupulous bandit.”
It’s possible to wonder, indeed, where plain speaking ends and misanthropy begins: Hitchens says that the earth sometimes seems to him to be “a prison colony and lunatic asylum that is employed as a dumping ground by far-off and superior civilizations.” He certainly likes to adopt the tone of a bemused Martian envoy hammering out a report for headquarters. (We hear of “a showbiz woman bizarrely known as Madonna.”) In a curious rhetorical tic, Hitchens regularly refers to people whom he wishes to ridicule by their zoological class. Thus the followers of Muhammad are “mammals,” as is the prophet himself, and so are the seventeenth-century false messiah Sabbatai Zevi and St. Francis of Assisi; Japan’s wartime Emperor Hirohito is a “ridiculously overrated mammal,” and Kim Il Sung, the father of North Korea’s current dictator, is a “ludicrous mammal.” Hitchens is trying to say that these people are mere fallible mortals; but his way of saying it makes him come across as rather an odd fish.
He is also a fallible one. After rightly railing against female genital mutilation in Africa, which is an indigenous cultural practice with no very firm ties to any particular religion, Hitchens lunges at male circumcision. He claims that it is a medically dangerous procedure that has made countless lives miserable. This will come as news to the Jewish community, where male circumcision is universal, and where doctors, hypochondria, and overprotective mothers are not exactly unknown. Jews, Muslims, and others among the nearly one-third of the world’s male population who have been circumcised may be reassured by the World Health Organization’s recent announcement that it recommends male circumcision as a means of preventing the spread of AIDS.
Hitchens is on firmer ground as he traipses around the world on a tour of sectarian conflicts. He recounts how, a week before September 11th, a hypothetical question was put to him by Dennis Prager, an American talk-show host. Hitchens was asked to imagine himself in a foreign city at dusk, with a large group of men coming toward him. Would he feel safer, or less safe, if he were to learn that they were coming from a prayer meeting? With justified relish, the widely travelled Hitchens responds that he has had that experience in Belfast, Beirut, Bombay, Belgrade, Bethlehem, and Baghdad, and that, in each case, the answer would be a resounding “less safe.” He relates what he has seen or knows of warring factions of Protestants and Catholics in Ulster; Christians and Muslims in Beirut and in Bethlehem; Hindus and Muslims in Bombay; Roman Catholic Croatians, Orthodox Serbians, and Muslims in the former Yugoslavia; and Shiites, Sunnis, and Christians in Baghdad. In these cases and others, he argues, religion has exacerbated ethnic conflicts. As he puts it, “religion has been an enormous multiplier of tribal suspicion and hatred.”
That’s more plausible than what Sam Harris has to say on the subject. He maintains that religious belief not only aggravates such conflicts but is “the explicit cause” of them. He believes this even of Northern Ireland, where the Troubles between pro-British Unionists and pro-Irish Republicans began around 1610, when Britain confiscated Irish land and settled English and Scottish planters on it. As far as Harris is concerned, Islam brought down the Twin Towers, thanks in no small part to the incendiary language of the Koran; Middle East politics, history, and economics are irrelevant sideshows. This thesis suffers from a problem of timing: if he is right, why did Al Qaeda not arise, say, three hundred years ago, when the Koran said exactly what it says now?
One practical problem for antireligious writers is the diversity of religious views. However carefully a skeptic frames his attacks, he will be told that what people in fact believe is something different. For example, when Terry Eagleton, a British critic who has been a professor of English at Oxford, lambasted Dawkins’s “The God Delusion” in the London Review of Books, he wrote that “card-carrying rationalists” like Dawkins “invariably come up with vulgar caricatures of religious faith that would make a first-year theology student wince.” That is unfair, because millions of the faithful around the world believe things that would make a first-year theology student wince. A large survey in 2001 found that more than half of American Catholics, Episcopalians, Lutherans, Methodists, and Presbyterians believed that Jesus sinned—thus rejecting a central dogma of their own churches.
So how is a would-be iconoclast supposed to tell exactly what the faithful believe? Interpreting the nature and prevalence of religious opinions is tricky, particularly if you depend on polls. Respondents can be lacking in seriousness, unsure what they believe, and evasive. Spiritual values and practices are what pollsters call “motherhood” issues: everybody knows that he is supposed to be in favor of them. Thus sociologists estimate that maybe only half of the Americans who say that they regularly attend church actually do so. The World Values Survey Association, an international network of social scientists, conducts research in eighty countries, and not long ago asked a large sample of the earth’s population to say which of four alternatives came closest to their own beliefs: a personal God (forty-two per cent chose this), a spirit or life force (thirty-four per cent), neither of these (ten per cent), don’t know (fourteen per cent). Depending on what the respondents understood by a “spirit or life force,” belief in God may be far less widespread than simple yes/no polls suggest.
In some religious research, it is not necessarily the respondents who are credulous. Harris has made much of a survey that suggests that forty-four per cent of Americans believe that Jesus will return to judge mankind within the next fifty years. But, in 1998, a fifth of non-Christians in America told a poll for Newsweek that they, too, expected Jesus to return. What does Harris make of that? Any excuse for a party, perhaps. He also worries about a poll that said that nearly three-quarters of Americans believe in angels—by which, to judge from blogs and online forums on the subject, some of them may have meant streaks of luck, or their own delightful infants.
The Bible is a motherhood issue, too. Harris takes at face value a Gallup poll suggesting that eighty-three per cent of Americans regard it as the Word of God, and he, like Dawkins and Hitchens, uses up plenty of ink establishing the wickedness of many tales in the Old Testament. Critics of the Bible should find consolation in the fact that many people do not have a clue what is in it. Surveys by the Barna Research Group, a Christian organization, have found that most Christians don’t know who preached the Sermon on the Mount.
The tangled diversity of faith is, in the event, no obstacle for Hitchens. He knows exactly which varieties of religion need attacking; namely, the whole lot. And if he has left anyone out he would probably like to hear about it so that he can rectify the omission. From the perspective of the new atheists, religion is all one entity; those who would apologize for any of its forms—Harris and Dawkins, in particular, insist on this point—are helping to sustain the whole. But, though the vague belief in a “life force” may be misguided, it’s hard to make the case that it’s dangerous. And there’s a dreamy incoherence in their conviction that moderate forms of religion somehow enable fundamentalist zeal and violence to survive. Are we really going to tame the fervor of an extremist imam’s mosque in Waziristan by weakening the plush-toy creed of a nondenominational church in Chappaqua? If there were no religion, it’s true, neither house of worship would exist. So perhaps we are just being asked to sway along with John Lennon’s “Imagine.” (“Imagine there’s no countries /It isn’t hard to do /Nothing to kill or die for /And no religion too.”)
When Hitchens weighs the pros and cons of religion in the recent past, the evidence he provides is sometimes lopsided. He discusses the role of the Dutch Reformed Church in maintaining apartheid in South Africa, but does not mention the role of the Anglican Church in ending it. He attacks some in the Catholic Church, especially Pope Pius XII, for their appeasement of Nazism, but says little about the opposition to Nazism that came from religious communities and institutions. In “Humanity: A Moral History of the Twentieth Century,” Jonathan Glover, who is the director of the Center of Medical Law and Ethics at Kings College London, documents such opposition, and writes, “It is striking how many protests against and acts of resistance to atrocity have . . . come from principled religious commitment.” The loss of such commitment, Glover suggests, should be of concern even to nonbelievers. Still, Hitchens succeeds in compiling a list of evils that the faithful, too, should find sobering. Now that so much charitable work is carried out by secular bodies, religious ones have to work harder to keep the moral high ground. For the Catholic Church in particular—with its opposition to contraception, including the distribution of condoms to prevent the spread of AIDS, and the covering up of child abuse by priests—the ledger is not looking good.
Bertrand Russell, who had a prodigious knowledge of history and a crisp wit, claimed in 1930 that he could think of only two useful contributions that religion had made to civilization. It had helped fix the calendar, and it had made Egyptian priests observe eclipses carefully enough to predict them. He could at least have added Bach’s St. Matthew Passion and more than a few paintings; but perhaps the legacy of religion is too large a conundrum to be argued either way. The history of the West has been so closely interwoven with the history of religious institutions and ideas that it is hard to be confident about what life would have been like without them. One of Kingsley Amis’s lesser-known novels, “The Alteration,” tried to envisage an alternative course for modern history in which the Reformation never happened, science is a dirty word, and in 1976 most of the planet is ruled by a Machiavellian Pope from Yorkshire. In this world, Jean-Paul Sartre is a Jesuit and the central mosaic in Britain’s main cathedral is by David Hockney. That piece of fancy is dizzying enough on its own. But imagine attempting such a thought experiment in the contrary fashion, and rolling it back several thousand years to reveal a world with no churches, mosques, or temples. The idea that people would have been nicer to one another if they had never got religion, as Hitchens, Dawkins, and Harris seem to think, is a strange position for an atheist to take. For if man is wicked enough to have invented religion for himself he is surely wicked enough to have found alternative ways of making mischief.
In the early days of the Christian era, nobody was fantasizing about a world with no religion, but there were certainly those who liked to imagine a world with no Christians. The first surviving example of anti-Christian polemic is strikingly similar in tone to that of some of today’s militant atheists. In the second century, it was Christians who were called “atheists,” because they failed to worship the accepted gods. “On the True Doctrine: A Discourse Against the Christians” was written in 178 A.D. by Celsus, an eclectic follower of Plato. The Christian deity, Celsus proclaimed, is a contradictory invention. He “keeps his purposes to himself for ages, and watches with indifference as wickedness triumphs over good,” and only after a long time decides to intervene and send his son: “Did he not care before?” Moses is said to be “stupid”; his books, and those of the prophets, are “garbage.” Christians have “concocted an absolutely offensive doctrine of everlasting punishment.” Their injunction to turn the other cheek was put much better by Socrates. And their talk of a Last Judgment is “complete nonsense.”
There’s not much more where that came from, because within a couple of hundred years Christians became the ones to decide who counted as an atheist and was to be punished accordingly. Pagan anti-Christian writings were destroyed wherever possible. In truth, from the start of the Christian era until the eighteenth century, there were probably very few people in the West who thought that there was no God of any sort. Those thinkers who had serious doubts about the traditional conception of God—of whom there were many in the seventeenth century—substituted another sort of deity, usually a more distant or less personalized one.
Even Voltaire, one of the fiercest critics of superstition, Christianity, and the Church’s abuse of power, was a man of deep religious feeling. His God, though, was beyond human understanding and had no concern for man. (Voltaire’s satirical tale “Candide,” which attacks the idea that all is for the best in a world closely watched over by a benevolent God, was partly inspired by a huge earthquake in Lisbon, which struck while the faithful were at Mass on All Saints’ Day in 1755 and killed perhaps thirty thousand people.)
Voltaire, like many others before and after him, was awed by the order and the beauty of the universe, which he thought pointed to a supreme designer, just as a watch points to a watchmaker. In 1779, a year after Voltaire died, that idea was attacked by David Hume, a cheerful Scottish historian and philosopher, whose way of undermining religion was as arresting for its strategy as it was for its detail. Hume couldn’t have been more different from today’s militant atheists.
In his “Dialogues Concerning Natural Religion,” which was published posthumously, and reports imaginary discussions among three men, Hume prized apart the supposed analogy between the natural world and a designed artifact. Even if the analogy were apt, he pointed out, the most one could infer from it would be a superior craftsman, not an omnipotent and perfect deity. And, he argued, if it is necessary to ask who made the world it must also be necessary to ask who, or what, made that maker. In other words, God is merely the answer that you get if you do not ask enough questions. From the accounts of his friends, his letters, and some posthumous essays, it is clear that Hume had no trace of religion, did not believe in an afterlife, and was particularly disdainful of Christianity. He had a horror of zealotry. Yet his many writings on religion have a genial and even superficially pious tone. He wanted to convince his religious readers, and recognized that only gentle and reassuring persuasion would work. In a telling passage in the “Dialogues,” Hume has one of his characters remark that a person who openly proclaimed atheism, being guilty of “indiscretion and imprudence,” would not be very formidable.
Hume sprinkled his gunpowder through the pages of the “Dialogues” and left the book primed so that its arguments would, with luck, ignite in his readers’ own minds. And he always offered a way out. In “The Natural History of Religion,” he undermined the idea that there are moral reasons to be religious, but made it sound as if it were still all right to believe in proofs of God’s existence. In an essay about miracles, he undermined the idea that it is ever rational to accept an apparent revelation from God, but made it sound as if it were still all right to have faith. And in the “Dialogues” he undermined proofs of God’s existence, but made it sound as if it were all right to believe on the basis of revelation. As the Cambridge philosopher Edward Craig has put it, Hume never tried to topple all the supporting pillars of religion at once.
In Paris, meanwhile, a number of thinkers began to profess atheism openly. They were the first influential group to do so, and included Denis Diderot, the co-editor of the Enlightenment’s great Encyclopédie, and Baron D’Holbach, who hosted a salon of freethinkers. Hume visited them, and made several friends there; they presented him with a large gold medal. But the philosophes were too dogmatic for Hume’s taste. To Hume’s like-minded friend the historian Edward Gibbon, they suffered from “intolerant zeal.” Still, they represented a historical vanguard: explicit attacks on religion as a whole poured forth within the next hundred years.
Since all the arguments against belief have been widely publicized for a long time, today’s militant atheists must sometimes wonder why religion persists. Hitchens says that it is born of fear and probably ineradicable. Harris holds that there are genuine spiritual experiences; having kicked sand in the faces of Judaism, Christianity, and Islam, he dives headlong into the surf of Eastern spirituality, encouraging readers to try Buddhist techniques of meditation instead of dangerous creeds. Dawkins devotes a chapter, and Dennett most of his book, to evolutionary accounts of how religion may have arisen and how its ideas spread. It’s thin stuff, and Dennett stresses that these are early days for a biological account of religion. It may, however, be too late for one. If a propensity toward religious belief is “hard-wired” in the brain, as it is sometimes said to be, the wiring has evidently become frayed. This is especially true in rich countries, nearly all of which—Ireland and America are exceptions—have relatively high rates of unbelief.
After making allowances for countries that have, or recently have had, an officially imposed atheist ideology, in which there might be some social pressure to deny belief in God, one can venture conservative estimates of the number of unbelievers in the world today. Reviewing a large number of studies among some fifty countries, Phil Zuckerman, a sociologist at Pitzer College, in Claremont, California, puts the figure at between five hundred million and seven hundred and fifty million. This excludes such highly populated places as Brazil, Iran, Indonesia, and Nigeria, for which information is lacking or patchy. Even the low estimate of five hundred million would make unbelief the fourth-largest persuasion in the world, after Christianity, Islam, and Hinduism. It is also by far the youngest, with no significant presence in the West before the eighteenth century. Who can say what the landscape will look like once unbelief has enjoyed a past as long as Islam’s—let alone as long as Christianity’s? God is assuredly not on the side of the unbelievers, but history may yet be. ♦

Wednesday, May 23, 2007


Eugen Weber, Authority on Modern France, Dies at 82
By ANDREW L. YARROW NY TIMES
Eugen Weber was a history professor and former dean at U.C.L.A.
The cause was pancreatic cancer, the university said.
Mr. Weber’s accessible style made him popular among students, historians and the public in the United States as well as in France, where his books on modern French history are considered classics. Over the years, hundreds of thousands of students got their first taste of modern European history from Mr. Weber’s best-selling textbooks like “A Modern History of Europe” (1971) and “Europe Since 1715: A Modern History” (1972).
And he was a familiar, charming presence to Americans who saw his acclaimed 52-part lecture series, “The Western Tradition,” produced by WGBH in Boston for public television in 1989. It became the basis of a video instructional series with companion books that students have used ever since.
Mr. Weber wrote about French history and taught it in the United States for more than 40 years. He published more than a dozen distinguished books, which have been translated into more than half a dozen languages. He was a professor of modern European history at the
University of California at Los Angeles, and his classes were so popular that he became a campus celebrity. A chair there was endowed in his name.
Mr. Weber was encyclopedic in his depiction of an era, a movement or a social trend, focusing more on the many facets of everyday life than on historical theories. History, he wrote in “Europe Since 1715,” was “not just the epic of collective deeds, but the tissue of the times; not just what happened, but to whom and how; not just wars and politics, the doings of a relatively restricted group, but the way people lived — humbler and middling people, and the rich as well — their food, their housing, the warp and woof of their existence.”
Mr. Weber’s work was admired in the country that so fascinated him. Tony Judt, a professor of history at
New York University and a leading writer on French history, once observed:
“On the whole, the French write their own history and write it with much sophistication. But occasionally they come across a foreigner who does it differently or better, and then, with much fanfare and generosity, they adopt him for their own. Such is the case of Eugen Weber.”
At least two of Mr. Weber’s works have become standard reading in France: “Action Française: Royalism and Reaction in Twentieth-Century France” (1962), a history of the royalist movement, which dominated the French right from the Dreyfus Affair until 1940; and “Peasants Into Frenchmen: The Modernization of Rural France, 1870-1914” (1976), an account of how a country that was still largely rural, “inhabited by savages” and a hodgepodge of cultures was transformed in the half-century after the Franco-Prussian War of 1870-71.
Mr. Weber maintained that before the 20th century, France was largely “a Parisian political project rather than a national reality.” Modern French identity, he said, was a relatively recent creation, a product of mass education, conscription and the coming of modern communications. After “Peasants” was published, Mr. Judt said, Mr. Weber’s thesis became the new orthodoxy.
Several of Mr. Weber’s books explored the development of right-wing nationalism in Europe. To Mr. Weber, nationalism had changed from a humanitarian, Enlightenment-based movement in the early 19th century into an angry, “tribal” and “exclusivist” movement in the 20th century. Despite its affinities with the historic right, the virulent, xenophobic and radical nationalism of the 20th century, he concluded, was statist and anti-individualistic.
In “The Nationalist Revival in France: 1905-1914,” published in 1959, Mr. Weber traced French right-wing nationalism from its royalist roots during the Third Republic beginning in the 1870s to the fascism of the Vichy regime in World War II.
Mr. Weber took a pragmatic approach to history. “Nothing is more concrete than history, nothing less interested in theories or in abstract ideas,” he once wrote. “The great historians have fewer ideas about history than amateurs do; they merely have a way of ordering their facts to tell their story. It isn’t theories they look for, but information, documents, and ideas about how to find and handle them.”
Eugen Joseph Weber was born in Bucharest, Romania, on April 24, 1925, the son of Sonia and Emmanuel Weber, an industrialist. At age 12, he was sent to boarding school in Herne Bay, in southeastern England, and later to Ashville College in the Lake District.
After graduating in 1943, he joined the British Army and was stationed in Belgium, India and occupied Germany. He rose to captain in the King’s Own Scottish Borderers, one of Britain’s oldest regiments. After his service ended in 1947, he attended Cambridge.
While at Cambridge, Mr. Weber spent two yearlong interludes in Paris, studying at the Institut d’Études Politiques and teaching English at the suburban Lycée Lakanal. He fell for France, he said, “just as one falls into love.”
While in London he met Jacqueline Brument, who was taking classes in art history. They married in 1950. Mr. Weber always said that he wrote for his wife, who is his only immediate survivor.
Mr. Weber spent a year as a visiting lecturer at U.C.L.A. and decided in 1954 to move permanently to North America. He taught at the University of Alberta in Edmonton and the
University of Iowa before rejoining the U.C.L.A. faculty in 1956. The move began a tenure that was to last the rest of his life, interrupted only by visiting professorships and other academic work in France and the United States.
Mr. Weber helped to build the U.C.L.A. history department into one of the nation’s best. He was also dean of social sciences in 1976 and dean of the College of Letters and Sciences from 1977 to 1982.
Among his other books are “Paths to the Present: Aspects of European Thought From Romanticism to Existentialism” (1960); “Varieties of Fascism: Doctrines of Revolution in the Twentieth Century” (1964); “The European Right: A Historical Profile” (1965), which he edited and wrote with Hans Rogger, a colleague at U.C.L.A.; “France: Fin de Siècle” (1986); “The Hollow Years: France in the 1930s” (1994); and “Apocalypses: Prophecies, Cults and Millennial Beliefs Through the Ages” (1999).
Mr. Weber was a Fulbright and Guggenheim fellow and in 1977 was decorated with the Ordre National des Palmes Académiques for his contribution to French culture.
In his introduction to a book of collected essays, “My France: Politics, Culture, Myth” (1991), Mr. Weber admitted to having a mind that was “more like a jumbly hayloft than an orderly library.” Tireless in pursuit of detail, he delighted in sharing the fruits of a lifetime spent in provincial archives.
In a review of “My France” for The New York Times Book Review, Mr. Judt gave an example of Mr. Weber’s feel for France and his “richly textured sense of the complex otherness of the recent past” by noting his discussion of why it took “La Marseillaise” almost a century to emerge from a Strasbourg drawing room on April 26, 1792, as the battle song of the Rhine Army to become the French national anthem.
“Mr. Weber reminds the reader that in 1790 only about three million people could speak French; as late as 1893, about one-quarter of the population of 30 million still had not mastered the national language,” Mr. Judt wrote. “Small wonder that it took so long for the ‘Marseillaise’ to catch on — many of those trying to learn the words could not understand them.”

Thursday, May 17, 2007


The Last Temptation of Al Gore
By Eric Pooley Time Magazine Article
Let's say you were dreaming up the perfect stealth candidate for 2008, a Democrat who could step into the presidential race when the party confronts its inevitable doubts about the front runners. You would want a candidate with the grass-roots appeal of Barack Obama—someone with a message that transcends politics, someone who spoke out loud and clear and early against the war in Iraq. But you would also want a candidate with the operational toughness of Hillary Clinton—someone with experience and credibility on the world stage.
In other words, you would want someone like Al Gore—the improbably charismatic, Academy Award–winning, Nobel Prize–nominated environmental prophet with an army of followers and huge reserves of political and cultural capital at his command. There's only one problem. The former Vice President just doesn't seem interested. He says he has "fallen out of love with politics," which is shorthand for both his general disgust with the process and the pain he still feels over the hard blow of the 2000 election, when he became only the fourth man in U.S. history to win the popular vote but lose a presidential election. In the face of wrenching disappointment, he showed enormous discipline—waking up every day knowing he came so close, believing the Supreme Court was dead wrong to shut down the Florida recount but never talking about it publicly because he didn't want Americans to lose faith in their system. That changes a man forever.
It changed Gore for the better. He dedicated himself to a larger cause, doing everything in his power to sound the alarm about the climate crisis, and that decision helped transform the way Americans think about global warming and carried Gore to a new state of grace. So now the question becomes, How will he choose to spend all the capital he has accumulated? No wonder friends, party elders, moneymen and green leaders are still trying to talk him into running. "We have dug ourselves into a 20-ft. hole, and we need somebody who knows how to build a ladder. Al's the guy," says Steve Jobs of Apple. "Like many others, I have tried my best to convince him. So far, no luck."
"It happens all the time," says Tipper Gore. "Everybody wants to take him for a walk in the woods. He won't go. He's not doing it!" But even Tipper—so happy and relieved to see her husband freed up after 30 years in politics—knows better than to say never: "If the feeling came over him and he had to do it, of course I'd be with him." Perhaps that feeling never comes over him. Maybe Obama or Clinton or John Edwards achieves bulletproof inevitability and Gore never sees his opening. But if it does come, if at some point in the next five months or so the leader stumbles and the party has one of its periodic crises of faith, then he will have to decide once and for all whether to take a final shot at reaching his life's dream. It's the Last Temptation of Gore, and it's one reason he has been so careful not to rule out a presidential bid. Is it far-fetched to think that his grass-roots climate campaign could yet turn into a presidential one? As the recovering politician himself says, "You always have to worry about a relapse."
For now, at least, Gore is firmly in the program. He's working mightily to build a popular movement to confront what he calls "the most serious crisis we've ever faced." He has logged countless miles in the past four years, crisscrossing the planet to present his remarkably powerful slide show and the Oscar-winning documentary that's based on it, An Inconvenient Truth, to groups of every size and description. He flies commercial most of the time to use less CO2 and buys offsets to maintain a carbon-neutral life. In tandem with Hurricane Katrina and a rising chorus of warning from climate scientists, Gore's film helped trigger one of the most dramatic opinion shifts in history as Americans suddenly realized they must change the way they live. In a recent New York Times/CBS News poll, an overwhelming majority of those surveyed—90% of Democrats, 80% of independents, 60% of Republicans—said they favor "immediate action" to confront the crisis.
The day that poll was published, in April, I spent some time with Gore, 59, in his hotel room in Buffalo, N.Y., during a break between two slide-show events at the state university. Draped across an easy chair, he looked exhausted—not as heavy as he has been (he is dieting and working out hard these days) but flushed and a little bleary. He was in the throes of an eight-show week—4,000 people in Regina, Sask.; 1,200 in Indianapolis; 2,000 near Utica, N.Y.; a flight to New York City the night before for a meeting with U.N. Secretary-General Ban Ki-moon; then back to Buffalo this morning for a matinee for 4,000 and, soon, an evening show for 6,000. I congratulated him on the poll and mentioned the dozen or so states that—in the absence of federal action—have moved to restrict CO2 emissions. Gore wasn't declaring victory. "I feel like the country singer who spends 30 years on the road to become an overnight sensation," he said with a smile. "And I've seen public interest wax and wane before—but this time does feel different."
So Gore is turning up the pressure. He has testified before both houses of Congress, recommending policies and warning the lawmakers that the Alliance for Climate Protection, his nonprofit advocacy group, will be running ads in their districts next year. He has been meeting privately with the presidential candidates (but won't talk about the meetings or handicap the race). He has trained a small army of volunteers to give his slide show all over the world. And on July 7, he will preside over Live Earth, producer Kevin Wall's televised global rock festival (nine concerts on seven continents in a single day), designed to get 2 billion people engaged in the crisis all at once. Since Gore is sometimes accused of profiting from the climate crisis, it's worth noting that he donates all his profits from the Inconvenient Truth movie and book to the alliance. He can afford to: he's a senior adviser at Google and sits on the board of directors at Apple. He's also a co-founder of Current TV, the cable network that was an early champion of user-generated content, and chairman of Generation Investment Management, a sustainable investment fund with assets approaching $1 billion. "I'm working harder than I ever have in my life," he says. "The other day a friend said, 'Why don't you just take a break, Al, and run for President?'"
That night, at the university of buffalo's Alumni Arena, there was a moment when Gore seemed to be doing just that. After the people—students, middle-aged men and women, retirees—took their seats, images of the earth appeared on three giant screens, and a natural-born teacher took them on a two-hour planetary tour. He was playful, eloquent, fully restored from his afternoon lull. He has given this presentation some 2,000 times yet still imbues it with a sense of discovery. He laid out the overwhelming evidence that human activity has given the earth a raging fever, then urged the people to respond—"If the crib's on fire, you don't speculate the baby's flame retardant! If the crib's on fire, you save the baby!" Yet he was optimistic. There's still time to act—two decades at most, according to the U.N.'s Intergovernmental Panel on Climate Change—and by rising to meet the challenge, this generation will achieve "the enhanced moral authority" it needs to solve so many other problems. Then, suddenly, Gore was laying American democracy itself on the couch, asking why the U.S. has been unable to take action on global warming, why it has made so many other disastrous choices—rushing into war in Iraq, spying on Americans without search warrants, holding prisoners at Guantánamo Bay without due process.
"I'm trying to say to you, be a part of the change," he told the crowd. "No one else is going to do it. The politicians are paralyzed. The people have to do it for themselves!" He was getting charged up now. "Our democracy hasn't been working very well—that's my opinion. We've made a bunch of serious policy mistakes. But it's way too simple and way too partisan to blame the Bush-Cheney Administration. We've got checks and balances, an independent judiciary, a free press, a Congress—have they all failed us? Have we failed ourselves?"
As it happens, these are the themes that animate The Assault on Reason, Gore's new book (an excerpt follows). The crowd seemed to like them—people were hollering and stomping on the aluminum risers—and right on cue, a bright-eyed Buffalo student named Jessica Usborne stood up and asked the Question. "Given the urgency of global warming, shouldn't you not only educate people but also help implement the changes that will be necessary—by running for President?" The place erupted, and Usborne dipped down onto one knee and bowed her head. Her dark hair fell across her eyes and her voice rose. "Please! I'll vote for you!" she cried above the crowd's roar, which sounded like a rocket launcher and lasted almost 30 seconds, all but drowning out Gore's simple, muted, five-word response: "I'm not planning to run."
Sorry, Jessica, there is no stealth campaign. Despite what you may have read, there are no shadowy meetings in which Gore and his operatives plot his path to power. There is no secret plan. There's only a vigorous draft-Gore movement that he has nothing to do with (two independent websites—draftgore.com and algore.org—have gathered almost 150,000 signatures so far) and, from time to time, social events at which old Gore hands get together and play a few friendly rounds of what-if.
Some people who know Gore assume he's biding his time, waiting to pounce; since he's at 12% in the polls—tied with John Edwards, without even being in the race—he would easily get on the primary ballots if he declared before the deadlines. He may not be rich enough to self-finance, but with his Apple and Google stock, Web following and Silicon Valley connections, money wouldn't be a huge problem either. "He just has to say the word," says a wealthy friend. But those who know him well would be very surprised if it happened. He hasn't built a shadow organization. His travel isn't calibrated to the primaries. And he's just not thinking much about politics anymore. "He used to be intensely interested in political gossip—who's up in the latest poll, and did you hear about so-and-so," says Carter Eskew, an old friend and former media adviser. "I haven't had a conversation like that with him since 2002 or 2003 [around the time he decided not to seek a rematch against Bush]. He's moved on, at least for the time being." In recent months, as Gore moneymen were recruited by other campaigns, they checked in with Gore. "I said, 'If I'm raising money for the wrong person, please tell me,'" says one. "Everyone asked that question, and his answer was always the same: 'Don't keep your money in your pocket waiting for me.'"
People looking for signs that Gore has a secret plan often point to the fact that he has lost a few pounds and hopes to lose many more. They mention that he hasn't asked the draft organizers to stop, the way he did before the 2004 election. They point out that in May, a group of former Gore fund raisers met at the Washington home of his onetime chief of staff, Peter Knight. (Someone handed out buttons that said al gore reunion 2007, but it was just a social event; Gore didn't attend.) They cite October as a good time for him to get in, since that's when the Nobel Committee announces its Peace Prize. Finally, they point to The Assault on Reason, the sort of book that could be a talisman of intent, since it takes aim at George W. Bush from multiple directions, diagnoses what's wrong with our democracy and offers ideas for curing it. Why else would you write a book like that, they say, if you weren't laying down a marker for 2008?
Al and Tipper Gore's home, a 1915 antebellum-style mansion in the wealthy Belle Meade section of Nashville, is laid out a bit like Gore himself: a gracious and formal Southern façade; slightly stuffy rooms when you walk in the door; and startlingly modern, relaxed, informal living spaces to the rear. The Gores bought the old place five years ago and are still retrofitting it, making it energy efficient with new windows, new heating and cooling units, solar panels on the roof. (The anti-Gore crowd zinged him recently because his electricity bill last August was 10 times the local average. The Gores pay extra to get 100% of their power from renewable sources, and their zealous retrofitting will no doubt bring their costs down. But it stung.) A new addition has a slate-floor family room (with a pool table and a flat-panel TV; Tipper's drum set and some nice acoustic guitars are nearby) and a gym and an office suite upstairs; there's a set of his-and-hers hybrid Mercury suvs in the garage. Al Gore and I settle down on the patio, near the swimming pool and the barbecue. "Did some grilling last night with my friend Jon Bon Jovi," he says. "His new record is great." He props his black cowboy boots on a brightly painted folk-art coffee table, scratches his mutt Bojangles behind the ears and talks about The Assault on Reason.
"The real reason I wrote the book," he begins, "is that I've tried for years to tell the story of the climate crisis, and it has taken far too long to get through. When the best evidence is compiled and there's no longer room for dragging out a pointless argument, we're raised as Americans to believe our democracy is going to respond. But it hasn't responded. We're still not doing anything. So I started thinking, What's going on here?" While Gore was mulling that, another test of American democracy presented itself—the walk-up to war in Iraq—and American democracy flunked again. "In both cases, our democracy was pushed around by false impressions and wasn't able to hold its focus," he says. "That's the common denominator. Once I'd thought through all of that, I couldn't not write this book."
The Assault on Reason will be hailed and condemned as Gore's return to political combat. But at heart, it is a patient, meticulous examination of how the participatory democracy envisioned by our founders has gone awry—how the American marketplace of ideas has gradually devolved into a home-shopping network of 30-second ads and mall-tested phrases, a huckster's paradise that sells simulated participation to a public that has all but lost the ability to engage. Gore builds his argument from deep drafts of political and social history and trenchant bits of information theory, media criticism, computer science and neurobiology, and reading him is by turns exhausting and exhilarating. One moment he is lecturing you about something you think you know pretty well, and the next moment he's making a connection you had never considered. The associative leaps are dazzling, but what will stoke the Democratic faithful are his successive chapters on the Iraq war, each one strafing the Administration for a different set of misdeeds: exploiting the politics of fear, misusing the politics of faith, misleading the American people, throwing out the checks and balances at the heart of our democracy, undermining the national security and degrading the nation's image in the world. For anyone who stepped into the Oval Office now and tried to end the war, he says, "it would be like grabbing the wheel of a car that's in mid-skid. You're just trying to work the wheel to see what pulls you out of it." But the mess we're in can't be blamed solely on the President or the Vice President or the post-9/11 distortion field that muzzled the media, immobilized Congress and magnified Executive power. "I think this started before 9/11, and I think it's continued long after the penumbra of 9/11 became less dominant," he says. "I think it is part of a larger shift driven by powerful forces"—print giving way to television as our dominant medium for examining ideas, television acting on our brains in ways that scientists are just beginning to unlock. As such, it's not the sort of problem that legislation is going to fix. Gore hopes that the Internet, which is so good at inviting people back into the conversation, will be the key to restoring American democracy. "It's going to take time," he says. "After all, we've been veering off course for a while."
If that sounds like a reference to 2000, so be it. But some will be disappointed to learn that Gore's book does not contain his long-suppressed account of that contentious year. He has never opened up publicly about the Florida debacle, and even in private he avoids the topic. Friends say he thinks the Supreme Court basically stole the election, but he won't say it. He has never indulged in postmortems—not even in the immediate aftermath. His psychological survival depended on looking ahead. "It was all about what's next," says his friend Reed Hundt, who was FCC chairman during the Clinton years. "He was not willing to be a victim—didn't want to call himself that, didn't want people to think of him that way. He didn't want Americans to doubt America."
Gore often compares the climate crisis to the gathering storm of fascism in the 1930s, and he quotes Winston Churchill's warning that "the era of procrastination" is giving way to "a period of consequences." To his followers, Gore is Churchill—the leader who sounds the alarm. And if no declared candidate steps up to lead on this issue, many of them believe he will have a "moral obligation"—you hear the phrase over and over—to jump in. "I understand that position and I respect it, but I'm not convinced things will evolve that way," says Gore. "If I do my job right, all the candidates will be talking about the climate crisis. And I'm not convinced the presidency is the highest and best role I could play. The path I see is a path that builds a consensus—to the point where it doesn't matter as much who's running. It would take a lot to disabuse me of the notion that my highest and best use is to keep building that consensus."
What it would take, specifically?
"I can't say because I'm not looking for it. But I guess I would know it if I saw it. I haven't ruled it out. But I don't think it's likely to happen."
His wife is more blunt. "He's got access to every leader in every country, the business community, people of every political stripe," says Tipper. "He can do this his way, all over the world, for as long as he wants. That's freedom. Why would anyone give that up?"
Gore knows it's in his interest to keep the door ajar. It builds curiosity. Before he could get serious about running, however, he would have to come to terms with the scars of 2000 and accept the possibility that he could lose again in 2008. That prospect may be too much to bear. "If he ran, there's no question in my mind that he would be elected," says Steve Jobs. "But I think there's a question in his mind, perhaps because the pain of the last election runs a lot deeper than he lets most of us see." There's an even deeper issue here, and with Gore, it's always the deepest issue that counts. What's at stake is not just Gore losing another election. It's Gore losing himself—returning to politics and, in the process, losing touch with the man he has become.
He was never quite the wooden Indian his detractors made him out to be in 2000 (nor did he claim to have invented the Internet), but he did carry himself with a slightly anachronistic Southern formality that was magnified beneath the klieg lights of the campaign. And his fascination with science and technology struck some voters (and other politicians) as weird. "In politics you want to be a half-step ahead," says Elaine Kamarck, his friend and former domestic-policy adviser. "You don't want to be three steps ahead." But now his scientific bent has been vindicated. The Internet is as big a deal as he said it would be. Global warming is as scary as he had warned. He wasn't being messianic, as people used to say, just prescient. And today he's still the same serious guy he always was, but the context has changed around him. He used to spend his time in Washington, but now his tech work takes him to Silicon Valley, to the campuses of Apple and Google, where his kind of intellectual firepower is celebrated. At Apple, where Jobs invited him to join the board in 2003, Gore patiently nudged the ceo to adopt a new Greener Apple program that will eliminate toxic chemicals from the company's products by next year. Last summer, Gore led the committee that investigated an Apple scandal—the backdating of stock options in the years before Gore joined the board—and cleared Jobs of wrongdoing. Political people were surprised Gore took that controversial assignment. "That's silly," he says.
Gore's role at Google is less formal. He started as a senior adviser when it was still a small company, before the IPO. "I assumed he'd give us geopolitical advice," says CEO Eric Schmidt, "and he did—but he was also superb at management and leadership. He likes to dive into teams that don't get a lot of attention—real engine-room stuff, like problems inside an advertising support group. He offers his strategies and solutions and then goes on his way. It's fun for him."
"It aggravates me when people say, 'He's the real Al Gore now' or 'He's changed,'" says Tipper. "Excuse me! He hasn't changed that much. This is somebody I have always known." The old Gore, she says, "was an unfair stereotype painted by cliques in the media and Republican opponents. Now, yes, there were constraints"—the vice presidency, the Monica mess, the campaign—"that weighed on him. And, yes, you grow and you change and you learn. So I see the same person, and I also see a new person who is free and liberated and doing exactly what he wants to do. And that is fabulous."
That's the person Gore would risk losing if he re-entered politics. "He learned something from his very difficult time after 2000," says Schmidt. "I think he got more comfortable with who he is. He had to go through a difficult personal transformation in order to achieve greatness. That sets him up for the next chapter. I have no idea what he'll do. My advice is to do whatever he's most passionate about. Because that is working."
"The slide show is a journey," says Gore, standing beside his trusty screen in a Nashville hotel ballroom. It's mid-March, and he's addressing 150 people—students, academics, lawyers, a former Miss Oklahoma contestant, a fashion designer, a linebacker for the Philadelphia Eagles. They've come at their own expense to learn how to give the slide show. There's an undeniable buzz in the room, the feeling that takes over a group that knows it's part of something that's big and getting bigger.
It has been five years since Tipper first urged her husband to dust off his slide show. The couple was still climbing from the wreckage of 2000, and she was convinced that his survival depended on reconnecting with his core beliefs. He assembled the earliest slide show in 1989, while writing Earth in the Balance—carrying an easel to a dinner party at David Brinkley's house, standing on a chair to show CO2 emissions heading off the charts. She wanted him to find that passion again. They were living in Virginia, and the Kodak slides were gathering dust in the basement. So he pulled them out, arranged them in the carousel and gave his first show with the images mostly backward and upside down. Tipper said, "Hey, Mr. Information Superhighway, they have computers now. Maybe you should use one."
A year passed before they realized what a phenomenon this was becoming. "We were on tour, doing the slide show, and men and women would come up to Al after," Tipper says. "Silently weeping." The weather started getting unmistakably weird, and Gore kept working on the slides, making the show more powerful. Producer Laurie David and director Davis Guggenheim saw it and asked him to turn it into a film. Gore didn't think it would work as a movie. It has now grossed $50 million globally and sold more than 1.5 million dvd copies, and its viral effect continues. In Los Angeles, producer Kevin Wall saw it and decided to put on the global extravaganza called Live Earth. In Washington, a retired Republican businessman named Gary Dunham—in town from Sugarland, Texas, for his wife's Daughters of the American Revolution convention—saw it and started giving his own version of the show to anyone who would listen. Dunham became the first of more than 1,200 to be trained as presenters. "All the trainees will tell you the same thing," he says. "That movie changed our lives."
In the ballroom, Gore gives the trainees some advice about the limits of time and complexity. ("Trust me on this. If audiences had an unlimited attention span, I'd be in my second term as President.") Even more important is the hope budget. "You're telling some not only inconvenient truths but hard truths, and it can be scary as hell. You're not going to get people to go with you if you paralyze them with fear."
And then, for the next five hours, Gore walks them through it, slide by slide, deconstructing the art and science, making it clear both how painstakingly well crafted and how scrupulous it is. He relishes the process, taking his time, bathing these people in a sea of data in which he has been splashing happily for years. He punctuates his presentation with pithy attention grabbers—"O.K., here's the key fact ... Here's your pivot ..."—and brings to bear much of what he knows about politics. "Here's something you need to know about for defensive purposes," he says, explaining the science behind a terrifying series of slides illustrating how a 20-ft. rise in sea level would swamp Florida, San Francisco, the Netherlands, Calcutta and lower Manhattan. The trainees are scribbling hard, arming themselves. Gore smiles. He was always better at political combat than people give him credit for. Later, a woman stands up in the back of the big room and asks the Question. "Not to put any pressure on you," she says, "but, by golly, we deserve a leader like you." They've got one—whether or not he runs.
"I have enjoyed the luxury of being able to focus single-mindedly on this issue," says Gore, back on the patio at his Nashville home. "But I am under no illusions that any position has as much ability to influence change as the presidency does. If the President made climate change the organizing principle, the filter through which everything else had to flow, then that could really make a huge difference."
What would President Gore do? Well, on Capitol Hill in March, Citizen Gore offered his ideas. He advocates an immediate freeze on CO2 emissions and a campaign of sharp reductions—90% by 2050. To get there, he would eliminate the payroll tax and replace it with a carbon tax, so the cost of pollution is finally priced into the market. "I understand this is considered politically impossible," he told the House Energy and Commerce Committee. "But part of our task is to expand the limits of what's possible." He would adopt a cap-and-trade program that would allow U.S. industry to meet reduction targets in part by trading pollution credits. Critics often dismiss carbon offsets as the green equivalent of religious indulgences, but in fact they stimulate the market—moving entrepreneurs to find dirty plants, clean them up and sell the CO2 reductions. Gore also wants a moratorium on new coal-fired power plants that don't capture and store their carbon emissions and much higher fuel-economy standards for cars. After Gore presented these views on Capitol Hill, critics assailed them as costly, unworkable economy cripplers. His reply: in a few years, when the crisis worsens, these proposals "will seem so minor compared to the things people will be demanding then." And, of course, he's not running for anything these days. He's in the vision business now.
I ask Gore if he regrets not having made climate change the organizing principle of his cautious 2000 campaign. Doing so might not have won many votes by itself, but it might have helped free him from the consultants, unleashing a more authentic Gore—and that could have made all the difference. "There's a tree-falls-in-the-forest factor here," he says. "Because the many speeches that I made about this were not really reported. More than half the articles written about global warming that year said it might not even be real. But I take responsibility for not having the skills needed to break through the clutter. At least not then. Perhaps I still don't."
But what if he does? What if he could take who he is now, all that he's learned, and carry it back into the maelstrom? Could he stay as he is or would he revert? What if he launched a new kind of campaign: no handlers, just the liberated Gore talking about what really matters to him? Would he seem too squishy? These days he improvises, giving freer rein to matters of the heart and spirit than he ever could as a candidate. He draws from a number of faiths, from philosophy and self-help and poetry and from Gandhi's concept of truth force, the idea that people have an innate ability to recognize the most powerful truths. He often cites an African proverb that says, "If you wish to go quickly, go alone. If you wish to go far, go together." Then he builds on it. "We have to go far, quickly," he said in April at the Tribeca Film Festival, where he was introducing a series of environmental films that will be shown at Live Earth. "We have to make it through an uncharted region, to the outer boundaries of what's known, beyond the limits of what we imagine is doable." Then he recited a famous line from the poet Antonio Machado: "Pathwalker, there is no path. You must make the path as you walk." I once heard him get tangled in that line during the 2000 campaign, but this time, he wasn't trying too hard. "We must find a path that we create together, quickly," he said. "With truth force. To seize the opportunity that lies before us." His words were simple, direct and powerful. One clue to how he found that power lies at the end of the poem, in a line Gore doesn't recite, as the poet reveals his desire "to be what I have never been ... a man all alone, walking with no road, with no mirror."
Gore is not carrying a mirror. He's not selling himself; he's selling a cause, a journey. There are no consultants fussing at him, telling him how to be himself. "There's no question I'm freed up," he says. "I don't want to suggest that it's impossible to be free and authentic within the political process, but it's obviously harder. Another person might be better at it than I was. And it's also true that the process is changing and that it may become freer in time. Obama is rising because he is talking about politics in a way that feels fresh to people ... But anyway, I came through all of that"—he waves a hand that seems to encompass everything, the advisers pecking at him, the attacks in the media, his own mistakes, the unspeakable Florida debacle—"and I guess I changed. And now it is easier for me to just let it fly. It's like they say: What doesn't kill me makes me stronger." What would this Gore be like as a candidate? This Gore is just not all that tempted to find out.
Faith-Based Fraud
By Christopher Hitchens
Posted Wednesday, May 16, 2007, at 12:46 PM ET
The discovery of the carcass of
Jerry Falwell on the floor of an obscure office in Virginia has almost zero significance, except perhaps for two categories of the species labeled "credulous idiot." The first such category consists of those who expected Falwell (and themselves) to be bodily raptured out of the biosphere and assumed into the heavens, leaving pilotless planes and driverless trucks and taxis to crash with their innocent victims as collateral damage. This group is so stupid and uncultured that it may perhaps be forgiven. It is so far "left behind" that almost its only pleasure is to gloat at the idea of others being abandoned in the same condition.
The second such category is of slightly more importance, because it consists of the editors, producers, publicists, and a host of other media riffraff who allowed Falwell to prove, almost every week, that there is no vileness that cannot be freely uttered by a man whose name is prefaced with the word Reverend. Try this: Call a TV station and tell them that you know the Antichrist is already on earth and is an adult Jewish male. See how far you get. Then try the same thing and add that you are the Rev. Jim-Bob Vermin. "Why, Reverend, come right on the show!" What a fool Don Imus was. If he had paid the paltry few bucks to make himself a certified clergyman, he could be jeering and sneering to the present hour.
Falwell went much further than his mad 1999 assertion about the Jewish Antichrist. In the time immediately following the assault by religious fascism on American civil society in September 2001, he used his regular indulgence on the airwaves to commit treason. Entirely exculpating the suicide-murderers, he asserted that their acts were a divine punishment of the United States. Again, I ask you to imagine how such a person would be treated if he were not supposedly a man of faith.
One of his associates, Bailey Smith, once opined that "God does not hear the prayers of a Jew." This is one of the few anti-Semitic remarks ever made that has a basis in fact, since God does not exist and does not attend to any prayers, but Smith was not quite making that point. Along with his friend Pat Robertson, who believes in secret Jewish control of the world of finance, and Billy Graham, who boasted to Richard Nixon that the Jews had never guessed what he truly thought of them, Falwell kept alive the dirty innuendo about Jews that so many believing Christians seem to need. This would be bad enough in itself, and an additional reason to deplore the free ride he was given on television, if his trade-off had not been even worse.
Seeking to deflect the charge of anti-Jewish prejudice, Falwell adopted the cause of the most thuggish and demented Israeli settlers, proclaiming that their occupation of the West Bank and Gaza was a holy matter and hoping that they might help to bring on Armageddon and the return of the Messiah. A detail in this ghastly narrative, as adepts of the "Left Behind" series will know, is that the return of the risen Christ will require the mass slaughter or mass conversion of all Jews. This consideration did not prevent Menachem Begin from awarding Falwell the Jabotinsky Centennial Medal in 1980 and has not inhibited other Israeli extremists from embracing him and his co-thinkers ever since. All bigots and frauds are brothers under the skin. Trying to interrupt the fiesta of piety on national television on the night of Falwell's death, I found myself waiting while Ralph Reed went all moist about the role of the departed in empowering "people of faith." Here was the hypocritical casino-based Christian who sought and received the kosher stamp from Jack Abramoff. Perfect.
Like many fanatical preachers, Falwell was especially disgusting in exuding an almost sexless personality while railing from dawn to dusk about the sex lives of others. His obsession with homosexuality was on a par with his lip-smacking evocations of hellfire. From his wobbly base of opportunist fund raising and degree-mill money-spinning in Lynchburg, Va., he set out to puddle his sausage-sized fingers into the intimate arrangements of people who had done no harm. Men of this type, if they cannot persuade enough foolish people to part with their savings, usually end up raving on the street and waving placards about the coming day of judgment. But Falwell, improving on the other Chaucerian frauds from Oral Roberts to Jim Bakker to Ted Haggard, not only had a TV show of his own but was also regularly invited onto mainstream ones.
The evil that he did will live after him. This is not just because of the wickedness that he actually preached, but because of the hole that he made in the "wall of separation" that ought to divide religion from politics. In his dingy racist past, Falwell attacked those churchmen who mixed the two worlds of faith and politics and called for civil rights. Then he realized that two could play at this game and learned to play it himself. Then he won the Republican Party over to the idea of religious voters and faith-based fund raising. And now, by example at least, he has inspired emulation in many Democrats and liberals who would like to borrow the formula. His place on the cable shows will be amply filled by Al Sharpton: another person who can get away with anything under the rubric of Reverend. It's a shame that there is no hell for Falwell to go to, and it's extraordinary that not even such a scandalous career is enough to shake our dumb addiction to the "faith-based."Christopher Hitchens is a columnist for Vanity Fair and the author of
God Is Not Great: How Religion Poisons Everything.

Tuesday, May 15, 2007

Conservative Evangelist Jerry Falwell Dies at 73
By Joe HolleyWashington Post Staff WriterTuesday, May 15, 2007; 5:56 PM
Jerry Falwell, 73, a Southern Baptist preacher who as founder and president of the Moral Majority presided over a marriage of Christian religious belief and conservative political values -- a bond that bore prodigious fruit for the Republican Party during the past 25 years -- died May 15 of congestive heart failure after he was found unconscious in his office at Liberty University in Lynchburg, Va.
According to a school spokesman, he was taken to Lynchburg General Hospital, where CPR efforts were unsuccessful.
With his outspoken pronouncements on matters moral, political and religious, Falwell became not only one of the most polarizing religio-political figures in America but also one of the most powerful. He built one of the nation's first mega-churches, founded a cable television network and a growing Bible-based university, and was considered the voice of the religious right in the early 1980s. In 1983, U.S. News & World Report named him one of the 25 most influential people in America.
Although his political influence and public profile had diminished in recent years as he devoted more of his time to Liberty University, his positions on a number of core issues have become canonical for the mainstream of the modern Republican Party. Liberty also has become a stop on the campaign trail for Republican presidential candidates, most recently Arizona
Sen. John McCain.
On Monday , six years after labeling Falwell one of the political "agents of intolerance," McCain delivered the commencement address at the university. Falwell told The Washington Post that he believed a resolution of their past differences helped McCain politically, noting the political power of some 80 million evangelicals in the United States.
"For all his critics, he was the most instrumental person in getting a heretofore apolitical group to become politically engaged. And that's no small accomplishment," said Michael Cromartie, an expert on evangelicals at the Washington-based Ethics and Public Policy Center.
A large man, whose preacherly voice and cocksure confidence could drive his detractors into paroxysms of rage, he had a penchant for provocative comments. Perhaps his most provocative came on Sept. 13, 2001, when he appeared on "The 700 Club," the Rev. Pat Robertson's TV show, and blamed pagans, abortionists, feminists, gays, the ACLU and others for 9-11.
"I point the finger in their face and say, 'You helped this happen,'" he said.
He later apologized, telling Geraldo Rivera that his words were the result of fatigue.
"I would never blame any human being except the terrorists, and if I left that impression with gays or lesbians or anyone else, I apologize," he told CNN.
Although Falwell told Geraldo Rivera his choice of words were the result of fatigue, he was a master media provocateur, said Mel White, Falwell's former speechwriter and ghostwriter for many of his books.
"He was a media genius, but part of that was in exaggerating, hyperbole and outrageousness," said White. "He told me once that if he didn't have people protesting him, he'd have to hire them. He felt it was publicity for the kingdom of God."
White, who left Falwell's employ when he announced in 1994 that he was gay, continued to attend Falwell's church and to live with his partner across the street from the church. He and Falwell remained friends.
"I invested so much in believing he could change, but he went and died instead," White said.
Before Falwell, Southern Baptists and most other evangelical Christian groups were reluctant to get involved in "things of this world," including politics; they had their eyes on what they considered higher things, primarily saving souls. When Falwell founded Moral Majority in 1978, fellow fundamentalist Bob Jones called the organization "the work of Satan," because the organization was making common cause with Catholics, Mormons and Jews in an ecumenical-political alliance. "Many people forget that Falwell had critics to his right," Cromartie said.
"He came to understand that if people of faith were not engaged in the larger culture, eventually the culture would move in a direction so hostile to its values it would be difficult to live in that culture," said Ralph Reed Jr., former executive director of the Christian Coalition. "If the culture becomes polluted, then ultimately the church and the faith community suffer."
Meanwhile Falwell, Pat Robertson, Jim Bakker and other televangelists throughout the South quickly mastered the new media available to them, primarily cable television, and built huge new audiences of people hungry for traditional values and increasingly agitated by what they saw as the moral decline of America. Falwell, who started out doing local radio and television in Lynchburg, became president of the Liberty Broadcasting Network in 1985.
In an interview with the Lynchburg News and Advance available on the Jerry Falwell Ministries website, Falwell contended that "America began losing her soul only a generation ago." He decried prayer expelled from public schools, legalized abortion, a high divorce rate, teen pregnancy, a drug epidemic, the gay and lesbian lifestyle, school violence and pornography. "America is in serious jeopardy of self-destructing," he said.
Liberals, leftists, anti-God politicians and activist judges were primarily to blame, Falwell proclaimed over the years. A fusion of politics and conservative Christian piety became the antidote.
Falwell founded the Moral Majority with the express purpose of organizing a Christian right electorate, registering voters, raising funds for candidates and exerting political leverage at state and national levels. The organization first applied that leverage in Ronald Reagan's election to the presidency in 1980, helping forge a bond between the Republican Party and the religious right that remains strong.
Although George H. W. Bush shared many of the same positions as his predecessor, he didn't claim the affections of the religious right the way Reagan did. After enduring eight long years in the Bill Clinton wilderness -- despite Clinton's Southern Baptist heritage -- the religious right became increasingly euphoric during the first term of George W. Bush's administration, as political adviser Karl Rove assiduously courted Falwell and other religious right leaders.
"Moral Majority by necessity became the lightning rod of the conservative movement," Falwell told the New York Times in 1987. "It was first. It was extremely successful in 1980. And that brought down a firestorm from all who disagreed."
In 1983, just as the "firestorm" began to rage, Larry Flynt's sex magazine Hustler carried a parody of a Campari ad that featured a fake interview with Falwell in which he admits to incest with his mother. He sued, alleging invasion of privacy, libel and intentional infliction of emotional distress. A jury rejected the invasion of privacy and libel claims, holding that parody could not reasonably be considered a description of actual events, but ruled in favor of Falwell on the emotional distress claim.
After the ruling was upheld on appeal, Flynt appealed to the U.S. Supreme Court and won. The court confirmed that public figures cannot recover damages based on emotional distress caused by parodies.
In 1987, Falwell took over the scandal-plagued PTL ministry of its disgraced founder Jim Bakker. PTL gave him access to a nationwide cable television network that reached 13.5 million homes. Unable to salvage the Bakker empire, with its deficit of some $70 million, he resigned a few months later.
During the 1980s, Falwell preached three times a week at the Thomas Road Baptist Church in Lynchburg, hosted "The Old Time Gospel Hour on TV stations around the country, taped a half-hour of Bible study for daily broadcasts on several hundred radio stations and a five-minute news commentary carried on radio stations three times a day, and covered up to 5,000 miles a week by jet for appearances at rallies and meetings.
"I am not a Republican, I am not a Democrat! I am a noisy Baptist!" he told crowds of supporters.
Falwell dissolved the Moral Majority in 1989, insisting the organization had accomplished what he had set out to accomplish.
"He had awakened the slumbering giant of evangelical politics and made it a force to be reckoned with," Reed said. "It has become the most critical and vibrant constituency in the American electorate, certainly on the Republican side."
Jerry Lamon Falwell was born August 11, 1933, in Lynchburg, where he grew up in the rough, blue-collar neighborhood of Fairview Heights. His family was relatively affluent, thanks to his father, a Prohibition-era bootlegger who owned a bus line, gas stations, a nightclub, a restaurant and a motel. An alcoholic and an agnostic who hated preachers (until he converted to Christianity while lying on his deathbed), he died when Falwell was 15.
In high school, Falwell played football and edited the school newspaper. He graduated as valedictorian, but the principal would not allow him to give the valedictory address because of a minor bit of mischief he had committed back in the third grade.
He was an 18-year-old student at Lynchburg College, the first in his family to get past high school, when he became a Christian in 1952. He had dreamed about being a professional baseball player and seriously considering becoming a journalist, but soon began to feel a call to the ministry.
After two years at Lynchburg College, he transferred to Baptist Bible College, a radically fundamentalist, unaccredited school in Springfield, Mo., where, he said, "God literally turned my life around." Working part time as a youth pastor at Kansas City Baptist Temple, he was invited to bring the Sunday morning message on a day the regular minister was out of town. When 19 of his listeners responded to his sermon by giving their lives to Christ, he knew in his heart that preaching was his God-given work.
Falwell returned to his hometown in 1956 and, at 22, founded Thomas Road Baptist Church in the old Donald Duck Cola building. In his autobiography, "Strength for the Journey" (1987), he explained how he built the congregation, starting with 35 charter members. He knocked on a hundred doors a day, on some days from 9 in the morning until 10 at night, knowing he might encounter "a sick child who needed prayer, a lonely and frightened widow who needed someone to talk to, an isolated alcoholic who wanted help. . . ."
Those people, and their friends and families, made their way to Falwell's church. Today Thomas Road has more than 22,000 members and held its 50th anniversary celebration last year in a new building near Liberty University. Falwell also built Christian elementary schools, the Elam Home for alcohol and drug-dependent men and the Liberty Godparent Home for unwed mothers.
He was chancellor and president of Liberty University, an institution he founded in 1971 as Lynchburg Baptist College, later renamed Liberty Baptist College. The college opened with 154 students and four full-time faculty members. Today the university enrolls more than 10,000 students. On its 4,400-acre campus is a tombstone-shaped shrine to fetuses aborted since the Supreme Court legalized the practice in 1973 and a museum of "creationism."
Falwell intended Liberty to be for evangelical Christians what Brigham Young University was to Mormons and Notre Dame University to Catholics. He hoped to raise $100 million in endowment funds during his lifetime and expand the student body to 20,000.
The university's website noted that he was regularly seen driving around campus in his Suburban and was called Jerry by most of the students, many of whom he knew by name. He rarely missed a home football, basketball or baseball game.
Falwell continued as senior pastor at Thomas Road Baptist Church but turned over the daily administration of the church to his son, Jonathan, and 15 other ministers. His philosophy -- to change with the culture without abandoning core principles -- made the church successful, David Randlett, the church's senior associate pastor, told The Washington Post in 2005.
"Most older ministers can't do that, but Jerry Falwell is unique," Randlett said. "He knows the Bible doesn't change, but the delivery has to, in order to speak the language of the public."
He could still be provocative. An outspoken supporter of South Africa's apartheid regime in the 1980s, he visited the country and voiced his support for the white minority government.
In 1999, he told an evangelical conference that the Antichrist was a male Jew alive in the world today. He later apologized for his remarks but not for holding the belief. That same year he warned parents that Tinky Winky, a character on the children's TV show "Teletubbies," was a gay role model.
On "60 Minutes" in 2002, he labeled Muhammad a terrorist.
In 2004, after voters told pollsters that moral values were important to them in the presidential election, Falwell founded the Faith and Values Coalition, calling it the "21st century resurrection of the Moral Majority." The organization's objectives included support for anti-abortion judges and a constitutional amendment banning same-sex marriage.
Survivors include his wife of 49 years, Macel Pate Falwell of Lynchburg; three children, Jerry Falwell Jr. and Jonathan Falwell, both of Lynchburg, and Jeannie Falwell Savas of Richmond; and eight grandchildren

Sunday, May 13, 2007

This perfect storm will finally destroy the neocon project
Americans are sick of the unrepentant arrogance of this elite. But the realisation has come at a very heavy cost Geoffrey WheatcroftFriday May 11, 2007The Guardian
Now and again people have found themselves in places where the course of history was dramatically changed: Paris in 1789, Petrograd in 1917, Berlin in 1989. Sometimes the feeling of momentous change is illusory. When Tony Blair won his first election 10 years ago, perfectly sane people proclaimed that "these are revolutionary times". As most of us realised long before his ignominious departure, that was just what they weren't.
And yet to visit the US at present, as I have done, is to experience an overwhelming sensation of drastic impending change. It's not merely that President Bush, to whom Blair so disastrously tethered himself, is "in office but not in power". Most Americans can't wait for him to go, Congress is beyond his control, and the Senate majority leader, Senator Harry Reid, has told him that the war in Iraq is lost - for which statement of the obvious Reid was accused of "defeatism" by the vice-president, Dick Cheney.
Besides that the portents range from Paul Wolfowitz's travails at the World Bank to the Senate interrogation of Alberto Gonzales, the attorney general, and the trial of Conrad Black. This might sound like the "succession of small disasters, oh trifling in themselves", in Alan Bennett's Forty Years On ("a Foreign Secretary's sudden attack of dysentery at the funeral of George V, an American ambassador found strangled in his own gym-slip...") And yet there really is an observable pattern.
Along with the collapse of Bush's authority, all these episodes are connected to the great disaster in Iraq. And all illustrate the hubristic, impenitent arrogance of the people who have been guiding America's destiny - as well as ours, alas - for the past six years. What one senses so acutely are the conditions building for a political perfect storm, which will engulf and destroy the whole neoconservative project.
In Washington I took part in a debate with Christopher Hitchens, my old sparring partner and drinking companion (mots justes, all of them), who supports Bush with a defiance worthy of a better cause. He surpassed himself by insisting that his friend Wolfowitz is a wronged man. A World Bank committee reportedly disagrees, and has found that Wolfowitz did violate the bank's rules in the matter of his lady friend's salary.
But in any case everyone else in Washington says the same thing: Wolfowitz cannot survive. His appointment was widely resented in the first place - the German, French, Dutch and Scandinavian governments have warned that they might withhold funds if he stays in office; and severe damage is being done to the organisation he claims to have at heart by his refusal to accept reality.
Then again, detachment from reality is perhaps to be expected from one of the architects of the war, a man who thought that the Iraqis would rise up to greet the American army as liberators. As the Nobel-winning economist Joseph Stiglitz said, Wolfowitz and his cabal "do not seem to understand that being president of the World Bank is a privilege, not an entitlement".
Gonzales was just a Texan hack lawyer who acted as Bush's consigliere, but he made his contribution to the great enterprise when he ruled that torture could be justified in the "war on terror". His Senate hearing provided a little comic relief, what with his acute amnesia followed by the deathless admission that "I now understand there was a conversation with myself and the president". One day Blair may understand that there was a conversation between himself and the president about the invasion of Iraq, and that his commitment to the war took place much earlier than he has ever admitted.
While Lord Black has never worked for the Bush administration, he was aligned with the neocon elite through the National Interest, the journal he used to publish, and he brought some of its members, such as Richard Perle, on to the board of his companies. Perle seems to have taken his fiduciary duties as lightly as he and his colleagues took the problems that would arise in Iraq as a result of the invasion. What has struck me about Black's trial was that we were hearing another version of the arrogance and denial we have heard from Wolfowitz and many others. It will give me no particular pleasure if my former employer is banged up, but his downfall is another grave blow for the neocons.
All of which has vital implications for British politics. Nicolas Sarkozy has been called "an American neoconservative with a French passport", which he is not. But Blair really is an American neoconservative with a British passport. He revealingly and accurately said that "there isn't a world of difference" between himself and the neocons politically, and his party must now, as it shakes off the burden of these past years, ask itself what, in that case, he was doing as Labour leader.
The Tories have questions of their own. Even the stupidest have grasped that the war and the American alliance are unpopular with the electorate, but they should now ask if sceptical, pragmatic Conservatism ever had anything in common with neoconservatism and its vast revolutionary scheme. One who did understand is Matthew Parris, the former Tory MP. Before the 2004 presidential election he said he wanted Bush re-elected: his presidency was halfway through an "experiment whose importance is almost literally earth-shattering" and should be played out to its inevitable failure.
But that failure must be demonstrated beyond contradiction. "The theory that liberal values and a capitalist system can be spread across the world by force of arms... should be tested to destruction ... The president and his neoconservative court should be offered all the rope they need to hang themselves."
His wish has come true; neocons are dangling all around us. In a flicker of self-knowledge, Wolfowitz told a recent World Bank meeting: "I understand that I've lost a lot of trust, and I want to build that trust back up." But it's too late, for him and all the other courtiers. They never really enjoyed the trust of most Europeans, let alone Africans and Asians, and they have now lost the trust of the American people.
All the readings on the barometer and the wind gauge say the same thing. The perfect storm is gathering. Unfortunately the collapse of the neocon project comes at a very heavy cost, not only to the people of Iraq but to all of us.
· Geoffrey Wheatcroft is the author of Yo, Blair!
wheaty@compuserve.com

Tuesday, May 08, 2007

Read it and weep!
Rethinking the war
By Andrew Sullivan
We are used to thinking of the war in Iraq in terms of what has happened to Iraq. And this is a completely defensible priority. Maybe two million of the country's crucial elite have fled. Perhaps as many have been forced to relocate within the country. The infrastructure has been shattered; Baghdad remains a place where 30 bodies appear on the streets overnight, even during the "surge"; suicide bombers continue their rampage through the country almost at will. The war has given al Qaeda a new base of operations and a new front. And the occupation continues to provide them with more recruits.
But we have not considered as much the damage that has been done to America. The first casualty has been the military itself. This war is now in its fifth year. In Iraq, there is no safe zone anywhere. The tours of duty are much longer than at any time in recent U.S. history. While equipment has been eroded very quickly in the punishing desert of the Middle East, the human toll has been perhaps more profound. Over three thousand dead minimizes the toll - because so many seriously wounded soldiers now survive but with terrible and permanent injuries. The
psychological toll on an over-stretched military is also profound:
"A considerable number of Soldiers and Marines are conducting combat operations everyday of the week, 10-12 hours per day seven days a week for months on end," wrote Col. Carl Castro and Maj. Dennis McGurk, both psychologists. "At no time in our military history have Soldiers or Marines been required to serve on the front line in any war for a period of 6-7 months."
But much more alarming, it seems to me, is the moral cost to this country of such a brutal and brutalizing occupation. For the first time in history, the president of the United States has allowed torture as an option for treatment of military captives. We saw some of the worst consequences of the Bush policy in Abu Ghraib. But Abu Ghraib represents a fraction of the incidents of abuse and torture throughout the conflict. The military itself reports the following
staggering facts:
More than one-third of U.S. soldiers in Iraq surveyed by the Army said they believe torture should be allowed if it helps gather important information about insurgents, the Pentagon disclosed yesterday. Four in 10 said they approve of such illegal abuse if it would save the life of a fellow soldier.
In addition, about two-thirds of Marines and half the Army troops surveyed said they would not report a team member for mistreating a civilian or for destroying civilian property unnecessarily. "Less than half of Soldiers and Marines believed that non-combatants should be treated with dignity and respect," the Army report stated. About 10 percent of the 1,767 troops in the official survey - conducted in Iraq last fall - reported that they had mistreated civilians in Iraq, such as kicking them or needlessly damaging their possessions.
This is how we win hearts and minds? Notice that this is not the "ticking bomb" scenario touted by torture advocates such as Charles Krauthammer. This is torture merely as a means to gather important information about insurgents. It's routine torture. Over a third of U.S. soldiers, taking the lead from their pro-torture commander-in-chief, see nothing wrong with this, even in a war clearly under Geneva guidelines. Two-thirds won't report it. One in ten say they have abused Iraqi civilians just for the hell of it. Imagine what we don't know and will never know about the rest.
In thinking about the costs of this war, and thinking about renewing it, we have to reconsider what it has done to America. It has turned the U.S. military into a force at ease with abuse of captives and civilians, occupying a Muslim nation. Some of this is surely due to the sheer hell of fighting an enemy you cannot see, surrounded by people you do not understand or trust, and being killed randomly in urban or desert insurgency conditions where friend and foe are close to indistinguishable, and where your buddies are killed on a regular basis by faceless cowards. You can certainly understand how soldiers grow completely numb in the face of abuse in those circumstances. Every "hajji" can seem like the enemy after a while. It requires men and women of almost saintly capabilities to keep their moral bearings among terrorists who massacre scores of innocents as a religious duty, among people whose differences are impossible for young troops to figure out in split-seconds. In such conditions, and as a consequences of grotesque under-manning, the breakdown in ethical discipline is no big surprise. But that doesn't make it any the less of a big deal.
This is why the Iraq war, so far, must be seen as a huge al Qaeda propaganda victory. Their narrative is that Muslims are under siege by an evil, imperialist, infidel army that tortures and abuses Muslims at will. Before Iraq, this was an absurdity. After Iraq, less so. Iraq has helped sustain al Qaeda's narrative with imagery and violence that will always stain the image of America in the Middle East. Yes, the paranoia of the Arab street would have invented such atrocities even if they didn't exist. But they did exist and continue to exist. The images of Abu Ghraib did not shock Iraqis used to far worse horrors under Saddam. But they did help educate Arabs and Muslims across the world into believing the very worst about U.S. intentions. Because this war is a war of ideas and ideals, this matters a huge amount. Our one massive advantage - that we are a free and decent civilization - has been fatally blurred.
At home, the public has come to accept torture as a legitimate instrument of government, something that the Founding Fathers would have been aghast at. We have come to accept that the president is not bound by habeas corpus, if he decides he isn't. He can sign laws and say they don't apply to him. We know that an American citizen can be detained for years without charges and tortured and abused - and then critical evidence of his torture will be "lost". We have come to accept our phones being tapped without a warrant and without our even knowing about it. These huge surrenders of liberty have occurred without much public outcry. When the next major terrorist attack comes, the question will simply be how much liberty Americans have left. That is a victory al Qaeda could not have achieved by force of arms. It is something they have achieved with our witting and conscious help.
In reassessing the war, in other words, the moral cost to America must come into the equation. The Iraq war has removed for a generation the concept of the U.S. military being an unimpeachable source of national honor. It has infringed civil liberties. It has legalized and institutionalized torture as a government tool - and helped abuse and brutality metastasize throughout the field of conflict. To be sure, abuse of captives always happens in wartime. What's different now is that the commander-in-chief has authorized and legitimized it, and so the contagion has spread like wildfire. The tragedy is that none of this will help us actually win this war. By alienating so many Iraqis, the occupation has badly damaged American soft-power in the world. It has alienated many allies. It has exhausted the military itself. It has failed to quell an insurgency. History also teaches us that success against such an insurgency in such a country would require over a decade of a brutal war of attrition.
The question we have to ask is: if this is the way we achieve victory, what kind of country would America be at the end of it? To paraphrase Robert Bolt, it profit a man nothing if he gain the whole world and lose his soul. But for Iraq?

Rosewood