Tuesday, November 28, 2006

Humans and Apes joined by Humpback’s in 'Spindle Cell Brigade'

by Gunika Khurana - November 28, 2006 - 2 comments

U.S. researchers examined brain cells of Humpback Whales and reported on Monday that they share intelligence equivalent to human beings, apes, and other cetaceans such as dolphins.
These cells were bragged to set humans and great apes apart from other mammals. Now even
Humpback Whales have been found to have these spindle cells- specialized cells positioned in the brain occupied in processing emotions and help interact socially.
The spindle cells helps to transmit signals in the nervous system, and are characterized by a large spindle shaped soma, steadily narrowing into a sole apical dendrite (
axon) in one direction, with only a single dendrite facing opposite.
Other types of
neurons have large number of dendrites but having been found in only two very restricted regions in the brains, spindle cells are unique from the rest.
Scientists have discovered that these cells play an important role in a wide range of cognitive abilities, development of intelligent behavior and adaptive response to changing conditions and cognitive dissonance. The cells are largest and most abundant in humans, connected to a large part of the brain, evidencing their contributions to greater ability of hominids to focus on complicated problems.
“This might mean such whales are more intelligent than they have been given
credit for, and suggests the basis for complex brains either evolved more than once, or has gone unused by most species of animals”, the researchers said.
The researchers also said that these findings will help them understand and give an explanation for the various behaviors exhibited by the
whales, such as complicated communication skills, the formation of alliances, cooperation, cultural transmission and tool usage.
Not only Humpback Whales, but even Killer Whales, Fin Whales, and Sperm Whales, have spindle cells existing in the same areas of the brains as humans, i.e, anterior cingulate cortex and frontoinsular cortex.
The research was conducted by Patrick Hof and Estel Van der Gucht of the Department of Neuroscience at Mount Sinai School of Medicine, and it was also found that the Humpbacks had structures that resembled "islands" in the cerebral cortex, seen in some other mammals.
The researches suggested that the islands must have evolved in order to promote quick and efficient communication between neurons.
“It’s absolutely clear to me that these are extremely intelligent animals,” says Patrick Hof.
“We must be careful about anthropomorphic interpretation of intelligence in whales,” says Hof. “But their potential for high-level brain function, clearly demonstrated already at the behavioral level, is confirmed by the existence of neuronal types once thought unique to humans and our closest relatives,” he says.
“They communicate through huge song repertoires, recognize their own songs and make up new ones. They also form coalitions to plan hunting strategies, teach these to younger individuals, and have evolved social networks similar to those of apes and humans,” Hof says.
The spindle cells appeared in humans and apes about 15 million years ago, but the researches indicated that in cetaceans they would have evolved earlier, possibly as early as 30 million years ago.
Unlike in humans, the researchers were baffled to find spindle cells in the frontopolar cortex at the front of the brain. Hof says “I do not yet know the significance of spindles found in areas other than those that contain the cells in humans and great apes”.
The functioning of the spindle cells is still under investigation but Hof strongly believes that the cells help in carrying messages to and fro from parts of the cortex.
The Humpback Whales are migratory species found in almost all oceans, spending its summers in cooler, high-latitude waters, but mating and calving in tropical and sub-tropical waters. Annually, it migrates 25,000 km, making it the most traveled mammalian specie. However, the Whales in the Arabian Sea do not migrate and remain in the tropical waters year round.
The species is not found in the eastern Mediterranean, the Baltic Sea or the Arctic Ocean.

Sunday, November 26, 2006

Will Justice Prevail?
From Andrew Sulivan
The task of American democracy tackling the kind of issues that were once the province of South American countries has now begun. The authorization of war crimes, torture, and illegal wire-tapping by this administration needs to be thoroughly investigated in order to hold more than a few scapegoat grunts responsible. The definitive proof is in the hands of the administration - and they have a constritutional duty to hand it over to the Congress.
Since the Bush administration has repeatedly said that they have never authorized torture or war crimes, then they presumably should be eager to hand over the critical, relevant documents to the Senate Judiciary Committee in order to exonerate themselves as quickly as possible. According to the president, he has never signed any memos authorizing torture - so what does he have to lose?
The usual arguments will be made about "national security" requiring complete secrecy. But these are not operational secrets that the enemy can use. These are documents that may or may not reveal techniques that have already been exhaustively documented in public, and that any enemy with a modem knows about in full. The only secret is: who signed off on them, and when? The fundamental question is not the content of the memos so much as who authored them and what exactly did they sign off on?
Money quote: Justice Department officials have long said they will resist efforts to require disclosure of classified documents that provide legal advice to other agencies. But in the interview this week, Mr. Leahy signaled that he expected the department to provide a fuller documentary history on issues like detention.
The senator's letter to Mr. Gonzales requested "all directives, memoranda, and/or orders including any and all attachments to such documents, regarding C.I.A. interrogation methods or policies for the treatment of detainees." It also sought an index of all documents related to Justice Department inquiries into detainee abuse by "U.S. military or civilian personnel in Guantánamo Bay, Abu Ghraib prison or elsewhere."
We need proof of Bush's, Cheney's, Rumsfeld's and Gonzales' direct involvement in turning the United States into an international pariah on questions of prisoner abuse and torture. Then we need justice.

Saturday, November 25, 2006

Betty Comden, Half of Lyrics Team Behind Musicals of Grace and Wit, Dies at 89
By ROBERT BERKVIST
Betty Comden, who with her longtime collaborator Adolph Green wrote the lyrics and often the librettos for some of the most celebrated musicals of stage and screen, died yesterday in Manhattan. She was 89 and lived in Manhattan.
The cause was heart failure, said Ronald Konecky, her lawyer and the executor of her estate.
During a professional partnership that lasted for more than 60 years, and which finally ended with Mr. Green’s death in 2002, the Comden-Green blend of sophisticated wit and musical know-how lit up stage shows like “On the Town,” “Wonderful Town,” “Peter Pan” and “Bells Are Ringing.” Their Hollywood credits included the screenplays for two landmark film musicals, “Singin’ in the Rain” and “The Band Wagon.”
Through the years they worked with composers like
Leonard Bernstein, Cy Coleman, Jule Styne and André Previn, creating songs like “New York, New York,” “The Party’s Over,” “It’s Love” and “Some Other Time.” They were adept at making their lyrics fit the mood, whether it was rueful (“Lonely Town”), raucous (“100 Easy Ways to Lose a Man”) or romantic (“Just in Time”).
The title of one of their own songs, from “Bells Are Ringing,” summed up their joint career: it was truly a “Perfect Relationship” in which they met daily, most often in Ms. Comden’s living room, either to work on a show, to trade ideas or even just talk about the weather.
“We stare at each other,” Ms. Comden said in a 1977 interview with The New York Times. “We meet, whether or not we have a project, just to keep up a continuity of working. There are long periods when nothing happens, and it’s just boring and disheartening. But we have a theory that nothing’s wasted, even those long days of staring at one another. You sort of have to believe that, don’t you? That you had to go through all that to get to the day when something did happen.”
Ms. Comden, slim, dark-haired and composed, was the ideal counterbalance to the often rumpled, wild-haired and restless Mr. Green. Sometimes, during discussions, Ms. Comden would finish one of his sentences, or vice versa. Songs and shows grew that way too, although the story was always the starting point.
“The book comes first,” Ms. Comden said in the 1977 interview , recalling how the song “Just in Time” took shape. “At some point when we were working on ‘Bells Are Ringing,’ ” she said, “Jule Styne wrote that tune. Dee da dum, da dee da dee da dum. We all agreed it had to be in the show somewhere, but for months we couldn’t find a place for it, or even a title, but Jule was playing it all over town at parties, calling it ‘Dee Da Dum.’ And that became the official title until the point where, rewriting part of the book one day, the situation was there, and we finally wrote the words to fit ‘Dee Da Dum.’ ”
The starting point for their partnership was Greenwich Village where, in the late 1930s, they joined up with another aspiring entertainer named
Judy Holliday and two other friends to form a cabaret act. They called themselves the Revuers and persuaded Max Gordon, the owner of a club called the Village Vanguard, that their act would be good for business. It was.
The Revuers opened at the Vanguard in 1939, performing material that included freewheeling sketches like “The Banshi Sisters” and “The Baroness Bazuka,” a zany operetta, and frequently accompanied at the piano by one of Mr. Green’s friends, a talented young musician named Leonard Bernstein, who dropped in often enough to be taken for part of their act.
The act’s success earned them a movie offer, and the Revuers traveled west in hopes of finding instant fame in “Greenwich Village,” a 1944 movie starring
Carmen Miranda and Don Ameche, in which the newcomers turned out to be virtually invisible. Ms. Comden and Mr. Green came back to New York, where they resumed working at the Vanguard and other clubs.
It wasn’t long before they heard from Mr. Bernstein, their erstwhile accompanist, who said he’d been working on a ballet with
Jerome Robbins and that the two of them had decided that the ballet, called “Fancy Free,” had the makings of a Broadway show. They were looking for someone to write the book and lyrics.
Ms. Comden and Mr. Green jumped at the chance and jumped into the limelight with their work on the show. The result, “On the Town,” the story of three sailors on shore leave in New York, opened late in 1944 and was a smash. Both Ms. Comden and Mr. Green appeared in the show, he as one of the sailors and she as Claire de Loone, an amorous anthropologist. New Yorkers inside and outside the theater were soon humming the town’s geography à la Comden and Green:
The Bronx is up and the Battery’s down,The people ride in a hole in the ground,New York, New York,It’s a helluva town.
Ms. Comden and Mr. Green were definitely on their way up, not to the Bronx but to big-time success.
Betty Comden was born Elizabeth Cohen on May 3, 1917, in Brooklyn. Her father, Leo, was a lawyer, her mother, Rebecca, a teacher. She attended Erasmus Hall High School and studied drama at
New York University, graduating in 1938.
By that time she had changed her surname to Comden, had had nose surgery to make her look more stageworthy, had acted with the Washington Square Players and had met and become friends with Mr. Green, another aspiring actor. Their circle soon included three other would-be entertainers, Ms. Holliday, Alvin Hammer and John Frank. Then came their decision to form the Revuers, and all else followed.
Ms. Comden married Steven Kyle, a designer and businessman, in 1942. He died in 1979, and she never remarried. They had two children, a daughter, Susanna, and a son, Alan. Their son, a drug addict, contracted AIDS and died of complications of his addiction in 1990. She is survived by her daughter, Susanna Kyle, of Manhattan.
Ms. Comden reminisced about her Brooklyn childhood, her student years and her long marriage in a 1995 memoir, “Off Stage,” in which she also told of the difficult circumstances of her son’s struggle with drugs. The book included tributes to some of her friends and colleagues, among them Mr. Bernstein and
Lauren Bacall, but hardly dealt at all with her professional life.
After the success of “On the Town,” Ms. Comden and Mr. Green tried their hands at writing the book for another Broadway musical. “Billion Dollar Baby,” which opened in 1945, had a score by Morton Gould, choreography by Mr. Robbins and was directed by George Abbott, but it was not as well received.
Hollywood called again, and this time, for the most part, they had better luck. “Good News” (1947), with
June Allyson and Peter Lawford as singing, dancing campus sweethearts, was their first screenplay. They wrote “The Barkleys of Broadway” (1949), which marked the film reunion of Ginger Rogers and Fred Astaire, and adapted “On the Town” (1949) for the screen, with Gene Kelly, Frank Sinatra and Jules Munshin as the freewheeling sailors navigating the streets of New York.
They contributed heavily to the success of “The Band Wagon” (1953), the
Vincente Minnelli musical for which they wrote the witty screenplay. The film had a score by Howard Dietz and Arthur Schwartz and starred Mr. Astaire, Cyd Charisse, Oscar Levant and Nanette Fabray. The Comden-Green screenplay received an Academy Award nomination.
They were still wedded to Broadway, however, and their stage work during the next few years included the Jule Styne musical “Two on the Aisle” (1951), a revue with Bert Lahr and Dolores Gray; “Wonderful Town” (1953), an adaptation of the 1939 comedy hit “My Sister Eileen,” with music by Mr. Bernstein and starring
Rosalind Russell and Edie Adams as two sisters from Ohio trying to make it in the Big Town; and, most notably, “Bells Are Ringing.”
That 1956 musical reunited them with Ms. Holliday, who headed the cast as an operator at an answering service who falls in love with one of the service’s male clients (Sydney Chaplin) after listening to his voice over her telephone line. The score included the comic lament “I’m Going Back (To the Bonjour Tristesse Brassiere Company)” as well as some songs that became part of the standard pop repertory, like “Just in Time,” “Long Before I Knew You” and “The Party’s Over,” which ended with the melancholy verse:
Now you must wake upAll dreams must endTake off your makeupThe party’s overIt’s all over, my friend.
Ms. Comden and Mr. Green also wrote the screenplay for the 1960 film version, which starred Ms. Holliday.
Even their less successful shows yielded musical nuggets, one example being the 1960 “Do Re Mi,” which had a book by
Garson Kanin and music by Mr. Styne for which they wrote the lyrics. It featured Phil Silvers and Nancy Walker. That largely unmemorable score included one gem, “Make Someone Happy.”
Ms. Comden and Mr. Green went on to write “Subways Are for Sleeping” (1961), with Carol Lawrence and Mr. Chaplin, and “Fade Out-Fade In” (1964), with
Carol Burnett and Jack Cassidy, both shows with music by Mr. Styne, and the lyrics for “Hallelujah, Baby” (1968), which had music by Mr. Styne and a book by Arthur Laurents and which starred Leslie Uggams and Robert Hooks. They wrote the book for “Applause” (1970), adapted from the film “All About Eve,” with music by Charles Strouse and lyrics by Lee Adams. The show starred Ms. Bacall as the take-no-prisoners movie queen Margo Channing played by Bette Davis in the film.
In the years that followed Ms. Comden and Mr. Green teamed up with the composer Cy Coleman for “On the Twentieth Century” (1978), based on the Ben Hecht-Charles MacArthur play about a flamboyant movie producer (John Cullum) and his leading lady (
Madeline Kahn), traveling from Hollywood to Broadway on the Twentieth-Century Limited in the 1930s. The show was a hit and brought them Tony Awards for their book and score.
In 1982 they wrote the book and lyrics — Larry Grossman wrote the music — for what was meant to be a kind of musical sequel to Ibsen’s “A Doll’s House,” but the result, “A Doll’s Life” (1982), was a four-performance disaster.
Their last major Broadway show was “The
Will Rogers Follies,” a 1991 Ziegfeld-style extravaganza with music by Mr. Coleman, book by Peter Stone and direction and choreography by Tommy Tune. Keith Carradine starred as the folksy humorist-philosopher. Despite mixed reviews the show won six Tony Awards, including one for the music and lyrics, and enjoyed a run of two and a half years.
By the time “Will Rogers” came along, Ms. Comden and Mr. Green had worked together for more than a half-century. On Broadway, starting with “On the Town” in 1944, they had won a shelf full of Tony Awards. They were among the recipients of the 1991
Kennedy Center honors for their contributions to American musical theater.
Their early Hollywood credits included “Take Me Out to the Ball Game” (1949), with Gene Kelly, Mr. Sinatra and
Esther Williams, and “It’s Always Fair Weather” (1955), with Mr. Kelly, Dan Dailey, Michael Kidd and Ms. Charisse, for which their screenplay received an Academy Award nomination. A bittersweet sequel of sorts to “On the Town,” the plot of “It’s Always Fair Weather” revolved around the reunion a decade after World War II of three former G.I. companions who find that time has altered their friendship for the worse.
“I don’t think there’s ever been a musical quite like it,” Ms. Comden said in a 1999 interview with The Times. “The corrosive effect that time has on friendships — that’s a very unusual subject for a musical.” She and Mr. Green said it was one of their favorites.
After their stage debut in “On the Town” they didn’t perform on Broadway again until 1958, when they appeared in “A Party with Betty Comden and Adolph Green,” a revue that included some of their early favorites like “The Baroness Bazuka,” which they described as a tribute to the Shubert brothers (“J. J., O. O., and Uh-Uh.”). The revue was well received, and they brought an updated version back to Broadway in 1977.
Ms. Comden also performed in films from time to time. She acted in
Sidney Lumet’s “Garbo Talks” (1984), in which Mr. Green also made a fleeting appearance, and James Ivory’s “Slaves of New York” (1989). She appeared onstage in 1983 in a rare dramatic role in Wendy Wasserstein’s “Isn’t It Romantic,” playing the mother of a footloose girl waiting for Mr. Right to come along.
In 1999 Ms. Comden and Mr. Green were saluted by their peers in a two-night program at
Carnegie Hall. Elaine Stritch and Brian Stokes Mitchell were among the performers who sang numbers from the Comden-Green repertoire.
Recent Broadway revivals of their work included a 2001 production of “Bells Are Ringing” starring Faith Prince, which closed after a brief run, and the 2003 revival of “Wonderful Town” with Donna Murphy, which settled in for a long stay at the
Al Hirschfeld Theater.
After Mr. Green’s death in October 2002, Broadway turned out in force two months later for a memorial program at the Shubert Theater.
Kevin Kline, Joel Grey, Ms. Bacall and others paid affectionate tribute to Mr. Green in song and story. At one point during her own reminiscence about him, Ms. Comden paused and said to the audience, “It’s lonely up here.” After six decades the perfect relationship was over.

Wednesday, November 22, 2006

Robert Altman, Director With Daring, Dies at 81
By RICK LYMAN
Robert Altman, one of the most adventurous and influential American directors of the late 20th century, a filmmaker whose iconoclastic career spanned more than five decades but whose stamp was felt most forcefully in one, the 1970s, died Monday in Los Angeles. He was 81.
His death, at Cedars Sinai Medical Center, was caused by complications of cancer, his company in New York, Sandcastle 5 Productions, announced. A spokesman said Mr. Altman had learned that he had cancer 18 months ago but continued to work, shooting his final film,
“A Prairie Home Companion,” which was released in June, and most recently completing pre-production on a new film that he intended to begin shooting in February.
Mr. Altman had a heart transplant in the mid-1990s, a fact he publicly revealed for the first time last March while accepting an honorary Oscar at the Academy Awards ceremony.
A risk taker with a tendency toward mischief, Mr. Altman put together something of a late-career comeback capped in 2001 by
“Gosford Park,” a multiple Oscar nominee. But he may be best remembered for a run of masterly films — six in five years — that propelled him to the forefront of American directors and culminated in 1975 with what many regard as his greatest film, “Nashville,” a complex, character-filled drama told against the backdrop of a presidential primary.
They were free-wheeling, genre-bending films that captured the jaded disillusionment of the ’70s. The best known was
“MASH,” the 1970 comedy that was set in a field hospital during the Korean war but that was clearly aimed at antiwar sentiments engendered by Vietnam. Its success, both critically and at the box office, opened the way for Mr. Altman to pursue his ambitions.
In 1971 he took on the western, making
“McCabe & Mrs. Miller” with Warren Beatty and Julie Christie. In 1972, he dramatized a woman’s psychological disintegration in “Images,” starring Susannah York. In 1973, he tackled the private-eye genre with a somewhat loopy adaptation of Raymond Chandler’s “The Long Goodbye,” with the laid-back Elliott Gould playing Philip Marlowe as a ’70s retro-hipster. And in 1974 he released two films, exploring gambling addiction in “California Split” and riffing on the Dust Bowl gangster saga with “Thieves Like Us.”
Unlike most directors whose flames burned brightest in the early 1970s — and frequently flickered out — Mr. Altman did not come to Hollywood from critical journals and newfangled film schools. He had had a long career in industrial films and television. In an era that celebrated fresh voices steeped in film history — young directors like Francis Ford Coppola, Peter Bogdanovich and Martin Scorsese — Mr. Altman was like their bohemian uncle, matching the young rebels in their skeptical disdain for the staid conventions of mainstream filmmaking and the establishment that supported it.
Most of his actors adored him and praised his improvisational style. In his prime, he was celebrated for his ground-breaking use of multilayer soundtracks. An Altman film might offer a babble of voices competing for attention in crowded, smoky scenes. It was a kind of improvisation that offered a fresh verisimilitude to tired, stagey Hollywood genres.
But Mr. Altman was also famous in Hollywood for his battles with everyone from studio executives to his collaborators, leaving more burned bridges than the Luftwaffe. He also suffered through periods of bad reviews and empty seats but always seemed to regain his stride, as he did in the early ’90s, when he made
“The Player” and “Short Cuts.” Even when he fell out of popular favor, however, many younger filmmakers continued to admire him as an uncompromising artist who held to his vision in the face of business pressures and who was unjustly overlooked by a film establishment grown fat on special effects and feel-good movies.
He was often referred to as a cult director, and it rankled him. “What is a cult?” Mr. Altman said. “It just means not enough people to make a minority.”
The Breakthrough
The storyline had to do with a group of boozy, oversexed Army doctors in a front-line hospital, specifically a Mobile Army Surgical Hospital. Fifteen directors had already turned the job down. But at 45, Mr. Altman signed on, and the movie, “MASH,” became his breakthrough.
Audiences particularly connected with the authority-bashing attitude of the film’s irreverent doctors, Hawkeye (
Donald Sutherland) and Trapper John (Mr. Gould).
“The heroes are always on the side of decency and sanity; that’s why they’re contemptuous of the bureaucracy,” the critic Pauline Kael wrote in The New Yorker. “They are heroes because they are competent and sane and gallant, and in this insane situation their gallantry takes the form of scabrous comedy.”
The villains are not the Communist enemy but marble-hearted military bureaucrats personified by the pious Frank Burns (
Robert Duvall) and the hypocritical Hot Lips Houlihan (Sally Kellerman).
The film was nominated for five Academy Awards, including one for best picture and one for Mr. Altman’s direction. It also won the Golden Palm, the top award at the 1970 Cannes Film Festival, and the best picture of the year award of the National Society of Film Critics.
But “MASH” was denied the best-picture Oscar; that award went to
“Patton.” In later years Mr. Altman received four more Academy Award nominations for best director and two for producing best-picture nominees, “Nashville” and “Gosford Park.” The only Oscar he received, however, was the honorary one in March.
Mr. Altman was angry that the lone Oscar given to “MASH” went to Ring Lardner Jr., who got sole screen credit for the script. Mr. Altman openly disparaged Mr. Lardner’s work, touching off one of his many feuds. Later, when Mr. Altman seemed unable to duplicate the mix of critical and box-office success that “MASH” had achieved, he grew almost disdainful of the film.
“ ‘MASH’ was a pretty good movie,” Mr. Altman said in an interview. “It wasn’t what 20th Century- Fox thought it was going to be. They almost, when they saw it, cut all the blood out. I fought with my life for that. The picture speaks for itself. It became popular because of the timing. Consequently, it’s considered important, but it’s no better or more important than any of the other films I’ve made.”
Mr. Altman’s interest in film genres was candidly subversive. He wanted to explode them to expose what he saw as their phoniness. He decided to make “McCabe & Mr. Miller” for just that reason. “I got interested in the project because I don’t like westerns,” Mr. Altman said. “So I pictured a story with every western cliché in it.”
His intention, he said, was to drain the glamour from the West and show it as it really was — filthy, vermin-infested, whisky-soaked and ruled by thugs with guns. His hero, McCabe (Mr. Beatty), was a dimwitted dreamer who let his cockiness and his love for a drug-addicted prostitute (Ms. Christie) undo him.
“These events took place,” Mr. Altman said, of westerns in general, “but not in the way you’ve been told. I wanted to look at it through a different window, you might say, but I still wanted to keep the poetry in the ballad.” “Nashville” interweaved the stories of 24 characters — country-western stars, housewives, boozers, political operators, oddball drifters — who move in and out of one another’s lives in the closing days of a fictional presidential primary. Mr. Altman returned to this multi-character approach several times (in
“A Wedding,” “Health,” “Short Cuts,” “Prêt-à-Porter” and “Kansas City”), but never again to such devastating effect.
“Nashville is a radical, evolutionary leap,” Ms. Kael wrote in The New Yorker. “Altman has already accustomed us to actors who don’t look as if they’re acting; he’s attuned us to the comic subtleties of a multiple-track sound system that makes the sound more live than it ever was before; and he’s evolved an organic style of moviemaking that tells a story without the clanking of plot. Now he dissolves the frame, so that we feel the continuity between what’s on the screen and life off-camera.”
Mr. Altman’s career stalled after “Nashville,” although he continued to attract top actors.
Paul Newman starred in “Buffalo Bill and the Indians” in 1976, Sissy Spacek in “3 Women” in 1977 and Mr. Newman again in “Quintet” in 1979. But critical opinion turned against Mr. Altman in the late ’70s, and his films fared worse and worse at the box office.
The crushing blow came in 1980, when Mr. Altman directed
Robin Williams in a lavish musical based on the “Popeye” cartoon. Though it eventually achieved modest commercial success, the movie was considered a dud because it made less money than had been expected and drew almost universal scorn from the critics. Mr. Altman retained his critical champions, including Ms. Kael and Vincent Canby of The New York Times, who in 1982 called Mr. Altman one of “our greatest living directors.” But the tide had turned against him.
In “Fore My Eyes,” a 1980 collection of film essays, Stanley Kauffmann spoke for other critics when he derided what he saw as the director’s middle-brow pretensions. “He’s the film equivalent of the advertising-agency art director who haunts the galleries to keep his eye fresh,” he wrote.
If Mr. Altman never fully regained his critical pre-eminence, he came close, recapturing much of his luster in the final years of his life. And he always kept in the game.
He remade his career in the early ’80s with a string of films based on stage dramas: Ed Graczyk’s “Come Back to the Five and Dime, Jimmy Dean, Jimmy Dean” in 1982, David Rabe’s
“Streamers” in 1983 and Sam Shepard’s “Fool for Love” in 1985. He also did some fresh work for television, a medium he had reviled when he left it two decades earlier.
In 1988, he directed a strong television adaptation of “The Caine Mutiny Court-Martial,” a stage play by Herman Wouk based on his novel
“The Caine Mutiny.” The Altman version restored the class conflict and anti-Semitism that had been excised from the 1954 Hollywood treatment starring Humphrey Bogart.
The ’90s brought an even more satisfying resurgence for Mr. Altman. It began with a pair of critical film successes: “The Player,” an acerbic satire based on the
Michael Tolkin novel about a ruthless Hollywood executive, and “Short Cuts,” an episodic, character-filled drama based on the short stories of Raymond Carver. The films earned him his third and fourth Oscar nominations for best director.
Then, in 2001, came “Gosford Park,” an elaborate murder mystery with an ensemble cast that capped his comeback.
Mr. Altman’s last film, “A Prairie Home Companion,” based on Garrison Keillor’s long-running radio show, was released in June and starred
Meryl Streep and Kevin Kline in another ensemble cast. Writing in The Times, A.O. Scott called the film a minor Altman work “but a treasure all the same.” “I seem to have become like one of those old standards, in musical terms,” Mr. Altman said in a 1993 interview. “Always around. Lauren Bacall said to me, ‘You just don’t quit, do you?’ Guess not.”
Son of a Salesman
Robert Bernard Altman was born on Feb. 20, 1925, in Kansas City, Mo., to Helen and B.C. Altman, a prosperous insurance salesman for the Kansas City Life Insurance Company. Mr. Altman’s grandfather, the developer Frank G. Altman, had built the Altman Building, a five-story retail mecca in downtown Kansas City. (It was razed in 1974.)
Young Robert attended Catholic schools and the Wentworth Military Academy in Lexington, Mo., before enlisting in the Air Force in 1945. He eventually became a co-pilot on a B-24. It was during this period that he invented what he called “Identi-code,” a method for tattooing numbers on household pets to help identify them if they were lost or stolen; he even talked
President Harry S. Truman into having one of his dogs tattooed.
After the Air Force, Mr. Altman went to work with the Calvin Company, a film company in Kansas City, making training films, advertisements and documentaries for industrial clients. In 1947 he married LaVonne Elmer, but they divorced two years later after they had a daughter, Christine. He married Lotus Corelli in 1950, and they divorced in 1955; they had two sons, Michael (who wrote lyrics to “Suicide Is Painless,” the “MASH” theme song, when he was just 14) and Stephen, a film production designer who frequently worked with his father.
Mr. Altman began to set his sights on Hollywood while still working in Kansas City. His first screen credit came for helping write
“Bodyguard,” (1948) a B movie about a hard-boiled detective.
It was not until 1955 that he actually headed for Hollywood; he had gotten a call offering him a job directing an episode of the television series “
Alfred Hitchcock Presents.”
Over the next decade, he directed dozens of episodes of
“Maverick,” “Lawman,” “Peter Gunn,” “Bonanza,” “Hawaiian Eye,” “Route 66,” “Combat!” and “Kraft Suspense Theater.”
It was while on the set of the TV series “Whirlybirds” that Mr. Altman met his third wife, Kathryn Reed. They married in 1957 and had two sons, Robert and Matthew. Mr. Altman’s wife and children survive him, as does a stepdaughter, Connie Corriere, 12 grandchildren and five great-grandchildren. Although Mr. Altman interrupted his early Kansas City work to crank out a teen exploitation movie called
“The Delinquents” (1957), it was not until 1968 that he moved up to directing major actors in a Hollywood feature. The film, “Countdown,” starring James Caan and Robert Duvall, was a critically praised drama about the first flight to the moon. He followed that up in 1969 with “That Cold Day in the Park,” a psychological thriller starring Sandy Dennis as a woman driven mad by her sex urges.
In 1970, he made what is perhaps his strangest film,
“Brewster McCloud,” about a nerdish youth who wanted to build his own flying machine and whiz around the Houston Astrodome.
Then came “MASH.”
In later years he gathered around him a company of favored performers, among them Mr. Gould,
Lily Tomlin, Shelley Duvall, Bert Remsen and Keith Carradine. Many of his sets were celebrated for their party atmosphere, which often came through on the screen. He thought that creating a casual mood helped him expand the boundaries of filmmaking.
To achieve his vision, Mr. Altman was willing to battle studio executives over the financing of his films and ultimate creative control.
“Robert Altman is an artist and a gambler,” his longtime assistant director,
Alan Rudolph, wrote in a 1994 tribute in Film Comment. “Pursuing artistic vision on film in America can sometimes put everything you own at risk.”
When a studio refused to distribute Mr. Rudolph’s first film,
“Welcome to L.A.,” Mr. Altman responded by forming his own independent distribution company, Lion’s Gate, for the sole purpose of releasing the film. It was a harbinger of the independent film companies of the ’80s and ’90s.
“There’s a big resistance to me,” Mr. Altman told The Washington Post in 1990. “They say, ‘Oh, he’s going to double-cross us somewhere.’ When I explain what I want to do, they can’t see it, because I’m trying to deliver something that they haven’t seen before. And they don’t realize that that’s the very reason they should buy it.”
Mr. Altman acknowledged that his career had suffered as a consequence of his own behavior — his hard drinking, procrastination and irascibility, his problem with authority. He also had a long history of bitter relations with screenwriters. Many complained that he injected himself into the rewriting process and took credit for work he did not do.
But many actors said they loved working with Mr. Altman because of the leeway he gave them in interpreting the script and in improvising in their scenes. “For somebody like me who likes to hang out with my pals and goof off and take the path of least resistance,” Sally Kellerman said, “he’s wonderful that way.”
Mr. Altman said giving actors freedom could draw things out of them that they did not know were there. “I look for actors where there’s something going on there, behind that mask,” Mr. Altman said. “
Tim Robbins fascinated me. This John Cusack guy: I always see something going on in there and I don’t know what it is.”
He never mellowed in his view of the movie business.
“The people who get into this business are fast-buck operators, carnival people, always have been,” Mr. Altman said in a 1993 interview. “They don’t try to make good movies now; they’re trying to make successful movies. The marketing people run it now. You don’t really see too many smart people running the studios, running the video companies. They’re all making big money, but they’re not looking for, they don’t have a vested interest in, the shelf life of a movie. There’s no overview. No one says, ‘Forty years from now, who’s going to want to see this.’ No visionaries.”

Tuesday, November 21, 2006


Ideology Has Consequences
Bush rejects the politics of prudence.

by Jeffrey Hart
Many Republicans must feel like that legendary man at the bar on the Titanic. Watching the iceberg slide by outside a porthole, he remarked, “I asked for ice. But this is too much.” Republicans voted for a Republican and got George W. Bush, but his Republican Party is unrecognizable as the party we have known.
Recall the Eisenhower Republican Party. Eisenhower, a thoroughgoing realist, was one of the most successful presidents of the 20th century. So was the prudential Reagan, wary of using military force. Nixon would have been a good secretary of state, but emotionally wounded and suspicious, he was not suited to the presidency. Yet he, too, with Henry Kissinger, was a realist. George W. Bush represents a huge swing away from such traditional conservative Republicanism.
But the conservative movement in America has followed him, evacuating prudence and realism for ideology and folly. Left behind has been the experienced realism of James Burnham. Also vacated, the Burkean realism of Willmoore Kendall, who aspired, as he told Leo Strauss, to be the “American Burke.” That Burkeanism entailed a sense of the complexity of society and the resistance of cultures to change. Gone, too, has been the individualism of Frank Meyer and the commonsense Western libertarianism of Barry Goldwater.
The post-2000 conservative movement has abandoned all that to back Bush and has followed him over the cliff into our calamity in Iraq. On top of all that, the Bush presidency has been fueled by the moral authoritarianism of the current third evangelical awakening.
Yes, aware Republicans are like that man on the Titanic who asked for ice, and this iceberg is too much.
The problem is that Bush campaigned in 2000 as a “compassionate conservative.” Today, the media calls him a conservative, yet there is nothing at all conservative about his policies, whether foreign or domestic. William F. Buckley once said that conservatism is the “politics of reality.” But Bush has not pursued reality-based policies. Will we have to find another word? It certainly looks that way.
Buckley has said that Bush has been “engulfed” by Iraq and that if he had been a European prime minister he would have resigned by now. Other commentators known as conservatives have agreed: Andrew Sullivan, George Will, Francis Fukuyama. It is worth considering a statement by Richard Cheney:
Once you get to Baghdad, it’s not clear what you do with it. It’s not clear what kind of government you put in place of the one that’s currently there now. Is it going to be a Shia regime, a Sunni regime, a Kurdish regime? Or one that tilts toward the Baathists, or one that tilts toward Islamic fundamentalists? How much credibility is that going to have if it’s set up by the American military there? How long does the United States military have to stay there to protect the people that sign on for that government, and what happens once we leave?
Smart man, that Cheney. The only problem is that he said that back in 1991 during the first Gulf War when he was secretary of defense in the administration of George H.W. Bush. At that time, Brent Scowcroft was national security adviser and James Baker was secretary of state. Recently, Scowcroft has said that though he has been friends with Cheney for more than 30 years, he no longer really knows him. What has happened to Cheney is anybody’s guess.
It can’t be 9/11. We know from many sources that Bush had decided to invade Iraq long before 9/11. In The Right Man, David Frum recounts being interviewed for a position by Michael Gerson, head Bush speechwriter and also policy adviser, not long after Bush became president. Gerson told Frum that Bush would topple Saddam. At that time nothing was being said about weapons of mass destruction.
National Review editor Rich Lowry sheds some light on the president’s motivation for invading Iraq in a column titled “The Revenge of Orthodoxy.” Following historian Walter Russell Mead, he notices that we are in the “Third Awakening” of Protestant evangelicalism and that the Bush presidency should be stamped “Brought to you by orthodox Christian believers.” He makes clear the implications of this for American foreign policy:
The reinvigorated Wilsonian foreign policy championed by Bush—and motivated less by Woodrow Wilson’s secular values (international law, etc.) and more by religious beliefs (the God-given rights of all people)—is a reflection of Bush’s Christian base.
Lowry, following Mead, is surely correct here. But just what is conservative about it? Historically, American evangelicalism has veered wildly from the crusading lyrics of Julia Ward Howe’s “Battle Hymn of the Republic” to the pacifism of William Jennings Bryan.
And has anyone ever claimed that Wilsonianism is conservative? To give Woodrow a bit of a break, his “Wilsonianism” was much more temperate than is sometimes thought: “It will now be our fortunate duty,” he said, “to assist by example, by sober, friendly counsel, and by material aid in the establishment of democracy through out the world.” That statement by Wilson reflects the original meaning of the torch the Statue of Liberty holds aloft: the United States is a beacon of liberty. Emma Lazarus’s famous lines about welcoming immigrants amounted to a misinterpretation. True enough, Lloyd George, when he returned to England from Versailles, remarked that he had not done badly considering that he had been sitting between Napoleon (Clemenceau) and Jesus Christ (Wilson). But just what did Wilson mean by “the world” when he spoke of “establishing democracy”? I hazard the thought that he focused on the West and was not thinking of Borneo or the Congo, nor, surely, of launching invasions and occupations of Mesopotamia. With Bush in mind, Woodrow’s “Wilsonianism,” though naïve and though certainly not conservative, can be declared Not Guilty.
To define what “conservative” in fact means, the place to turn is Edmund Burke, the founder of modern political philosophy, the first political thinker to base his thought on empirical fact and on history. Both Hobbes and Locke were empiricists, but in their political thought they reasoned from assumptions they posited about human nature.
Hobbes took a relatively dark view of human nature, seeing human life in a mythical pre-social state of nature as “solitary, nasty, brutish and short.” Such creatures needed firm control. Locke, in contrast, was more optimistic, seeing man in a state of nature as governed by reason and thus requiring a much less intrusive government. The empiricism reflected by Locke, however, represented a new way of seeing the world and made political philosophy, beginning with Burke, possible. The opening pages of Locke’s Essay on Human Understanding (1690) possess the promise of a new and innocent dawn as Locke brushes aside much of Western philosophy, judging metaphysics to be a distraction from his focus on the facts of this world, with a view to improving it. As a result, we have the facticity reflected in the birth of the novel (Defoe), history (Gibbon, Hume), biography (Boswell), and Burke. In Robinson Crusoe (1719) we have the thrill of Locke’s empiricism as it appears in the prose of our first novel, that is, in the first distinctively modern form of literature:
The sixth day of our being at sea we came into Yarmouth Roads; the wind having been contrary, and the Weather calm, we had made but little Way since the Storm. Here we were obliged to come to an Anchor, and here we lay, the Wind continuing contrary—viz. at South-west—for seven or eight Days, during which time a great many Ships from Newcastle came into the same Roads, as the common Harbour where the ships might wait for a wind from the river.
Never before in literature had man been placed so thoroughly in a physical (empirical) environment. Never before had biography come to us with the detail Boswell uses in his Life of Samuel Johnson.
Burke does not begin with hypothetical “states of nature” but with the facts of history and human behavior. His great breakthrough into new territory—he wrote that he had been “alarmed into reflection” by the completely unique events in France—came in his Reflections on the Revolution in France (1790). To see his thought develop here in an exploratory way, then see him make further discoveries a year later, is to experience enormous intellectual excitement.
Once, while I was a graduate student at Columbia, I took a seminar in important thinkers with Jacques Barzun and Lionel Trilling. Barzun, in particular, liked to start by identifying the core of a great thinker’s thought. When it came to Burke’s Reflections on the Revolution, I offered: “Burke knows that if you tried to tie your shoes in the morning by means of reason you would never get out of the house.” That is, you tie your shoes by habit. Barzun nodded approval but gave this a social dimension, saying, “Burke wanted his morning newspaper delivered on time.” That is, the writing, manufacture, and delivery of that newspaper require a great many actions that are accomplished by habit. Social institutions are the habits of society.
What Burke faced in the radical philosophes across the Channel was something new: an actual society in France being attacked by abstract “rights of man.” To this he opposed the historic liberties of England. He saw the abstraction-based attack on an actual society as something new in history—and inherently dangerous. Part of the excitement of the Reflections consists in Burke confronting this novelty, searching for a vocabulary to describe it: “abstract theory,” “metaphysical dogma.” Burke was seeking terms to describe a belief system impervious to fact or experience, and he brought to bear a permanently valid analysis of human behavior and the role of social institutions. Burke’s “abstract theory” and “metaphysical dogma” we would call ideology.
Burke’s thought, however, did not conclude with the Reflections. And it is exciting to watch him responding to events as they unfold. By 1791, in his “Thoughts on French Affairs,” he recognized that the social forces converging against the absolute monarchy had made revolution inevitable. Saying that the French Revolution had occupied him for two years, he now recognized that:
If a great change is to be made in human affairs, the minds of men will be fitted to it; the general opinions and feelings will draw that way. Every fear, every hope will forward it; and they, who persist in opposing this mighty current in human affairs, will appear rather to resist the decrees of Providence itself, than the mere designs of men. They will not be resolute and firm, but perverse and obstinate.
Burke there moved from social structure in the Reflections to social process. In his great essay “The Function of Criticism at the Present Time” (1865) Matthew Arnold rightly described this as one of the great moments in modern thought.
In the free nations of the world at the present time, we have experienced changes that can be called revolutions, certainly the biomedical, also the women’s revolution, which has been one of the most far reaching in its implications. Not until 1912 was women’s suffrage on the agenda of a major American political party, Theodore Roosevelt’s Progressive (“Bull Moose”) Party. And women’s suffrage implied women’s equality. The sources of women’s demand for equality surely went back before 1912. The result today can be seen in almost any college or university graduate school, indeed in the armed forces. I know the subject is fraught with emotion and contention, but I consider analytically that the demand for the availability of abortion is a derivative of women’s equality: that is, equality requires that women be able to shape their lives as freely as men do. Many will find that analytical conclusion disagreeable. No doubt Burke hated to see that the French Revolution had been inevitable. Yet he knew that those who “persist in opposing [the implications of] this mighty current in human affairs … will not be resolute and firm but perverse and obstinate.”
While it is not incorrect to call Burke a conservative, it is also correct to call him an analytical realist. And I suggest that they may be the same thing. Indeed there is a sense in which any successful government must be based upon such analytical realism. Today, many historians judge that Franklin Roosevelt and Dwight Eisenhower were among the best presidents in the 20th century and rank them among the best in American history. I think Ronald Reagan will join them. All were realistic in handling the challenges they faced.
Bush has offered two justifications for his invasion of Iraq. First, that Saddam had weapons of mass destruction. None were discovered, and Bush’s claims, upon examination, have been found suspect. He has also projected a democratic Iraq, some of his statements being so disconnected from actuality as to qualify as pure ideology.
For example, at the American Enterprise Institute on Feb. 26, 2003, Bush put forth the following theory of human behavior:
Human cultures can be vastly different. Yet the human heart desires the same good things, everywhere on earth. In our desire to be safe from brutal and bullying oppression, human beings are the same. For these fundamental reasons, freedom and democracy will always and everywhere have greater appeal than the slogans of hatred and the tactics of terror.
Yes, human beings do dislike “brutal and bullying oppression,” but everything else there is false. The people going to work at the World Trade Center on 9/11 did not want the same things as Mohammed Atta. Historically, holiness, power, glory, conquest, and empire have had greater appeal than freedom and democracy. But Bush’s belief in the convergence and even identity of goals apparently is unshakable.
Speaking in Whitehall later in 2003, Bush was at it again, claiming, “The establishment of a free Iraq in the heart of the Middle East will be a watershed event in the global expansion of democracy ... as the alternative to instability and hatred and terror.” Sure, “global expansion of democracy.” Andrew Bacevich of Boston University, a strategic thinker, wrote of Bush’s
fusion of breathtaking utopianism with barely disguised machtpolitik. It reads as if it were the product not of sober, ostensibly conservative Republicans but of an unlikely collaboration of Woodrow Wilson and the elder Field Marshal von Moltke.
On April 24, Bush repeated his fantastic theory in a speech in Irvine, California:
I based a lot of my foreign policy decisions on some things I think are true. One, I believe that there’s an Almighty, and secondly, I believe one of the great gifts of the Almighty is the desire in everybody’s soul, regardless of what you look like or where you live, to be free. I believe liberty is universal. I believe people want to be free. And I know that democracies do not war with each other. And I know that the best way to defeat the enemy, the best way to defeat their ability to exploit hopelessness and despair is to give people a chance to live in a free society.
Well, it is certainly taking a long time for what the Almighty wants to make its appearance in the actual world. Most of the world today is far from democratic. Over the long span of human history, democracy is almost invisible. In the real world, many people want a society in which the rules laid down in the Koran govern all activities and take absolute precedence over liberty. In Iraq, the radical cleric Moqtada al-Sadr has no interest in freedom, and al-Sadr is the power behind the present Prime Minister Maliki. What planet is Bush living on? He makes the “metaphysical dogma” of the radical philosophes seem sober by comparison.
Before long, students may be allowed to take entire history courses in the expanding library of books analyzing Bush’s Iraq calamity and other failures of his administration, which also derive from his tendency to privilege ideology over realism. Supply-side ideology led to large tax cuts and mountainous deficits. Privatization ideology led to an incomprehensible and unnecessarily expensive prescription-drug plan. No previous administration has produced such an outpouring.
Is Bush a conservative? Of course not. When all the evidence is in, I think historians will agree with Princeton’s Sean Wilentz, who wrote a carefully argued article judging Bush to have been the worst president in American history. The problem is that he is generally called a conservative, perhaps because he obviously is not a liberal. It may be that Bush, in the magnitude of his failure, defies conventional categories. But the word “conservative” deserves to be rescued. Against the misconception that Bush is a conservative, and appealing to Burke, all of our analytical energies must be brought to bear. I hope I have made a beginning here. ____________________________________________
Jeffrey Hart is a senior editor of National Review and author, most recently, of The Making of the American Conservative Mind.

Monday, November 20, 2006



The Times
November 20, 2006
Beatles remixed for the 21st century
Adam Sherwin,
An act of sacrilege or a magical reworking of pop’s greatest legacy? Beatles fans will today deliver their verdict when a remix album of the Fab Four’s hits is given a global release.
Sir George Martin and his son, Giles, spent three years splicing and dicing the Beatles’ catalogue to create Love, the soundtrack to a Las Vegas stage show with Cirque du Soleil. A “mash-up” in modern musical parlance, it mixes elements of Penny Lane with Strawberry Fields Forever and marries Blackbird to Yesterday in a reinterpretation of the band’s work.
Bob Spitz, author of The Beatles: The Biography, said: “They are the Beatles’ songs and overdubbing them and massaging them allows other people to impose their own creative ideas on something that was so immediate and of a particular time. I thought that legacy was virtually tamper-proof, until now.” The project’s supporters argue that its juxtapositions and layerings, together with a digital enhancement of the original recordings at the Abbey Road studios, shed new light on the Beatles’ legacy. “We know there’ll be letters,” Giles Martin said. “But sometimes we take this music for granted. I hope this will help people to hear Beatles music again.”
EMI expects the first “new” Beatles album in 26 years to be a bestseller. “Best of” albums by U2 and Oasis are also released today. A spokesman for HMV said that the albums had attracted the largest numbers of pre-orders of any to date. The company said that the Oasis CD was likely to sell the most copies this week, but that the Beatles album would do well over Christmas.
Sir George, 80, said that the record, which has united Sir Paul McCartney and Yoko Ono, concludes his involvement with the band. He believes that it will see off the challenge from Oasis. Sir George said: “I think the Gallaghers are very talented, but they are not as good as the Beatles. Noel once told me he felt he was the inheritor of Paul McCartney’s ethos. They have modelled themselves on the Beatles, but possibly too much for the good of themselves.”
The Love stage show is selling 2,000 tickets a night at the Mirage Hotel in Las Vegas, even with prices of up to $150 (£80). Discussions about a global tour are continuing. A London staging could become one of the West End’s success stories if a suitable venue were found.
Sir George said: “They spent $100 million building a theatre specially for the show in Vegas. If you could take over the Royal Albert Hall and convert it that would be fine. But it’s very difficult to transfer.”
Philip Anschutz, the American tycoon, could bring the show to the Dome in London. The venue, renamed the O2, is part of his Anschutz Entertainment Group. A spokeswoman said: “It would be great to have Love in London and, as a world-class facility, we would welcome it at the O2.”
Up for auction
£100,000 estimated price that a Maton MS500 Mastersound guitar played by George Harrison in 1963 is expected to fetch at auction
£45,000 expected for John Lennon’s hand-made, hand-painted psychedelic jacket, from 1967, made for him by Mick Jagger’s brother, Chris
The Cooper Owen’s Music Legends auction, including 300 pieces of pop and rock memorabilia, takes place at Air Studios in London on November 30

Sunday, November 19, 2006


WTC's Freedom Tower starts to take shape
By Amy Westfeldt, Associated Press Writer November 18, 2006
NEW YORK --Seventy trucks rolled into ground zero Saturday to pour the concrete base of the signature skyscraper at the new World Trade Center, creating the first visible signs of the long-delayed tower.
The concrete mixers began by dropping 520 cubic yards of concrete near thin steel bars jutting from the bottom of ground zero. The base will anchor the 1,776-foot Freedom Tower's concrete core. Next month, the first steel beams for the tower are scheduled to rise.
The site has been bustling in recent months, with work on half a dozen projects under way after years of disputes about designs and authority over the redevelopment.
"We've really turned this site around," said Steve Plate, director of priority capital programs for the Port Authority of New York and New Jersey, which owned the trade center.
The work continued despite calls from some family members of victims to halt construction. A search for human remains is ongoing in the area after more than 200 bones were found in manholes on the site's western edge. A search on the rooftop of the Millennium Hilton continued Saturday.
Gov. George Pataki's chief of staff John Cahill said the city needs to be sensitive to families' needs, but "it is time to build this site."
For visitors who peer through distant metal fences for a look into the 70-feet-deep pit, it may take time to see the buildings take shape on Manhattan's skyline. Project manager Mel Ruffini said it would take nearly two years for the Freedom Tower to reach street level.
Work on the Freedom Tower got under way earlier this year after the Port Authority renegotiated developer Larry Silverstein's lease to rebuild on the site. It is scheduled to open in 2011.
The agency is preparing the eastern end of the site for three more office towers, and earlier this week poured part of the concrete foundation for a permanent transit hub.
Construction of the memorial to the Sept. 11, 2001, terrorist attacks began this spring. This week workers began drilling into parts of the south tower's footprint to begin building columns that will support the twin reflecting pools meant to symbolize the lost towers.

Sunday, November 12, 2006



That way son
From Andrew Sullivan
Sunday London Times
After his election humiliation George Bush has slunk back to Dad for help. It's Shakespeare meets Freud, says Andrew Sullivan
The events of last week in America have an almost Shakespearean quality to them. It’s like some ghastly conflation of Richard II’s doom-laden “Down, down, I come” and Richard III’s “winter of our discontent”. Richard II is how Bush would like the world to see him — a king of noble motives brought low by injustice and fate. Richard III is . . . well, ask Karl Rove, the hunch in W’s back.At the centre of this epic psycho-political drama is a royal family of sorts in a war for survival: the Bush dynasty, a story of a father and his son, their tortured relationship and what they have had to do to survive.Last week George W Bush was forced back — once again — to the protective arms of his father. They call the first President Bush “Poppy” in the family, and it captures both the authority and the slight daffiness of the 41st president. His first son always lived in his shadow — both deeply admiring him and deeply resenting him, the way dauphins often do their monarchs.In his own presidency, with the Yankee Bush clan reforged in the desert of Midland, Texas, Dubya tried to chart his own course, create his own destiny, become his own man. He would have two terms, not one. He would never raise taxes. And he would remove Saddam, not just corner him; liberate Iraq, not just contain it.Last week the dream collapsed in the sands of Anbar and the voting booths of the Midwest. The first son, who always wanted to make a name for himself, to escape the suffocating legacy of a presidential father, was forced by the American people to go back to Poppy. BY nominating Robert Gates to the Pentagon, Bush Jr was reduced to asking one of his father’s closest friends to clean up the mess. What was Gates’s last job? As president of Texas A&M University, Gates hosted Poppy’s own presidential library. What was his previous claim to fame? Poppy had appointed him CIA director. Poppy himself had been CIA director — manoeuvred into the shell-shocked institution after Vietnam by a wily young Donald Rumsfeld in the Ford administration. Gates was a CIA director’s CIA director. He was Poppy’s Poppy.Is it possible to come up with a figure other than Gates more closely connected to the patriarch and not the dauphin? Actually, yes: James Baker. By asking Baker, another close confidant of the senior Bush, to head up a commission to solve the Iraq disaster, the president was forced again to return to the wise men of his prudent father’s circle.The irony last week was even worse for the 43rd president. By firing his defence secretary, Bush was also firing his dad’s old enemy. He was surrendering one of his dad’s foes and replacing him with one of the old man’s closest pals. In his latest book, State of Denial, Bob Woodward is clear about the long-held animosity between Poppy and Rummy. They couldn’t stand each other. “Bush senior thought Rumsfeld was arrogant, self-important, too sure of himself and Machiavellian. He believed that in 1975 Rumsfeld had manoeuvred President Ford into selecting him to head the CIA. The CIA was at perhaps its lowest point in the mid-1970s. Serving as its director was thought to be a dead end . . . Rumsfeld had also made nasty private remarks that Bush was a lightweight, a weak cold war CIA director who did not appreciate the Soviet threat and was manipulated by secretary of state Henry Kissinger.”If you want to know why Bush Jr held onto Rumsfeld longer than any sane person should have, one clue lies in the paternal relationship. Surrendering Rumsfeld means that Poppy was right. Not just right about Rumsfeld’s skills and nasty streak. Right about the biggest things: war and peace, country and honour. Rummy was in some ways the personification of the son’s refusal to be his father. Rummy was the prickly, querulous, impolitic businessman, everything Poppy was not.By picking one of his father’s old nemeses to head the Pentagon in 2000, W was also telling the old man to stay out of his affairs. And Poppy did. As Woodward also recounts, Bush Sr told his friend Prince Bandar of Saudi Arabia: “I had my turn. It is his turn now. I just have to stay off the stage . . . I will not make any comment vis-à-vis this president, not only out of principle but to let him be himself.” W was indeed himself, which makes the failure now that much harder. Last week was a moment of complete humiliation. Wednesday, for Bush Jr, must have been the most crushing psychological moment in his presidency.Of course, this kind of analysis would be dubbed by the president mere psychobabble. But the facts are plain.George W was the first son, but never the favoured one, of the Bush dynasty. Jeb, his younger brother, was always going to be president. W was the loser, the joker, the wastrel. But W was also, in his heart, desperate to emulate his father, while too driven by his own ego to listen to him. He desperately both wanted approval and just as desperately wanted to be free and independent. It is self-evidently hard to be the son of a vice-president and president. It is hard to feel that every business deal you ever made was not because you were shrewd but because your father was powerful. It is hard both to support your father’s career and also to resent him. But that is the story of this president and, in part, of this administration.When W was campaigning with his father in 1992, he described the oedipal conflict himself. At the New Orleans convention that year W confided to the Houston Chronicle that he was ambivalent about his father’s re-election campaign. He said his father’s defeat might be good for him, according to the invaluable early biography of Bush by Bill Minutaglio, First Son.Then he brought himself up short and said to the reporter: “That is a strange thing to say, isn’t it? But if I were to think about running for office and he was president, it would be more difficult to establish my own identity. It probably would help me out more if he lost.” The struggle between father and son began early. W was his mother’s boy, his father distant. “My father doesn’t have a normal life,” he once told a classmate, according to Minutaglio. “I don’t have a normal father.”He was also a rebel. After a period at Yale and in the National Guard, W spent his twenties partying. One night when he was 26, he hadbeen out drinking and had driven home. He had drunkenly barrelled his car into a neighbour’s rubbish bin, which had become attached to the car, and Bush drove down the street with the bin making a hell ofa racket.He pulled up, walked into the house, and was told that his father wanted to see him in the family den immediately. It was only a few weeks after the death of the über-patriarch Prescott Bush, W’s grandfather. But the young man was in a feisty mood, as the journalist David Maraniss revealed way back in 1989.“I hear you’re looking for me,” W said to Poppy, slurring his words. “You wanna go mano a mano right here?”Jeb, the favourite son, intervened. He told his parents that W had just been accepted by Harvard Business School, something W had kept from them. They were stunned, and the potentially violent stand-off was defused. “You should think about that, son,” Poppy said. “Oh, I’m not going,” W replied. “I just wanted to let you know I could get into it.”Of course, W went. And in that tortured interaction, all the subsequent psycho-drama can be found. Supremely rebellious and yet deeply loyal, all W wanted was to please and yet outdo his dad. In the end he achieved neither.W went to the Ivy League but hated it for what he saw as the American elite’s snootiness and liberalism. At his news conference last Wednesday the president looked at the press corps and saw the same type of people. “Why all the glum faces?” he sneered bitterly. They reminded him of everything he loathed in his dad.But the love was there as well. A family friend, Joe O’Neill, even ascribed Bush’s decision at 40 to stop drinking to the paternal factor. “He looked in the mirror and said, ‘Some day I might embarrass my father. It might get my dad in trouble.’ And boy, that was it. That’s how high a priority it was,” O’Neill told Minutaglio. “He never took another drink.”W surpassed his dad by actually becoming a businessman. But his oil company exploits never worked out. He kept trying to find the magic oilfield that would reward his investors, but it never arrived. And there was always the suspicion that his family’s money and his father’s political power greased the wheels.The more W tried to get past his father’s legacy the more it tracked him. Here is a passage from Minutaglio’s book that bears rereading this week. It’s about Bush’s early attempts to strike a big oilfield in Texas.“The project was simply too large for him, and it was like putting a steel cap on a dream . . . Bush was extremely disappointed at losing . . . the chance to be deeply, independently capitalised without having to rely on his uncle’s investors. ‘We had never found the huge liberator,’ is what Bush once said to a Dallas writer.”It’s almost too poignant a parallel to the present. Bush so wanted to be a huge liberator in another desert. But the wells were dry.When Bush failed in business, his family contacts kept him financially afloat. He was kept on boards and bailed out of trouble by people eager to keep in his father’s good graces. His connections were also inextricable from his successful bid to be governor of Texas. But when he won re-election as governor, he felt empowered for the first time as a political force independent of his father. He had found Karl Rove, who had honed his skills in the gutters of Southern political campaigning. And he had an ease with people that his father lacked and a shrewdness he inherited from his mother.It was a powerful combination, and when W ran for the presidency it was both to avenge his father’s defeat at the hands of Bill Clinton and at the same time a way to show how he was not like his father at all. Ideologically he was much closer to the religious right, he was adamant on taxes, he wasn’t prudent fiscally, and he wasn’t timid in the world at large.By putting Rumsfeld, his father’s enemy, in the Pentagon he sent a signal that he was his own president and his own man. Gaining the presidency was emulating his father. But regaining it was the final moment when Bush surpassed his one-term dad. It was also the moment when this administration started falling apart at the seams.W loves boldness. It’s his greatest strength and his deepest weakness. When it came to Iraq, his decision was laden with memory. Hisfather had fought a war against Saddam. Its hallmarks were a vast multinational coalition, huge numbers of troops and distinctly limited goals. The son’s war would be different.With Rumsfeld in the Pentagon, it would be with an extra-light force, with far fewer allies, and far more ambitious. It would not only defang Saddam, it would establish democracy. If his father always had trouble articulating the “vision thing”, as he once memorably put it, the son was all vision. In fact, the vision blinded him to the reality. You can forgive W for the innovative, lightning decapitation of the Baghdad regime. In fact it was a stroke of genius. But you cannot forgive him for the hubris afterwards, for having no plan for the post-invasion, no troops to keep order, no strategy for everything his father had once worried so much about. His father would never have done such a thing. Wouldn’t be “prudent”, would it? As Woodward recounts in his new book, the parents were worried all along. At a black-tie dinner on the eve of invasion, Barbara Bush took aside a Washington friend, David Boren, a former Democratic senator who had been the chairman of the select committee on intelligence during Poppy’s presidency.“You always told me the truth,” Barbara opened, drawing Boren aside for a private chat. “Yes, ma’am,” Boren replied.“Will you tell me the truth now?”“Certainly.”“Are we right to be worried about this Iraq thing?”“Yes. I’m very worried.”“Do you think it’s a mistake?”“Yes, ma’am,” Boren replied. “I think it’s a huge mistake if we go in right now, this way.”“Well, his father is certainly worried and is losing sleep over it. He’s up at night worried.”“Why doesn’t he talk to him?”“He doesn’t think he should unless he’s asked,” Barbara Bush said.This time there was no Jeb to intervene to avert a father-son clash. And the father was too decent and too loyal to force one.Poppy’s closest allies did what they could before the war to stage an intervention. Brent Scowcroft, Poppy’s former national security adviser, wrote a newspaper article warning against war in Iraq. Scowcroft was a realist of the old Poppy school. He had no illusions about spreading democracy among Arabs; he’d been happy to deal with Saddam as a bulwark against Iran; he was content to stand back in 1991 as Saddam, left in power, murdered countless Shi’ites and Kurds, because the United States was not prepared to occupy what Churchill once called the “ungrateful volcano” of Iraq.Scowcroft was Condoleezza Rice’s mentor. He was part of the Bush famiglia, governed by the clan’s code of omerta. For him to be disloyal in public was a warning shot from the old man. But the son didn’t listen. Too many of us were deaf.There is another irony. Poppy was prudent but not bold. W was bold but not prudent. If Poppy had been as bold as his son back in 1990 and had actually invaded Iraq, the coalition would indeed have been greeted as liberators in Baghdad. There would even have been enough troops to succeed in an occupation. The anti-American suspicions that the Shi’ites retained from their bitter experience of being abandoned in 1991 and the rapid deterioration in Iraq’s civil society during the sanctions regime of the 1990s might never have come about. The ironies are painful. If the father had been more like the son in 1990 the world might now be a very different place. And if the son had been more like the father in 2003, had responded to obvious errors and brought sufficient allies and troops to the task, he might have succeeded as well.But the tragedy of history is that we never know what might have been: 1990 wasn’t 2003, and Poppy wasn’t W.W stuck with Rumsfeld’s vision even when no one else would. Poppy stuck to caution even when he had a historic opportunity to remake the Middle East before the toxin of Islamism could become more potent. Each was his own man and each, in his own way, therefore failed. Except that the consequences of W’s failure are immeasurably greater than Poppy’s. The truth about this president is that he still loves and reveres his father. This cathartic moment in American and world history might also be a catharsis within the Bush family. The ranks are already closing. With Gates and Baker now back in the fold, Poppy’s faction has solidified behind W’s. They want to help him out, to rescue his presidency, to rebalance American power in the world and to carve something from the wreckage in Iraq. Last week the American people forced the family intervention. They knew what they were doing. If you combine W’s shrewdness with Poppy’s wisdom you might have the beginning of a new day in world politics. This Shakespearean drama is not over. We have merely finished Act IV. W has two more years. The Democrats will force him to move domestically to the centre, and Daddy’s team will not abandon the son in his hour of need. Their price was Rumsfeld’s head, and they now have it on a platter.What they will do is not yet knowable. Much is on the table. As recently as two years ago Robert Gates authored a Council on Foreign Relations paper advocating direct negotiations with Iran. Baker’s Iraq Study Group has already deemed the goal of democracy impossible there. The American people, for their part, do not want defeat or a Vietnam-style retreat in Iraq. They just want a sane strategy, shorn of delusion, fanaticism and arrogance.Whether Bush has the strength to reconcile with his father at this moment and do what is necessary is also unknowable. These are not characters in a play. They are still human beings, as unpredictable and inscrutable as Shakespeare saw them. The American voters just shifted the underlying plot; and the Iraqi people have their own painful decisions to make and loyalties to break.Act V, in other words, is about to begin.

Rosewood