Monday, April 24, 2017

Iran Election 2017

Iran's election: It's not about moderates or hardliners

Presidential election is going to determine whether the rationalisation and normalisation of Iran is going to continue.

by Saeid Golkar

Saeid Golkar is a lecturer at the Middle East and North African Studies Program at Northwestern University.

The upcoming presidential election in May 2017 will determine not only Iran's policies in the short term but also the future direction of the Islamic Republic. Based on the result of this election, Iran can move towards either a more theocratic, militaristic regime or a more democratic, electoral one.

Many of Iran's observers have analysed elections through the binary lens of moderates v hardliners. From this viewpoint, the primary candidates who can pass the filter of the Guardian Council, a conservative institution responsible for endorsing the final candidate list, belong to two groups: hardliners and moderates. 
Right now, three main candidates are running: incumbent President Hojatoleslam Hassan Rouhani from the moderate wing; and Hojatoleslam Ebrahim Raisi and Mohammad Ghalibaf, both hardliners.

Raisi is the national prosecutor-general and the custodian of the Imam Reza shrine in Mashhad, while Ghalibaf is the former commander of the Islamic Revolutionary Guard Corps (IRGC) and Iran's national police, and current Mayor of Tehran. 

However, the binary of moderates and hardliners is no longer a useful paradigm with which to understand Iranian politics. It cannot explain the difference among hardliner candidates; nor can it predict Iran's political future.

To better understand Iran's politics, it helps to examine which power bloc each candidate belonged to before running for office: the clergy, the technocrats, or the military/security forces. 
Three blocks that shaped Iran
Since 1979, these three blocs and the interactions between them have shaped Iranian policy. Three forms of alliances have arisen from the interactions between these groups: the clergy-military, the clergy-technocrats and the military-technocrats.

In the 1980s, the clergy-military alliance was the dominant axis, and the clergy had the upper hand therein. In the 1990s, power shifted in favour of the technocrats and against the Revolutionary Guard.

During the Hashemi Rafsanjani period (1989–1997), the clergy-technocrats alliance was dominant, and the clerics had more weight. However, during Khatami's presidency (1997-2005), this alliance changed to a technocrat-clergy alliance in which the bureaucrats had the upper hand.

The power shifted again under Ahmadinejad as the IRGC-bureaucrats alliance became the most influential group and political base of his hardliner administration (2005-2012). Currently, under Rouhani's presidency, technocrats have returned to power and are responsible for shaping government policy, while the IRGC has become marginalised, at least in the administration.

Using this framework, we can categorise the main presidential candidates in three categories: some are close to the clerical network (Howzeh), while others are close to the bureaucracy or military. For example, Hassan Rouhani represents the clergy-technocrat alliance; Raisi, the clergy-military/security alliance; and Ghalibaf - the security/military-technocrats alliance.

Iran's Supreme Leader Ayatollah Khamenei, is also closer to the alliance of the clergy and the military (IRGC). This explains his appointment of Raisi as the custodian of the Imam Reza shrine, the wealthiest conglomerate foundation in Mashhad. Raisi also has a good relationship with the IRGC and the Basij militia, as well as the Iranian judiciary. 

These three axes - the clergy-military, the clergy-technocrat, and the technocrat-military - have different political, social, cultural and even economic orientations. As a result, the victory of one of these candidates in the May presidential election will decide which group holds the most power in shaping Iran's domestic and foreign policies. Not only that, this most powerful axis will affect the future of the Islamic Republic by influencing future elections and, most importantly, the appointment of the next supreme leader.

It seems that this is the most important election in the history of the Islamic Republic so far.
Second most powerful figure
While Iran's presidents lack the ultimate power to design Iran's security and foreign policies, which falls under the domain of Iran's supreme leader, the president is still the second most powerful figure in Iran's political structure.

The re-election of President Rouhani would lead to continued normalisation and rationalisation of the regime and a strengthening of the technocrats. Meanwhile, the victory of Raisi might instead strengthen the axis of radical clergy and the Revolutionary Guard and lead to the radicalisation of the administration and narrowing down of the political space. Similarly, Ghalibaf's victory would strengthen the security/military-technocrat alliance and probably increase the involvement of the IRGC in politics.

The election is significant in determining short-term policies, but it also has a substantial impact on the selection of the next supreme leader. Although the president is not directly involved in the selection process, the president still plays an important role. The Iranian constitution states that, in the event of the death of the Supreme Leader, the president is one of three key figures in a council that will take over the duties of the supreme leader until the Assembly of Experts, a body of 88 jurists, appoints a new one.

The other two members of this council are the head of the judicial power and a jurist from the Guardian Council. Because there are no constraints on the length of the appointment process, the temporary ruling council could be in power for a long time.

Among the presidential candidates, both Rouhani and Raisi are members of the Assembly of Experts. As a result, they have more power in the selection of the next supreme leader when the time comes. Because Ayatollah Khamenei is 77 and has been reported not to be in good health, there is a possibility that the next supreme leader will be elected within the next four years.

The leaders who are affiliated with the clergy-military alliance can move Iran in a different direction, possibly towards a more militaristic regime, than leaders who are closer to the clergy-technocrats axis. For this reason, it seems that this is the most important election in the history of the Islamic Republic so far.

Saeid Golkar is a lecturer at the Middle East and North African Studies Programme at Northwestern University, and senior fellow of Iran policy at The Chicago Council on Global Affairs.

The views expressed in this article are the author's own and do not necessarily reflect Al Jazeera's editorial policy.

Thursday, April 20, 2017

Why the FBI Kept a 1,400-Page File on Einstein

From The National Geographic on Einstein's birthday

The world-famous physicist was outspoken against racism, nationalism, and nuclear bombs, prompting deep suspicion from J. Edgar Hoover.

Albert Einstein was already a world-famous physicist when the FBI started keeping a secret dossier on him in December 1932. He and his wife Elsa had just moved to the United States from their native Germany, and Einstein had been very vocal about the social issues of his time, arguing publicly against racism and nationalism.

By the time of Einstein’s death on April 18, 1955, that FBI file would be 1,427 pages long. Agency director J. Edgar Hoover was deeply suspicious of Einstein’s activism; the man was quite possibly a communist, according to Hoover, and was certainly “an extreme radical.”

Einstein himself probably would have laughed out loud at those labels if he’d known about them; he’d heard far worse from the Nazis back home. And he was not at all intimidated by officialdom. “Unthinking respect for authority is the greatest enemy of truth,” he declared in 1901.

Einstein's defiant attitude had gotten him kicked out of the German equivalent of high school at age 15, and that had led him to renounce his citizenship at age 17. He wanted nothing more to do with Germany’s authoritarian schools and rampant militarism, which he loathed.

Instead, Einstein attended the Zurich Polytechnic Institute in Switzerland, became a Swiss citizen, and, after he graduated, went to work at the Swiss patent office in Bern, where he did his revolutionary work on relativity and quantum theory in 1905. (Read about the 2016 discovery of gravitational waves, a phenomenon Einstein predicted a hundred years ago.)

Einstein did not return to live in Germany until April 1914, when his achievements led to a prestigious appointment at the University of Berlin. There, he continued to develop his ideas about relativity and gravity, which received spectacular confirmation in 1919 from observations of a solar eclipse and which have shaped our understanding of the universe ever since.

The rising Nazi party was soon denouncing relativity as “a Jewish perversion”—the 1920s equivalent of using “fake news” as an all-purpose put-down—and Einstein was receiving so many anonymous death threats that he tried to avoid walking alone.

But threats didn’t shut him down. Instead, he repeatedly used his newfound fame to speak out against what he saw as the wrongs of the world. Silence in the face of evil, he once said, “would have made me feel guilty of complicity.”

He denounced militant nationalism. “The measles of mankind,” he called it in 1929.

He questioned capitalism. “I regard class differences as contrary to justice and, in the last resort, based on force,” he wrote in 1931. “Let every man be respected as an individual and no man idolized.”

He protested racism. In 1937, when African-American singer Marian Anderson was denied a hotel room in Einstein’s new home town of Princeton, New Jersey, he and Elsa invited Anderson to stay in their home—the beginning of a lifelong friendship. He also befriended the African-American singer Paul Robeson, who had been ostracized for being a communist. And in a 1946 address to the historically black Liberty University in Pennsylvania, Einstein declared segregation to be “a disease of white people.”

After 1933, the rise of Hitler forced Einstein to concede that pure pacifism was no longer realistic. In August 1939, fearing that German physicists were already racing to exploit the newly discovered phenomenon of nuclear fission, Einstein signed a letter to U.S. President Franklin Roosevelt warning that “the element uranium may be turned into a new and important source of energy in the immediate future”—that is, a bomb.
Roosevelt’s response was the Manhattan Project: a crash program to develop the atomic bomb before Hitler could.

Einstein played no further role in the project. But in the spring of 1945 he did write another letter urging the President to meet with Manhattan Project scientists who were concerned about the rush to finish the bomb and use it, even though Germany was near defeat and had clearly given up on uranium.

Roosevelt died on April 12 before he could read the letter, and when Einstein learned in August that an atomic bomb had been detonated over the Japanese city of Hiroshima, he could only whisper, “Oh my God.” (Also see “This Bonsai Survived Hiroshima But Its Story Was Nearly Lost.”)

For the rest of his life, he was a tireless advocate for bringing nuclear weapons under some form of international control. In the atomic age, he argued, war had become a form of insanity.

We can only guess what Einstein would have said about today’s political atmosphere. But we do know his reaction to an earlier era of government crackdowns: the anti-communist hysteria of the 1950s.

“Every intellectual who is called before one of the committees ought to refuse to testify,” Einstein declared in 1953, referring to the congressional investigations that were intimidating and ruining the careers of many innocent people.

That statement earned him outraged editorials in newspapers across the country, including the Washington Post and New York Times. But he wore their condemnation proudly.

After his firsthand experience with the “brute force and fear” taking hold in Europe, what impressed Einstein most about America “was the country’s tolerance of free thought, free speech, and nonconformist beliefs”—the very qualities that had always animated his science, says his biographer Walter Isaacson.

Einstein was not about to stand by and watch while, in the great physicist’s words, “the German calamity of years ago repeats itself.”

Wednesday, April 12, 2017

I Thought I Understood the American Right. Trump Proved Me Wrong.

Rick Perlstein NY Times

A historian of conservatism looks back at how he and his peers failed to anticipate the rise of the president.


Until Nov. 8, 2016, historians of American politics shared a rough consensus about the rise of modern American conservatism. It told a respectable tale. By the end of World War II, the story goes, conservatives had become a scattered and obscure remnant, vanquished by the New Deal and the apparent reality that, as the critic Lionel Trilling wrote in 1950, liberalism was “not only the dominant but even the sole intellectual tradition.”

Year Zero was 1955, when William F. Buckley Jr. started National Review, the small-circulation magazine whose aim, Buckley explained, was to “articulate a position on world affairs which a conservative candidate can adhere to without fear of intellectual embarrassment or political surrealism.” Buckley excommunicated the John Birch Society, anti-Semites and supporters of the hyperindividualist Ayn Rand, and his cohort fused the diverse schools of conservative thinking — traditionalist philosophers, militant anti-Communists, libertarian economists — into a coherent ideology, one that eventually came to dominate American politics.

I was one of the historians who helped forge this narrative. My first book, “Before the Storm,” was about the rise of Senator Barry Goldwater, the uncompromising National Review favorite whose refusal to exploit the violent backlash against civil rights, and whose bracingly idealistic devotion to the Constitution as he understood it — he called for Social Security to be made “voluntary” — led to his crushing defeat in the 1964 presidential election. Goldwater’s loss, far from dooming the American right, inspired a new generation of conservative activists to redouble their efforts, paving the way for the Reagan revolution. Educated whites in the prosperous metropolises of the New South sublimated the frenetic, violent anxieties that once marked race relations in their region into more palatable policy concerns about “stable housing values” and “quality local education,” backfooting liberals and transforming conservatives into mainstream champions of a set of positions with enormous appeal to the white American middle class.

These were the factors, many historians concluded, that made America a “center right” nation. For better or for worse, politicians seeking to lead either party faced a new reality. Democrats had to honor the public’s distrust of activist government (as Bill Clinton did with his call for the “end of welfare as we know it”). Republicans, for their part, had to play the Buckley role of denouncing the political surrealism of the paranoid fringe (Mitt Romney’s furious backpedaling after joking, “No one’s ever asked to see my birth certificate”).
Then the nation’s pre-eminent birther ran for president. Trump’s campaign was surreal and an intellectual embarrassment, and political experts of all stripes told us he could never become president. That wasn’t how the story was supposed to end. National Review devoted an issue to writing Trump out of the conservative movement; an editor there, Jonah Goldberg, even became a leader of the “Never Trump” crusade. But Trump won — and conservative intellectuals quickly embraced a man who exploited the same brutish energies that Buckley had supposedly banished, with Goldberg explaining simply that Never Trump “was about the G.O.P. primary and the general election, not the presidency.”
The professional guardians of America’s past, in short, had made a mistake. We advanced a narrative of the American right that was far too constricted to anticipate the rise of a man like Trump. Historians, of course, are not called upon to be seers. Our professional canons warn us against presentism — we are supposed to weigh the evidence of the past on its own terms — but at the same time, the questions we ask are conditioned by the present. That is, ultimately, what we are called upon to explain. Which poses a question: If Donald Trump is the latest chapter of conservatism’s story, might historians have been telling that story wrong?

American historians’ relationship to conservatism itself has a troubled history. Even after Ronald Reagan’s electoral-college landslide in 1980, we paid little attention to the right: The central narrative of America’s political development was still believed to be the rise of the liberal state. But as Newt Gingrich’s right-wing revolutionaries prepared to take over the House of Representatives in 1994, the scholar Alan Brinkley published an essay called “The Problem of American Conservatism” in The American Historical Review. American conservatism, Brinkley argued, “had been something of an orphan in historical scholarship,” and that was “coming to seem an ever-more-curious omission.” The article inaugurated the boom in scholarship that brought us the story, now widely accepted, of conservatism’s triumphant rise.

That story was in part a rejection of an older story. Until the 1990s, the most influential writer on the subject of the American right was Richard Hofstadter, a colleague of Trilling’s at Columbia University in the postwar years. Hofstadter was the leader of the “consensus” school of historians; the “consensus” being Americans’ supposed agreement upon moderate liberalism as the nation’s natural governing philosophy. He didn’t take the self-identified conservatives of his own time at all seriously. He called them “pseudoconservatives” and described, for instance, followers of the red-baiting Republican senator Joseph McCarthy as cranks who salved their “status anxiety” with conspiracy theories and bizarre panaceas. He named this attitude “the paranoid style in American politics” and, in an article published a month before Barry Goldwater’s presidential defeat, asked, “When, in all our history, has anyone with ideas so bizarre, so archaic, so self-confounding, so remote from the basic American consensus, ever gone so far?”

It was a strangely ahistoric question; many of Goldwater’s ideas hewed closely to a well-established American distrust of statism that goes back all the way to the nation’s founding. It betokened too a certain willful blindness toward the evidence that was already emerging of a popular backlash against liberalism. Reagan’s gubernatorial victory in California two years later, followed by his two landslide presidential wins, made a mockery of Hofstadter. Historians seeking to grasp conservatism’s newly revealed mass appeal would have to take the movement on its own terms.

That was my aim when I took up the subject in the late 1990s — and, even more explicitly, the aim of Lisa McGirr, now of Harvard University, whose 2001 book, “Suburban Warriors: The Origins of the New American Right,” became a cornerstone of the new literature. Instead of pronouncing upon conservatism from on high, as Hofstadter had, McGirr, a social historian, studied it from the ground up, attending respectfully to what activists understood themselves to be doing. What she found was “a highly educated and thoroughly modern group of men and women,” normal participants in the “bureaucratized world of post-World War II America.” They built a “vibrant and remarkable political mobilization,” she wrote, in an effort to address political concerns that would soon be resonating nationwide — for instance, their anguish at “liberal permissiveness” about matters like rising crime rates and the teaching of sex education in public schools.

But if Hofstadter was overly dismissive of how conservatives understood themselves, the new breed of historians at times proved too credulous. McGirr diligently played down the sheer bloodcurdling hysteria of conservatives during the period she was studying — for example, one California senator’s report in 1962 that he had received thousands of letters from constituents concerned about a rumor that Communist Chinese commandos were training in Mexico for an imminent invasion of San Diego. I sometimes made the same mistake. Writing about the movement that led to Goldwater’s 1964 Republican nomination, for instance, it never occurred to me to pay much attention to McCarthyism, even though McCarthy helped Goldwater win his Senate seat in 1952, and Goldwater supported McCarthy to the end. (As did William F. Buckley.) I was writing about the modern conservative movement, the one that led to Reagan, not about the brutish relics of a more gothic, ill-formed and supposedly incoherent reactionary era that preceded it.

A few historians have provocatively followed a different intellectual path, avoiding both the bloodlessness of the new social historians and the psychologizing condescension of the old Hofstadter school. Foremost among them is Leo Ribuffo, a professor at George Washington University. Ribuffo’s surname announces his identity in the Dickensian style: Irascible, brilliant and deeply learned, he is one of the profession’s great rebuffers. He made his reputation with an award-winning 1983 study, “The Old Christian Right: The Protestant Far Right From the Great Depression to the Cold War,” and hasn’t published a proper book since — just a series of coruscating essays that frequently focus on what everyone else is getting wrong. In the 1994 issue of The American Historical Review that featured Alan Brinkley’s “The Problem of American Conservatism,” Ribuffo wrote a response contesting Brinkley’s contention, now commonplace, that Trilling was right about American conservatism’s shallow roots. Ribuffo argued that America’s anti-liberal traditions were far more deeply rooted in the past, and far angrier, than most historians would acknowledge, citing a long list of examples from “regional suspicions of various metropolitan centers and the snobs who lived there” to “white racism institutionalized in slavery and segregation.”
After the election, Ribuffo told me that if he were to write a similar response today, he would call it, “Why Is There So Much Scholarship on ‘Conservatism,’ and Why Has It Left the Historical Profession So Obtuse About Trumpism?” One reason, as Ribuffo argues, is the conceptual error of identifying a discrete “modern conservative movement” in the first place. Another reason, though, is that historians of conservatism, like historians in general, tend to be liberal, and are prone to liberalism’s traditions of politesse. It’s no surprise that we are attracted to polite subjects like “colorblind conservatism” or William F. Buckley.

Our work might have been less obtuse had we shared the instincts of a New York University professor named Kim Phillips-Fein. “Historians who write about the right should find ways to do so with a sense of the dignity of their subjects,” she observed in a 2011 review, “but they should not hesitate to keep an eye out for the bizarre, the unusual, or the unsettling.”

Looking back from that perspective, we can now see a history that is indeed unsettling — but also unsettlingly familiar. Consider, for example, an essay published in 1926 by Hiram Evans, the imperial wizard of the Ku Klux Klan, in the exceedingly mainstream North American Review. His subject was the decline of “Americanism.” Evans claimed to speak for an abused white majority, “the so-called Nordic race,” which, “with all its faults, has given the world almost the whole of modern civilization.” Evans, a former dentist, proposed that his was “a movement of plain people,” and acknowledged that this “lays us open to the charge of being hicks and ‘rubes’ and ‘drivers of secondhand Fords.’ ” But over the course of the last generation, he wrote, these good people “have found themselves increasingly uncomfortable, and finally deeply distressed,” watching a “moral breakdown” that was destroying a once-great nation. First, there was “confusion in thought and opinion, a groping and hesitancy about national affairs and private life alike, in sharp contrast to the clear, straightforward purposes of our earlier years.” Next, they found “the control of much of our industry and commerce taken over by strangers, who stacked the cards of success and prosperity against us,” and ultimately these strangers “came to dominate our government.” The only thing that would make America great again, as it were, was “a return of power into the hands of everyday, not highly cultured, not overly intellectualized, but entirely unspoiled and not de-Americanized average citizens of old stock.”

This “Second Klan” (the first was formed during Reconstruction) scrambles our pre-Trump sense of what right-wing ideology does and does not comprise. (Its doctrines, for example, included support for public education, to weaken Catholic parochial schools.) The Klan also put the predations of the international banking class at the center of its rhetoric. Its worldview resembles, in fact, the right-wing politics of contemporary Europe — a tradition, heretofore judged foreign to American politics, called “herrenvolk republicanism,” that reserved social democracy solely for the white majority. By reaching back to the reactionary traditions of the 1920s, we might better understand the alliance between the “alt-right” figures that emerged as fervent Trump supporters during last year’s election and the ascendant far-right nativist political parties in Europe.

None of this history is hidden. Indeed, in the 1990s, a rich scholarly literature emerged on the 1920s Klan and its extraordinary, and decidedly national, influence. (One hotbed of Klan activity, for example, was Anaheim, Calif. McGirr’s “Suburban Warriors” mentions this but doesn’t discuss it; neither did I in my own account of Orange County conservatism in “Before the Storm.” Again, it just didn’t seem relevant to the subject of the modern conservative movement.) The general belief among historians, however, was that the Klan’s national influence faded in the years after 1925, when Indiana’s grand dragon, D.C. Stephenson, who served as the de facto political boss for the entire state, was convicted of murdering a young woman.

But the Klan remained relevant far beyond the South. In 1936 a group called the Black Legion, active in the industrial Midwest, burst into public consciousness after members assassinated a Works Progress Administration official in Detroit. The group, which considered itself a Klan enforcement arm, dominated the news that year. The F.B.I. estimated its membership at 135,000, including a large number of public officials, possibly including Detroit’s police chief. The Associated Press reported in 1936 that the group was suspected of assassinating as many as 50 people. In 1937, Humphrey Bogart starred in a film about it. In an informal survey, however, I found that many leading historians of the right — including one who wrote an important book covering the 1930s — hadn’t heard of the Black Legion.

Stephen H. Norwood, one of the few historians who did study the Black Legion, also mined another rich seam of neglected history in which far-right vigilantism and outright fascism routinely infiltrated the mainstream of American life. The story begins with Father Charles Coughlin, the Detroit-based “radio priest” who at his peak reached as many as 30 million weekly listeners. In 1938, Coughlin’s magazine, Social Justice, began reprinting “Protocols of the Learned Elders of Zion,” a forged tract about a global Jewish conspiracy first popularized in the United States by Henry Ford. After presenting this fictitious threat, Coughlin’s paper called for action, in the form of a “crusade against the anti-Christian forces of the red revolution” — a call that was answered, in New York and Boston, by a new organization, the Christian Front. Its members were among the most enthusiastic participants in a 1939 pro-Hitler rally that packed Madison Square Garden, where the leader of the German-American Bund spoke in front of an enormous portrait of George Washington flanked by swastikas.

The 1960s and ’70s New York in which Donald Trump came of age was at conservatism’s cutting edge.

The Bund took a mortal hit that same year — its leader was caught embezzling — but the Christian Front soldiered on. In 1940, a New York chapter was raided by the F.B.I. for plotting to overthrow the government. The organization survived, and throughout World War II carried out what the New York Yiddish paper The Day called “small pogroms” in Boston and New York that left Jews in “mortal fear” of “almost daily” beatings. Victims who complained to authorities, according to news reports, were “insulted and beaten again.” Young Irish-Catholic men inspired by the Christian Front desecrated nearly every synagogue in Washington Heights. The New York Catholic hierarchy, the mayor of Boston and the governor of Massachusetts largely looked the other way.

Why hasn’t the presence of organized mobs with backing in powerful places disturbed historians’ conclusion that the American right was dormant during this period? In fact, the “far right” was never that far from the American mainstream. The historian Richard Steigmann-Gall, writing in the journal Social History, points out that “scholars of American history are by and large in agreement that, in spite of a welter of fringe radical groups on the right in the United States between the wars, fascism never ‘took’ here.” And, unlike in Europe, fascists did not achieve governmental power. Nevertheless, Steigmann-Gall continues, “fascism had a very real presence in the U.S.A., comparable to that on continental Europe.” He cites no less mainstream an organization than the American Legion, whose “National Commander” Alvin Owsley proclaimed in 1922, “the Fascisti are to Italy what the American Legion is to the United States.” A decade later, Chicago named a thoroughfare after the Fascist military leader Italo Balbo. In 2011, Italian-American groups in Chicago protested a movement to rename it.

Anti-Semitism in America declined after World War II. But as Leo Ribuffo points out, the underlying narrative — of a diabolical transnational cabal of aliens plotting to undermine the very foundations of Christian civilization — survived in the anti-Communist diatribes of Joseph McCarthy. The alien narrative continues today in the work of National Review writers like Andrew McCarthy (“How Obama Embraces Islam’s Sharia Agenda”) and Lisa Schiffren (who argued that Obama’s parents could be secret Communists because “for a white woman to marry a black man in 1958, or ’60, there was almost inevitably a connection to explicit Communist politics”). And it found its most potent expression in Donald Trump’s stubborn insistence that Barack Obama was not born in the United States.

Trump’s connection to this alternate right-wing genealogy is not just rhetorical. In 1927, 1,000 hooded Klansmen fought police in Queens in what The Times reported as a “free for all.” One of those arrested at the scene was the president’s father, Fred Trump. (Trump’s role in the melee is unclear; the charge — “refusing to disperse” — was later dropped.) In the 1950s, Woody Guthrie, at the time a resident of the Beach Haven housing complex the elder Trump built near Coney Island, wrote a song about “Old Man Trump” and the “Racial hate/He stirred up/In the bloodpot of human hearts/When he drawed/That color line” in one of his housing developments. In 1973, when Donald Trump was working at Fred’s side, both father and son were named in a federal housing-discrimination suit. The family settled with the Justice Department in the face of evidence that black applicants were told units were not available even as whites were welcomed with open arms.

The 1960s and ’70s New York in which Donald Trump came of age, as much as Klan-ridden Indiana in the 1920s or Barry Goldwater’s Arizona in the 1950s, was at conservatism’s cutting edge, setting the emotional tone for a politics of rage. In 1966, when Trump was 20, Mayor John Lindsay placed civilians on a board to more effectively monitor police abuse. The president of the Patrolmen’s Benevolent Association — responding, “I am sick and tired of giving in to minority groups and their gripes and their shouting” — led a referendum effort to dissolve the board that won 63 percent of the vote. Two years later, fights between supporters and protesters of George Wallace at a Madison Square Garden rally grew so violent that, The New Republic observed, “never again will you read about Berlin in the ’30s without remembering this wild confrontation here of two irrational forces.”

The rest of the country followed New York’s lead. In 1970, after the shooting deaths of four students during antiwar protests at Kent State University in Ohio, a Gallup poll found that 58 percent of Americans blamed the students for their own deaths. (“If they didn’t do what the Guards told them, they should have been mowed down,” one parent of Kent State students told an interviewer.) Days later, hundreds of construction workers from the World Trade Center site beat antiwar protesters at City Hall with their hard hats. (“It was just like Iwo Jima,” an impressed witness remarked.) That year, reports the historian Katherine Scott, 76 percent of Americans “said they did not support the First Amendment right to assemble and dissent from government policies.”

In 1973, the reporter Gail Sheehy joined a group of blue-collar workers watching the Watergate hearings in a bar in Astoria, Queens. “If I was Nixon,” one of them said, “I’d shoot every one of them.” (Who “they” were went unspecified.) This was around the time when New Yorkers were leaping to their feet and cheering during screenings of “Death Wish,” a hit movie about a liberal architect, played by Charles Bronson, who shoots muggers at point-blank range. At an October 2015 rally near Nashville, Donald Trump told his supporters: “I have a license to carry in New York, can you believe that? Nobody knows that. Somebody attacks me, oh, they’re gonna be shocked.” He imitated a cowboy-style quick draw, and an appreciative crowd shouted out the name of Bronson’s then-41-year-old film: “ ‘Death Wish’!”

The dubious grifting of Donald Trump, in short, is a part of the structure of conservative history.
In 1989, a young white woman was raped in Central Park. Five teenagers, four black and one Latino, confessed to participating in the crime. At the height of the controversy, Donald Trump took out full-page ads in all the major New York daily papers calling for the return of the death penalty. It was later proved the police had essentially tortured the five into their confessions, and they were eventually cleared by DNA evidence. Trump, however, continues to insist upon their guilt. That confidence resonates deeply with what the sociologist Lawrence Rosenthal calls New York’s “hard-hat populism” — an attitude, Rosenthal hypothesizes, that Trump learned working alongside the tradesmen in his father’s real estate empire. But the case itself also resonates deeply with narratives dating back to the first Ku Klux Klan of white womanhood defiled by dark savages. Trump’s public call for the supposed perpetrators’ hides, no matter the proof of guilt or innocence, mimics the rituals of Southern lynchings.

When Trump vowed on the campaign trail to Make America Great Again, he was generally unclear about when exactly it stopped being great. The Vanderbilt University historian Jefferson Cowie tells a story that points to a possible answer. In his book “The Great Exception,” he suggests that what historians considered the main event in 20th century American political development — the rise and consolidation of the “New Deal order” — was in fact an anomaly, made politically possible by a convergence of political factors. One of those was immigration. At the beginning of the 20th century, millions of impoverished immigrants, mostly Catholic and Jewish, entered an overwhelmingly Protestant country. It was only when that demographic transformation was suspended by the 1924 Immigration Act that majorities of Americans proved willing to vote for many liberal policies. In 1965, Congress once more allowed large-scale immigration to the United States — and it is no accident that this date coincides with the increasing conservative backlash against liberalism itself, now that its spoils would be more widely distributed among nonwhites.

The liberalization of immigration law is an obsession of the alt-right. Trump has echoed their rage. “We’ve admitted 59 million immigrants to the United States between 1965 and 2015,” he noted last summer, with rare specificity. “ ‘Come on in, anybody. Just come on in.’ Not anymore.” This was a stark contrast to Reagan, who venerated immigrants, proudly signing a 1986 bill, sponsored by the conservative Republican senator Alan Simpson, that granted many undocumented immigrants citizenship. Shortly before announcing his 1980 presidential run, Reagan even boasted of his wish “to create, literally, a common market situation here in the Americas with an open border between ourselves and Mexico.” But on immigration, at least, it is Trump, not Reagan, who is the apotheosis of the brand of conservatism that now prevails.

A puzzle remains. If Donald Trump was elected as a Marine Le Pen-style — or Hiram Evans-style — herrenvolk republican, what are we to make of the fact that he placed so many bankers and billionaires in his cabinet, and has relentlessly pursued so many 1-percent-friendly policies? More to the point, what are we to the make of the fact that his supporters don’t seem to mind?

Here, however, Trump is far from unique. The history of bait-and-switch between conservative electioneering and conservative governance is another rich seam that calls out for fresh scholarly excavation: not of how conservative voters see their leaders, but of the neglected history of how conservative leaders see their voters.

In their 1987 book, “Right Turn,” the political scientists Joel Rogers and Thomas Ferguson presented public-opinion data demonstrating that Reagan’s crusade against activist government, which was widely understood to be the source of his popularity, was not, in fact, particularly popular. For example, when Reagan was re-elected in 1984, only 35 percent of voters favored significant cuts in social programs to reduce the deficit. Much excellent scholarship, well worth revisiting in the age of Trump, suggests an explanation for Reagan’s subsequent success at cutting back social programs in the face of hostile public opinion: It was business leaders, not the general public, who moved to the right, and they became increasingly aggressive and skilled in manipulating the political process behind the scenes.

But another answer hides in plain sight. The often-cynical negotiation between populist electioneering and plutocratic governance on the right has long been not so much a matter of policy as it has been a matter of show business. The media scholar Tim Raphael, in his 2009 book, “The President Electric: Ronald Reagan and the Politics of Performance,” calls the three-minute commercials that interrupted episodes of The General Electric Theater — starring Reagan and his family in their state-of-the-art Pacific Palisades home, outfitted for them by G.E. — television’s first “reality show.” For the California voters who soon made him governor, the ads created a sense of Reagan as a certain kind of character: the kindly paterfamilias, a trustworthy and nonthreatening guardian of the white middle-class suburban enclave. Years later, the producers of “The Apprentice” carefully crafted a Trump character who was the quintessence of steely resolve and all-knowing mastery. American voters noticed. Linda Lucchese, a Trump convention delegate from Illinois who had never previously been involved in politics, told me that she watched “The Apprentice” and decided that Trump would make a perfect president. “All those celebrities,” she told me: “They showed him respect.”

It is a short leap from advertising and reality TV to darker forms of manipulation. Consider the parallels since the 1970s between conservative activism and the traditional techniques of con men. Direct-mail pioneers like Richard Viguerie created hair-on-fire campaign-fund-raising letters about civilization on the verge of collapse. One 1979 pitch warned that “federal and state legislatures are literally flooded with proposed laws that are aimed at total confiscation of firearms from law-abiding citizens.” Another, from the 1990s, warned that “babies are being harvested and sold on the black market by Planned Parenthood clinics.” Recipients of these alarming missives sent checks to battle phony crises, and what they got in return was very real tax cuts for the rich. Note also the more recent connection between Republican politics and “multilevel marketing” operations like Amway (Trump’s education secretary, Betsy DeVos, is the wife of Amway’s former president and the daughter-in-law of its co-founder); and how easily some of these marketing schemes shade into the promotion of dubious miracle cures (Ben Carson, secretary of housing and urban development, with “glyconutrients”; Mike Huckabee shilling for a “solution kit” to “reverse” diabetes; Trump himself taking on a short-lived nutritional-supplements multilevel marketing scheme in 2009). The dubious grifting of Donald Trump, in short, is a part of the structure of conservative history.

Future historians won’t find all that much of a foundation for Trumpism in the grim essays of William F. Buckley, the scrupulous constitutionalist principles of Barry Goldwater or the bright-eyed optimism of Ronald Reagan. They’ll need instead to study conservative history’s political surrealists and intellectual embarrassments, its con artists and tribunes of white rage. It will not be a pleasant story. But if those historians are to construct new arguments to make sense of Trump, the first step may be to risk being impolite.

Rick Perlstein is the author, most recently, of “The Invisible Bridge: The Fall of Nixon and the Rise of Reagan.”

Tuesday, April 04, 2017


Who Was Psychology's First True Genius?

The real founder of psychology was not a psychologist.

Here’s a one-item test: “Who founded the science of psychology?”

One possible answer would be “William James,” who wrote the first psychology textbook, Principles of Psychology, in 1890. 


Douglas T. Kenrick Ph.D. Douglas T. Kenrick Ph.D. Psychology Today


You would get a few more points for answering “Wilhelm Wundt.”  Indeed, Wundt started the first formal laboratory in 1879, at the University of Leipzig, and William James was initially inspired to study psychology when he read one of Wundt’s papers in 1868, whilst visiting Germany. 

But Wundt himself had started his career as a lab assistant to the man I would nominate as psychology’s first true genius: Hermann Helmholtz. 

Helmholtz made at least two great contributions to modern psychology:

1. He was the first to measure the speed of a neural impulse. (In doing so, Helmholtz completely overturned the previous assumption that nervous signals were instantaneous, traveling at an infinite speed.)

2. He advanced the trichromatic theory of color vision, brilliantly inferring that there were three different types of color receptors in the eye, which responded specifically to blue, green, and red (an inference that was proven true a century later).  This theory ran contrary to the view, popular only a few years before his time, that any kind of nerve cell could transmit any kind of information.  It suggested not only that different kinds of neurons transmitted different kinds of information, but that even within the visual sense, there were different kinds of information being sent along different neurons in the eye. 

Hermann von Helmholtz.  from Wikipedia Commons public domain

There is one problem with identifying Helmholtz as psychology’s first genius: Helmholtz would not have defined himself as a psychologist. This is partly because there was no such field as psychology back in the early 1800s. Wilhelm Wundt was trained as a biologist, and William James as a philosopher.  But both Wundt and James ended up defining themselves as psychologists.   Helmholtz, on the other hand, started his career as a professor of physiology, and after dabbling in psychophysics for a while, switched his professional identity to become a professor of physics.  His last years were devoted not to the scientific study of the mind, but to thermodynamics, meterology, and electromagnetism. Indeed, Helmholtz’s contributions to physics won him his widest acclaim. Those contributions led the emperor to promote him to the nobility (hence his name became Hermann von Helmholtz).  (Helmholtz’s life was not exactly a rags to (Photo of W. Wundt) riches story, but it was certainly a noteworthy case of upward mobility. His father was a schoolteacher, and did not have the means to send his brilliant son to university to study physics.  Instead, Helmholtz took advantage of a deal offered by the Prussian army – they would pay for his training in medicine, if he would agree to serve 8 years as an army surgeon after graduation).  Along the way to becoming a member of the aristocracy for his acclaimed accomplishments in physics, and inspiring budding psychologists like Wundt and James, Helmholtz also invented the opthalmoscope, and wrote a textbook on optics that was widely used for half a century. While he was supposed to be studying Latin in high school, he was instead making optical diagrams under his desk.  While he was in medical school, he found time to play the piano, read Goethe and Byron, and study integral calculus (Fancher & Rutherford, 2015). 

Let’s look specifically at what was so ingenious about this young polymath’s studies of neural impulses and his theory of color vision, though.

Clocking the speed of a neural impulse. 

What’s the big deal about measuring the speed of a neural impulse?  Well, before Helmholtz’s time, the experts believed that a neural impulse was instantaneous, traveling at infinite or near infinite speed. When a pin pricks your finger, on that view, your brain is immediately aware of it. Helmholtz’s own advisor, the brilliant physiologist Johannes Müller, explained this presumed immediate transmission as outside the realm of scientific study, an example of the operation of the mysterious “life force” that underpinned the activities of all living organisms. 

But Helmholtz and some of Müller’s other students believed there was no such mysterious force. Instead, they guessed that if you could shine a light on any process happening inside a living organism, you would discover merely the operation of basic chemical and physical events. As a young professor at the University of Konigsberg, Helmholtz devised an apparatus that hooked a frog’s foot to a galvanometer, in such a way that a current passed through the frog’s thigh muscle would trigger a kick that would turn off the electrical current. What he discovered was that when he zapped the frog’s leg closer to the foot, the twitch happened measurably faster than when he zapped further up the leg. This device led him to estimate an exact speed – the signal seemed to be traveling along frog’s leg’s neurons at 57 mph. 

Then he repeated the study with living human beings. He taught his subjects to press a button as soon as they felt a poke to their legs. When he zapped the toe, it took longer for the subject to register it than when he zapped the thigh. Obviously, the toe is further from the brain, so this indicated that the neural impulse took measurably longer to register when it had to travel farther. This was amazing because people usually experience mental processes as happening instantaneously. And at the time, physiologists had been assuming that the underlying processes must also be instantaneous. If we were whales incidentally, it would take almost a full second for our brain to know that a fish had taken a bite out of our tail, and another full second to send a message back to tail muscle to swat the fish away. 

During the next century, psychologists made great use of this “reaction time” method, using it to estimate how much neural processing is involved in different tasks (doing long division or translating a sentence in our second language versus adding two numbers or reading the same sentence in our native tongue, for example).

The three kinds of color-detecting receptors in the eye

Johannes Müller, who was Helmholtz’s advisor, may have clung to an archaic belief in an instantaneously-acting life force, but he also championed some revolutionary new ideas, including the “law of specific nerve energies” – which was the idea that every sensory nerve conducts only one kind of information. Psychology historian Raymond Fancher points out that one traditional view before then was that neurons were hollow tubes capable of transmitting any kind of energy – color, brightness, volume, tone, even scent or taste or skin pressure.  But the new view was that each sense had its own separate neurons. 

The trichromatic theory suggested that it was more specific than that – the eye might contain three different kinds of receptors, each one transmitting information about a particular section of the spectrum.  Helmholtz noted that all the different colors of the spectrum could be reconstructed by combining lights of three primary colors – blue, green, and red.  If you shine a green light and a red light at the same spot, you will see yellow.  If you shine a blue light and a red light at the same spot you will see purple, and if you shine all three colors, you will see white.  Helmholtz inferred from this that perhaps the brain could determine which color you were looking at if it integrated information from three types of retinal receptors.  If the red receptors are firing away, but the blues are silent, you are seeing bright red, if the blue and red are both firing at a moderate pace, you are seeing a dull purple, (Photo William James) etc.  The idea had also been suggested earlier by the British physician Thomas Young, but Helmholtz developed it more fully. Today, the theory is called the Young-Helmholtz trichromatic theory.

Sensitivity of three different color receptors in eye.  Original by author, including elements from Wikipedia Commons.
Source: Sensitivity of three different color receptors in eye. Original by author, including elements from Wikipedia Commons.
A century later, in 1956, a physiologist at the University of Helsinki named Gunnar Svaetichin found direct support for the trichromatic theory by using microelectrodes to record the signals sent by different cells in fish retinas. Sure enough, some were maximally sensitive to blue, some to green, and some to red. 

Even before this theory was directly supported, it had very important practical implications – television screens trick the eye into seeing colors not by reproducing all the colors of the rainbow, but by using only three kinds of pixels – red, green, and blue, and tweaking the brightness on each of those three channels produces images that our brain perceives as bright orange, dull tan, sparkling turquoise, and lustrous lavender. 

Psychophysics and the discovery of human nature

Thinking about Helmholtz, and his fellow “psychophysicists,” can
make us aware of just how much we have learned about human nature in the last two centuries.  Philosophers had debated a number of questions about how the mind maps the physical universe, but the psychophysicists were able to use new and rigorous scientific methods to actually answer some of these basic questions. Physicists developed the methods to precisely measure the changes in physical energy in sound waves and light waves, and then the psychophysicists developed methods to record how people’s experiences changed, or did not change, along with those physical changes.  What they discovered was that what the human brain experiences is not everything that is happening in the world. Some forms of physical energy, like infrared light or ultra-high pitched sound waves, are invisible to us, but obvious to other animals (like bees and bats). Other forms of energy are highly salient to us, but not to our pet cats and dogs (who lack different kinds of color receptors, and see the world in black and white, except with really loud smells). 

Rosewood