Friday, May 27, 2016

WHAT PRESIDENTS TALK ABOUT WHEN THEY TALK ABOUT HIROSHIMA
By Alex Wellerstein The New Yorker

On Friday, President Obama became the first sitting U.S. President to visit Hiroshima. After laying a wreath at Peace Memorial Park, several hundred yards from where the atomic bomb exploded, on August 6, 1945, he delivered a short address, with Prime Minister Shinzo Abe at his side. As the President’s spokesman, Ben Rhodes, promised earlier this month, Obama neither apologized for the attack nor relitigated the decision to drop the bomb. “Seventy-one years ago, on a bright, cloudless morning, death fell from the sky and the world was changed,” he said. “A flash of light and a wall of fire destroyed a city and demonstrated that mankind possessed the means to destroy itself.” If the fact of Obama’s presence was remarkable, his choice of words wasn’t. Hiroshima has played an important role in how Americans typically understand both the end of the Second World War and the origins of the Cold War, but it has shown up rarely in the speeches of most Presidents, and then usually as a generic invocation of the nuclear age—with heavy emphasis on the past tense and passive voice. (The bombing of Nagasaki, meanwhile, is either ignored or tacked on as an afterthought.)

Harry Truman was an exception to this rule. Although he learned of the existence of the bomb only after Franklin Roosevelt’s death and had essentially no part in its creation, his Presidency became inextricably linked with it. In the hours after the Hiroshima attack, his office issued a press release. Drafted by Arthur Page, the vice-president of A.T. & T. and a friend of the Secretary of War, and edited by a committee of civilian, scientific, and military leaders, it was written, as Page put it, with the goal of persuading the Japanese that “they might well capitulate to the power of the universe.” At the time, Truman was at sea, on his way back from the Potsdam Conference. He didn’t write the document, but he approved it. Since he never gave an official order to drop the bomb, this is the closest thing we have on record to signal that he ever assented to its use. Three days later, after the bombing of Nagasaki, Truman delivered a nighttime radio address. He didn’t write this one, either—Samuel Rosenman, a White House lawyer and speechwriter, did, with help from the poet Archibald MacLeish, who was then serving as the Librarian of Congress and the Assistant Secretary of State for Public Affairs. The text of the address reckoned with the “tragic significance of the atomic bomb.” But it also made a point of calling Hiroshima “a military base”—earlier drafts had said “purely a military base”—rather than what it was: a city with a military base in it.

Soon after the Second World War ended, the men who had been involved in the creation of the bomb, including General Leslie Groves, the head of the Manhattan Project, and Henry Stimson, the Secretary of War, developed a standardized account of the decision to attack Hiroshima. By 1955, when Truman published his memoirs, this account had acquired the sheen of orthodoxy. The gist was that Truman and his advisers had carefully weighed the question of using the bomb on Japan, ultimately deciding that it was the lesser of two evils, since it would prevent the need for a land invasion and save many American and Japanese lives. Most of the historians who have studied the topic in depth agree that this narrative is largely false. Truman himself was a fairly peripheral figure, busy with many other pressing matters, like the disposition of Soviet-occupied Europe. There was no White House debate about whether the bomb should be dropped, no singular decision to use it, no suggestion that it was an alternative to invasion. (The plan was to bomb and then invade, if necessary.) It would be wrong to call the shaping of the orthodox narrative an outright conspiracy, but there was certainly an element of collusion. Personal legacies and political careers were at stake. As early as the late nineteen-forties, strong voices had started questioning the wisdom of the Hiroshima attack, and they did not come from the places that we might expect today.

One source of dissent was Dwight D. Eisenhower. In 1948, he had just stepped down as Chief of Staff of the Army and was beginning his tenure as president of Columbia University. He published a wartime memoir that year, “Crusade in Europe,” which met with rave reviews. In the book, Eisenhower described a meeting in which he stated his misgivings about the use of the bomb to Stimson: “I expressed the hope that we would never have to use such a thing against any enemy because I disliked seeing the United States take the lead in introducing into war something as horrible and destructive as this new weapon was described to be.” The historian Barton Bernstein has concluded, after consulting as many corroborating sources as possible, that this discussion probably never took place and that Eisenhower likely misremembered it, perhaps in the service of making himself look like a morally centered military man. (It is true, though, that Eisenhower grasped the revolutionary power of nuclear weapons earlier than most of his fellow career officers. After the war, he set up a staff of young analysts to work through the implications and collaborate with government scientists.)

Eisenhower’s self-presentation was in keeping with the postwar statements of several other top military officials—a tinge of regret, a sense of skepticism about whether the bomb was necessary, or whether it even played the role in ending the war that people said it did. The U.S. Strategic Bombing Survey of 1946, for instance, concluded that “Japan would have surrendered even if the atomic bombs had not been dropped, even if Russia had not entered the war, and even if no invasion had been planned or contemplated.” Admiral William Leahy, in his memoirs, called the bomb “barbarous” and said that it provided “no material assistance in our war against Japan,” since the Japanese were “already defeated and ready to surrender.”

These critiques can seem shocking today, because they upset our understanding of how Hiroshima and Nagasaki map onto modern politics. We assume that Republicans, especially those in the military, are retrospectively pro-bomb, and that liberals see the attacks as something between a mistake and a war crime. But this interpretation removes the critiques from their historical context. Many commanders in both the European and Pacific theatres resented that the bomb got credit for ending the war. They saw their own strategic efforts, including the ruinous firebombing of at least sixty-seven Japanese cities, led by General Curtis LeMay, as being overshadowed by a scientific “gadget.” They feared that nuclear weapons would become an excuse to cut funding for conventional armed forces: if the bomb maintained the peace, who needed generals? (Their fears proved not entirely unfounded—Truman’s second Secretary of Defense, Louis A. Johnson, did try to slash military budgets—but they eventually learned to love the bomb.) When these leaders proposed that the attacks on Hiroshima and Nagasaki were unnecessary, they meant that they were unnecessary because Japan had already been bombed to dust. It was not a peacenik argument.

As President, Eisenhower remained mute on Hiroshima. He oversaw a rapid expansion of the U.S. nuclear arsenal, which grew from around twelve hundred warheads when he took office, in 1953, to more than twenty-two thousand when he left, in 1961—from the equivalent of five thousand Hiroshima bombs to the equivalent of more than a million at its height. Eisenhower, in other words, is an unlikely hero for opponents of nuclear weapons. After he left the Presidency, however, he made more critical statements on the bombings. In “Mandate for Change,” published in 1963, he wrote that, during the alleged meeting with Stimson, he had “been conscious of a feeling of depression,” and claimed that he had told the Secretary of War that “the dropping of the bomb was completely unnecessary.” In an interview with Newsweek from later that year, Eisenhower stated bluntly that “the Japanese were ready to surrender and it wasn’t necessary to hit them with that awful thing.”

Whether or not Eisenhower’s views jibe with the documentary evidence from the time, what is most curious is how inexpressible these same views would be for American politicians—much less Presidents—today. The politics of the present are defined far more by the events of the late Cold War and its aftermath than by the arguments of the nineteen-forties. In 1995, a group of veterans of the Second World War objected sharply, and effectively, to a planned exhibit at the Smithsonian Institution centered around the restored fuselage of the plane that dropped the Hiroshima bomb, the Enola Gay. Much of the original text of the exhibit dealt with the suffering of the Japanese, and on historical arguments that the exhibit’s detractors termed “revisionist.” (Eisenhower’s quotes about the bombing were among those they objected to.) The exhibit went forward, but in a considerably neutered state, focussing on the mechanics of the plane and carefully avoiding discussions of the human consequences. The controversy was less a debate about actual history than an extension of the mid-nineteen-nineties culture war into the nostalgic memories of the Greatest Generation. And the consequence was a polarization of opinion. Either you were for the bombings, or you were a revisionist: there was no middle ground. Not surprisingly, politicians have tended to play it safe.

Today, the polarization, at least among historians, seems to have abated considerably. There are still those who take strong views on whether the bombs should have been dropped, but the narratives themselves diverge less. Veterans no longer play as much of a role in the discussion: they are too few in number, and very elderly. It remains to be seen whether distance from the living past will open up a path to public consensus, or cause us to veer even further between the extremes of support and condemnation.

Monday, May 16, 2016

6 things you need to know about Iran's powerful Islamic Revolutionary Guards Corps

by Afshon Ostovar VOX

Despite the historic nuclear agreement between the US and Iran, and talk of the rising political influence of Iran’s "moderates," who favor easing tensions with the US, the fact is that a core constituency in the Iranian regime remains vehemently opposed to warming relations with America.
RelatedWhy many of Iran's "moderates" say they prefer Trump to Clinton
That constituency is led by the Islamic Revolutionary Guards Corps (IRGC), the security and military organization that is responsible for the protection and survival of the regime. You can't really understand Iran without understanding the IRGC: why it was formed, what role it plays in Iran's foreign policy, and why it is so deeply anti-American.

Here, then, are six facts about the IRGC that help explain why Iran behaves the way it does — and why it's unlikely to change that behavior anytime soon.

1) The IRGC is not just a military — it also has huge political and economic power in Iran
In order to understand how the IRGC got to where it is, you need to know where it came from.

The 1979 Islamic Revolution in Iran was many things, but above all it was a reaction to American influence in Iran. Iran’s ruling monarch, Mohammad Reza Pahlavi, was considered a tyrant by the revolutionaries. They saw the shah’s oppressive policies as the bitter fruit of American influence. Armed and trained by America, Iran’s secret police and military were the blunt ends of that oppression.

The secret police was dissolved after the revolution, but the military was spared. Revolutionaries still didn’t trust the military. To curb its power and protect the revolution from a potential military coup, they formed the IRGC.

Since its establishment in the midst of the 1979 revolution, the IRGC has gradually grown to become a pillar of the Islamic Republic. It is the designated defender of the Islamic Revolution, the guardian of the supreme leader, and the chief mechanism of Iranian coercive power.

The IRGC has a massive commercial arm that is involved in industrial construction, shipping, telecommunications, and media. It oversees Iran’s sensitive covert operations and much of its foreign intelligence gathering, from Latin America to Southeast Asia.

It controls Iran’s most important strategic deterrent, its ballistic missile program, and facilitates relations with Iran’s closest regional allies and clients, such as Lebanese Hezbollah, Shia militias in Iraq, and the regime of Bashar al-Assad in Syria.

2) The IRGC sees itself as the guardian of Iran’s Islamic Revolution and the supreme leader
The IRGC, whose name in Persian means "Army of the Guardians of the Islamic Revolution," was started by loyalists of Ayatollah Ruhollah Khomeini — the founder of the Islamic Republic and its first supreme leader. Its charge was to safeguard the revolution and Iran’s new governmental system of clerical rule.

The supreme leader was the heart of that system and its highest authority. Above all else, defending the revolution meant defending the supreme leader and the primacy of his rule.

From Khomeini to his successor, Ali Khamenei, Iran’s two supreme leaders have benefited from the IRGC’s support. The IRGC’s coercive power has underwritten the domestic policies and foreign ambitions of the supreme leader’s office.

Unlike Khomeini, who had a large following before the revolution, Ali Khamenei (Iran’s current supreme leader) lacked a natural constituency when he took power. The IRGC filled that void, and forged a tight, symbiotic relationship with the supreme leader. The IRGC has vigorously supported Khamenei’s policies, and in return he has preserved the IRGC’s unique and dominant position.

This doesn’t mean the IRGC gets whatever it wants. The supreme leader rules by balancing the interests of Iran’s main stakeholders — the elected government, the parliament, the clergy, the electorate, and the IRGC.

However, because of the IRGC’s proximity to the supreme leader and staunch advocacy for his station, the organization gets what it wants more often than not.

3) The IRGC’s aims are a mix of ideology and national self-interest
Safeguarding the role of the supreme leader and the centrality of Islam in Iran are the IRGC’s top priorities. To that end, the IRGC wants to preserve its status and political influence, because it believes itself to be the only organization in Iran that can truly defend the country’s Islamic system of governance.

The IRGC sees Western influence as the revolution’s fiercest challenge. Democracy is viewed as a problematic outgrowth of Western influence, which is why the IRGC fears pro-democratic populist movements that aren’t outwardly supportive of the regime and conservative Islamic values.

To the IRGC, the intrusion of Western influence is a form of cultural warfare that it must fight at every turn. Within Iran, it does this mainly through its massive volunteer militia, known as the Basij. Local branches of Basij exist at every level of schooling (primary through university), in all government offices and business, and in neighborhood mosques throughout Iran.

At one end of the spectrum, the Basij operates like an Iranian version of the Boy Scouts or Girl Scouts, in that its aim is to develop young people into proper, moral citizens who are religiously devout and devoted to the supreme leader.

At the other end, the Basij operates more like a political pressure group. The Basij has used violence and coercion to disrupt pro-democratic popular movements, combat political unrest, and promote a rigid social order. During the 2009 protests, for example, it was blamed for some of the worst violence, including the killing and rape of scores of protestors.

Outside Iran, the IRGC’s mission is more complex. It wants to purge the Middle East of American influence, which means both compelling the retreat of American forces from the Persian Gulf and seeking the defeat of the state of Israel. It is also deeply mistrustful of Iran’s Arab neighbors — especially Saudi Arabia — for their close ties to Washington.

Riyadh’s pro-American stance, and its promotion of a virulently anti-Shia form of Sunni Islam, underpins Iran’s tensions with the Saudi kingdom. As those tensions have helped fuel Saudi and Iranian involvement in regional wars (in Syria, Iraq, and Yemen), the IRGC has become a central node in a simmering Iranian-Saudi conflict.

To rid the Middle East of American influence, and to counter Saudi’s regional interests, the IRGC relies on a robust network of client armed groups. Hezbollah, Iraqi Shia militias, various Palestinian groups (including Hamas), and Yemen’s Ansarallah (also known as the Houthis) are all backed by the IRGC to further Iranian interests across the region.

Nowhere is this more apparent than in Syria’s civil war, where the IRGC has enlisted foreign militants and its own soldiers to defend the Assad regime against the largely Sunni rebellion backed by Saudi Arabia, Qatar, Turkey, and the United States.

4) The bitter experience of the Iran-Iraq war — and US support for Iraq during it — still looms large for the IRGC
The IRGC was a ragtag militia manned by poorly trained, under-armed recruits when Saddam Hussein invaded Iran in September 1980. The Iran-Iraq war lasted nearly eight years and devastated Iran. Hundreds of thousands of Iranian troops were killed.

The United States supported Iraq during the war, which Iran saw as an American plot to destroy the fledgling revolution. Iran blames America for providing intelligence on Iranian positions to the Iraqis, enabling Saddam’s use of chemical weapons, and pressuring Iran through the use of US naval power in the Persian Gulf.

The downing of Iran Air Flight 655 by the USS Vincennes, which killed 290 civilians, is still fresh in the minds of Iran’s war veterans. Regardless of how the accident occurred, to the IRGC and Iran’s leadership it was an act of war, and a harbinger of escalations to come if the war continued. Unable to fight both Iraq and the US, Iran’s leadership felt compelled to end the war with Saddam.

Iran’s Arab neighbors — with the exception of Syria — all supported Saddam as well. Saudi Arabia and Kuwait, both American allies, provided much of the financing for Iraq’s war effort. The IRGC — and Iran’s leadership more broadly — holds a deep grudge against the United States, Saudi Arabia, and others as a result of their support for Saddam Hussein.

The vast majority of the IRGC’s current leadership got their start as volunteers in the Iran-Iraq war. They bear all the scars of that conflict, but also the pride and confidence of having survived it. The IRGC and its Basij popular militia wing lost more lives than any other force during the war — a fact that has not been lost on them.

Since the end of the war, the IRGC has formed its politics to ensure that the sacrifices of the war veterans are not overlooked and that the causes they fought for (the preservation of the revolution and its Islamic ideals) are not compromised by things such as the spread of popular democracy, the weakening of social mores, and the normalization of relations with America.

5) The IRGC sees sinister motives behind nearly all US actions in the Middle East
There was a brief opening after 9/11 where the IRGC flirted with cooperating with the United States. Qassem Suleimani, the commander of the IRGC’s covert special operations wing known as the Quds (Jerusalem) Force, greenlit limited intelligence sharing with US diplomats in the buildup to the US invasion of Afghanistan. The IRGC had almost gone to war with the Taliban in 1998, and was somewhat open to assisting America’s war on their mutual enemy.

The collaboration was short-lived.

President George W. Bush’s "axis of evil" speech tainted the budding relationship, and was seen as a direct betrayal by Suleimani. It was confirmation to the IRGC and the supreme leader that the United States would never change its stance toward the Islamic Republic, and could never be trusted.

Instead of cooperating with the US, the IRGC worked to undermine the US occupations of Afghanistan and Iraq. The IRGC helped train and arm anti-American Afghan militants, and led a covert ground war against US influence in Iraq. The IRGC’s control of Iraqi militias and influence over Iraqi politicians partly led to the departure of US forces from the country.

The history of mistrust similarly underlies the IRGC’s understanding of US policies in Syria and Iraq today. From the IRGC’s perspective, the US war against ISIS is a smokescreen designed to conceal America’s efforts to uproot Iranian influence in the Levant by toppling Iran’s longstanding ally, the Assad regime, weakening Lebanese Hezbollah, and destabilizing the region to create a pretext for the return of US forces to Iraq.

6) The IRGC matters more to US policy than most care to admit
The IRGC is an inexorable part of Iranian foreign and strategic policy. It is second only to the supreme leader in its influence over the strategic arena and plays an outsize role in foreign policy through its relationships and on-the-ground investments.

The supreme leader sits atop Iranian foreign and strategic policy, and sets guidelines for both. Under the leader, the elected government takes the lead in managing foreign policy. Similarly, strategic policy is the purview of the Supreme National Security Council (on which the president, the IRGC commander, and several other institutional heads sit).

The government has a lot of influence over policy. It runs the diplomatic corps and sets the tone for Iran’s foreign policy writ large. However, in Iran’s most important relationships — in Lebanon, Iraq, and Syria — the IRGC sets the parameters for Iranian policy. This is because the heart of Iran’s influence in those countries is the IRGC’s relationships with foreign leaders and client armed groups.

The US government interacts with Iran’s elected government, which serves as the face of the Islamic Republic in international affairs. This gives the appearance that the government’s policy preferences are backed by the whole of the regime, but often they are not. The government speaks for the regime in official discussions, but outside of negotiations like the Iran deal, the government’s predilections are not necessarily shared by the IRGC.

The issues that are of central concern to Washington and Iran’s neighbors are generally those dominated by the IRGC. From the ballistic missile program to Iranian activities across the Middle East and beyond, the IRGC controls much of the strategic space that worries the West and Iran’s rivals the most.

The supreme leader maintains ultimate authority over strategic and foreign policies, including covert activities, but he cedes much of the decision-making to the IRGC so long as its decisions are not deemed harmful to Iran’s broader interests. Although the supreme leader and IRGC disagree at times, they are in agreement more often than not.

This is a challenge to the US because the IRGC is essential to all of the regional issues that Washington is currently deeply engaged in (Syria, Iraq, Yemen, Israel) and to regional security in general.

In order for US-Iran relations to ever transcend mutual antagonism, the IRGC will need to either abandon the investments and activities that have made it so formidable or move beyond its fundamental anti-Americanism. That is unlikely to happen. What the future holds is less likely to be a US-Iranian rapprochement than the continued management of tensions and disagreements.

The nuclear deal opened the door for the Iran’s return from isolation. More than ever before, Iran is in control of its own destiny. If it wants peace and prosperity, it will need to rethink its priorities — starting with those of the IRGC.

Afshon Ostovar is the author of the new book Vanguard of the Imam: Religion, Politics, and Iran’s Revolutionary Guards. He is a senior analyst at the Center for Strategic Studies at CNA, and will be moving on to the Naval Postgraduate School in the summer.

Monday, May 09, 2016

Sensory Studies 

Feel Me What the new science of touch says about ourselves.

BY ADAM GOPNIK The New Yorker

Our skin is no neutral envelope; it is a busily sensing organ that situates us in relation to others and the world.
On a bitter, soul-shivering, damp, biting gray February day in Cleveland—that is to say, on a February day in Cleveland—a handless man is handling a nonexistent ball. Igor Spetic lost his right hand when his forearm was pulped in an industrial accident six years ago and had to be amputated. In an operation four years ago, a team of surgeons implanted a set of small translucent “interfaces” into the neural circuits of his upper arm. This afternoon, in a basement lab at a Veterans Administration hospital, the wires are hooked up directly to a prosthetic hand—plastic, flesh-colored, five-fingered, and articulated—that is affixed to what remains of his arm. The hand has more than a dozen pressure sensors within it, and their signals can be transformed by a computer into electric waves like those natural to the nervous system. The sensors in the prosthetic hand feed information from the world into the wires in Spetic’s arm. Since, from the brain’s point of view, his hand is still there, it needs only to be recalled to life.

Now it is. With the “stimulation” turned on—the electronic feed coursing from the sensors—Spetic feels nineteen distinct sensations in his artificial hand. Above all, he can feel pressure as he would with a living hand. “We don’t appreciate how much of our behavior is governed by our intense sensitivity to pressure,” Dustin Tyler, the fresh-faced principal investigator on the Cleveland project, says, observing Spetic closely. “We think of hot and cold, or of textures, silk and cotton. But some of the most important sensing we do with our fingers is to register incredibly minute differences in pressure, of the kinds that are necessary to perform tasks, which we grasp in a microsecond from the feel of the outer shell of the thing. We know instantly, just by touching, whether to gently squeeze the toothpaste or crush the can.”

With the new prosthesis, Spetic can sense the surface of a cherry in a way that allows him to stem it effortlessly and precisely, guided by what he feels, rather than by what he sees. Prosthetic hands like Spetic’s tend to be super-strong, capable of forty pounds of pressure, so the risk of crushing an egg is real. The stimulation sensors make delicate tasks easy.

Spetic comes into the lab every other week; the rest of the time he is busy pursuing a degree in engineering, which he has taken up while on disability. The researchers try to use their time with him energetically, so there is an excited murmur while the experiments go on—shoptalk conducted mostly in acronyms and initials. It is perfectly possible to hear a sentence beginning “One of the difficulties about being the P.I. on a DARPA-funded study, post I.R.B. . . . ” and see gentle nods of agreement. Though Spetic is an industrial worker, he has been in the study long enough to have absorbed the language of the investigators, and he now speaks easily of “the double-blind data” and “following the expanding parameters of the experiment.”

Spetic, burly and broad-faced, with the quietly powerful look of someone accustomed to working hard with his arms and hands, is undertaking a new set of tests: with no prosthesis on at all, simply by willing the nerves—in what is crudely called his stump, what is politely called his residual—he is manipulating a virtual hand in a virtual space, represented on a flat screen in front of him. He moves his hand through the muscles in his arm by using his head, and the hand on the screen moves, too, reaching out and grasping the ball.

“Turn the stim on,” he says, almost longingly. An experimenter raises an eyebrow—protocol stipulates that the subject should not know when the stim is turned on—but he does, and immediately Spetic begins to pick the ball up easily. “I can feel it in my thumb and my fingers,” he says. Then he corrects himself: “In this space.” Tyler whispers to an observer, “He began saying ‘the’ thumb or ‘the’ finger. Now he says ‘my’ thumb, ‘my’ finger!”

Touch is not a one-way deduction of sensation but a constant two-way interchange between what Tyler calls the “language” of sensation and the raw data of reception. “What we’ve discovered is that the language of touch is what matters most,” he says. “When we first fed the stimulus in, Igor only felt a tingle. The question was, how do we go from tingle to touch? By analogy, pure sound is something we readily do.” Tyler stops and makes a kind of inarticulate cry. “I make a noise, but there’s no information in it. Break it up in the right way, and it’s words. That’s what happens when you have epilepsy—it’s a kind of constant brain sound. But the healthy body works with patterns of information. And there’s a narrow window within which the body interprets. Shouting ‘Baaah!’ is not very different from talking sense.”

Tyler’s lab is a hive of busy graduate students and assistants, monitoring screens and fiddling with cables. Tyler glides among them encouragingly. “When we started,” he says, “we couldn’t get past the tingle. We couldn’t make the tingle become touch. There’s a nerve called the digital nerve, and it’s superficial, close to the skin, so we hooked me up to the stimulus, and I started to feel sensation. It took forever, ninety-eight per cent failure and two per cent success. There were so many things to vary! But finally one pattern emerged: a sinusoidal envelope, modulating at one hertz, that fits within the biological range of rhythm and change. Tighten the wave, and tingle becomes touch. It may be coincidence, but that wave, the one that communicates touch, is just around the rhythm of a heartbeat, a sort of essential bodily beat.”

The day wears on; Igor Spetic gets a little sad. “I hate to go,” he says, pausing in the doorway and looking back. “When I leave this room, I leave my hand behind.”

I started thinking about how touch happens when something buzzed in my pocket that wasn’t there. Sometimes we think we’re going crazy when we’re actually in tune with our time and in synch with our fellows. We go to watch a high-fidelity, high-frame-rate movie, think it looks eerily like a local television news show from our childhood, and discover that this is a well-noted phenomenon, called the “soap-opera effect.” We feel a strange compulsion to leap off a high cliff, and discover that it’s the high-place phenomenon, and that, far from a death wish, it may be a backward phenomenon of self-recording: we come to the edge, instantly retreat, and then our brain explains our actions to us and retrospectively reorders our memory to believe that we must have actually been thinking of jumping. And we see a blue-and-black dress and think it’s white and gold, and everybody else in the country has the same problem.

Or we begin to get the jumps at feeling a cell phone vibrate that isn’t there. I’d feel a distinct, small buzzing, would reach down and—nothing. I thought maybe some nerve ending in my thigh had become so habituated to the vibration that it had gone into permanent iPhone spasm. In fact, as the neuroscientist David Linden explained to me, it involves a predictable misread by something called a Pacinian corpuscle.

“The phantom cell phone is such a widespread thing,” Linden says. We were speaking in his office at Johns Hopkins University, in Baltimore. “I think something like ninety per cent of college students report it at one time or another. Something else stimulates the Pacinian—one of the sense receptors in your thigh—and the skin says, ‘Oh, it must be that damn cell phone again!’ It’s a nice example of how our entire skin is a sensing, guessing, logic-seeking organ of perception, a blanket with a brain in every micro-inch. So any vibration near the pocket, and the system organizes it in advance, and interprets it as the buzz of your phone.”

Linden’s original research involves glow-in-the-dark neurons in mouse brains—he manipulates the mouse’s DNA, allowing its neural pathways to shine under blue light like psychedelic poster patterns—but he has written at length about the science of touch and has become widely expert in the field. For Linden, it is where the tingle is. Only recently has brain science fully grasped that skin and touch are as rich and paradoxical as any other part of our humanity. Touch is the unsung sense—the one that we depend on most and talk about least. We know the illusions that our eyes or ears can create. But our skin is capable of the same high ordering and the same deceptions. It is as though we lived within a five- or six-foot-tall eye, an immense, enclosing ear, with all an eye or ear’s illusions, blind spots, and habitual mistakes. We are so used to living within our skins that we allow them to introduce themselves as neutral envelopes, capable of excitation at the extremities (and at extreme moments), rather than as busy, body-sensing organs. We see our skins as hides hung around our inner life, when, in so many ways, they are the inner life, pushed outside.

“More papers have been published on the molecular and cellular basis of touch in the past decade than in the past century,” Linden says. “Over the past fifty years, there have been probably a hundred papers about vision for every paper about touch in the scientific literature. Part of that is that vision is more accessible to our experience. People go blind often. But almost no one is touch-blind—the fact that you have to say ‘touch-blind’ is a hint of the problem. Being touch-blind isn’t compatible with life. There are no national foundations for the hard-of-touch.”

David Ginty, a neuroscientist at Harvard Medical School who studies the “low-threshold mechanosensory neurons” that allow our brains to interpret touch, emphasizes the breakthroughs in animal models that have led to what he calls a renaissance in touch science. “For the basic research, it was the conquest of mouse genetics,” he says. “Rodents as animal models have come of age, and our ability to bring modern molecular-genetic approaches to age-old questions on somatic sensation is now incredibly powerful.” He goes on to explain how mouse genes allow us to explain human touch: “We can turn an itch system off or turn it on. We’re interested in the sensory neurons that innervate the skin. And we try to make sense of the complexity: Why are there so many kinds of sensory neurons? What do they do? How are they integrated to give rise to the perception of a touch?”

The world of tactile research is divided into a bewildering variety of names and specialties—haptics, prosthetics, somatosensory studies, haptic feedback prosthetics, and on and on—but they all have in common the relations between our skin and our sense of ourselves. Linden believes that, among all the new discoveries about touch and haptic sensation, the most important are the least generalized. Startlingly specific touch systems, or “labelled lines,” as they are called, have been identified. “Each time we study the touch system more deeply, we realize that it is more specialized than we’d known,” Linden says. “These systems aren’t usefully understood just as different cognitive responses to the same stimuli—they’re completely different integrated systems. There are separate labelled lines for so many seemingly intermingled systems.” The difference between “affective” touch—a loving caress—and other kinds, like a threatening or a clinical grope, involves two different sensing systems working in close concert.

Still more strikingly “specific” work is being done, down the hall from Linden’s office, by another Hopkins neuroscientist, Xinzhong Dong. Dong is the Einstein of itch, the scientist who established that the itch system qualifies as a labelled line, with dedicated neurons of its own. A native of China, he speaks a clipped, intense, and amiable English. “People used to assume that itch was just small pain, the little brother of pain,” he says. “But not so. It’s a separate system loaded by itself. There’s a lot of debate about how itch and pain are coded in the sensory neurons. A few years ago, we discovered a group of cells that function as a specific itch receptor. And that was a breakthrough.”

Dong bred mice whose gene for the suspected receptor was turned off. But, to test the itch system, a reliable means of making mice itchy was required. “Many bodybuilders develop severe itch,” he said. “If you go online to bodybuilders’ sites, you can find a drug they take to prevent acid buildup. Without muscle fatigue, they say it feels like a thousand mosquito bites! So we tried it in mice and they scratched very robustly. And the ones without the receptors become insensitive. So that showed the receptor we found was the right one. For itch we have very dedicated behavior. It’s really cool. We inject a chemical into a face. If it’s painful, the animals use a front paw to gently rub it. If you inject an itchy substance, they use a hind leg to scratch. Almost always animals use their hind paw to scratch. So we can tell if they are itchy or painful.”

In videos, you see the difference: mice delicately pawing their faces in mild pain; mice scratching fiercely at an itch—two separate systems, turned on and off like porch lights. Even more, the experiments suggest an odd asymmetry between the two systems. You can trade pain for itch, Dong points out: that’s why mice and men both scratch. But it won’t work the other way around: you can pain your itch, but you can’t itch your pain. A signature of itch is that it’s specific to the skin. Your bones can ache, but they can’t itch. In still one more experiment, Dong made his itch-specific fibres fluorescent. They appeared, as expected, only in the skin.

Why should itch be so catchy? Why should itch be, as it were, pre-installed and so neatly differentiated from pain? Several theories present themselves. The most probable is that it arises from the paramount adaptive need for animals to guard against parasites, which are more likely to produce itch than pain. If we put insect bites on a dimension measured in pain, they would not register sufficiently or at all. There could be survival value in being able, so to speak, to tell a bug up the ass from a pain in the rear.

One strange thing about the unsung sense is that it has no songs. Every other sense has an art to go with it: the eyes have art, the ears have music, even the nose and the tongue have perfume and gastronomy. But we don’t train our hands to touch as we train our eyes to look or our ears to listen. Every now and again, someone comes up with a “touch museum” or starts a program for the visually handicapped to experience art through their fingers. But such enterprises often have a hopeful, doomed feeling to them: they seem more willed than wanted.

Is it possible that the absence of tactile art is a mere accident of history? The historian Constance Classen reminds us that in the eighteenth century touching the objects in proto-museums—cabinets of curiosities and amateur collections—was invited and expected and even, in a way, compulsory. “When the underkeeper of the Ashmolean in 1760 tried to prevent a museum visitor from handling artifacts he was accused of incivility,” she writes, in “The Book of Touch,” an anthology of writings on the tactile. The current reign of the optical museum—where all the objects are shut away, even ones that demand to be touched to be understood at all, like scientific or musical instruments—is, Classen shows, in “The Deepest Sense,” a cultural history of touch, a recent one, due to “the association of touch with irrationality and primitivism.” The museumgoer who touched was a woman or a child; the patriarchs shut things up in cases and then looked at them imprisoned.

Of course, there may be more insurance than episteme in this change: when ten people a week come to see your Greek bust, letting them caress it is one thing; when ten thousand come, it is something else. And, indeed, one of the ways in which the ten still distinguish themselves from the ten thousand is that they are allowed to touch the objects: seeing and handling art objects out of their frames and cases is one of the perks of becoming an art professional. (Art pros will often, perhaps unconsciously, talk or even brag about handling a famous thing—“I saw ‘The Scream’ without its frame and held it up!” “The Jasper Johns flashlight was actually in my grasp and I got a sense of its magic!”—to assert their authority.)

In the absence of art, touch turns easily to entertainment. The high-water mark of the touch world can be found at the haptics conferences that fill the calendar of hapticians everywhere, most notably the Institute of Electrical and Electronics Engineers’ annual Haptics Symposium, which this spring was held at a hotel in downtown Philadelphia, on a perfect April weekend. Since the upper hall of the hotel is eerily like a high-school gym, one can get the impression of being at a science fair to which only really smart kids can submit projects. It helps the effect that, haptics engineers being professionally unpretentious, they customarily refer to their innovations as “incredibly cool,” as in “Did you see the locating device they developed at M.I.T.? It’s incredibly cool!” An I.E.E.E. haptics fair is exactly what Ben Franklin would have dreamed of for American science—practical-minded, eccentric, and, as with bifocals, solving problems that one was not entirely aware were problems until an inventor found a solution to them.

The crowd includes the usual engineering types—Midwesterners, Asian-Americans, Asians from Asia—and, in a historically male-dominated discipline, a surprising number of women. There are also numerous special visitors from Apple and Google, extremely anxious about saying too much about what, exactly, they’re looking for, the wrong word likely both to spill the beans to the competition and to boost undue speculation about somebody’s startup. The air crackles with the distinctive combination of altruism and entrepreneurialism which governs the tech world.

Many and cool are the devices on offer: a “Novel Vibrotactile Feedback Assisted Mid Air Writing device”; a “New Wearable Fingertip Haptic Interface for the Rendering of Virtual Shapes and Surface Features.” And here is the Animotus, designed by Adam Spiers, of Yale, and intended “to communicate proximity and heading to navigational targets”; it’s a small white two-story cube that sits innocently in your hand, willfully changing shape as—Wi-Fi’d or Bluetoothed to a G.P.S. system—it nudges and pushes you in the right direction down streets and around corners and up alleys, leading you with silent efficiency to whatever destination you have entered. It is like having a tiny guide dog in the center of your hand, nudging your palm with his tongue. (Eventually, it might be connected to an obstacle-spotter, so that it actually could replace those guide dogs for the blind.) Another new haptic device allows for long-distance Swedish massage; created by a team of Mexican engineers, it allows the masseuse to simply wave her hands over a motion sensor, which reproduces the precise sensitivities of her touch on the back of a patient lying on a pinpoint-tuned motion-sensor pad. Swedish masseuses would no longer have to leave Sweden; they could stay in Stockholm and e-mail massages anywhere in the world.

The attendees like to assure you, and one another, that it is only in the past few years that they have really put the happy in haptics. The haptics devices that most of us are familiar with are the simple ones that make a controller vibrate when the assassin is killed in Assassin’s Creed or the defenseman crunches a forward in NHL 16. The new generation of haptics-makers tend to be a little embarrassed by these primitive devices, which they have been known to refer to as “joy buzzers” or even “whoopee cushions,” in comparison with the new generation of haptics. A standard trope in an I.E.E.E. demo is to place the old trembly technology beside the new, sleek and persuasive full-range touch illusion.

William Provancher, formerly a professor of mechanical engineering at the University of Utah, now runs a startup called Tactical Haptics, and had the hit demo of the conference. He can create astonishing touch illusions using simple gaming controls. With the HTC Vive—those virtual-reality goggles—he conjures a vast, empty white skin of space, stretching out to every horizon. Life-size zombies come at you from zombie-style holes that expand within the white sheets, like the resurrected dead in Signorelli’s painting of the Last Judgment. Armed only with a bow and arrow—though what you are actually clutching is a controller with a trigger, shaped more or less like a gas-pump nozzle—you can feel the tension on your virtual bow as you release the arrow, and then the flutter of the arrow and the thunk of the ground trembling when the arrow strikes an onrushing zombie and he falls.

Heather Culbertson, now a postdoc at Stanford, worked at Penn in its famous GRASP lab—the acronym stands for General Robotics, Automation, Sensing, and Perception—and she has returned to Philadelphia to show off her own invention. It is a haptic system that can create the illusion of a hundred distinct textures when you hold it and drag it against a neutral surface. Metal mesh, metal shelving, sandpaper, linoleum, bubble wrap, cardboard, coffee filter, painted brick: holding a pen-shaped utensil in your right hand, you touch the desired texture’s name and then drag the utensil across a countertop, say, and in your fingers you feel exactly the sensation that you would feel if the tool were being dragged across the material you specified. You feel wood; you feel brick; you feel paper. More astonishing, the virtual textures change in feeling, as real ones do, depending on the force and speed with which you move the tool across them.

The Queen of Haptics is Katherine J. Kuchenbecker, the brilliant Stanford-trained engineer who oversees the haptics group at the GRASP lab and supervised Culbertson’s work. The daughter of a developmental psychologist—and, one is not surprised to learn, a member of the Stanford volleyball team that twice won N.C.A.A. titles—she recognizes the gratifyingly large number of women engineers in haptics. (It was Kuchenbecker who trained Culbertson, then passed her on to her own supervisor, the formidable Allison Okamura, at Stanford.) She is understandably reluctant to say that women study feelings better because they have more of them than men, but then she more or less says it. “We have a long tradition of women as team leaders in haptics,” she volunteers—the founder of the GRASP lab is a legendary roboticist named Ruzena Bajcsy—“and I think it’s fair to say that women are drawn to areas of engineering with obvious human interface. Places where what you’re doing obviously reaches people, touches them, you might say.”
She likes the potential of haptic devices to serve both pros and amateurs. Heather Culbertson’s tool allows designers to choose fabrics at a distance and someone searching for clothes online to feel the linen of a summer shirt while sitting at her computer. “What Heather and I did was to take a haptic camera—a touch-based camera—and a swatch of material, and record ten seconds of interaction, dragging the tool back and forth, fast and then slow, light and then heavy,” she explains. “But the key to creating a compelling illusion that you’re touching a real object is that the sensations you feel match all the motions that you make. So we cut that recording up into tiny pieces, fifty milliseconds or a hundred milliseconds of touch, so that we got the minute details right—exactly what you felt on canvas when you moved fast but pushed lightly, and the next time, when you were going slower but pushing harder.” The illusion of texture arises when the vibration pattern is played back. The sensing stylus you hold, which resembles a very fat ballpoint pen with a cable attached to its rump, transmits patterned vibrations to your fingers. In a way, it’s something like the needle in the groove of an old-fashioned vinyl album, only it plays back into your fingers rather than into your ears. “When you change how hard you are pressing or how fast you are moving, the spectrum of the vibration waveform changes to match the spectral changes we measured during the original data recording,” Kuchenbecker says. “It’s like recording a certain natural sound, like a waterfall, and then being able to generate a synthetic sound that sounds the same but goes on forever and never repeats, so it’s not just a looped recording. The trick is that we constantly change the properties of the waveform to match the exploration conditions, like adjusting how fast the waterfall seems to be flowing. And it creates a fluid, moving, three-dimensional illusion of texture.” Choose your texture, drag the tool across nothing, and you feel touch plus time, which is all that texture is.

Pressure is tone, and texture melody, but touch presses itself on us most urgently at the extremities, in the experience of pain and of sexual pleasure. In phantom pain, limbs and appendages that no longer exist continue to feel and even to suffer. In sexual touching, as distinct from the affective kind, touch seems driven toward necessity. Both are forms of hyperbolized touch, making more of the stimulation than the stimulation alone would suggest was plausible.

Igor Spetic, in Cleveland, suffered after his amputation from extreme, persistent pain, which he felt permanently emanating from the hand he no longer had. “It was unbearable, twenty-four seven, as though my hand were in a clamp,” he says. Since the last thing he vaguely recalls about his accident is his hand clutched in a vise as he reached out toward the mechanical press that crushed it, it seems that his mind had continued to feel that final moment, like a clanging bell that is the last thing remembered, and still heard on his hospital bed, by the victim of a train accident. His hand is so much there from the brain’s point of view that the brain may be creating the pain it thinks the hand ought to be feeling, the last tactile sensation it can recall.

This kind of phantom pain in amputated limbs is a widely observed phenomenon, but for a long time it was thought to be a response to trauma of the “cauterized” nerves in the residual limb. One of the things that Dustin Tyler’s project in Cleveland has helped confirm is that it is also a cognitive phenomenon, placed much “higher up” in the system. After the sensors in Spetic’s arm were stimulated, his pain diminished, and then vanished. Reassured that the hand had moved on, that the trauma had passed and was no longer in need of response, the brain released it from the emergency state of feeling pain. Tyler thinks that, given the extraordinary cost of supplying his prosthetic hand to amputees—the eventual cost of the operation and the equipment, if it ever becomes widely available, would probably reach tens of thousands of dollars—its brightest future may lie exactly in this kind of therapeutic use for patients with extreme neuropathic and phantom pain. The stim can heal.

In a similar way, even normal pain has turned out to be intricately story-driven. The severity of pain, as Ronald Melzack, of McGill University, and his students showed many years ago, varies dramatically according to the context it takes place in: soldiers getting wounds on battlefields which can send them home from the war are numb and happy; women in childbirth—an off-the-charts agony, measured by any objective standard—report it afterward as painful but productive work, and rarely refuse to have another child because of it. It’s not that the soldier doesn’t feel the wound, or the mother the labor; but they reorganize their experience to suit their situation. It is one reason that, as has often been pointed out, those who suffer even from debilitating neuropathic pain often lead satisfying lives, while those who are born unable to feel pain usually die young. We can retune the warning system; we can’t live without one. Pain is, of course, a critical part of the new science of touch: most of the money for projects like Dustin Tyler’s comes from a research arm of the Department of Defense, and the Department of Defense has invested tens of millions of dollars in sensing prosthetics because so many soldiers came back from Iraq and Afghanistan missing arms and legs—having survived injuries that in earlier wars would have left them dead.

Most touch acts are surreptitious or subconscious or quietly social, but sexual touching is sought, specific, pointed in desire, and enormous in consequence. It is, in its way, phantom pleasure—an experience so discontinuous with other feelings that one expects it to be not merely a labelled but a licensed line of its own, as though there were a hundred things to hear and one that must be listened to. Yet while we tend, experientially, to separate sexuality from other forms of touching—or at least men do, seeing sex not as a blossom from the world of the tactile but as a thing unto itself—sexual touch seems, in the realm of neurophysiology, curiously unspecified.

“You’d think this would be a real obvious thing, with conferences about it,” David Linden says. “But there seems to be nothing special about the sexual skin. We’ve got this nerve ending we’ve looked at and we don’t know if it’s involved in sexual sensation. There are a lot of them in the clitoris and a lot in the glans penis, at the highest density where most men report the strongest sexual sensations. But that’s not proof.” It has long been established that on the somatosensory cortex—the “map” that exists in the brain, relating specific areas of the cortex to specific places on the skin—the genitalia are represented both in their expected place (around the lower trunk and upper leg) and then again below the leg, around the feet and toes. This may help explain why, as one student of sexual fetishism reports, “in search data there were 93,885 sexual searches for feet and only 5,831 sexual searches for hands.”
“And then there are small meaningful oddities,” Linden goes on. “There are people who have orgasm syndrome. They’re like what we call pain asymbolics—people who lose the emotional content of pain. You hit them with a hammer, and they know they’ve been hit, but it doesn’t trouble them. The same thing is true of pleasure—we think of orgasm as intrinsically pleasurable. But you can have an orgasm that is more convulsive than compelling. All the same things happen on the periphery—rhythmic contractions of the rectum and so on. But it doesn’t feel like much more than a sneeze. What are they missing?” A favorite case in the literature is that of a woman who would get a seizure every time she brushed her teeth—the seizures are probably triggered by the repetitive physical activity—and then the seizure would provoke an orgasm. The steady regimen of tooth-brushing orgasms was exhausting, rather than exalting, and led to an unusual morning dilemma: to brush or not to brush.

Among ordinary people, though, the two touch systems that seem most automatic and involuntary, relating to hurting and wanting, turn out to be among the most socially embedded. Pain is not a shared illusion, and sex is not a cultural condition: cut yourself with a carving knife and it will hurt no matter what company you’re in; an orgasm felt like an orgasm to Cleopatra as to a Meg Ryan character. But both are surprisingly dependent on our ideas about what they ought to be like. Itch passes through our bodies in direct currents, as if from ancient history; sex and pain enter our lives communally, loaded with the local news.

And so if the acceptable frontier of haptic technology is virtual-reality gaming, the unspoken but quietly recognized frontier is romantic. There is already a “hug shirt” that can transmit touch from sender to wearer. It was designed by Ryan Genz and Francesca Rosella, of the London fashion firm Cutecircuit, who decided, more than a decade ago, that touch was the missing link in modern talk: “We can transmit voice, we can transmit images—but we couldn’t transmit touch,” Ryan Genz says. Originally made as a sort of giant blood-pressure cuff, constricting and releasing the wearer in haptic harmony with another wearer, the shirt proved alarming, and now one hug shirt merely vibrates in long-range synch with another. The first transatlantic hug happened during a conference in 2006, and still longer, fiercer hugs can be imagined. (The newest designs include L.E.D. elements, so that the trace of the embrace lights up.) The hug shirt’s love children are almost too obvious to be enumerated. “The only logical advancement in haptics is to full-on virtual sex,” the sex-tech journalist Emma McGowan writes. “Full-body haptic suits are no longer a far-fetched sci-fi nerd’s dream.” Haptics engineers chat about allowing virtual sex with fictional characters or famous celebrities.

At that point, haptics crosses over not just into erotics but into accessories. As the Canadian researcher Meredith Chivers points out, however, there is a demonstrable disconnect between what women, at least, respond to physically and what they self-report as provocative. When it comes to sex, the science of touch confirms that stories, more than sensations, are what stir us. A story-making machine is more likely than a haptic suit to turn us on, as has been the rule of the erotic life of touch since it began.

Every haptic application, once its cool stuff is demonstrated, is followed by a sober explication from its maker on its four potential uses, always offered in descending order of piety: medicine, prostheses, commerce, and gaming. A haptic device might help you operate on a prostate, add touch sensitivity to an artificial hand, allow you to assess the fabric on an online shirt, or make you feel the trigger pressure when you shoot at zombies in a virtual-reality game.

But the real apotheosis of the enterprise will be achieved when artificial haptic intelligence is successfully modelled in robots. As the long-standing dream of the artificial-intelligence community was to make a computer that could defeat a chess master, it is the dream of the robotics community to make, by 2050, a humanoid-robotic team that can defeat the World Cup soccer champions. Kuchenbecker says, smiling, “It’s our BHAG”—the Big Hairy Audacious Goal.

For her, the core discovery of the past decade’s research in touch is that skin-smart is as smart as any other kind of smart. She talked about this in the GRASP lab, while showing off a Da Vinci, a robotlike surgical system. The Da Vinci—a grinning panoply of robotic arms and sharp tiny tools, like the torture device in a Bond film—can operate internally and make incisions with a precision that no human surgeon can hope to have. Although temporarily down for repairs, the Da Vinci bears a sign warning visitors to keep clear of it: “Do Not Touch. Testing Is in Progress. Robot Is Active.” Though down, it is apparently far from out, and not to be trifled with. Nearby, a second, chubbier robot, designed to have more cushioned arms and less lethal swing-back, keeps it consoling company. Both reside in a room devoted to robots; nearby are several knee-high would-be soccer players on an undersized soccer field. They will, in principle, be scaled up one day, as they are perfected to meet the BHAG. (For the moment, they tend to fall over and lose their heads when trying to recover the ball and kick it accurately in one move.)

Kuchenbecker’s goal is to provide robots with more than mere mechanical expertise. She wants them to have “haptic responsiveness,” so that the surgeon operating the robot can feel in her own hands the bounce or flab of an internal muscle, or palpate a liver from long distance. Ultimately, that intelligence could be infused in the robot itself, so that it would need no human to control it.
“Haptic intelligence is vital to human intelligence,” she concludes. “It’s not just dexterity. It’s finding your way in the world: it’s embodiment, emotion, attack. Haptic intelligence is human intelligence. We’re just so smart with it that we don’t know it yet. It’s actually much harder to make a chess piece move correctly—to pick up the piece and move it across the board and put it down properly—than it is to make the right chess move.” She adds, slyly, “When I took A.I. as a student, I was so dismayed to find that most A.I. is just stupid brute force, just running through the possibilities a machine can look at quickly. Computer chess looks intelligent, but it’s under-the-hood stupid. Reaching and elegantly picking up the right chess piece fluidly and having it land in the right place in an uncontrolled environment—that’s hard. Haptic intelligence is an almost irreproducible miracle! Because people are so good at that, they don’t appreciate it. Machines are good at finding the next move, but moving in the world still baffles them.”

The study of haptic intelligence leads to even deeper questions about the somatic self. Our skin is us because it draws a line around our existence: we experience the world as ourself. We can separate ourself from our eyes and ears, recognize the information they give us as information, but our tactile and proprioceptive halos supply us with the sense that we are constant selves.

There are rare conditions in which you come to believe that while, say, the right half of your body is you being yourself, the left half of your body is someone else’s—some uncomfortably close-talking, peering stranger you would like to get away from. Out-of-body experiences are related to these illusions, and they are probably key both to religious experience and to tales of alien abductions. The possibility of such illusions suggests that their opposite—our agreed-on coherent sense of a continuous self—may be a convenient fiction, an organized cognitive heuristic that we impose on experience to let us go on having it.

When somatic illusions strike, in other words, they strike our very sense of who we are. It is possible, by tapping at sequential spots on the skin, to create the illusion of intermediate taps between them, as though a rabbit were hopping down our arm. The so-called “cutaneous rabbit,” whose paws we feel strongly, can even be made to hop out of the body and leap onto a stick held by a subject. (The stick shakes, or so the subject feels, as though the rabbit had jumped on it.) The rabbit is just us, leaping out of our own skin.

In another way, it is increasingly possible to imagine oneself as being discontinuous with one’s skin. Igor Spetic feels something like this when he leaves his hand behind. “Think about it,” Dustin Tyler says. “There’s no real constraint on how far in space the connection could go. You could be sitting here in Cleveland and performing surgery in Tahiti, and actually feel the flesh and organs of your patient. Actually feel them. For that matter, you could text-message a handshake to a friend.” Even a visitor, playing with Spetic’s virtual hand, without the added bonus of the “stim” that enables him to feel the surface shapes of nonexistent objects, can find the experience of solving problems so intense that he feels that his hand, too, is in there, on a screen, inside a box. You are here; your hand is six feet away. The philosopher Daniel Dennett, playing with this idea, came up with a thought experiment in which one’s brain sits in a vat in Texas while artificial, remote-controlled hands and eyes and limbs engage at its direction from Oklahoma. The essay he wrote about this thought experiment was titled, simply, “Where Am I?” For the first time, this fantasy is becoming readily imaginable in the real world: in a sense, Spetic’s hand is left in the lab on the weekend. A bit of him is there.

It can sometimes seem as if the world of thinking about touch were divided into that of philosophers and students of culture who study the “phenomenology” of sensing, and that of the scientists and engineers who study its mechanics, with an abyss of understanding between them. In the introduction to “The Book of Touch,” Constance Classen explains that the anthology “does not offer any scientific information about touch,” because “attempts to explain tactile culture through scientific models tell more about the culture of science than about the scientific basis of culture.” The humanists are certain that what the scientists are doing is really cultural studies that don’t yet know themselves.

One of the few “multilinguals” in the field—someone equally at home with neuroscience and with phenomenology, with the language of data and with the talk of daily human experience—is Dacher Keltner. A professor of psychology at Berkeley, he is a specialist in the science of emotions; he was the scientific adviser on Pixar’s “Inside Out,” the movie about the inner life of a little girl. I dropped him a line one Sunday morning, and discovered that, serendipitously, he was in New York that day. He suggested that I meet him that very afternoon, in Washington Square Park, where he was “going to spend some time watching people be embodied.”

I found Keltner, calm and inquisitive, observing the world from a park bench. He looks like a Pixar version of the emotion of Benevolence: graying blond hair, worn long and parted in the middle, a serene smile always on his lips, and creased eyes suggesting perpetual, hope-filled curiosity mixed with wisdom. He explained that he likes to come out and watch emotions becoming embodied, by which he means seeing all the ways in which people take on the poses of their feelings, with the additional twist that he thinks, in effect, that the poses come first. In his view, touch is the primary moral experience: it is morality as we experience it in the first instance in the actual world. The thoughts come afterward to administer the thing. “Touch is the first system to come online, and the foundations of human relationships are all touch,” he says. “Skin to skin, parent to child, touch is the social language of our social life. It lays a basis for embodiment in feeling.”

Keltner has the power, shared by true students of a science, to make one see with his eyes: looking out across the panoply of human interaction in the sunlit square, one sees at once how much depends on skin and near-skin encounters: dating couples lean forward, hair brushes and fingertips touch; children bump as they play, not too hard and then hard enough to be warning and instructing; chess players off in their corner imply tentativeness, certainty, triumph, and mid-game anxiety by the sureness or the uncertainty with which they grasp and move their pieces.
“The foundation of human relationships is all touch,” Keltner goes on. “There are four years of touch exchanged between mother and baby. Among primates, the sense of reciprocal altruism emerges from food sharing, and they are always systemically touching each other as they share food. Reciprocity is tactile. Aggression is tactile. Sex is tactile. It’s the root moral precept of our sense of common humanity. In the social realm, our social awareness is profoundly tactile.” Keltner was one of the co-authors of a much talked of study that encoded twelve distinct kinds of “celebratory touches” among pro basketball players, including “fist bumps, high-fives, chest bumps, leaping shoulder bumps, chest punches, head slaps, head grabs, low fives, high tens, full hugs, half hugs, and team huddles.” They discovered that teams whose players touched one another a lot did better than those whose players didn’t. Touch lowers stress, builds morale, and produces triumphs—a chest bump instructs us in coöperation, a half-hug in compassion.

Keltner’s approach to touch turns on the deeper idea that consciousness itself is “exteriorized”—that we are alive in relation to others, not in relation to some imagined inner self, the homunculus in our heads. Our bodies are membranes in the world, with sensation and meaning passing seamlessly through them. Our experience of our bodies—the things they feel, the moves they make, and the textures and the people they touch—is our primary experience of our minds. “The brain is just simply part of our bodies” is how the philosopher Alva Noë often puts it. The truer cartoon, in a sense, would be “Outside In,” with the emotions produced by people bumping against one another. A key to being embodied in this way is tactile experience—what we touch, whom we touch, how many we touch, and why we find them touching. Grasping, hugging, striking, playing, caressing, reaching, scratching backs, and rubbing rears: these are not primitive forms of communication. They are the fabric of being conscious. The work of the world is done by handling it. We live by feel.

Later, in a café near the square, Keltner has a cappuccino and, sitting at the counter, watches the variety of human touch as it reveals itself in that unending theatre: fingers flying on the keyboard, hands darting out to make a point, heads turning to underline a joke, bodies slouching and primping and jostling and soliciting attention. An intensity of feeling combines, in our tactile lives, with a plurality of kinds.

Perhaps the reason that touch has no art form is that its supremacy makes it hard to escape. We can shut our eyes and cover our ears, but it’s our hands that do it when we do. We can’t shut off our skins. It is the obscurity of the other senses that makes us enliven them with art: touch is too important to be elaborated or distilled. It just is. What we see we long for; what we hear we interpret; what we touch we are. The art we aspire to is a remote sensation, always out of reach. Life is the itch we are still trying to scratch. ♦

Tuesday, May 03, 2016

Democracies end when they are too democratic.

And right now, America is a breeding ground for tyranny.

ANDREW SULLIVAN NY MAGAZINE
As this dystopian election campaign has unfolded, my mind keeps being tugged by a passage in Plato’s Republic. It has unsettled — even surprised — me from the moment I first read it in graduate school. The passage is from the part of the dialogue where Socrates and his friends are talking about the nature of different political systems, how they change over time, and how one can slowly evolve into another. And Socrates seemed pretty clear on one sobering point: that “tyranny is probably established out of no other regime than democracy.” What did Plato mean by that? Democracy, for him, I discovered, was a political system of maximal freedom and equality, where every lifestyle is allowed and public offices are filled by a lottery. And the longer a democracy lasted, Plato argued, the more democratic it would become. Its freedoms would multiply; its equality spread. Deference to any sort of authority would wither; tolerance of any kind of inequality would come under intense threat; and multiculturalism and sexual freedom would create a city or a country like “a many-colored cloak decorated in all hues.”

This rainbow-flag polity, Plato argues, is, for many people, the fairest of regimes. The freedom in that democracy has to be experienced to be believed — with shame and privilege in particular emerging over time as anathema. But it is inherently unstable. As the authority of elites fades, as Establishment values cede to popular ones, views and identities can become so magnificently diverse as to be mutually uncomprehending. And when all the barriers to equality, formal and informal, have been removed; when everyone is equal; when elites are despised and full license is established to do “whatever one wants,” you arrive at what might be called late-stage democracy. There is no kowtowing to authority here, let alone to political experience or expertise.

The very rich come under attack, as inequality becomes increasingly intolerable. Patriarchy is also dismantled: “We almost forgot to mention the extent of the law of equality and of freedom in the relations of women with men and men with women.” Family hierarchies are inverted: “A father habituates himself to be like his child and fear his sons, and a son habituates himself to be like his father and to have no shame before or fear of his parents.” In classrooms, “as the teacher ... is frightened of the pupils and fawns on them, so the students make light of their teachers.” Animals are regarded as equal to humans; the rich mingle freely with the poor in the streets and try to blend in. The foreigner is equal to the citizen.

And it is when a democracy has ripened as fully as this, Plato argues, that a would-be tyrant will often seize his moment.

He is usually of the elite but has a nature in tune with the time — given over to random pleasures and whims, feasting on plenty of food and sex, and reveling in the nonjudgment that is democracy’s civil religion. He makes his move by “taking over a particularly obedient mob” and attacking his wealthy peers as corrupt. If not stopped quickly, his appetite for attacking the rich on behalf of the people swells further. He is a traitor to his class — and soon, his elite enemies, shorn of popular legitimacy, find a way to appease him or are forced to flee. Eventually, he stands alone, promising to cut through the paralysis of democratic incoherence. It’s as if he were offering the addled, distracted, and self-indulgent citizens a kind of relief from democracy’s endless choices and insecurities. He rides a backlash to excess—“too much freedom seems to change into nothing but too much slavery” — and offers himself as the personified answer to the internal conflicts of the democratic mess. He pledges, above all, to take on the increasingly despised elites. And as the people thrill to him as a kind of solution, a democracy willingly, even impetuously, repeals itself.

And so, as I chitchatted over cocktails at a Washington office Christmas party in December, and saw, looming above our heads, the pulsating, angry televised face of Donald Trump on Fox News, I couldn’t help but feel a little nausea permeate my stomach. And as I watched frenzied Trump rallies on C-SPAN in the spring, and saw him lay waste to far more qualified political peers in the debates by simply calling them names, the nausea turned to dread. And when he seemed to condone physical violence as a response to political disagreement, alarm bells started to ring in my head. Plato had planted a gnawing worry in my mind a few decades ago about the intrinsic danger of late-democratic life. It was increasingly hard not to see in Plato’s vision a murky reflection of our own hyperdemocratic times and in Trump a demagogic, tyrannical character plucked directly out of one of the first books about politics ever written.

Could it be that the Donald has emerged from the populist circuses of pro wrestling and New York City tabloids, via reality television and Twitter, to prove not just Plato but also James Madison right, that democracies “have ever been spectacles of turbulence and contention … and have in general been as short in their lives as they have been violent in their deaths”? Is he testing democracy’s singular weakness — its susceptibility to the demagogue — by blasting through the firewalls we once had in place to prevent such a person from seizing power? Or am I overreacting?

Perhaps. The nausea comes and goes, and there have been days when the news algorithm has actually reassured me that “peak Trump” has arrived. But it hasn’t gone away, and neither has Trump. In the wake of his most recent primary triumphs, at a time when he is perilously close to winning enough delegates to grab the Republican nomination outright, I think we must confront this dread and be clear about what this election has already revealed about the fragility of our way of life and the threat late-stage democracy is beginning to pose to itself.

Plato, of course, was not clairvoyant. His analysis of how democracy can turn into tyranny is a complex one more keyed toward ancient societies than our own (and contains more wrinkles and eddies than I can summarize here). His disdain for democratic life was fueled in no small part by the fact that a democracy had executed his mentor, Socrates. And he would, I think, have been astonished at how American democracy has been able to thrive with unprecedented stability over the last couple of centuries even as it has brought more and more people into its embrace. It remains, in my view, a miracle of constitutional craftsmanship and cultural resilience. There is no place I would rather live. But it is not immortal, nor should we assume it is immune to the forces that have endangered democracy so many times in human history.

Part of American democracy’s stability is owed to the fact that the Founding Fathers had read their Plato. To guard our democracy from the tyranny of the majority and the passions of the mob, they constructed large, hefty barriers between the popular will and the exercise of power. Voting rights were tightly circumscribed. The president and vice-president were not to be popularly elected but selected by an Electoral College, whose representatives were selected by the various states, often through state legislatures. The Senate’s structure (with two members from every state) was designed to temper the power of the more populous states, and its term of office (six years, compared with two for the House) was designed to cool and restrain temporary populist passions. The Supreme Court, picked by the president and confirmed by the Senate, was the final bulwark against any democratic furies that might percolate up from the House and threaten the Constitution. This separation of powers was designed precisely to create sturdy firewalls against democratic wildfires.

Over the centuries, however, many of these undemocratic rules have been weakened or abolished. The franchise has been extended far beyond propertied white men. The presidency is now effectively elected through popular vote, with the Electoral College almost always reflecting the national democratic will. And these formal democratic advances were accompanied by informal ones, as the culture of democracy slowly took deeper root. For a very long time, only the elites of the political parties came to select their candidates at their quadrennial conventions, with the vote largely restricted to party officials from the various states (and often decided in, yes, smoke-filled rooms in large hotel suites). Beginning in the early 1900s, however, the parties began experimenting with primaries, and after the chaos of the 1968 Democratic convention, today’s far more democratic system became the norm.

Direct democracy didn’t just elect Congress and the president anymore; it expanded the notion of who might be qualified for public office. Once, candidates built a career through experience in elected or Cabinet positions or as military commanders; they were effectively selected by peer review. That elitist sorting mechanism has slowly imploded. In 1940, Wendell Willkie, a businessman with no previous political office, won the Republican nomination for president, pledging to keep America out of war and boasting that his personal wealth inoculated him against corruption: “I will be under obligation to nobody except the people.” He lost badly to Franklin D. Roosevelt, but nonetheless, since then, nonpolitical candidates have proliferated, from Ross Perot and Jesse Jackson, to Steve Forbes and Herman Cain, to this year’s crop of Ben Carson, Carly Fiorina, and, of course, Donald J. Trump. This further widening of our democracy — our increased openness to being led by anyone; indeed, our accelerating preference for outsiders — is now almost complete.

The barriers to the popular will, especially when it comes to choosing our president, are now almost nonexistent. In 2000, George W. Bush lost the popular vote and won the election thanks to Electoral College math and, more egregiously, to a partisan Supreme Court vote. Al Gore’s eventual concession spared the nation a constitutional crisis, but the episode generated widespread unease, not just among Democrats. And this year, the delegate system established by our political parties is also under assault. Trump has argued that the candidate with the most votes should get the Republican nomination, regardless of the rules in place. It now looks as if he won’t even need to win that argument — that he’ll bank enough delegates to secure the nomination uncontested — but he’s won it anyway. Fully half of Americans now believe the traditional nominating system is rigged.

Many contend, of course, that American democracy is actually in retreat, close to being destroyed by the vastly more unequal economy of the last quarter-century and the ability of the very rich to purchase political influence. This is Bernie Sanders’s core critique. But the past few presidential elections have demonstrated that, in fact, money from the ultrarich has been mostly a dud. Barack Obama, whose 2008 campaign was propelled by small donors and empowered by the internet, blazed the trail of the modern-day insurrectionist, defeating the prohibitive favorite in the Democratic primary and later his Republican opponent (both pillars of their parties’ Establishments and backed by moneyed elites). In 2012, the fund-raising power behind Mitt Romney — avatar of the one percent — failed to dislodge Obama from office. And in this presidential cycle, the breakout candidates of both parties have soared without financial support from the elites. Sanders, who is sustaining his campaign all the way to California on the backs of small donors and large crowds, is, to put it bluntly, a walking refutation of his own argument. Trump, of course, is a largely self-funding billionaire — but like Willkie, he argues that his wealth uniquely enables him to resist the influence of the rich and their lobbyists. Those despairing over the influence of Big Money in American politics must also explain the swift, humiliating demise of Jeb Bush and the struggling Establishment campaign of Hillary Clinton. The evidence suggests that direct democracy, far from being throttled, is actually intensifying its grip on American politics.

None of this is necessarily cause for alarm, even though it would be giving the Founding Fathers palpitations. The emergence of the first black president — unimaginable before our more inclusive democracy — is miraculous, a strengthening, rather than weakening, of the system. The days when party machines just fixed things or rigged elections are mercifully done with. The way in which outsider candidates, from Obama to Trump and Sanders, have brought millions of new people into the electoral process is an unmitigated advance. The inclusion of previously excluded voices helps, rather than impedes, our public deliberation. But it is precisely because of the great accomplishments of our democracy that we should be vigilant about its specific, unique vulnerability: its susceptibility, in stressful times, to the appeal of a shameless demagogue.

What the 21st century added to this picture, it’s now blindingly obvious, was media democracy — in a truly revolutionary form. If late-stage political democracy has taken two centuries to ripen, the media equivalent took around two decades, swiftly erasing almost any elite moderation or control of our democratic discourse. The process had its origins in partisan talk radio at the end of the past century. The rise of the internet — an event so swift and pervasive its political effect is only now beginning to be understood — further democratized every source of information, dramatically expanded each outlet’s readership, and gave everyone a platform. All the old barriers to entry — the cost of print and paper and distribution — crumbled.

So much of this was welcome. I relished it myself in the early aughts, starting a blog and soon reaching as many readers, if not more, as some small magazines do. Fusty old-media institutions, grown fat and lazy, deserved a drubbing. The early independent blogosphere corrected facts, exposed bias, earned scoops. And as the medium matured, and as Facebook and Twitter took hold, everyone became a kind of blogger. In ways no 20th-century journalist would have believed, we all now have our own virtual newspapers on our Facebook newsfeeds and Twitter timelines — picking stories from countless sources and creating a peer-to-peer media almost completely free of editing or interference by elites. This was bound to make politics more fluid. Political organizing — calling a meeting, fomenting a rally to advance a cause — used to be extremely laborious. Now you could bring together a virtual mass movement with a single webpage. It would take you a few seconds.

The web was also uniquely capable of absorbing other forms of media, conflating genres and categories in ways never seen before. The distinction between politics and entertainment became fuzzier; election coverage became even more modeled on sportscasting; your Pornhub jostled right next to your mother’s Facebook page. The web’s algorithms all but removed any editorial judgment, and the effect soon had cable news abandoning even the pretense of asking “Is this relevant?” or “Do we really need to cover this live?” in the rush toward ratings bonanzas. In the end, all these categories were reduced to one thing: traffic, measured far more accurately than any other medium had ever done before.

And what mainly fuels this is precisely what the Founders feared about democratic culture: feeling, emotion, and narcissism, rather than reason, empiricism, and public-spiritedness. Online debates become personal, emotional, and irresolvable almost as soon as they begin. Godwin’s Law — it’s only a matter of time before a comments section brings up Hitler — is a reflection of the collapse of the reasoned deliberation the Founders saw as indispensable to a functioning republic.

Yes, occasional rational points still fly back and forth, but there are dramatically fewer elite arbiters to establish which of those points is actually true or valid or relevant. We have lost authoritative sources for even a common set of facts. And without such common empirical ground, the emotional component of politics becomes inflamed and reason retreats even further. The more emotive the candidate, the more supporters he or she will get.

Politically, we lucked out at first. Obama would never have been nominated for the presidency, let alone elected, if he hadn’t harnessed the power of the web and the charisma of his media celebrity. But he was also, paradoxically, a very elite figure, a former state and U.S. senator, a product of Harvard Law School, and, as it turned out, blessed with a preternaturally rational and calm disposition. So he has masked, temporarily, the real risks in the system that his pioneering campaign revealed. Hence many Democrats’ frustration with him. Those who saw in his campaign the seeds of revolutionary change, who were drawn to him by their own messianic delusions, came to be bitterly disappointed by his governing moderation and pragmatism.

The climate Obama thrived in, however, was also ripe for far less restrained opportunists. In 2008, Sarah Palin emerged as proof that an ardent Republican, branded as an outsider, tailor-made for reality TV, proud of her own ignorance about the world, and reaching an audience directly through online media, could also triumph in this new era. She was, it turned out, a John the Baptist for the true messiah of conservative populism, waiting patiently and strategically for his time to come.

Trump, we now know, had been considering running for president for decades. Those who didn’t see him coming — or kept treating him as a joke — had not yet absorbed the precedents of Obama and Palin or the power of the new wide-open system to change the rules of the political game. Trump was as underrated for all of 2015 as Obama was in 2007 — and for the same reasons. He intuitively grasped the vanishing authority of American political and media elites, and he had long fashioned a public persona perfectly attuned to blast past them.

Despite his immense wealth and inherited privilege, Trump had always cultivated a common touch. He did not hide his wealth in the late-20th century — he flaunted it in a way that connected with the masses. He lived the rich man’s life most working men dreamed of — endless glamour and women, for example — without sacrificing a way of talking about the world that would not be out of place on the construction sites he regularly toured. His was a cult of democratic aspiration. His 1987 book, The Art of the Deal, promised its readers a path to instant success; his appearances on “The Howard Stern Show” cemented his appeal. His friendship with Vince McMahon offered him an early entrée into the world of professional wrestling, with its fusion of sports and fantasy. He was a macho media superstar.

One of the more amazing episodes in Sarah Palin’s early political life, in fact, bears this out. She popped up in the Anchorage Daily News as “a commercial fisherman from Wasilla” on April 3, 1996. Palin had told her husband she was going to Costco but had sneaked into J.C. Penney in Anchorage to see … one Ivana Trump, who, in the wake of her divorce, was touting her branded perfume. “We want to see Ivana,” Palin told the paper, “because we are so desperate in Alaska for any semblance of glamour and culture.”

Trump assiduously cultivated this image and took to reality television as a natural. Each week, for 14 seasons of The Apprentice, he would look someone in the eye and tell them, “You’re fired!” The conversation most humane bosses fear to have with an employee was something Trump clearly relished, and the cruelty became entertainment. In retrospect, it is clear he was training — both himself and his viewers. If you want to understand why a figure so widely disliked nonetheless powers toward the election as if he were approaching a reality-TV-show finale, look no further. His television tactics, as applied to presidential debates, wiped out rivals used to a different game. And all our reality-TV training has conditioned us to hope he’ll win — or at least stay in the game till the final round. In such a shame-free media environment, the assholes often win. In the end, you support them because they’re assholes.

In Eric Hoffer’s classic 1951 tract, The True Believer, he sketches the dynamics of a genuine mass movement. He was thinking of the upheavals in Europe in the first half of the century, but the book remains sobering, especially now. Hoffer’s core insight was to locate the source of all truly mass movements in a collective sense of acute frustration. Not despair, or revolt, or resignation — but frustration simmering with rage. Mass movements, he notes (as did Tocqueville centuries before him), rarely arise when oppression or misery is at its worst (say, 2009); they tend to appear when the worst is behind us but the future seems not so much better (say, 2016). It is when a recovery finally gathers speed and some improvement is tangible but not yet widespread that the anger begins to rise. After the suffering of recession or unemployment, and despite hard work with stagnant or dwindling pay, the future stretches ahead with relief just out of reach. When those who helped create the last recession face no consequences but renewed fabulous wealth, the anger reaches a crescendo.

The deeper, long-term reasons for today’s rage are not hard to find, although many of us elites have shamefully found ourselves able to ignore them. The jobs available to the working class no longer contain the kind of craftsmanship or satisfaction or meaning that can take the sting out of their low and stagnant wages. The once-familiar avenues for socialization — the church, the union hall, the VFW — have become less vibrant and social isolation more common. Global economic forces have pummeled blue-collar workers more relentlessly than almost any other segment of society, forcing them to compete against hundreds of millions of equally skilled workers throughout the planet. No one asked them in the 1990s if this was the future they wanted. And the impact has been more brutal than many economists predicted. No wonder suicide and mortality rates among the white working poor are spiking dramatically.

“It is usually those whose poverty is relatively recent, the ‘new poor,’ who throb with the ferment of frustration,” Hoffer argues. Fundamentalist religion long provided some emotional support for those left behind (for one thing, it invites practitioners to defy the elites as unholy), but its influence has waned as modernity has penetrated almost everything and the great culture wars of the 1990s and 2000s have ended in a rout. The result has been a more diverse mainstream culture — but also, simultaneously, a subculture that is even more alienated and despised, and ever more infuriated and bloody-minded.

This is an age in which a woman might succeed a black man as president, but also one in which a member of the white working class has declining options to make a decent living. This is a time when gay people can be married in 50 states, even as working-class families are hanging by a thread. It’s a period in which we have become far more aware of the historic injustices that still haunt African-Americans and yet we treat the desperate plight of today’s white working ­class as an afterthought. And so late-stage capitalism is creating a righteous, revolutionary anger that late-stage democracy has precious little ability to moderate or constrain — and has actually helped exacerbate.

For the white working class, having had their morals roundly mocked, their religion deemed primitive, and their economic prospects decimated, now find their very gender and race, indeed the very way they talk about reality, described as a kind of problem for the nation to overcome. This is just one aspect of what Trump has masterfully signaled as “political correctness” run amok, or what might be better described as the newly rigid progressive passion for racial and sexual equality of outcome, rather than the liberal aspiration to mere equality of opportunity.

Much of the newly energized left has come to see the white working class not as allies but primarily as bigots, misogynists, racists, and homophobes, thereby condemning those often at the near-bottom rung of the economy to the bottom rung of the culture as well. A struggling white man in the heartland is now told to “check his privilege” by students at Ivy League colleges. Even if you agree that the privilege exists, it’s hard not to empathize with the object of this disdain. These working-class communities, already alienated, hear — how can they not? — the glib and easy dismissals of “white straight men” as the ultimate source of all our woes. They smell the condescension and the broad generalizations about them — all of which would be repellent if directed at racial minorities — and see themselves, in Hoffer’s words, “disinherited and injured by an unjust order of things.”

And so they wait, and they steam, and they lash out. This was part of the emotional force of the tea party: not just the advancement of racial minorities, gays, and women but the simultaneous demonization of the white working-class world, its culture and way of life. Obama never intended this, but he became a symbol to many of this cultural marginalization. The Black Lives Matter left stoked the fires still further; so did the gay left, for whom the word magnanimity seems unknown, even in the wake of stunning successes. And as the tea party swept through Washington in 2010, as its representatives repeatedly held the government budget hostage, threatened the very credit of the U.S., and refused to hold hearings on a Supreme Court nominee, the American political and media Establishment mostly chose to interpret such behavior as something other than unprecedented. But Trump saw what others didn’t, just as Hoffer noted: “The frustrated individual and the true believer make better prognosticators than those who have reason to want the preservation of the status quo.”

Mass movements, Hoffer argues, are distinguished by a “facility for make-believe … credulity, a readiness to attempt the impossible.” What, one wonders, could be more impossible than suddenly vetting every single visitor to the U.S. for traces of Islamic belief? What could be more make-believe than a big, beautiful wall stretching across the entire Mexican border, paid for by the Mexican government? What could be more credulous than arguing that we could pay off our national debt through a global trade war? In a conventional political party, and in a rational political discourse, such ideas would be laughed out of contention, their self-evident impossibility disqualifying them from serious consideration. In the emotional fervor of a democratic mass movement, however, these impossibilities become icons of hope, symbols of a new way of conducting politics. Their very impossibility is their appeal.

But the most powerful engine for such a movement — the thing that gets it off the ground, shapes and solidifies and entrenches it — is always the evocation of hatred. It is, as Hoffer put it, “the most accessible and comprehensive of all unifying elements.” And so Trump launched his campaign by calling undocumented Mexican immigrants a population largely of rapists and murderers. He moved on to Muslims, both at home and abroad. He has now added to these enemies — with sly brilliance — the Republican Establishment itself. And what makes Trump uniquely dangerous in the history of American politics — with far broader national appeal than, say, Huey Long or George Wallace — is his response to all three enemies. It’s the threat of blunt coercion and dominance.

And so after demonizing most undocumented Mexican immigrants, he then vowed to round up and deport all 11 million of them by force. “They have to go” was the typically blunt phrase he used — and somehow people didn’t immediately recognize the monstrous historical echoes. The sheer scale of the police and military operation that this policy would entail boggles the mind. Worse, he emphasized, after the mass murder in San Bernardino, that even the Muslim-Americans you know intimately may turn around and massacre you at any juncture. “There’s something going on,” he declaimed ominously, giving legitimacy to the most hysterical and ugly of human impulses.

To call this fascism doesn’t do justice to fascism. Fascism had, in some measure, an ideology and occasional coherence that Trump utterly lacks. But his movement is clearly fascistic in its demonization of foreigners, its hyping of a threat by a domestic minority (Muslims and Mexicans are the new Jews), its focus on a single supreme leader of what can only be called a cult, and its deep belief in violence and coercion in a democracy that has heretofore relied on debate and persuasion. This is the Weimar aspect of our current moment. Just as the English Civil War ended with a dictatorship under Oliver Cromwell, and the French Revolution gave us Napoleon Bonaparte, and the unstable chaos of Russian democracy yielded to Vladimir Putin, and the most recent burst of Egyptian democracy set the conditions for General el-Sisi’s coup, so our paralyzed, emotional hyperdemocracy leads the stumbling, frustrated, angry voter toward the chimerical panacea of Trump.

His response to his third vaunted enemy, the RNC, is also laced with the threat of violence. There will be riots in Cleveland if he doesn’t get his way. The RNC will have “a rough time” if it doesn’t cooperate. “Paul Ryan, I don’t know him well, but I’m sure I’m going to get along great with him,” Trump has said. “And if I don’t? He’s gonna have to pay a big price, okay?” The past month has seen delegates to the Cleveland convention receiving death threats; one of Trump’s hatchet men, Roger Stone, has already threatened to publish the hotel rooms of delegates who refuse to vote for Trump.

And what’s notable about Trump’s supporters is precisely what one would expect from members of a mass movement: their intense loyalty. Trump is their man, however inarticulate they are when explaining why. He’s tough, he’s real, and they’ve got his back, especially when he is attacked by all the people they have come to despise: liberal Democrats and traditional Republicans. At rallies, whenever a protester is hauled out, you can almost sense the rising rage of the collective identity venting itself against a lone dissenter and finding a catharsis of sorts in the brute force a mob can inflict on an individual. Trump tells the crowd he’d like to punch a protester in the face or have him carried out on a stretcher. No modern politician who has come this close to the presidency has championed violence in this way. It would be disqualifying if our hyper­democracy hadn’t already abolished disqualifications.

And while a critical element of 20th-century fascism — its organized street violence — is missing, you can begin to see it in embryonic form. The phalanx of bodyguards around Trump grows daily; plainclothes bouncers in the crowds have emerged as pseudo-cops to contain the incipient unrest his candidacy will only continue to provoke; supporters have attacked hecklers with sometimes stunning ferocity. Every time Trump legitimizes potential violence by his supporters by saying it comes from a love of country, he sows the seeds for serious civil unrest.

Trump celebrates torture — the one true love of tyrants everywhere — not because it allegedly produces intelligence but because it has a demonstration effect. At his rallies he has recounted the mythical acts of one General John J. Pershing when confronted with an alleged outbreak of Islamist terrorism in the Philippines. Pershing, in Trump’s telling, lines up 50 Muslim prisoners, swishes a series of bullets in the corpses of freshly slaughtered pigs, and orders his men to put those bullets in their rifles and kill 49 of the captured Muslim men. He spares one captive solely so he can go back and tell his friends. End of the terrorism problem.

In some ways, this story contains all the elements of Trump’s core appeal. The vexing problem of tackling jihadist terror? Torture and murder enough terrorists and they will simply go away. The complicated issue of undocumented workers, drawn by jobs many Americans won’t take? Deport every single one of them and build a wall to stop the rest. Fuck political correctness. As one of his supporters told an obtuse reporter at a rally when asked if he supported Trump: “Hell yeah! He’s no-bullshit. All balls. Fuck you all balls. That’s what I’m about.” And therein lies the appeal of tyrants from the beginning of time. Fuck you all balls. Irrationality with muscle.

The racial aspect of this is also unmissable. When the enemy within is Mexican or Muslim, and your ranks are extremely white, you set up a rubric for a racial conflict. And what’s truly terrifying about Trump is that he does not seem to shrink from such a prospect; he relishes it.

For, like all tyrants, he is utterly lacking in self-control. Sleeping a handful of hours a night, impulsively tweeting in the early hours, improvising madly on subjects he knows nothing about, Trump rants and raves as he surfs an entirely reactive media landscape. Once again, Plato had his temperament down: A tyrant is a man “not having control of himself [who] attempts to rule others”; a man flooded with fear and love and passion, while having little or no ability to restrain or moderate them; a “real slave to the greatest fawning,” a man who “throughout his entire life ... is full of fear, overflowing with convulsions and pains.” Sound familiar? Trump is as mercurial and as unpredictable and as emotional as the daily Twitter stream. And we are contemplating giving him access to the nuclear codes.

Those who believe that Trump’s ugly, thuggish populism has no chance of ever making it to the White House seem to me to be missing this dynamic. Neo-fascist movements do not advance gradually by persuasion; they first transform the terms of the debate, create a new movement based on untrammeled emotion, take over existing institutions, and then ruthlessly exploit events. And so current poll numbers are only reassuring if you ignore the potential impact of sudden, external events — an economic downturn or a terror attack in a major city in the months before November. I have no doubt, for example, that Trump is sincere in his desire to “cut the head off” ISIS, whatever that can possibly mean. But it remains a fact that the interests of ISIS and the Trump campaign are now perfectly aligned. Fear is always the would-be tyrant’s greatest ally.

And though Trump’s unfavorables are extraordinarily high (around 65 percent), he is already showing signs of changing his tune, pivoting (fitfully) to the more presidential mode he envisages deploying in the general election. I suspect this will, to some fools on the fence, come as a kind of relief, and may open their minds to him once more. Tyrants, like mob bosses, know the value of a smile: Precisely because of the fear he’s already generated, you desperately want to believe in his new warmth. It’s part of the good-cop-bad-cop routine that will be familiar to anyone who has studied the presidency of Vladimir Putin.

With his appeal to his own base locked up, Trump may well also shift to more moderate stances on social issues like abortion (he already wants to amend the GOP platform to a less draconian position) or gay and even transgender rights. He is consistent in his inconsistency, because, for him, winning is what counts. He has had a real case against Ted Cruz — that the senator has no base outside ideological-conservative quarters and is even less likely to win a general election. More potently, Trump has a worryingly strong argument against Clinton herself — or “crooked Hillary,” as he now dubs her.

His proposition is a simple one. Remember James Carville’s core question in the 1992 election: Change versus more of the same? That sentiment once elected Clinton’s husband; it could also elect her opponent this fall. If you like America as it is, vote Clinton. After all, she has been a member of the American political elite for a quarter-century. Clinton, moreover, has shown no ability to inspire or rally anyone but her longtime loyalists. She is lost in the new media and has struggled to put away a 74-year-old socialist who is barely a member of her party. Her own unfavorables are only 11 points lower than Trump’s (far higher than Obama’s, John Kerry’s, or Al Gore’s were at this point in the race), and the more she campaigns, the higher her unfavorables go (including in her own party). She has a Gore problem. The idea of welcoming her into your living room for the next four years can seem, at times, positively masochistic.

It may be that demographics will save us. America is no longer an overwhelmingly white country, and Trump’s signature issue — illegal immigration — is the source of his strength but also of his weakness. Nonetheless, it’s worth noting how polling models have consistently misread the breadth of his support, especially in these past few weeks; he will likely bend over backward to include minorities in his fall campaign; and those convinced he cannot bring a whole new swath of white voters back into the political process should remember 2004, when Karl Rove helped engineer anti-gay-marriage state constitutional amendments that increased conservative voter turnout. All Trump needs is a sliver of minority votes inspired by the new energy of his campaign and the alleged dominance of the Obama coalition could crack (especially without Obama). Throughout the West these past few years, from France to Britain and Germany, the polls have kept missing the power of right-wing insurgency.

Were Trump to win the White House, the defenses against him would be weak. He would likely bring a GOP majority in the House, and Republicans in the Senate would be subjected to almighty popular fury if they stood in his way. The 4-4 stalemate in the Supreme Court would break in Trump’s favor. (In large part, of course, this would be due to the GOP’s unprecedented decision to hold a vacancy open “for the people to decide,” another massive hyperdemocratic breach in our constitutional defenses.) And if Trump’s policies are checked by other branches of government, how might he react? Just look at his response to the rules of the GOP nomination process. He’s not interested in rules. And he barely understands the Constitution. In one revealing moment earlier this year, when asked what he would do if the military refused to obey an illegal order to torture a prisoner, Trump simply insisted that the man would obey: “They won’t refuse. They’re not going to refuse, believe me.” He later amended his remark, but it speaks volumes about his approach to power. Dick Cheney gave illegal orders to torture prisoners and coerced White House lawyers to cook up absurd “legal” defenses. Trump would make Cheney’s embrace of the dark side and untrammeled executive power look unambitious.

In his 1935 novel, It Can’t Happen Here, Sinclair Lewis wrote a counterfactual about what would happen if fascism as it was then spreading across Europe were to triumph in America. It’s not a good novel, but it remains a resonant one. The imagined American fascist leader — a senator called Buzz Windrip — is a “Professional Common Man … But he was the Common Man ­twenty-times-magnified by his oratory, so that while the other Commoners could understand his every purpose, which was exactly the same as their own, they saw him towering among them, and they raised hands to him in worship.”

He “was vulgar, almost illiterate, a public liar easily detected, and in his ‘ideas’ almost idiotic.” “ ‘I know the Press only too well,’ ” Windrip opines at one point. “ ‘Almost all editors hide away in spider-dens, men without thought of Family or Public Interest … plotting how they can put over their lies, and advance their own positions and fill their greedy pocketbooks.’ ”

He is obsessed with the balance of trade and promises instant economic success: “ ‘I shall not be content till this country can produce every single thing we need … We shall have such a balance of trade as will go far to carry out my often-criticized yet completely sound idea of from $3000 to $5000 per year for every single family.’ ” However fantastical and empty his promises, he nonetheless mesmerizes the party faithful at the nominating convention (held in Cleveland!): “Something in the intensity with which Windrip looked at his audience, looked at all of them, his glance slowly taking them in from the highest-perched seat to the nearest, convinced them that he was talking to each individual, directly and solely; that he wanted to take each of them into his heart; that he was telling them the truths, the imperious and dangerous facts, that had been hidden from them.”

And all the elites who stood in his way? Crippled by their own failures, demoralized by their crumbling stature, they first mock and then cave. As one lone journalist laments before the election (he finds himself in a concentration camp afterward): “I’ve got to keep remembering … that Windrip is only the lightest cork on the whirlpool. He didn’t plot all this thing. With all the justified discontent there is against the smart politicians and the Plush Horses of Plutocracy — oh, if it hadn’t been one Windrip, it’d been another … We had it coming, we Respectables.”

And, 81 years later, many of us did. An American elite that has presided over massive and increasing public debt, that failed to prevent 9/11, that chose a disastrous war in the Middle East, that allowed financial markets to nearly destroy the global economy, and that is now so bitterly divided the Congress is effectively moot in a constitutional democracy: “We Respectables” deserve a comeuppance. The vital and valid lesson of the Trump phenomenon is that if the elites cannot govern by compromise, someone outside will eventually try to govern by popular passion and brute force.

But elites still matter in a democracy. They matter not because they are democracy’s enemy but because they provide the critical ingredient to save democracy from itself. The political Establishment may be battered and demoralized, deferential to the algorithms of the web and to the monosyllables of a gifted demagogue, but this is not the time to give up on America’s near-unique and stabilizing blend of democracy and elite responsibility. The country has endured far harsher times than the present without succumbing to rank demagoguery; it avoided the fascism that destroyed Europe; it has channeled extraordinary outpourings of democratic energy into constitutional order. It seems shocking to argue that we need elites in this democratic age — especially with vast inequalities of wealth and elite failures all around us. But we need them precisely to protect this precious democracy from its own destabilizing excesses.

And so those Democrats who are gleefully predicting a Clinton landslide in November need to both check their complacency and understand that the Trump question really isn’t a cause for partisan Schadenfreude anymore. It’s much more dangerous than that. Those still backing the demagogue of the left, Bernie Sanders, might want to reflect that their critique of Clinton’s experience and expertise — and their facile conflation of that with corruption — is only playing into Trump’s hands. That it will fall to Clinton to temper her party’s ambitions will be uncomfortable to watch, since her willingness to compromise and equivocate is precisely what many Americans find so distrustful. And yet she may soon be all we have left to counter the threat. She needs to grasp the lethality of her foe, moderate the kind of identity politics that unwittingly empowers him, make an unapologetic case that experience and moderation are not vices, address much more directly the anxieties of the white working class—and Democrats must listen.

More to the point, those Republicans desperately trying to use the long-standing rules of their own nominating process to thwart this monster deserve our passionate support, not our disdain. This is not the moment to remind them that they partly brought this on themselves. This is a moment to offer solidarity, especially as the odds are increasingly stacked against them. Ted Cruz and John Kasich face their decisive battle in Indiana on May 3. But they need to fight on, with any tactic at hand, all the way to the bitter end. The Republican delegates who are trying to protect their party from the whims of an outsider demagogue are, at this moment, doing what they ought to be doing to prevent civil and racial unrest, an international conflict, and a constitutional crisis. These GOP elites have every right to deploy whatever rules or procedural roadblocks they can muster, and they should refuse to be intimidated.

And if they fail in Indiana or Cleveland, as they likely will, they need, quite simply, to disown their party’s candidate. They should resist any temptation to loyally back the nominee or to sit this election out. They must take the fight to Trump at every opportunity, unite with Democrats and Independents against him, and be prepared to sacrifice one election in order to save their party and their country.

For Trump is not just a wacky politician of the far right, or a riveting television spectacle, or a Twitter phenom and bizarre working-class hero. He is not just another candidate to be parsed and analyzed by TV pundits in the same breath as all the others. In terms of our liberal democracy and constitutional order, Trump is an extinction-level event. It’s long past time we started treating him as such.


*This article appears in the May 2, 2016 issue of New York Magazine.

Rosewood