Wednesday, February 26, 2020

The Romanticized Belle Epoque in Paris Was an Age of Political Crisis


VIA KNOPF 


The Romanticized Belle Epoque in Paris Was an Age of Political Crisis 

Julian Barnes on Fake News, Religious Tension, and "Gangster Imperialism" Abounded 


By Julian Barnes Literary Hub

Merrie England, the Golden Age, la Belle Epoque: such shiny brand names are always coined retrospectively. No one in Paris ever said to one another, in 1895 or 1900, “We’re living in the Belle Epoque, better make the most of it.” The phrase describing that time of peace between the catastrophic French defeat of 1870–71 and the catastrophic French victory of 1914–18 didn’t come into the language until 1940–41, after another French defeat. It was the title of a radio program which morphed into a live musical-theater show: a feel-good coinage and a feel-good distraction which also played up to certain German preconceptions about oh-la-la, can-can France. 

The Belle Epoque: locus classicus of peace and pleasure, glamor with more than a brush of decadence, a last flowering of the arts, and last flowering of a settled high society before, belatedly, this soft fantasy was blown away by the metallic, unfoolable 20th century, which ripped those elegant, witty Toulouse-Lautrec posters from the leprous wall and rank vespasienne. Well, it might have been like that for some, and Parisians more than most. But then as Douglas Johnson, wise historian of France, once wrote, “Paris is only the outskirts of France.” 

At the time, however, the Beautiful Era was—and felt—an age of neurotic, even hysterical national anxiety, filled with political instability, crises and scandals. In such hyperventilating times, prejudice could swiftly metastasize into paranoia. So that “known affinity” between the historically persecuted Protestant and Jew could be turned by some minds into vivid threat. 

In 1899, a certain Ernest Renauld published Le Péril protestant, whose purpose, he explained, was “to unmask the enemy, the Protestant, allied with the Jew and the Freemason against the Catholic.” 

Nobody knew what would happen, because what “should” have happened rarely did. The Prussian demand for reparations in 1871, which should have crippled the country for decades, was quickly paid off, and cost France much less than the phylloxera epidemic which had devastated the French vineyards from 1863 onwards. Enormous constitutional changes which should have come about were averted at the last minute for seemingly trivial reasons. 

After the defeat by Prussia, the monarchy was all set to return until the Pretender, the Comte de Chambord, jibbed at having the tricolore as the national flag. He insisted on the white fleur-de-lys or nothing; he got nothing. In the late 1880s General Boulanger—Catholic, royalist, populist, Revengist—was expected to win power in the 1889 election. (One of his more unlikely candidates was Prince Edmond de Polignac, chosen to stand in Nancy, but who found the campaign trail all too fatiguing and withdrew.) 

Political corruption in France was endemic: it was said that “each banker has his personal senator and his deputies.” 

After this democratic bid failed, a coup d’état seemed certain; except that Boulanger too balked at the last minute, seemingly on the advice of his exquisitely named mistress, Mme de Bonnemains. One major constitutional change which did happen was the separation of church and state; the law of 1905 remains the basis of the French secular state to this day. 

The cure for—or at least distraction from—domestic political confusion is often the same: foreign adventure. The French believed at this time, as did the British, that they had a unique mission civilisatrice in the world; and each, predictably, thought their own civilizing mission more civilized than the other’s. Although to the actual civilizees, it felt different—more like conquest. 

Thus, in the spring of 1881 the French invaded Tunisia, and in the autumn of that year they put down a rebellion. In between, they signed a “treaty of protection” with the country’s previous rulers. The phrase is telling. Those who offer protection have their hands out for protection money: this was the era of gangster imperialism. Meanwhile, between 1870 and 1900, the British Empire expanded to cover 4 million square miles. 

Political corruption in France was endemic: it was said that “each banker has his personal senator and his deputies.” The press was violent in its language; libel laws slack; fake news prevalent; and killing never far away. In 1881 the International Anarchist Congress gave approval to “propaganda by the deed” (the very phrase was French), and the high life of which the Belle Epoque boasted—the world of opera houses and fashionable restaurants—was targeted. 

When the Anarchist Ravachol was tried and guillotined in 1892, the response was a nail bomb thrown into the Chambre des Députés which injured 50. There were high-level assassinations: of the President of the Republic, Sadi Carnot, in 1894; of the Socialist anti-war leader, Jean Jaurès, in 1914. 

There was also a rise of blood-and-soil nativism, which urged a “reawakening” of Old Gaul; a fierce desire, articulated by Boulanger, for revenge against Prussia; and convulsive, nationwide eruptions of anti-Semitism. All three strands were plaited into the Dreyfus case, the overriding political event of the period, one which, beyond the “simple” matter of justice, concentrated the past and shaped the future. Everyone was involved, in one way or another. At Dreyfus’s “degradation” in 1895, Sarah Bernhardt sat in the front row. At his second trial, at Rennes in 1899, Pozzi was there (Pozzi was everywhere). 

And yet—and in keeping with the historical illogic of the period—the Dreyfus case had an effect out of all proportion to its content. Its victim confirmed the rule that the martyr often fails to live up to the mystique of his own martyrdom. “We were prepared to die for Dreyfus,” commented the poet Charles Péguy, “but Dreyfus wasn’t.” As for the gravity of the actual spying, “There was nothing in it,” concluded Douglas Johnson. 

The case was far more important for what others made of it than for what it contained in itself. Indeed, if you were looking for an example of high corruption liable to promote anti-Semitic feeling, then the Panama Scandal of 1892–93—in which three Jewish financiers bribed several cabinet ministers, 150 deputies, and virtually every major newspaper—ought to have been far more significant. But there is often little “ought” in history. 

Political memory runs long in France. In 1965, the novelist François Mauriac, then 80, wrote: “I was a child at the time of the Dreyfus Affair, but it has filled my life.” That same year I was teaching in France, and discovering French (and Francophone) singer-songwriters. My favorite was Jacques Brel, who 12 years later—and 63 years after the event—was to record his lyrical lament “Jaurès,” with the refrain, “Pourquoi ont-ils tué Jaurès?” 

Enormous constitutional changes which should have come about were averted at the last minute for seemingly trivial reasons. 

And then there was a quirky, minor, amusing business thousands of miles away, which nicely demonstrates the historical law of unintended consequence. In 1896, during the Scramble for Africa, an expeditionary force of eight French and one-hundred twenty Senegalese soldiers crossed the continent from west to east: their target was a ruined fort on the Sudanese Upper Nile. Frenchly, they set off with 1,300 litres of claret, fifty bottles of Pernod, and a mechanical piano. 

The journey took them two years; they arrived in July 1898, two months after Zola published J’Accuse. They raised the tricolore at the ruined fort of Fashoda, and seemed to have no more geopolitical purpose than to annoy the British. This they did, just a little, until Kitchener, then in charge of the Egyptian Army (and, contrary to his reputation, a Francophile who spoke fluent French), turned up and advised them to hop it. 

He also gave them copies of recent French newspapers, in which they read of the Dreyfus Affair and wept. The two sides fraternized, and the British band played “La Marseillaise” as the French withdrew. No one was hurt or abused, let alone killed. 

How could this not have been a trifling comic sideshow amid the broader imperial rivalry? The British have long forgotten Fashoda (but then it was they who forced the tiny withdrawal). In French eyes, however, it was a key moment of national humiliation and dishonor, one that made a profound impact on a certain eight-year-old French boy, who in later years remembered it as a “childhood tragedy.” 

How was Kitchener to know, as he was drinking warm champagne with eight Frenchmen at that distant fort, and noting how its brief occupants had even planted a garden—“Flowers at Fashoda! Oh these Frenchmen!”—that these events would play out, decades later, in Charles de Gaulle’s obstreperous and infuriating (translate into French as “determined and patriotic”) behavior during his London wartime exile, then later in his stubbornly vindictive (“principled and statesmanlike”) triple refusal to allow Britain to join (“disrupt”) the European Common Market?


Tuesday, February 18, 2020

What We Become: Jefferson Airplane, “White Rabbit”


Stuck, Ch. 15. What We Become: Jefferson Airplane, “White Rabbit”

by Akim Reinhardt 3 Quarks Daily


Stuck is a weekly serial appearing at 3QD every Monday through early April. 


Charles Lutwidge Dodgson was an odd fellow who eventually became someone else.

Born in 1832, he was the fourth of twelve children, and descended from a long line of English soldiers and priests all named Charles Dodgson. His parents were first cousins. He stuttered. A childhood fever left him deaf in one ear. As an adult he would suffer from migraines and epilepsy.

At age 12 he was sent away to school. He hated it. Still, he aced his classes and went on to Christ Church College in Oxford. He did not always apply himself, but nonetheless excelled at mathematics and eventually earned a teaching position. He remained at the school for the rest of his life.

Dodgson was conservative, stuffy, and shy. He was awed by aristocrats and sometimes snobbish to his social inferiors. He was mildly self-deprecating and earnestly religious. He had a reputation for being a very good charades player. He invented a number of gadgets, including a stamp collecting folder, a note taking tablet, a new type of money order, and a steering device for tricycles. He also created an early version of Scrabble. He liked little girls.

Dodgson enjoyed photographing and drawing nude children. He never married or had any children of his own. Whether his affection for pre-pubescent girls was sexual, or merely tied to Victorian notions of children representing innocence, is still debated. In the prime of his adulthood, one girl in particular caught his fancy: eleven year old Alice Liddell.

Dodgson spent much time with the Liddell family. A favorite activity was taking Alice and her two siblings out on a rowboat, where he would tell them stories. Alice so enjoyed the stories that she begged Charles to write them down. He presented her with a handwritten, illustrated collection in 1864. He called it Alice’s Adventures Underground.

Next, he brought the manuscript to a publisher. Several new titles were considered, including Alice among the Fairies and Alice’s Golden Hour. In 1865 it was published as Alice’s Adventures in Wonderland. It was an immediate success. Charles Lutwidge Dodgson, the odd man fascinated by numbers and other people’s children, had become celebrated author Lewis Carroll.

The book and its 1871 sequel, Through the Looking Glass, are at once simple tales of childhood fancy and stunningly complex pieces of literature. The stories are at turns straightforward and absurd. For example, Carroll occasionally employs what scholars call literary nonsense. Famous examples are two poems within the books, “Jabberwocky” and “The Hunting of the Snark.” They are largely gibberish and full of make believe words. Yet the books are so rich in word play, Victorian cultural references, parodies of other poems, mathematical concepts, and contemporary satire, that by 1960 The Annotated Alice was published, replete with hundreds of footnotes explaining this deceptively complex children’s story, because few adult readers could be expected to really “get it” without substantial help.

So simple and so complicated, Carroll’s two books about a small girl who went down the rabbit hole could be almost anything one wanted them to be. And so they became many things.

First the books began to change. In 1890, Carroll himself produced a shorter version for small children. Five years later, author Anna Richards imagined a new girl, also named Alice, who travels to Wonderland and has her own adventures. In 1897, a string of parody editions began appearing.

A stage production was developed as early as 1886. It featured musical pantomime. In 1903, Alice in Wonderland became a pioneering silent movie, with color tinting of the film and innovative special effects packed into its 12 minutes. After several more silent adaptations, the first talkie Alice premiered in movie houses in 1931. Betty Boop starred in a 1934 cartoon version called Betty in Blunderland. The story eventually served as an animated vehicle for Mickey Mouse, Popeye, The Three Stooges, Abbot and Costello, Hello Kitty, and the Care Bears among others; Walt Disney’s studio took its turn in 1951.

Alice in Wonderland first became a British TV show in 1937, many years before almost anyone had a television. The list of TV shows adapting the books in one way or another is exhaustive, ranging from the original Star Trek to Lost. Several Batman villains were drawn directly from Carroll’s roster of characters, including the Mad Hatter and Tweedle Dum and Tweedle Dee. In 1969, Salvador Dali made a dozen paintings based on the story. In 1976, the X-rated Alice in Wonderland: A Musical Porno appeared in theaters.

In the world of music, Alice became an orchestral piece (1914), a choral work (1942), a symphony (1969), two ballets (both in 2010) and several operas (1973, 1995, 2007). But none of those clung to me. Rather, it was a short song, written in 1965 by a woman who was in the process of reinventing herself, and who soon thereafter reinterpreted her own song with a new band that also kept changing and remaking itself.

Grace Barnett Wing was born outside Chicago in 1939 and grew up in Palo Alto, California. After a year of college in New York City and another in Miami, she dropped out, married an aspiring film maker, and settled in San Francisco. She spent three years working as a model in a department store, and now went by her married name: Grace Slick.

During this period she also began writing songs. In 1965, Grace Slick, her husband Jerry, her brother-in-law Darby, and another friend teamed up to form a musical group. They called themselves The Great Society, a mocking reference to President Lyndon Johnson’s domestic political program. By the end of the year they were a popular local act.

It was around this time that Grace Slick wrote a song about Alice called “White Rabbit.” The Great Society was at the forefront of San Francisco’s emerging psychedelic scene, and Slick’s lyrics rendered imagery from Lewis Carroll’s books into a cipher for hallucinogenic drug use.

One pill makes you larger and one pill makes you small
And the ones that mother gives you don’t do anything at all
Go Ask Alice when she’s 10 feet tall

Around Slick’s lyrics and chords, The Great Society painted a musical tableau that reflected psychedelia’s ethos of open mindedness, drug experimentation, and sometimes naive overtures to proto-multicutluralism. The piece opens with an extended instrumental jam that conjures images of Middle Eastern snake charmers before Slick starts singing about “a hookah-smoking caterpillar” who has summoned you to take this trip.

Listen to a live Great Society recording from 1966 and you find that the lyrics and melody are already sorted out. But the singing is different. This is not yet the famous Grace Slick. This a the mid-20s, first marriage, model-turned-singer who had yet to pursue music as a full time career, and who is still learning how to use her voice. Her signature vibrato is there, but it is not fully formed. She’s occasionally off key.

As The Great Society played club dates around town, they sometimes found themselves on the same bill as another band quickly bubbling up from San Francisco’s vibrant music scene. They were called Jefferson Airplane, and they too had a female singer.

Signe Toly Anderson was already an accomplished folk and jazz singer when she signed on as a founding member of Jefferson Airplane in 1965. She sang on their debut album, Jefferson Airplane Takes Off. However, when she became pregnant by her husband (Jerry Anderson, who was one of Ken Kesey’s Merry Pranksters), she quite reasonably decided that life on the road with a drug addled rock band was no way to raise a kid. So she quit.

The band needed a replacement. They wanted a woman with a deep, powerful voice like Anderson. The lead singer of The Great Society now became the lead singer of Jefferson Airplane. And she brought her songs with her.

Grace Slick’s new band worked over “White Rabbit” before recording it for their second album, Surrealistic Pillow. They slowed down the tempo, giving it a more meditative feeling The musical introduction was shortened substantially and re-arranged to recall Maurice Ravel’s “Bolero,” which had originally inspired Slick while writing the song. The whole thing was now barely two and a half minutes, its centerpiece the rising crescendo of Grace Slick’s vocals, which culminated in a soaring finale. And this time she didn’t miss any notes.

Charles Lutwidge Dodgson had become Lewis Carroll. Alice Liddell had become the Alice of Carroll’s fantasy world. Stories told on a rowboat had become world famous books. Alice in Wonderland had become many things, including parodies, plays, paintings and pornography. And in the hands of Grace Wing, who herself had become Grace Slick, it became an early hippie jam for The Great Society, the breakthrough single for Jefferson Airplane, and an anthem of the 1960s counterculture.
*
CODA (noun): 1. a concluding musical passage typically forming an addition to the basic structure; 2. a concluding event, remark, or section.  A coda on the circuitous and comical, tangled and tragic evolution of Jefferson Airplane can be found here.


Thursday, February 13, 2020

The Horror of Unwanted Oral Sex



The Specific Horror of Unwanted Oral Sex

Harvey Weinstein is accused of doing it to Annabella Sciorra and Mimi Haleyi. Years ago, a man did it to me.


By Lisa Taddeo
The NY Times


After my mother died I drugged myself to sleep every night. I couldn’t exist without Ambien. When I recall the great loves of my life, Ambien is in the top three. 

Almost every night I walked from Harlem down to Wall Street trying to tire myself out, like a dog or a kid. A few blocks from home I’d swallow the pill dry and walk the rest of the way feeling that wavy head-high I’d come to associate with imminent peace. 

There was a man during those times who listened, devotedly, as I described my unlivable grief, the way I hadn’t tossed away my mother’s deodorant and the way I’d preserved the last blanket that had covered my father — a thin yellow microfleece — on a shelf in my bedroom. I never used it or let anyone else use it, lest my father’s essence be diluted. 

One summer night that man walked me home from dinner. I was terrifically lonely and he was a tremendous listener. I removed an Ambien pill from the little silver box my mother had used to house the pills she took after the death of my father. 

I swallowed it dry and this man laughed because it was a silly, sad little routine. He told me how unsafe it was to be impaired on the streets of Manhattan. How I might be raped. 

There was trance music in the elevator of my building. He made a joke about it — how it sounded like the New Jersey shore of my past, that I might not be able to refrain from clubbing. 

At my bedroom door I told him I was fine. I told him to go home. He walked out and I got into bed with my clothes on. I woke sometime later to find the man’s head between my legs. He was so restrained in the act that in my fog it seemed faraway and lulling, like something happening to somebody else. And then I woke fully. I asked him what was going on and he stood up quickly, looking a bit guilty. But he didn’t apologize. After all, it was merely my drowsiness that stood between him and his capacity to provide me with that ultimate pleasure often associated with cunnilingus. 

In the recent weeks, during the Harvey Weinstein trial, I’ve been listening to the reactions around forced oral sex. Annabella Sciorra and Mimi Haleyi both recall instances of Mr. Weinstein pinning them down and forcing oral sex upon them. 

There’s a different response to this type of assault. 

As a recent article in the Daily Beast relates, until fairly recently the notion of forced cunnilingus was foggy at best. In the state of Georgia it still only counts as rape if the male organ penetrates the female organ. Other people feel the same. Oral sex does not actually qualify as “sex,” as President Bill Clinton asserted in 1998. Oral sex carries a reduced risk of sexually transmitted infections compared to penetrative rape. And, of course, there’s the despicable suggestion that it can be a sort of pampering. An indulgent visit to a spa. 

Mr. Weinstein allegedly said to Ms. Sciorra, “This is for you.” 

People, naturally, speak differently in person than they do in the neat sheet music of social media. I’ve been listening to both men and women negotiate the claim of forced cunnilingus. 

She could have just used her feet to shove him off, come on. 

It’s not, like, the worst thing in the world to have forced upon you. 

I argued that, for me and for others, the worst part is not blaming the perpetrator: It’s blaming yourself. But that’s a difficult feeling to parse. Off the internet — and as usual — I found that the people who were talking weren’t listening. 

In her vivid essay, “The Trouble with Following the Rules,” Mary Gaitskill describes three assaults she suffered. One was a violent rape, inflicted by a stranger. Another took place when she was 16: Alone and on acid with a new acquaintance, she “allowed herself to be drawn into sex because I could not face the idea that if I said no, things might get ugly.” She writes that she felt more violated by the latter — the date rape — because she felt complicit. 

She also describes a third assault, an experience during which her initial feelings of attraction quickly turned to alarm after the man became aggressive. This one, which took place when she was in her thirties, involved a younger man and longtime friend. She explains that she’d initially ended the essay without stating the fact that she’d gone on to date the man for months. She knew that would render the violation moot in the eyes of many. 

In my bedroom that evening I felt the polar opposite of pleasure. I was sickened by the way it felt, by the self-impressed tongue, by the forced closeness of an act that many women feel is more intimate that intercourse. 

“I’m sleeping,” I said. “Please go home.” 

I wanted to stay awake to make sure that he left. But just as I wanted to drug myself for a few hours of respite from my grief, so too did I want to leave my body so that I might wake in the morning and find the assault had paled in intensity. 

Here’s one sick part: I don’t feel good using the word “assault.” Part of the reason is my feeling of complicity. Part is my humiliation. And finally, there’s the thought that someone reading this will think that it’s not “as big of a deal” as intercourse. That I am being overly dramatic. That the poor guy was just trying to make a sad girl feel better. 

But that, in fact, is the worst part. The blur. 

For some women, the way it feels for someone to force themselves on you in a nearly emotional way carries with it a certain diabolical confusion. 

I didn’t kick the man in the head. I didn’t scream. Deplorably, I felt that if I kicked him, I’d not only be considered unreasonable but even unhinged. I remember, with an indescribable nausea, that I didn’t want the man to feel he was not “doing a good job.” Even just writing those words makes me feel powerless, existentially subjugated. Self-hating. 

After the two rapes Ms. Haleyi allegedly suffered at the hands of Mr. Weinstein, she went on to write to him, to ask him for work and advice. If you haven’t been in a similar situation, it might be nearly impossible to hear that testimony and not feel a shred of suspicion. Why, if she was so despicably assaulted, would she continue to speak to the man? Even if she were afraid to contact the authorities, why would she not keep away from the monster? 

I kept in touch with that man for a year or so after that evening. There were a number of reasons, not least of which was the emotional support he provided. On top of that, he professed to love me. He didn’t need anything from me except my willingness to let him be there for me. I was missing a father and he subtly played that role. I made it clear — with my words and my actions — that I wasn’t interested in anything more and he made it clear that he didn’t need anything more. 

The confusion, the imbalance of that relationship, culminated in that night. That I did not kick him, that I did not scream when I found his head between my legs and my underwear pulled down around my ankles. I merely asked what he was doing. That moment was the ultimate symbol of my complicity. That is why, even though I was passed out, I feel tremulous using the word “assault.” 

She could have just used her feet to shove him off, come on. 

I wanted to stay awake that night, but I fell asleep. In the months that followed, the event did pale in intensity. I had larger wounds to heal: Another person I loved died. I didn’t see the man as a monster, exactly, but I was repulsed by his face, by his greedy eyes, by the way he still looked at me like I was that prostrate girl in her lonely, white bed, whom he was drawing up from the bowels of hell. 

I woke the next morning to find that he had covered me in my father’s yellow blanket, as though he were giving me something else that I didn’t realize I needed. 

Lisa Taddeo (@lisadtaddeo) is a columnist for The Sunday Times in London and the author of “Three Women.”


Sunday, February 09, 2020

The Age of Decadence


The Age of Decadence

By Ross Douthat NY TIMES

Cut the drama. The real story of the West in the 21st century is one of stalemate and stagnation.



An Opinion columnist and the author of the forthcoming book “The Decadent Society,” from which this essay is adapted.
Feb. 7, 2020

Everyone knows that we live in a time of constant acceleration, of vertiginous change, of transformation or looming disaster everywhere you look. Partisans are girding for civil war, robots are coming for our jobs, and the news feels like a multicar pileup every time you fire up Twitter. Our pessimists see crises everywhere; our optimists insist that we’re just anxious because the world is changing faster than our primitive ape-brains can process.

But what if the feeling of acceleration is an illusion, conjured by our expectations of perpetual progress and exaggerated by the distorting filter of the internet? What if we — or at least we in the developed world, in America and Europe and the Pacific Rim — really inhabit an era in which repetition is more the norm than invention; in which stalemate rather than revolution stamps our politics; in which sclerosis afflicts public institutions and private life alike; in which new developments in science, new exploratory projects, consistently underdeliver? What if the meltdown at the Iowa caucuses, an antique system undone by pseudo-innovation and incompetence, was much more emblematic of our age than any great catastrophe or breakthrough?

The truth of the first decades of the 21st century, a truth that helped give us the Trump presidency but will still be an important truth when he is gone, is that we probably aren’t entering a 1930-style crisis for Western liberalism or hurtling forward toward transhumanism or extinction. Instead, we are aging, comfortable and stuck, cut off from the past and no longer optimistic about the future, spurning both memory and ambition while we await some saving innovation or revelation, growing old unhappily together in the light of tiny screens.

The farther you get from that iPhone glow, the clearer it becomes: Our civilization has entered into decadence.

The word “decadence” is used promiscuously but rarely precisely. In political debates, it’s associated with a lack of resolution in the face of threats — with Neville Chamberlain and W.B. Yeats’s line about the best lacking all conviction. In the popular imagination, it’s associated with sex and gluttony, with pornographic romances and chocolate strawberries. Aesthetically and intellectually it hints at exhaustion, finality — “the feeling, at once oppressive and exalting, of being the last in a series,” in the words of the Russian poet Vyacheslav Ivanov.

But it’s possible to distill a useful definition from all these associations. Following in the footsteps of the great cultural critic Jacques Barzun, we can say that decadence refers to economic stagnation, institutional decay and cultural and intellectual exhaustion at a high level of material prosperity and technological development. Under decadence, Barzun wrote, “The forms of art as of life seem exhausted, the stages of development have been run through. Institutions function painfully. Repetition and frustration are the intolerable result.” He added, “When people accept futility and the absurd as normal, the culture is decadent.” And crucially, the stagnation is often a consequence of previous development: The decadent society is, by definition, a victim of its own success.

Note that this definition does not imply a definitive moral or aesthetic judgment. (“The term is not a slur,” Barzun wrote. “It is a technical label.”) A society that generates a lot of bad movies need not be decadent; a society that makes the same movies over and over again might be. A society run by the cruel and arrogant might not be decadent; a society where even the wise and good can’t legislate might be. A crime-ridden society isn’t necessarily decadent; a peaceable, aging, childless society beset by flares of nihilistic violence looks closer to our definition.

Nor does this definition imply that decadence is necessarily an overture to a catastrophe, in which Visigoths torch Manhattan or the coronavirus has dominion over all. History isn’t always a morality play, and decadence is a comfortable disease: The Chinese and Ottoman empires persisted for centuries under decadent conditions, and it was more than 400 years from Caligula to the actual fall of Rome.

“What fascinates and terrifies us about the Roman Empire is not that it finally went smash,” wrote W.H. Auden of that endless autumn, but rather that “it managed to last for four centuries without creativity, warmth, or hope.”


Whether we are waiting for Christians or barbarians, a renaissance or the Singularity, the dilemma that Auden described is now not Rome’s but ours.

  II.


“Do people on your coast think all this is real?”

The tech executive sounded curious, proud, a little insecure. We were talking in the San Francisco office of a venture capital firm, a vaulted space washed in Californian sun. He was referring to the whole gilded world around the Bay, the entire internet economy.

That was in 2015. Here are three stories from the five years since.

A young man comes to New York City. He’s a striver, a hustler, working the borderlands between entrepreneurship and con artistry. His first effort, a credit card for affluent millennials, yanks him into the celebrity economy, where he meets an ambitious rapper-businessman. Together they plan a kind of internet brokerage where celebrities can sell their mere presence to the highest bidder. As a brand-enhancing advertisement for the company, they decide to host a major music festival — an exclusive affair on a Caribbean island for influencers, festival obsessives and the youthful rich.

The festival’s online rollout is a great success. There is a viral video of supermodels and Instagram celebrities frolicking on a deserted beach, a sleek website for customers and the curious, and in the end, more than 5,000 people buy tickets, at an average cost of $2,500 to $4,000 — the superfluity of a rich society, yours for the right sales pitch.

But the festival as pitched does not exist. Instead, our entrepreneur’s plans collapse one by one. The private island’s owners back out of the deal. The local government doesn’t cooperate. Even after all the ticket sales, the money isn’t there, and he has to keep selling new amenities to ticket buyers to pay for the ones they’ve already purchased. He does have a team working around the clock to ready … something for the paying customers, but what they offer in the end is a sea of FEMA tents vaguely near a beach, a catering concern that supplies slimy sandwiches, and a lot of cheap tequila.


Amazingly, the people actually come — bright young things whose Instagram streams become a hilarious chronicle of dashed expectations, while the failed entrepreneur tries to keep order with a bullhorn before absconding to New York, where he finds disgrace, arrest and the inevitable Netflix documentary.


That’s the story of Billy McFarland and the Fyre Festival. It’s a small-time story; the next one is bigger.

A girl grows up in Texas, she gets accepted to Stanford, she wants to be Steve Jobs. She has an idea that will change an industry that hasn’t changed in years: the boring but essential world of blood testing. She envisions a machine, dubbed the Edison, that will test for diseases using just a single drop of blood. And like Jobs she quits college to figure out how to build it.

Ten years later, she is the internet era’s leading female billionaire, with a stream of venture capital, a sprawling campus, a $10 billion valuation for her company, and a lucrative deal with Walgreens to use her machines in every store. Her story is a counterpoint to every criticism you hear about Silicon Valley — that it’s a callow boys’ club, that its virtual realities don’t make the world of flesh and blood a better place, that it solves problems of convenience but doesn’t cure the sick. And she is the toast of an elite, in tech and politics alike, that wants to believe the Edisonian spirit lives on.

But the Edison box — despite endless effort and the best tech team that all that venture capital can buy — doesn’t work. And over time, as the company keeps expanding, it ceases even trying to innovate and becomes instead a fraud, using all its money and big-time backers to discredit whistle-blowers. Which succeeds until it doesn’t, at which point the company and all its billions evaporate — leaving behind a fraud prosecution, a best-selling exposé and the inevitable podcast and HBO documentary to sustain its founder’s fame.

That’s the story of Elizabeth Holmes and Theranos. It’s a big story. But our third story is bigger still, and it isn’t finished yet.

An internet company decides to revolutionize an industry — the taxi and limousine market — that defines old-school business-government cooperation, with all the attendant bureaucracy and unsatisfying service. It promises investors that it can buy its way to market dominance and use cutting-edge tech to find unglimpsed efficiencies. On the basis of that promise, it raises billions of dollars across its 10-year rise, during which time it becomes a byword for internet-era success, the model for how to disrupt an industry. By the time it goes public in 2019, it has over $11 billion in annual revenue — real money, exchanged for real services, nothing fraudulent about it.

Yet this amazing success story isn’t actually making any profit, even at such scale; instead, it’s losing billions, including $5 billion in one particularly costly quarter. After 10 years of growth, it has smashed the old business model of its industry, weakened legacy competitors and created value for consumers — but it has done all this using the awesome power of free money, building a company that would collapse into bankruptcy if that money were withdrawn. And it has solved none of the problems keeping it from profitability: The technology it uses isn’t proprietary or complex; its rival in disruption controls 30 percent of the market; the legacy players are still very much alive; and all of its paths to reduce its losses — charging higher prices, paying its workers less — would destroy the advantages that it has built.

So it sits there, a unicorn unlike any other, with a plan to become profitable that involves vague promises to somehow monetize all its user data and a specific promise that its investment in a different new technology — the self-driving car, much ballyhooed but as yet not exactly real — will make the math add up.

That’s the story of Uber — so far. It isn’t an Instagram fantasy or a naked fraud; it managed to go public and maintain its outsize valuation, unlike its fellow unicorn WeWork, whose recent attempt at an I.P.O. hurled it into crisis. But it is, for now, an example of a major 21st-century company invented entirely out of surplus, and floated by the hope that with enough money and market share, you can will a profitable company into existence. Which makes it another case study in what happens when an extraordinarily rich society can’t find enough new ideas that justify investing all its stockpiled wealth. We inflate bubbles and then pop them, invest in Theranos and then repent, and the supposed cutting edge of capitalism is increasingly defined by technologies that have almost arrived, business models that are on their way to profitability, by runways that go on and on without the plane achieving takeoff.

Do people on your coast think all this is real? When the tech executive asked me that, I told him that we did — that the promise of Silicon Valley was as much an article of faith for those of us watching from the outside as for its insiders; that we both envied the world of digital and believed in it, as the one place where American innovation was clearly still alive. And I would probably say the same thing now because, despite the stories I’ve just told, the internet economy is still as real as 21st-century growth and innovation gets.

But what this tells us, unfortunately, is that 21st-century growth and innovation are not at all that we were promised they would be.

     III.

The decadent economy is not an impoverished one. The United States is an extraordinarily wealthy country, its middle class prosperous beyond the dreams of centuries past, its welfare state effective at easing the pain of recessions, and the last decade of growth has (slowly) raised our living standard to a new high after the losses from the Great Recession.

But slowly compounding growth is not the same as dynamism. American entrepreneurship has been declining since the 1970s: Early in the Jimmy Carter presidency, 17 percent of all United States businesses had been founded in the previous year; by the start of Barack Obama’s second term, that rate was about 10 percent. In the late 1980s, almost half of United States companies were “young,” meaning less than five years old; by the Great Recession, that share was down to only 39 percent, and the share of “old” firms (founded more than 15 years ago) rose from 22 percent to 34 percent over a similar period. And those companies increasingly sit on cash or pass it back to shareholders rather than investing in new enterprises. From World War II through the 1980s, according to a recent report from Senator Marco Rubio’s office, private domestic investment often approached 10 percent of G.D.P.; in 2019, despite a corporate tax cut intended to get money off the sidelines, the investment-to-G.D.P. ratio was less than half of that.



Image
Credit...Sammy Harkham


This suggests that the people with the most experience starting businesses look around at their investment opportunities and see many more start-ups that resemble Theranos than resemble Amazon, let alone the behemoths of the old economy. And the dearth of corporate investment also means that the steady climb of the stock market has boosted the wealth of a rentier class — basically, already-rich investors getting richer off dividends — rather than reflecting surging prosperity in general.

Behind this deceleration lurks the specter of technological stagnation. Andrew Yang’s presidential campaign notwithstanding, leaping advances in robotics aren’t about to throw everybody out of work. Productivity growth, the best measure of technology’s effect on the economy, has been weak in the United States and weaker in Europe ever since the first dot-com bust.

In 2017 a group of economists published a paper asking, “Are Ideas Getting Harder to Find?” The answer was a clear yes: “We present a wide range of evidence from various industries, products, and firms showing that research effort is rising substantially while research productivity is declining sharply.” In his 2011 book “The Great Stagnation,” Tyler Cowen cited an analysis from the Pentagon physicist Jonathan Huebner, who modeled an innovations-to-population ratio for the last 600 years: It shows a slowly ascending arc through the late 19th century, when major inventions were rather easy to conceive and adopt, and a steepening decline ever since, as rich countries spend more and more on research to diminishing returns.

These trends don’t mean progress has ceased. Fewer blockbuster drugs are being approved, but last month still brought news of a steady generational fall in cancer deaths, and a possible breakthrough in cystic fibrosis treatment. Scientific research has a replication crisis, but it’s still easy to discern areas of clear advancement — from the frontiers of Crispr to the study of ancient DNA.

But the trends reveal a slowdown, a mounting difficulty in achieving breakthroughs — a bottleneck if you’re optimistic, a ceiling if you aren’t. And the relative exception, the internet and all its wonders, highlights the general pattern.

The Northwestern University economist Robert Gordon, one of the most persuasive theorists of stagnation, points out that the period from 1840 to 1970 featured dramatic growth and innovation across multiple arenas — energy and transportation and medicine and agriculture and communication and the built environment. Whereas in the last two generations, progress has become increasingly monodimensional — all tech and nothing else. Even within the Silicon Valley landscape, the clear success stories are often the purest computer-and-internet enterprises — social media companies, device manufacturers, software companies — while the frauds and failures and possible catastrophes involve efforts to use tech to transform some other industry, from music festivals to office-space rentals to blood tests.

The Silicon Valley tycoon Peter Thiel, another prominent stagnationist, likes to snark that “we wanted flying cars, instead we got 140 characters.” And even the people who will explain to you, in high seriousness, that nobody would really want a flying car can’t get around the basic points that Thiel, Gordon, and others have been making. Take a single one of the great breakthroughs of the industrial age — planes and trains and automobiles, antibiotics and indoor plumbing — and it still looms larger in our everyday existence than all of the contributions of the tech revolution combined.

We used to travel faster, build bigger, live longer; now we communicate faster, chatter more, snap more selfies. We used to go to the moon; now we make movies about space — amazing movies with completely convincing special effects that make it seem as if we’ve left earth behind. And we hype the revolutionary character of our communications devices in order to convince ourselves that our earlier expectations were just fantasies, “Jetsons stuff” — that this progress is the only progress we could reasonably expect.
    

    IV.

With this stagnation comes social torpor. America is a more peaceable country than it was in 1970 or 1990, with lower crime rates and safer streets and better-behaved kids. But it’s also a country where that supposedly most American of qualities, wanderlust, has markedly declined: Americans no longer “go west” (or east or north or south) in search of opportunity the way they did 50 years ago; the rate at which people move between states has fallen from 3.5 percent in the early 1970s to 1.4 percent in 2010. Nor do Americans change jobs as often as they once did. For all the boosterish talk about retraining and self-employment, all the fears of a precarious job market, Americans are less likely to switch employers than they were a generation ago.

Meanwhile, those well-behaved young people are more depressed than prior cohorts, less likely to drive drunk or get pregnant but more tempted toward self-harm. They are also the most medicated generation in history, from the drugs prescribed for A.D.H.D. to the antidepressants offered to anxious teens, and most of the medications are designed to be calming, offering a smoothed-out experience rather than a spiky high. For adults, the increasingly legal drug of choice is marijuana, whose prototypical user is a relaxed and harmless figure — comfortably numb, experiencing stagnation as a chill good time.



Image
Credit...Sammy Harkham


And then there is the opioid epidemic, whose spread across the unhappiest parts of white America passed almost unnoticed in elite circles for a while because the drug itself quiets rather than inflames, supplying a gentle euphoria that lets its users simply slip away, day by day and bit by bit, without causing anyone any trouble. The best book on the epidemic, by the journalist Sam Quinones, is called “Dreamland” for a reason.

In the land of the lotus eaters, people are also less likely to invest in the future in the most literal of ways. The United States birthrate was once an outlier among developed countries, but since the Great Recession, it has descended rapidly, converging with the wealthy world’s general below-replacement norm. This demographic decline worsens economic stagnation; economists reckoning with its impact keep finding stark effects. A 2016 analysis found that a 10 percent increase in the fraction of the population over 60 decreased the growth rate of states’ per capita G.D.P. by 5.5 percent. A 2018 paper found that companies in younger labor markets are more innovative; another found that the aging of society helped explain the growth of monopolies and the declining rate of start-ups.

This feedback loop — in which sterility feeds stagnation, which further discourages childbearing, which sinks society ever-deeper into old age — makes demographic decline a clear example of how decadence overtakes a civilization. For much of Western history, declining birthrates reflected straightforward gains to human welfare: victories over infant mortality, over backbreaking agrarian economies, over confining expectations for young women. But once we crossed over into permanent below-replacement territory, the birth dearth began undercutting the very forces (youth, risk -taking, dynamism) necessary for continued growth, meaning that any further gains to individual welfare are coming at the future’s expense.

    V.

Now the reader will probably have an obvious objection to this portrait of senescence and stagnation: What about politics? Would a decadent society really reproduce the 1969 Days of Rage on social media, with online mobs swarming and the old extremes back in action? Would it produce a populist surge and a socialist revival, a domestic civil war so polarizing that Americans could mistake the work of Russian hackers for the sincere convictions of their fellow citizens? Would it elect Donald Trump as president?

Strangely, the answer might be “yes.” Both populism and socialism, Trump and Bernie Sanders, represent expressions of discontent with decadence, rebellions against the technocratic management of stagnation that defined the Obama era. “Make America Great Again” is the slogan of a reactionary futurism, a howl against a future that wasn’t what was promised, and the Sanders Revolution promises that what the left lost somewhere in the Reagan era can be regained, and the climb to utopia begun anew.

But the desire for a different future only goes so far, and in practical terms the populist era has mostly delivered a new and deeper stalemate. From Trump’s Washington to the capitals of Europe, Western politics is now polarized between anti-establishment forces that are unprepared to competently govern and an establishment that’s too disliked to effectively rule.

The structures of the Western system, the United States Constitution and administrative state, the half-built federalism of the European Union, are everywhere creaking and everywhere critiqued. But our stalemates make them impervious to substantial reform, let alone to revolution. The most strident European nationalists don’t even want to leave the European Union, and Trump’s first term has actually been much like Obama’s second, with failed legislation and contested executive orders, and policy made mostly by negotiation between the bureaucracy and the courts.

There is a virtual Trump presidency whose depredations terrify liberals, one that airs on Fox in which Trump goes from strength to strength. But the real thing is closer to the genre the president knows best, reality television, than to the actual return of history. (Trump’s recent State of the Union, with its theatrics and premature declaration of victory over decadence, was a particularly striking case in point.)

Likewise in the wider political culture. The madness of online crowds, the way the internet has allowed the return of certain forms of political extremism and the proliferation of conspiracy theories — yes, if our decadence is to end in the return of grand ideological combat and street-brawl politics, this might be how that ending starts.

But our battles mostly still reflect what Barzun called “the deadlocks of our time” — the Kavanaugh Affair replaying the Clarence Thomas hearings, the debates over political correctness cycling us backward to fights that were fresh and new in the 1970s and ’80s. The hysteria with which we’re experiencing them may represent nothing more than the way that a decadent society manages its political passions, by encouraging people to playact extremism, to re-enact the 1930s or 1968 on social media, to approach radical politics as a sport, a hobby, a kick to the body chemistry, that doesn’t put anything in their relatively comfortable late-modern lives at risk.

Close Twitter, log off Facebook, turn off cable television, and what do you see in the Trump-era United States? Campuses in tumult? No: The small wave of campus protests, most of them focused around parochial controversies, crested before Trump’s election and have diminished since. Urban riots? No: The post-Ferguson flare of urban protest has died down. A wave of political violence? A small spike, maybe, but one that’s more analogous to school shootings than to the political clashes of the 1930s or ’60s, in the sense that it involves disturbed people appointing themselves knights-errant and going forth to slaughter, rather than organized movements with any kind of concrete goal.

Internet-era political derangement is partially responsible for white supremacists goading one another into shooting sprees, or the Sanders supporter who tried to massacre Republicans at a congressional baseball game in 2017. But these episodes are terrible and also exceptional; they have not yet established a pattern that looks anything like the early 1970s, when there were more than 2,500 bombings across the continental United States in one 18-month period.

Maybe today’s outliers are the forerunners of something worse. But our terrorists don’t feel like prophets or precursors; they often feel more like marks.

The terrorist in 21st-century America isn’t the guy who sees more deeply than the rest; he’s the guy who doesn’t get it, who takes the stuff he reads on the internet literally in a way that most of the people posting don’t, who confuses virtual entertainment with reality. The left-winger who tries to assassinate Republicans isn’t just a little deeper into the Resistance mind-set than the average activist; he’s the guy who totally misunderstands the Resistance, who listens to all the online talk about treason and Fascism and thinks that he’s really in 1940s France. The guy who parks his truck on the Hoover Dam and demands that certain imaginary indictments be unsealed isn’t just a little more action oriented than the typical QAnon conspiracy theorist; he fundamentally misunderstands those labyrinthine theories, taking them as literal claims about the world rather than as what they are for their creators (a sport, a grift, a hobby) and for most of their participants (an odd form of virtual community).

This doesn’t excuse the grifting or the rage stoking, especially presidential grifting and rage stoking, and it doesn’t make the mass shootings, when they come, any less horrific. But it’s important context for thinking about whether online politics is really carrying our society downward into civil strife. It suggests that the virtual realm might make our battles more ferocious but also more performative and empty; and that online rage is a steam-venting technology for a society that is misgoverned, stagnant and yet, ultimately, far more stable than it looks on Twitter.

If you want to feel as if Western society is convulsing, there’s an app for that, a convincing simulation waiting. But in the real world, it’s possible that Western society is leaning back in an easy chair, hooked up to a drip of something soothing, playing and replaying an ideological greatest-hits tape from its wild and crazy youth.

Which is, to be clear, hardly the worst fate imaginable. Complaining about decadence is a luxury good — a feature of societies where the mail is delivered, the crime rate is relatively low, and there is plenty of entertainment at your fingertips. Human beings can still live vigorously amid a general stagnation, be fruitful amid sterility, be creative amid repetition. And the decadent society, unlike the full dystopia, allows those signs of contradictions to exist, which means that it’s always possible to imagine and work toward renewal and renaissance.



Image
Credit...Sammy Harkham


This is not always true when you gamble on a revolution: The last hundred-odd years of Western history offer plenty of examples of how the attempt to throw off decadence can bring in far worse evils, from the craving for Meaning and Action that piled corpses at Verdun and Passchendaele, to the nostalgic yearning for the Cold War that inspired post-9/11 crusading and led to a military quagmire in the Middle East.

So you can even build a case for decadence, not as a falling-off or disappointing end, but as a healthy balance between the misery of poverty and the dangers of growth for growth’s sake. A sustainable decadence, if you will, in which the crucial task for 21st-century humanity would be making the most of a prosperous stagnation: learning to temper our expectations and live within limits; making sure existing resources are distributed more justly; using education to lift people into the sunlit uplands of the creative class; and doing everything we can to help poorer countries transition successfully into our current position. Not because meliorism can cure every ill, but because the more revolutionary alternatives are too dangerous, and a simple greatest-good-for-the-greatest-number calculus requires that we just keep the existing system running and give up more ambitious dreams.


But this argument carries you only so far. Even if the dystopia never quite arrives, the longer a period of stagnation continues, the narrower the space for fecundity and piety, memory and invention, creativity and daring. The unresisted drift of decadence can lead into a territory of darkness, whose sleekness covers over a sickness unto death. And true dystopias are distinguished, in part, by the fact that many people inside them don’t realize that they’re living in one, because human beings are adaptable enough to take even absurd and inhuman premises for granted. If we feel that elements of our own system are, shall we say, dystopia-ish — from the reality-television star in the White House to the addictive surveillance devices always in our hands; from the drugs and suicides in our hinterlands to the sterility of our rich cities — then it’s possible that an outsider would look at our decadence and judge it more severely still.

So decadence must be critiqued and resisted — not by fantasies of ennobling world wars, not by Tyler Durden from “Fight Club” planning to blow every Ikea living room sky-high, but by the hope that where there’s stability, there also might eventually be renewal, that decadence need not give way to collapse to be escaped, that the renaissance can happen without the misery of an intervening dark age.

Today we are just 50 years removed from the peak of human accomplishment and daring that put human beings on the moon, and all the ferment that surrounded it. The next renaissance will be necessarily different, but realism about our own situation should make us more inclined, not less, to look and hope for one — for the day when our culture feels more fruitful, our politics less futile and the frontiers that seem closed today are opened once again.

Rosewood