Monday, November 30, 2020
Review Of “The Hype Machine”
Saturday, November 28, 2020
Indiana Asks the Supreme Court to Let It Strip Rights From Same-Sex Parents
Indiana Asks the Supreme Court to Let It Strip Rights From Same-Sex Parents
The justices have shown interest in a case that could begin the rollback of marriage equality.
Friday, November 20, 2020
Hospitals Know What’s Coming
Hospitals Know What’s Coming
“We are on an absolutely catastrophic path,” said a COVID-19 doctor at America’s best-prepared hospital.
ED YONG The Atlantic
Perhaps no hospital in the United States was better prepared for a pandemic than the University of Nebraska Medical Center in Omaha.
After the SARS outbreak of 2003, its staff
began specifically preparing for emerging infections. The center has the
nation’s only federal quarantine facility and its largest biocontainment unit,
which cared for airlifted Ebola patients in 2014. The people on staff had
detailed pandemic plans. They ran drills. Ron Klain, who was President Barack
Obama’s “Ebola czar” and will be Joe Biden’s chief of staff in the White House,
once told me that UNMC is “arguably the best in the country” at handling
dangerous and unusual diseases. There’s a reason many of the Americans who
were airlifted from
the Diamond Princess cruise ship in February were sent to UNMC.
In the past two weeks, the hospital had to
convert an entire building into a COVID-19 tower, from the top down. It now has
10 COVID-19 units, each taking up an entire hospital floor. Three of the units
provide intensive care to the very sickest people, several of whom die every
day. One unit solely provides “comfort care” to COVID-19 patients who are
certain to die. “We’ve never had to do anything like this,” Angela Hewlett, the
infectious-disease specialist who directs the hospital’s COVID-19 team, told
me. “We are on an absolutely catastrophic path.
To hear such talk from someone at UNMC,
the best-prepared of America’s hospitals, should shake the entire nation. In
mid-March, when just 18 Nebraskans had tested positive for COVID-19, Shelly Schwedhelm,
the head of the hospital’s emergency-preparedness program, sounded gently
confident. Or, at least, she told me: “I’m confident in having a plan.” She
hoped the hospital wouldn’t hit capacity, “because people will have done the
right thing by staying home,” she said. And people did: For a while, the U.S.
flattened the curve.
But now about 2,400 Nebraskans are testing
positive for COVID-19 every day—a rate five times higher than in the spring.
More than 20 percent of tests are
coming back positive, and up to 70 percent in some rural counties—signs that
many infections aren’t being detected. The number of people who’ve been
hospitalized with the disease has tripled
in just six weeks. UNMC is fuller with COVID-19 patients—and patients, full
stop—than it has ever been. “We’re watching a system breaking in front of us
and we’re helpless to stop it,” says Kelly Cawcutt, an infectious-disease and
critical-care physician.
Cawcutt knows what’s coming. Throughout
the pandemic, hospitalizations have lagged behind cases by about 12 days. Over
the past 12 days, the total number of confirmed cases in Nebraska has risen
from 82,400 to 109,280. That rise represents a wave of patients that will slam
into already beleaguered hospitals between now and Thanksgiving. “I don’t see
how we avoid becoming overwhelmed,” says Dan Johnson, a critical-care doctor.
People need to know that “the assumption we will always have a hospital bed for
them is a false one.”
What makes this “nightmare” worse, he
adds, “is that it was preventable.” The coronavirus is not unstoppable, as
some have suggested and as New Zealand, Iceland, Australia, and Hong
Kong have
resoundingly disproved—twice. Instead, the
Trump administration never mounted a serious effort to stop it.
Whether through gross incompetence or deliberate
strategy, the president and his advisers left the virus to run amok,
allowed Americans to get sick, and punted the consequences to the health-care
system. And they did so repeatedly, even after the ordeal of the spring, after
the playbook for controlling the virus became clear, and despite months of
warnings about a fall surge.
Not even the best-prepared hospital can
compensate for an unchecked pandemic. UNMC’s preparations didn’t fail so much
as the U.S. created a situation in which hospitals could not possibly succeed.
“We can prepare over and over for a wave of patients,” says Cawcutt, “but we
can’t prepare for a tsunami.”
A full hospital means that everyone waits.
COVID-19 patients who are going downhill must wait to enter a packed
intensive-care unit. Patients who cannot breathe must wait for the many minutes
it takes for a nurse elsewhere in the hospital to remove cumbersome protective
gear, run over, and don the gear again. On Tuesday, one rapidly deteriorating
patient needed to be intubated, but the
assembled doctors had to wait, because the anesthesiologists were all busy
intubating four other patients in an ICU and a few more in an emergency room.
None of the people I spoke with would
predict when UNMC will finally hit its capacity ceiling, partly because they’re
doing everything to avoid that scenario, and partly because it’s so grim as to
be almost unthinkable. But “we’re rapidly approaching that point,” Hewlett
said.
When it arrives, people with COVID-19 will
die not just because of the virus, but because the hospital will have nowhere
to put them and no one to help them. Doctors will have to decide who to put on
a ventilator or a dialysis machine. They’ll have to choose whether to abandon
entire groups of patients who can’t get help elsewhere. While cities like New
York and Boston have many big hospitals that can care for advanced strokes,
failing hearts that need mechanical support, and transplanted organs, “in this
region, we’re it,” Johnson says. “We provide care that can’t be provided at any
other hospital for a 200-mile radius. We’re going to need to decide if we
continue to offer that care, or if we admit every single COVID-19 patient who
comes through our door.”
Read:
How many Americans are about to die?
During the spring, most of UNMC’s COVID-19
patients were either elderly people from nursing homes or workers in
meatpacking plants and factories. But with the third national surge, “all the
trends have gone out the window,” Sarah Swistak, a staff nurse, told me. “From
the 90-year-old with every comorbidity listed to the 30-year-old who is the
picture of perfect health, they’re all requiring oxygen because they’re so
short of breath.”
This lack of pattern is a pattern in
itself, and suggests that there’s no single explanation for the current surge.
Nebraska reopened too early, “when we didn’t have enough control, and in the
absence of a mask mandate,” Cawcutt says. Pandemic fatigue set in. Weddings
that were postponed from the spring took place in the fall. Customers packed
into indoor spaces, like bars and restaurants, where the virus most easily
finds new hosts. Colleges resumed in-person classes. UNMC is struggling not
because of any one super-spreading event, but because of the cumulative toll of
millions of bad decisions.
When the hospital first faced the pandemic
in the spring, “I was buoyed by the realization that everyone in America was
doing their part to slow down the spread,” Johnson says. “Now I know
friends of mine are going about their normal lives, having parties and dinners,
and playing sports indoors. It’s very difficult to do this work when we know so
many people are not doing their part.” The drive home from the packed hospital
takes him past rows of packed restaurants, sporting venues, and parking lots.
To a degree, Johnson sympathizes. “I don’t
think people in Omaha thought we could ever have something that resembles New
York,” he told me. “To be honest, in the spring, I would have thought it
extremely unlikely.” But he adds that the Midwest has taken entirely the wrong
lesson from the Northeast’s ordeal. Instead of learning that the pandemic is
controllable, and that physical distancing works, people instead internalized
“a mistaken belief that every curve that goes up must come down,” he said.
“What they don’t realize is that if we don’t change anything about how we’re
conducting ourselves, the curve can go up and up.”
Speaking on Tuesday afternoon, Nebraska
Governor Pete Ricketts once again refused
to issue a statewide mask mandate. He promised to tighten restrictions once
a quarter of the state’s beds are filled with COVID-19 patients, but
even then, some restaurants will still offer indoor dining; gyms and
churches will remain open; and groups of 10 people will still be able to gather
in enclosed spaces. Ricketts urged Nebraskans to avoid close contact, confined
areas, and crowds, but his policies nullify his pleas. “People have the
mistaken belief that if the government allows them to do something, it is safe
to do,” Johnson said.
Read:
The pandemic safety rule that really matters
There are signs that citizens and
businesses are acting ahead of policy makers. Some restaurants are ceasing
indoor dining even without a prohibition. Parents are pulling their children
out of schools and sports leagues. “I have heard from more friends and family
about COVID-19 in the last two weeks than I have in the previous six months,
expressing support and a change in attitudes,” Johnson said.
But COVID-19 works slowly. It takes
several days for infected people to show symptoms, a dozen more for newly
diagnosed cases to wend their way to hospitals, and even more for
the sickest of patients to die. These lags mean that the pandemic’s
near-term future is always set, baked in by the choices of the past. It means
that Ricketts is already too late to stop whatever UNMC will face in the coming
weeks (but not too late to spare the hospital further grief next month). It
means that some of the people who get infected over Thanksgiving will struggle
to enter packed hospitals by the middle of December, and be in the ground by
Christmas.
Officially, Nebraska has
4,223 hospital beds, of which 1,165—27 percent—are still available. But
that figure is deceptive. It includes beds for labor and deliveries, as well as
pediatric beds that cannot be repurposed. It also says nothing about how
stretched hospitals have already become in their efforts to create capacity.
UNMC has postponed elective
surgeries—those which could be deferred for four to 12 weeks. Patients with
strokes and other urgent traumas aren’t getting the normal level of attention,
because the pandemic is so all-consuming. Clinical research has stopped because
research nurses are now COVID-19 nurses. The hospital is forced to turn down
many requests to take in patients from rural hospitals and neighboring states
that are themselves almost out of beds.
Empty hospital beds might as well be hotel
beds without doctors and nurses to staff them. And though health-care workers
are resilient, “many of us feel like we haven’t had a day off since this thing
began,” Hewlett says. The current surge is pushing them to the limit because
people with COVID-19 are far sicker than the average patient. In an ICU, they
need twice as much attention for three times the usual stay. To care for them,
UNMC’s nurses and respiratory therapists are now doing mandatory overtime. The
hospital has tried to hire travel nurses, but with the entire country calling
for help, the pool of reinforcements is dry. “Even before COVID-19 hit, we were
short-staffed,” says Becky Long, a lead nurse on a COVID ICU floor. Of late,
there have been days when the hospital had 45 to 60 fewer nurses than it
needed. “Every time I’ve been at work, I’ve thought: This is going to be
the final straw. But somehow we continue to make it work, and I truly have
no idea how.”
Read:
The end of the pandemic is now in sight
Before COVID-19, Long worked in oncology.
Death is no stranger to her, but she tells me she can barely comprehend the
amount she has seen in recent weeks. “I used to be able to leave work at work,
but with the pandemic, it follows me everywhere I go,” she said. “It’s all I
see when I come home, when I look at my kids.”
Long and other nurses have told many
families that they can’t see their dying loved ones, and then sat with those
patients so they didn’t have to die alone. Lindsay Ivener, a staff nurse, told
me that COVID-19 had recently killed an elderly woman whom she was caring for,
the woman’s husband, and one of her grandchildren. A second grandchild had just
been admitted to the hospital with COVID-19. “It just tore this whole family
apart in a month,” Ivener said. “I couldn’t even cry. I didn’t have the
energy.”
Until recently, Ivener worked in corporate
America as a retail buyer and inventory manager. Wanting to help people, she
retrained as a nurse and graduated this May. “I’ve only worked as a nurse
during a pandemic,” she told me. “It’s got to get better, right?”
Monday, November 16, 2020
Wikipedia, “Jeopardy!,” and the Fate of the Fact
Wikipedia, “Jeopardy!,” and the Fate of the Fact
In the Internet age, it can seem as if there’s no reason to remember anything. But information doesn’t always amount to knowledge.
By Louis Menand The New Yorker
Is it still cool to memorize a lot of stuff? Is there even a reason to memorize anything? Having a lot of information in your head was maybe never cool in the sexy-cool sense, more in the geeky-cool or class-brainiac sense. But people respected the ability to rattle off the names of all the state capitals, or to recite the periodic table. It was like the ability to dunk, or to play the piano by ear—something the average person can’t do. It was a harmless show of superiority, and it gave people a kind of species pride.
There is still no artificial substitute for the
ability to dunk. It remains a valued and nontransferable aptitude. But today
who needs to know the capital of South Dakota or the atomic number of hafnium
(Pierre and 72)? Siri, or whatever chatbot you use, can get you that information
in nanoseconds. Remember when, back in the B.D.E. (Before the Digital Era),
you’d be sitting around with friends over a bottle of Puligny-Montrachet, and
the conversation would turn on the question of when Hegel published “The Phenomenology of Spirit”? Unless you had an
encyclopedia for grownups around the house, you’d either have to trek to your
local library, whose only copy of the “Phenomenology” was likely to be checked
out, or use a primitive version of the “lifeline”—i.e., telephone a Hegel
expert. Now you ask your smartphone, which is probably already in your hand. (I
just did: 1807. Took less than a second.)
And names and dates are the least of it. Suppose, for
example, that you suspected that one of your friends was misusing Hegel’s term
“the cunning of reason.” So annoying. But you don’t even have to be sober
to straighten that person out. As you contemplate another glass, Siri places in
your hand a list of sites where that concept is explained, also in under a
second. And, should the conversation ever get serious, Hegel’s entire corpus is
searchable online. Interestingly, when I ask Siri, “Is Dick Van Dyke still
alive?,” Siri says, “I won’t respond to that.” It’s not clear if that’s because
of the Dick or the Dyke. (He is, and he’s ninety-four.)
There is also, of course, tons of instant information
that is actually useful, like instructions for grilling corn on the cob, or
unclogging a bathtub drain. And it’s free. You do not have to pay a plumber.
Leaving the irrefutably dire and dystopian effects of
the Web aside for a moment, this is an amazing accomplishment. In less than
twenty years, a huge percentage of the world’s knowledge has become accessible
to anyone with a device that has Wi-Fi. Search engines work faster than the
mind, and they are way more accurate. There is plenty of misinformation on the
Web, but there is plenty of misinformation in your head, too. I just told you
what the atomic number of hafnium is. Do you remember it correctly?
The most radical change that instant information has
made is the levelling of content. There is no longer a distinction between
things that everyone knows, or could readily know, and things that only experts
know. “The cunning of reason” is as accessible as the date Hegel’s book was
published and the best method for grilling corn. There is no such thing as
esoterica anymore. We are all pedants now. Is this a cause for concern? Has it
changed the economic and social value of knowledge? Has it put scholars and
plumbers out of business and made expertise obsolete?
In the early years of the Web, the hub around which
such questions circled was Wikipedia. The site will be twenty years old on
January 15th, and a collection of articles by scholars, called “Wikipedia @ 20: Stories of an Incomplete Revolution”
(M.I.T.), is being published as a kind of birthday tribute. The authors survey
many aspects of the Wiki world, not always uncritically, but the consensus is
that Wikipedia is the major success story of the Internet era. A
ridiculously simple principle—“Anyone can edit”—has produced a more or less
responsibly curated, perpetually up-to-date, and infinitely expandable source
of information, almost all of it hyperlinked to multiple additional sources.
Andrew Lih’s history of the site, “The Wikipedia Revolution: How a Bunch of Nobodies Created the
World’s Greatest Encyclopedia,” published in 2009, is similarly smitten.
Wikipedia took off like a shot. Within a month, it had
a thousand articles, a number that would have been impossible using a
traditional editorial chain of command. Within three years, it had two hundred
thousand articles, and it soon left print encyclopedias in the dust. Today,
Wikipedia (according to Wikipedia) has more than fifty-five million articles in
three hundred and thirteen languages. In 2020, it is the second most visited
site on the Web in the United States, after YouTube, with 1.03 billion visits a
month—over four hundred million more visits than the No. 3 Web site, Twitter.
The Encyclopædia Britannica, first published in 1768 and for centuries the gold
standard of the genre, had sixty-five thousand articles in its last print
edition. Since 2012, new editions have been available only online, where it
currently ranks fortieth in visits per month, with about thirty-two million.
In the beginning, the notion that you could create a
reliable encyclopedia article about Hegel that was not written by, or at least
edited by, a credentialled Hegel expert was received, understandably, with
skepticism. Teachers treated Wikipedia like the study guide SparkNotes—a
shortcut for homework shirkers, and a hodgepodge compiled by autodidacts and
trivia buffs. The turning point is customarily said to have been a study published
in Nature, in 2005, in which academic scientists compared forty-two
science articles in Wikipedia and the Encyclopædia Britannica. The experts
determined that Wikipedia averaged four errors per article and Britannica
averaged three. “Wikipedia comes close to Britannica in terms of the accuracy
of its scientific entries” was the editors’ conclusion. By then, many teachers
were consulting Wikipedia regularly themselves.
The reason most people today who work in and on
digital media have such warm feelings about Wikipedia may be that it’s one of
the few surviving sites that adhere to the spirit of the early Internet, to
what was known affectionately as the “hacker ethos.” This is the ethos of
open-source, free-access software development. Anyone can get in the game, and
a person doesn’t need permission to make changes. The prototypical open-source
case is the operating system Linux, released in 1991, and much early
programming was done in this communal barn-raising spirit. The vision, which
now seems distinctly prelapsarian, was of the Web as a bottom-up phenomenon,
with no bosses, and no rewards other than the satisfaction of participating in
successful innovation.
Even today, no one is paid by Wikipedia, and anyone
can (at least in theory, since a kind of editorial pecking order has evolved)
change anything, with very few restrictions. In programming shop talk, all work
on Wikipedia is “copyleft,” meaning that it can be used, modified, and
distributed without permission. No one can claim a proprietary interest. There
are scarcely any hard-and-fast rules for writing or editing a Wikipedia
article.
That seems to have been what got hacker types, people
typically allergic to being told what to do, interested in developing the site.
“If rules make you nervous and depressed,” Larry Sanger, the site’s co-founder,
with Jimmy Wales, wrote in the early days, “then ignore them and go about your
business.”
Wikipedia is also one of the few popular sites whose
content is not monetized and whose pages are not personalized. Nothing is
behind a paywall; you do not have to log in. There are occasional pop-ups
soliciting contributions (in 2017-18, almost a hundred million dollars was donated
to the nonprofit Wikimedia Foundation, headed by Wales), but no one is trying
to sell you something. Everyone who looks up Pierre, South Dakota, sees the
same page. There is no age-and-gender-appropriate clickbait, no ads for drain
de-cloggers and books by German philosophers.
Wikipedia has some principles, of course. Contributors
are supposed to maintain a “neutral point of view”; everything must be
verifiable and, preferably, given a citation; and—this is probably the key to
the site’s success with scholars—there should be no original research. What
this means is that Wikipedia is, in essence, an aggregator site. Already
existing information is collected, usually from linkable sources, but it is not
judged, interpreted, or, for the most part, contextualized. Unlike in scholarly
writing, all sources tend to be treated equally. A peer-reviewed journal and a
blog are cited without distinction. There is also a semi-official indifference
to the quality of the writing. You do not read a Wikipedia article for the
pleasures of its prose.
There are consequently very few restrictions on
creating a page. The bar is set almost as low as it can be. You can’t post an
article on your grandmother’s recipe for duck à l’orange. But there is an
article on duck à l’orange. There are four hundred and seventy-two subway
stations in New York City; each station has its own Wikipedia page. Many
articles are basically vast dumping grounds of links, factoids, and data.
Still, all this keeps the teachers and scholars in business, since knowledge
isn’t the data. It’s what you do with the data. A quickie summary of “the
cunning of reason” does not get you very far into Hegel.
But what about the folks who can recite the periodic
table, or who know hundreds of lines of poetry “by heart,” or can tell you the
capital of South Dakota right off the bat? Is long-term human memory obsolete?
One indication of the answer might be that the highest-rated syndicated program
on television for the first ten weeks of 2020 was “Jeopardy!” The ability to
recall enormous numbers of facts is still obviously compelling. Geek-cool
lives.
“Jeopardy!” is thirty-seven years old under its
host Alex
Trebek, who died earlier this month, at the age of eighty. But the show is
much older than that. It first went on the air in 1964, hosted by Art Fleming,
and ran until 1975. And the “Jeopardy!” genre, the game show, is much older
still. Like a lot of early television—such as soap operas, news broadcasts, and
variety shows—game shows date from radio. The three national broadcast
networks—CBS, NBC, and ABC—were originally radio networks, so those were genres
that programmers already knew.
Shows like “Jeopardy!” were as popular in the early
years of television as they are today. In the 1955-56 season, the highest-rated
show was “The $64,000 Question,” in which contestants won money by answering
questions in different categories. Soon afterward, however, a meteor struck the
game-show planet when it was discovered that Charles Van Doren, a contestant on
another quiz show, “Twenty-One,” who had built up a huge following and whose
face had been on the cover of Time, had been given the answers in advance.
It turned out that most television quiz shows were rigged. The news was
received as a scandal; there were congressional hearings, and the
Communications Act was amended to make “secret assistance” to game-show
contestants a federal crime.
Whom did such “assistance” help? Mostly, the networks.
When a player is on a streak, audience size increases, because more and more
people tune in each week to see if the streak will last. In the
nineteen-fifties, there were usually just three shows to choose from in a given
time slot, so audiences were enormous. As many as fifty-five million people—a
third of the population—tuned in to “The $64,000 Question.” It was the
equivalent of broadcasting the Super Bowl every week. The financial upside of a
Van Doren was huge.
But the scandal made it clear that game shows are
popular because they are also reality television. “Jeopardy!” and “The
Apprentice” belong to the same genre. So, for that matter, does TikTok. The
premise of reality television is that the contestants are ordinary people, not
performers. This approach allows viewers to feel that they are matching wits
with the people on the screen, but there is also something awe-inspiring about
watching Charles Van Doren, or Ken Jennings, the owner of a six-month winning
streak on “Jeopardy!,” run up the score. Still, you have to be able to believe
that these people are not professionals, and that they are doing it without
help.
In retrospect, the Van Doren fan-demic seems odd. He
held advanced degrees and taught at Columbia; he was distinctly not the man on
the street. It helped that he was young and good-looking, and that he really
seemed to be sweating out the answers. One of the most popular “Jeopardy!”
winners, on the other hand, is Frank Spangenberg, who for a long time held the
record for five-day winnings ($102,597). Spangenberg was a member of the New
York City Transit Police. He was the ideal game-show type, someone viewers can
relate to.
As Claire McNear explains in “Answers in the Form of Questions: A Definitive History and
Insider’s Guide to ‘Jeopardy!’ ” (Twelve), a book mainly for fans, the
Van Doren scandal helped define “Jeopardy!” in two respects. The first is the
concept for the show, which is credited to Julann Griffin, Merv Griffin’s wife.
She is supposed to have argued that, if it was a crime to give quiz-show contestants
the answers in advance, then giving them the answers up front and having them
come up with the questions would get the show around the Communications Act.
This nonsensical reasoning is repeated in virtually every book on the show.
The other piece of long-term fallout from the
quiz-show scandals is that when contestants on “Jeopardy!” return home, and
everyone asks them, “So what is Alex Trebek really like?,” they have no answer.
This is because, except when the game is in progress, the contestants never
interact with him. The policy is intended to insure that no contestant is
getting off-camera help (which is also nonsensical, since contestants could be
getting help from someone besides the host). But the lack of face time with
Trebek is considered a major disappointment.
For Trebek was something between a cult figure and an
icon. “Our generation’s Cronkite,” Ken Jennings called him in a column
published last year, and the comparison is apt. Walter Cronkite did not report
the news. He read cue cards on the air every week night on CBS for nineteen
years. Trebek did not write the clues on “Jeopardy!” He read them on the
morning of the taping, to make sure he had the pronunciations right. His aura
of knowing the answers (or the questions) was, like Cronkite’s air of gravitas,
part of the onscreen persona. Cronkite was trained as a journalist. He knew
what was going on in the world and he understood the events he reported on. But
that is not why he became an icon. Trebek, too, was an educated man with
genuine curiosity and many interests. But it would not have mattered if he
wasn’t. By some combination of familiarity and longevity, he and Cronkite
acquired an outsized cultural status.
Like another TV icon, Johnny Carson, who hosted the
“Tonight Show” for thirty years, Trebek’s great talent was for being supremely
at ease in front of a camera. Whoever he was when he was at home, on the air he
was himself. In thirty-seven years, he never missed a taping. When he was
diagnosed with cancer, in March, 2019, he was seventy-eight years old. But he
worked right up to the end. On days when he was undergoing treatment, he would
be suffering terribly. Between games—“Jeopardy!” tapes five games a day, in
Culver City, with fifteen-minute breaks—he sometimes writhed in agony on the
floor of his dressing room. Fifteen minutes later, on the set and with the
cameras rolling, he behaved as though he were perfectly healthy.
By his own account, offered in his brief and cheery
memoir, “The Answer Is: Reflections on My Life” (Simon &
Schuster), and confirmed by other reports, including McNear’s, when Trebek was
off the air he was more laid-back and salty, less like your eighth-grade math
teacher. But his tastes were conventional, and so was his career. He hosted
numerous short-lived shows, in Canada, where he was born, and in the U.S.,
before getting the “Jeopardy!” gig. He did not think that the success of “Jeopardy!”—it
ranked No. 1 or 2 among syndicated shows for many years—had anything to do with
him. “You could replace me as the host of the show with anybody and it would
likely be just as popular,” he says in the memoir. I guess we’ll see.
If there is a mystique about Trebek, one of the things
we learn from McNear’s book is that there is also a mystique about the
contestants. Today, many of them are not, in fact, ordinary people. They are
trivia professionals, people who spend countless hours practicing and preparing.
A major skill required on the show, for instance, is mastering the
buzzer. Aspiring contestants now manufacture their own buzzers and practice
to get reaction times, measured in milliseconds, as low as possible. (You
cannot press your buzzer until the host has finished reading the answer; if you
press it too early, there is a quarter-second wait before you can press it
again, and by then the other contestants are likely to have pressed theirs.)
Since contestants who make it onto the show typically know almost all the
answers, the outcome tends to turn on who is the fastest buzzer-presser. “A
reaction-time test tacked onto a trivia contest” is how one contestant
described it.
Competing on “Jeopardy!” brings fame, and for most
contestants being able to say that they played a game on the show is all the
reward they require. But winning on “Jeopardy!” does not bring riches. In fact,
to cast a cold economic eye on the show, “Jeopardy!” contestants constitute an
exploited class. Together with its sibling show, “Wheel of Fortune,” another
Merv Griffin creation, “Jeopardy!” is said to bring in a hundred and
twenty-five million dollars a year. (Griffin wrote the “Jeopardy!” theme tune,
and he claimed, before he died, in 2007, to have made more than seventy million
dollars in royalties from it.) Trebek, who worked only forty-six days a year,
was paid in the neighborhood of ten million dollars.
But contestants’ travel and hotel expenses are not
paid, and the second- and third-place finishers do not keep the money they’ve
“won”; they are given consolation prizes—two thousand dollars for second place
and a thousand dollars for third—plus a tote bag and a “Jeopardy!” cap. (This
is to incentivize riskier play.) According to McNear, in the 2017-18 season,
the average amount that winners took home was $20,022. In his six-month streak,
Jennings won $2.5 million, but during those six months ratings increased by
fifty per cent over the previous year’s, and “Jeopardy!” became the
second-ranked show on all television, after “CSI.” Two and a half million
dollars was a very small price to pay. The riches of “Jeopardy!” are not
necessarily what they seem. Other pockets got much fuller than Ken Jennings’s.
Something of the same could be said about Wikipedia’s
reputation as a “free encyclopedia.” Yochai Benkler has a peculiar essay in the
“Wikipedia @ 20” collection. (Benkler is the lead author of a recent study,
widely reported, showing that right-wing media, like Fox and Breitbart, not
trolls or Russian hackers, are responsible for most of the misinformation about
“voter fraud.”) In his essay on Wikipedia, Benkler argues that the site is “a
critical anchor for working alternatives to neoliberalism. . . .
People can work together, build a shared identity in a community of practice,
and make things they need without resorting to enforced market exchange.”
But that is not quite how Wikipedia works. A major
influence on Jimmy Wales’s conception of the site was an essay by Friedrich
Hayek called “The Use of Knowledge in Society,” published in 1945, and Hayek is
virtually the father of postwar neoliberalism. His tract against planning, “The Road to Serfdom,” published in 1944, has sold hundreds
of thousands of copies, and is still in print. Hayek’s argument about knowledge
is the same as the neoliberal argument: markets are self-optimizing mechanisms.
No one can know the totality of a given situation, as he puts it in “The Use of
Knowledge” (he is talking about economic decision-making), but the optimal
solution can be reached “by the interactions of people each of whom possesses only
partial knowledge.”
This theory of knowledge is not unrelated to the
wisdom-of-crowds scenario in which a group of people are guessing the number of
jelly beans in a jar. The greater the number of guesses, the closer the mean of
all guesses will come to the true number of jelly beans. A crucial part of
crowdsourcing knowledge is not to exclude any guesses. This is why Wales, in
his role as Wikipedia’s grand arbiter, is notoriously permissive about allowing
access to the site’s editing function, and why he doesn’t care whether some of
the editors are discovered to be impostors, people pretending to expertise that
they don’t really have. For, when you are calculating the mean, the outliers
are as important as the numbers that cluster around the average. The only way
for the articles to be self-correcting is not to correct, to let the invisible
hand do its job. Wikipedia is neoliberalism applied to knowledge.
Still, the people who post and who edit the articles
on Wikipedia are not guessing jelly beans. They are culling knowledge that has
already been paid for—by universities, by publishers, by think tanks and
research institutes, by taxpayers. The editors at Nature who, back in
2005, compared Wikipedia with the Encyclopædia Britannica seem not to have
considered whether one reason Wikipedia’s science entries had fewer errors than
they expected was that its contributors could consult the Encyclopædia
Britannica, which pays its contributors. There is no such thing as a free
fact. ♦
Published in the print edition of the November 23, 2020,
issue, with the headline “What Do You Know?.”
Louis Menand has
been a staff writer at The New Yorker since 2001. He teaches at Harvard
University.
Saturday, November 07, 2020
Trump Is Gone. Trumpism Just Arrived
Trump
Is Gone. Trumpism Just Arrived
The air has been cleared. And democracy is working.
Andrew Sullivan Weekly Dish
The
“wisdom of the American people” is a horrifying cliché, routinely hauled out
every four years as pious pabulum by those whose candidate just won. But the
complicated and close election results of 2020, in so far as we can understand
them at this point in time, really do seem to capture where America now is, for
good and ill, defying the caricatures and wishful thinking of both Republicans
and Democrats, revealing a sanity that has helped keep me rather serene in this
chaotic week.
The
key fact is that Donald J Trump has been decisively defeated. He will be a
one-term president. This was by no means inevitable. But in a massive turnout,
where both sides mobilized unprecedented hordes of voters, and when the GOP
actually made gains in the House, and did much better than expected, Trump
lost. A critical mass of swing voters and moderate Republicans picked Biden
over him. Our nightmare of four years — an unstable, malignant, delusional
maniac at the center of our national life — is over.
Take
a moment to feel that relief. Breathe. Rejoice. He’s done.
He
will not concede. He cannot concede — because he would suffer a psychic break
if he did. And what we witnessed Thursday night, in his drained yet still
despicable rant, was a picture of a sad, lost, delusional person, a man utterly
unfit to hold the office he holds, lying and lying and lying, spinning paranoid
conspiracy theories like a drunk on Tumblr. He said without any basis in fact:
“This is a case where they’re trying to steal an election, they’re trying to
rig an election, and we can’t let that happen.”
In
the early hours of this morning, we got this
tweet: “I easily WIN the Presidency of the United States with LEGAL VOTES
CAST. The OBSERVERS were not allowed, in any way, shape, or form, to do their
job and therefore, votes accepted during this period must be determined to be
ILLEGAL VOTES. U.S. Supreme Court should decide!” This, to put it plainly, is a
form of mental illness. The idea that election observers have been completely
barred from doing their jobs in counts around the country is a fantasy. The
notion that vast numbers of votes can suddenly be deemed “illegal” is also
bonkers. Ditto the idea that counting votes after election day is somehow
“finding” votes. These are absurd, delusional, desperate fantasies, made by
someone who has no understanding of the word “responsibility”.
Donald
Trump, in other words, is now showing exactly why he had to be defeated. Policy
is irrelevant in his singular case. No serious democracy can have a delusional,
utterly incompetent, psychologically disturbed madman as president and
survive.
But
Trumpism? It did far better than anyone expected. Down-ballot, many Republicans
out-performed their nominal leader. The GOP made real gains in the House —
during a health crisis and a recession — and will probably hold the Senate,
effectively checkmating any truly progressive ambitions Biden might have had.
The rural turnout was spectacular, responding perhaps to Trump’s incredibly
boisterous series of big rallies as the campaign came to a close. This was far
from the Biden landslide I had been dreaming about a few weeks back. It was
rather the moment that the American people surgically removed an unhinged
leader and re-endorsed the gist of his politics. It was the moment that Trump’s
core message was seared into one of our major political parties for the
foreseeable future, and realigned American politics. If Trump were sane, this
is how he would describe his success — and leave office graciously to become
the kingmaker in his own party. But he is not sane.
His
impact, however, is undeniable. Neoconservatism is over; globalization as some
kind of conservative principle is over; a conservatism that allows for or looks
away from unrestrained mass immigration is over. What was cemented in place
this week is a new GOP, not unlike the new Tories in the UK. They’re
nationalist, culturally conservative, geared toward the losers of capitalism as
well as its winners, and mildly protectionist and isolationist. It is a natural
response to the unintended consequences of neoliberalism’s success under a
conservative banner. And it speaks in a language that working class Americans
understand, devoid of the woke neologisms of the educated elite. It seems
to me that this formula is a far more settled and electorally potent coalition
than what we now see among the deeply divided Democrats.
And
this is where I think I have been wrong about Trump’s appeal, and where I think
I’ve misunderstood why otherwise decent people could support such a foul
disrupter of democratic norms. Many of them simply didn’t take Trump’s threat
to our system seriously. They took all his assaults on democracy as so much
bluster from the kind of car salesman he is. They deal with this kind of
bullshit all the time, took liberal democracy for granted and saw little reason
to fret about its future. The writer Jamie Kirchick says that everything Trump
says makes sense if it is preceded by the following words: “And now, Donnie
from Queens, you’re on the air.” Many people heard Trump exactly that way, and
couldn’t see what all the fuss was about. They weren’t endorsing his
madness. They were looking past it. They were, in my opinion, wrong to be so
cavalier. But I don’t think most were malignant extremists of any kind, or
unaware of the hideous personal qualities of Trump.
And
they enjoyed economic rewards that, absent the Covid19 recession, might well
have swept Trump to victory. One of the more revealing results from the
polls this year came in the answers to the core question made famous
by Reagan: “Are you better off now than you were four years ago?” In previous
campaigns to re-elect the president, Reagan was re-elected in a landslide with
only 44 percent saying they were better off, George W. Bush won with 47 percent
and Obama succeeded with 45 percent. For Trump, a mighty 56 percent said they
were better off now than when he took office — a fundamental along with
incumbency that should have led to a landslide re-election — and yet he still
lost. That tells you something about Americans’ understanding of how unfit a
president Trump turned out to be, even as they felt very good about their own
wellbeing.
And
this was also clearly and unequivocally a rejection of the woke left. The riots
of the summer turned many people off. In exit
polls, 88 percent of Trump voters say it was a factor in their choice. On
the question of policing and criminal justice, Trump led Biden 46 — 43 percent.
For the past five years, Democrats have been telling us that Trump and his
supporters were white supremacists, that he was indeed the “First
White President” in Ta-Nehisi Coates’ words, that all minorities were under
assault by the modern day equivalent of the KKK. And yet, the GOP got the
highest proportion of the minority vote since 1960! No wonder Charles
Blow’s head exploded.
We
may find out more as exit
polling is pored over, but in the current stats, Trump measurably
increased his black, Latino, gay and Asian support. 12 percent of blacks — and
18 percent of black men — backed someone whom the left has identified as a
“white supremacist”, and 32 percent of Latinos voted for the man who put
immigrant children in cages, giving Trump Florida and Texas. 31 percent of
Asians and 28 percent of the gay, lesbian and transgender population also went
for Trump. The gay vote for Trump may have doubled! We’ll see if this pans
out. But it’s an astonishing rebuke of identity politics and its crude
assumptions about how unique individuals vote.
Why
did minorities shift slightly rightward after enduring four years of Trump?
First off, many obviously rejected the narrative being pushed out by every
elite media source: that the core of Trump’s appeal was racism. They saw a more
complicated picture. I suspect that many African-Americans, for example, were
terrified of “defunding the police” and pleased to be economically better off,
with record low unemployment before Covid19 hit. Many legal Latino citizens, perplexing
leftists, do not want continued mass immigration, and are socially
conservative. Asians increasingly see the woke as denying their children fair
access to education, and many gays just vote on various different issues, now
that the civil rights question has been largely resolved by the Supreme
Court.
Obviously
a big majority of non-white and non-straight voters still backed Democrats. But
the emergence of this coalition of minority conservatives is fascinating — and,
of course, a complete refutation of what critical race theory tells us how
minorities must feel. Ditto the gender gap. It’s there, but not quite the gulf
we were led to believe. We have again been told insistently that being female
in America today is a constant nightmare of oppression, harassment, violence
and misogyny; and that no one represents this more potently than Donald “grab
‘em by the pussy” Trump. And yet white women still voted for Trump 55 to 43 percent.
Among white women with no college education, arguably those most vulnerable to
the predations of men, Trump got 60 percent support. This is not a wave of
rage; and it suggests that the left’s notion of patriarchy is, in 2020,
something many, many women just don’t buy, or do not believe should outweigh
other, more important issues.
And
look at California, one of the most leftist states in the country, and
minority-majority. The initiative to allow public institutions to discriminate
openly on the basis of race — in order to favor some groups over others on the
Ibram X. Kendi model — decisively failed, after months of unceasing propaganda
about “white supremacy” and the need to counter it. So did an attempt to
regulate the gig economy and to expand rent control. The appeal of assimilation
and economic success among Latinos is not, pace the critical race
theorists, an attempt to gain the advantages of “white-adjacency”. It’s simply
the American way, paved by generations of immigrant groups before them.
And
it’s important to acknowledge that almost everyone in the elite and in the
polling industry were misled by the polling. I was surprised at the resilience
of Trump’s coalition in what was a huge turnout (breaking another paradigm of
leftists, that somehow if more people voted, they would always gain). I was
less surprised by the politics of Covid19. The idea that Trump’s manifest
failures would translate to a major rebuke missed an underlying dynamic. Many
Americans want to move on; they’re sick of the shutdowns and restrictions; and
they’re more receptive to fake reassurance than we might imagine. Indeed, the
counties with the biggest recent surges in Covid cases voted
overwhelmingly for Trump. Democrats need to spend less time looking
for exogenous factors — like demographic change, or the Covid epidemic — and
more time making arguments that capture the country’s move toward more leftist economic
policy, while jettisoning woke madness.
Pollsters,
despite attempts to fix what they got so wrong last time, also missed a huge
swathe of “shy Trump” voters. But why? Why did people not tell the truth to
pollsters?Eric Kaufmann, one of the most astute political scientists writing
today, notes that the segment of the Trump vote the polling missed was educated white voters.
He suspects they were afraid to say out loud to pollsters how they were really
going to vote. After all, “45% of Republicans with degrees, compared to 23% of
Democrats with degrees, said they feared that their careers could be at risk if
their views became known.”
So
the polling got the less inhibited white non-college-educated Trump voters
right, but the graduates very wrong: “The exit
polls show that Trump ran even among white college graduates 49-49,
and even had an edge among white female graduates of 50-49! This puts
pre-election surveys out by a whopping 26-31 points among white graduates.” The
threat of wokeness both alienated educated white voters — and caused more of
them to vote Trump than anyone expected. The problem with woke media is that
they mislead Democrats who then misread the country.
And
that’s what I mean about the clearer air we breathe after surveying what
Americans actually believe — in cold hard data. This mass secret vote revealed
that the New York Times’ woke narrative of America — the centuries-long
suffocating oppression of minorities and women by cis white straight men — is
simply a niche elite belief, invented in a bubble academy, and imposed by
bullying, shaming and if possible, firing dissenters. Some of us who refused to
cower can gain real satisfaction from knowing we were not mad, not evil, not
bigots, and that a huge swathe of our fellow citizens agree.
Recall
also the huge money advantage the Democrats had in many Senate races they still
lost. In California’s Prop 16 vote, for good measure, supporters of bringing
back race discrimination by public authorities outspent
opponents 14 -1! What this election shows is that leftists cannot
bully voters into abandoning core principles of liberal democracy, and they
can’t buy their submission either.
The
clarifying truth is that we’re a very closely divided country, growing further
apart culturally and socially, exploited by extremists on right and left, and
yet still, fundamentally, sane. The American people do not want a revolution,
but they also realized they do not want Trump as head of state. They removed
the nutcase, defanged the woke, showed up to vote in vast numbers, and gave us
a constellation of forces in Washington that pleases no one. And that’s
ok.
America
is a vast and complicated place — and our representatives mirror that rather
accurately, it seems. That’s democracy working, not failing. And in such a
country, there is a place in the center for compromise if we can unwind the
hysteria and polarization that the far left has fueled and Trump has
exacerbated. We have now a president-elect with little personal ambition ahead
of him, deep relationships with the Senate he will desperately need, elected by
the sane center, with a check on his left flank. If we can get pragmatic and
less inflamed, we can move forward. Biden won the primary thanks to moderate
and realistic black voters; and he has won the election with a broad coalition.
There are deals to be done. There is politics to engage. And we’ll get there if
we can use this moment to listen to each other, especially those whose opinions
we have spurned, and whose identities we have feared.
But
the maniac is gone. It will take some serious effort to keep him from
inflicting terrible damage on his way out. But he is gone. The republic will
survive, battered and bruised, but it will survive. And with a little grace
from all of us, it can also begin to heal.