Monday, March 30, 2020

A World Turned Upside-Down


A World Turned Upside-Down 

By Paul Theroux NY Times


In 1966, the writer Paul Theroux was in Uganda at a time of curfew and violence. It shaped his thinking about travel writing’s imperative to bear witness. 

In this season of infection, the stock market little more than a twitching corpse, in an atmosphere of alarm and despondency, I am reminded of the enlightenments of the strict curfew Uganda endured in 1966. It was, for all its miseries, an episode of life lessons, as well as monotonous moralizing (because most crises enliven bores and provoke sententiousness). I would not have missed it for anything. 

That curfew evoked — like today — the world turned upside-down. This peculiarity that we are now experiencing, the nearest thing to a world war, is the key theme in many of Shakespeare’s plays and Jacobean dramas, of old ballads, apocalyptic paintings and morality tales. It is the essence of tragedy and an occasion for license or retribution. As Hamlet says to his father’s ghost, “Time is out of joint.” 

In Uganda, the palace of the king of Buganda, the Kabaka, Mutesa II — also known as King Freddie — had been attacked by government troops on the orders of the prime minister, Milton Obote. From my office window at Makerere University, where I was a lecturer in English in the Extra Mural department, I heard the volleys of heavy artillery, and smoke rising from the royal enclosure on Mengo Hill. The assault, led by Gen. Idi Amin, resulted in many deaths. But the king eluded capture; he escaped the country in disguise and fled to Britain. The period that followed was one of oppression and confusion, marked by the enforced isolation of a dusk-to-dawn curfew. But, given the disorder and uncertainty, most people seldom dared to leave home at all. 

The curfew was a period of fear, bad advice, arbitrary searches, intimidation and the nastiness common in most civil unrest, people taking advantage of chaos to settle scores. Uganda had a sizable Indian population, and Indian people were casually mugged, their shops ransacked and other minorities victimized or sidelined. It was also an interlude of hoarding, and of drunkenness, lawlessness and licentiousness, born of boredom and anarchy. 

“Kifugo!” I heard again and again of the curfew — a Swahili word, because it was the lingua franca there. “Imprisonment!” Yes, it was enforced confinement, but I also felt privileged to be a witness: I had never seen anything like it. I experienced the stages of the coup, the suspension of the constitution, the panic buying and the effects of the emergency. My clearest memory is of the retailing of rumors — outrageous, frightening, seemingly improbable — but who could dispute them? Our saying then was, “Don’t believe anything you hear until the government officially denies it.” 

Speaking for myself, as a traveler, any great crisis — war, famine, natural disaster or outrage — ought to be an occasion to bear witness, even if it means leaving the safety of home. The fact that it was the manipulative monster Chairman Mao who said, “All genuine knowledge originates in direct experience,” does not make the apothegm less true. It is or should be the subtext for all travelers’ chronicles. 

The curfew — three years into my time in Africa — was my initiation into the misuse of power, of greed, cowardice and selfishness; as well as, also, their opposites — compassion, bravery, mutual aid and generosity. Even at the time, 24-years-old and fairly callow, I felt I was lucky in some way to be witnessing this convulsion. It was not just that it helped me to understand Africa better; it offered me insights into crowds and power and civil unrest generally, allowing me to observe in extreme conditions the nuances of human nature. 

I kept a journal. In times of crisis we should all be diarists and documentarians. We’re bound to wail and complain, but it’s also useful to record the particularities of our plight. We know the progress of England’s Great plague of 1665 because Samuel Pepys anatomized it in his diary. On April 30 he wrote: “Great fears of the sickness here in the City it being said that two or three houses are already shut up. God preserve us all!” Later, on June 25, “The plague increases mightily.” And by July 26: “The Sicknesse is got into our parish this week; and is endeed everywhere.” 

A month later he notes the contraction of business: “To the Exchange, which I have not been a great while. But Lord how sad a sight it is to see the streets empty of people, and very few upon the Change, jealous of every door that one sees shut up lest it should be the plague — and about us, two shops in three, if not more, generally shut up.” 

In that outbreak of bubonic plague, spread by rat fleas, a quarter of London’s population died. 

My diary these days sounds a lot like Pepys’, though without the womanizing, snobbery or name dropping. The progress of the Covid 19 pandemic is remarkably similar to that of the plague year, the same upside-down-ness and the dizziness it produces, the muddle of daily life, the collapse of commerce, the darkness at noon, a haunting paranoia in the sudden proximity to death. And so much of what concerned me as important in the earlier pages of my diary now seems mawkish, trivial or beneath notice. This virus has halted the routine of the day to day and impelled us, in a rare reflex from our usual hustling, to seek purification. 

Still writing gives order to the day and helps inform history. In my journal of the Ugandan curfew I made lists of the rumors and tried to estimate the rate at which they traveled; I noted the instances of panic and distraction — there were many more car crashes than usual, as drivers’ minds were on other things. Ordinary life was suspended, so we had more excuses to do as we pleased. 

the Great Depression, which this present crisis much resembles. They were — ever after — frugal, cautious and scornful of wasters: My father developed a habit of saving string, paper bags, nails and screws that he pried out of old boards. The Depression made them distrustful of the stock market, regarding it as a casino. They were believers in education, yet their enduring memory was of highly educated people rendered destitute — “college graduates selling apples on street corners in Boston!” My mother became a recycler and a mender, patching clothes, socking money away. This pandemic will likely make us a nation of habitual hand-washers and doorknob avoiders. 

In the Great Depression, Americans like my parents saw the country fail — and though it rose and became vibrant once more, they fully expected to witness another bust in their lifetime. Generally speaking, we have known prosperity in the United States since the end of World War II. But the same cannot be said for other countries, and this, of course, is something many travelers know, because travel often allows us glimpses of upheaval or political strife, epidemics or revolution. Uganda evolved after the curfew into a dictatorship, and then Idi Amin took over and governed sadistically. 

But I’d lived in the dictatorship and thuggery of the Malawi of Dr. Hastings Banda (“Ngwazi” — the Conqueror), so Uganda’s oppression was not a shock. And these experiences in Africa helped me deconstruct the gaudy dictatorship of Saparmurat Niyazov, who styled himself “Tukmenbashi” — Great Head of the Turks — when, years later, I traveled through Turkmenistan; the Mongolia of Jambyn Batmönkh, the Syria of Hafiz Assad, the muddy dispirited China of Mao’s chosen successor, Hua Guo Feng. As for plague, there have been recent outbreaks of bubonic plague in Madagascar, Congo, Mongolia and China, producing national moods of blame-shifting and paranoia, not much different from that of Albert Camus’s “The Plague.” 

We’re told not to travel right now, and it’s probably good advice, though there are people who say that this ban on travel limits our freedom. But in fact, travel produces its own peculiar sorts of confinement. 

The freedom that most travelers feel is often a delusion, for there is as much confinement in travel as liberation. This is not the case in the United States, where I have felt nothing but fresh air on road trips. It is possible to travel in the United States without making onward plans. But I can’t think of any other country where you can get into a car and be certain at the end of the day of finding a place to sleep (though it might be scruffy) or something to eat (and it might be junk food). For my last book, I managed a road trip in Mexico — but with hiccups (bowel-shattering meals, extortionate police, bed bugs). But the improvisational journey is very difficult elsewhere, even in Europe, and is next to impossible in Africa. It is only by careful planning that a traveler experiences a degree of freedom, but he or she will have to stick to the itinerary, nagged by instructions, which is a sort of confinement. 

In fact, most travel is a reminder of boundaries and limits. For example, millions of travelers go to Bangkok or Los Cabos, but of them, a great number head for a posh hotel and rarely leave: The hotel is the destination, not the city. The same can be said for many other places, where the guest in the resort or spa — essentially a gated and guarded palace — luxuriates in splendid isolation. 

The most enlightening trips I’ve taken have been the riskiest, the most crisis-ridden, in countries gripped by turmoil, enlarging my vision, offering glimpses of the future elsewhere. We are living in just such a moment of risk; and it is global. This crisis makes me want to light out for the territory ahead of the rest. It would be a great shame if it were not somehow witnessed and documented. 

Paul Theroux’s latest book, “On the Plain of Snakes: A Mexican Journey,” was published in 2019 by Houghton Mifflin Harcourt.


Saturday, March 28, 2020

Toward an Economic Democracy


Toward an Economic Democracy

Why the coronavirus crisis is an opportunity to reshape the relationship between workers and their employers

By CHRISTOPHER MACKIN The New Republic

The most fundamental tragedy of the coronavirus crisis is human. It is lives being lost. Somewhere close behind is the feeling of desperation shared by working people. In an economy where it is estimated that 50 percent of the labor force survives from paycheck to paycheck, we are facing an economic crisis of unprecedented proportions that exposes a fundamental flaw in our widely accepted idea of the relationship between working people and their places of work. 

That fundamental flaw is a long-standing acceptance across the ideological spectrum of a division between wage earners and the owners of capital assets. While owners of businesses are able to fall back on accumulated wealth and assets in a crisis, it has become abundantly clear that a majority of workers are prisoners of wage income. As long as that divide persists, the threat of economic breakdown will loom both in the coming months and into the next crisis. That divide is the heart of economic inequality. Near-term measures that maintain or increase wage income should be implemented. But it is time to think more deeply about the causes of inequality, and it is time to introduce remedies that serve as conditions for the provision of federal government assistance. 

As Mark Cuban, owner of the NBA’s Dallas Mavericks, has wisely counseled, no governmental interventions now being considered should be entered into without consideration of how that intervention will address inequality. 

A prominent test case—the airline industry—can help lead the way. Any federal funds loaned to the airlines should be repaid in two steps. The first dollar repaid should be directed to newly established Employee Stock Ownership Plans, or ESOPs, at each company whose beneficiaries are the more than 500,000 airline workers, from luggage handlers and flight attendants to mechanics and pilots. The second dollar repaid should return directly to the federal government. Using that formula, over a short period of time, employees will accrue a substantial stake in these companies. They or their selected professional representatives should serve prominently on company boards of directors to give voice to the employees that make the business work. 

As Columbia law professor Tim Wu points out, the American public also deserves corporate governance representation to help steer the airlines back to responsible stewardship. Narrowly focused stock buybacks in public companies that have enriched a small set of the corporation’s stakeholders, senior management, and quick-flipping, short-term shareholders should end. If management balks, they should be educated on how this arrangement is a superior corporate model for workers, shareholders, and the public. There is abundant empirical evidence that it is. It simply requires more of management. 

A second mechanism to use with the airlines and with any other private-sector company receiving governmental support can also speed up the process toward greater wealth participation by ordinary working people. Business taxes should be rethought. They should be paid in full if they perpetuate status quo arrangements that keep workers on the outside of ownership. A necessary reform would simplify existing rules by crediting a company’s tax payments dollar for dollar against its federal tax obligations. Payments that would ordinarily be directed to the federal government should instead be directed to purchase company stock that would be held by trusts for company employees. Over time, working people would become substantial owners of the companies they work for and enjoy a voice and a share in the wealth they have helped create. 

The idea of restructuring our economy so that capital is a resource that works for labor and not just for itself is not a new one. Workers didn’t always work solely for wages. They used to work in small shops and on farms. In the middle of the nineteenth century, as industrialization was taking hold, some labor leaders warned that an employer-employee relationship where the first group owned and the second group was expected to survive on wages was a trap that would result in dependence and servility. They argued for employee ownership of the newly emerging industrial economy as an alternative where labor should work for both wages and capital ownership. 

Strangely enough, so did a smattering of legendary industrialists, including Robert Brookings, Leland Stanford, and, early in the twentieth century, the chairman of the General Electric Corporation, Owen D. Young. On July 4, 1927, Young took the podium on the newly installed granite steps of the Baker Library at the Harvard Business School. He was the guest speaker for the opening of that grand building, and he had a surprise vision to share with the audience. He asked his audience to consider whether American capitalism, then barely a century old in its industrial form, had been launched on the right foot. 

Into these [larger-scale businesses] we have brought together larger amounts of capital and larger numbers of workers than existed in cities once thought great. We have been put to it, however, to discover the true principles which should govern their relations. From one point of view, they were partners in a common enterprise. From another they were enemies fighting for the spoils of their common achievement. 

He spoke hopefully that the Harvard Business School might be a place where his alternative vision could be fleshed out and made to work. 

Perhaps someday we may be able to organize human beings engaged in a particular undertaking so that they truly will be the employer buying capital as a commodity in the market at the lowest price.… I hope the day may come when these great business organizations will truly belong to the men who are giving their lives and their efforts to them, I care not in what capacity.… Then we shall dispose once and for all, of the charge that in industry organizations are autocratic and not democratic.… Then, in a word, men will be as free in cooperative undertakings and subject only to the same limitations and chances as men in individual businesses. Then we shall have no hired men. That objective may be a long way off, but it is worthy to engage the research and efforts of the Harvard School of Business. 

It takes a crisis of the magnitude of the coronavirus to reveal to us how poorly designed the dominant U.S. corporate economic arrangements are from the point of view of sharing the common wealth all workers help create. There are alternative wealth-sharing arrangements to the dominant U.S. corporate structure that are within reach. Some of them, like the 7,000 companies across the U.S. that are owned by their employees through ESOPs, have survived and prospered on the margins of this dominant structure. It is time to expand the reach of these ideas to the commanding heights of the American economy in order to design an inclusive form of capitalism that ends the utter dependence of most working people on their weekly paycheck. 

The wealthy in America are disturbed by the coronavirus crisis, but they can sleep at night knowing that they have reserves that can get them through these difficult times. It is now painfully evident at this moment that that same comfort of having stored up wealth through your life’s work must be an opportunity extended to the rest of working America. 

Finally, we need to appreciate that this is a root-and-branch moment. Owen Young is not the only neglected prophet whose stock is rising. Visionaries and critics who have warned about the dangers of unchecked economic growth, industrial agriculture, and remote supply chains must get a new hearing. Private patents on life-saving technologies should be terminated. Stock buybacks limited to grasping senior managements should be officially over. 

But an economy where workplaces are comprised of fellow owners, where there are “no hired men,” can still sound, at this acute moment of crisis, like a special pleading. What about all of those outside the reach of the workplace? How are these ideas going to help them? 

The best answers to that challenge are partial. And while there are concrete advances that should follow from reengineering the ownership and governance of the modern workplace, the responses on offer are also necessarily abstract. Perhaps the broadest claim that can be made in favor of these reforms is that the vitality and the moral responsibility of an economy is the single best guarantee that society can extend to all of its members. The economy is what will make and deliver their resources, food, shelter, health, and technology. It is where our problems will be solved or allowed to fester. 

The reigning, now staggering, modern structures of economic life have arguably delivered on something we can narrowly describe as vitality. Technology has achieved wonders. Increases in productivity have reduced poverty. But modern economic life has also become significantly unmoored from responsibility to people and the planet. There is not only a coronavirus loose upon the land. When inequality is allowed to reach the unprecedented heights that prevail today, we are also confronting a historic responsibility deficit that traces back to a lack of accountability, a lack of democracy, in our economic institutions. 

Unless we are going to fall for an even more romantic and already historically discredited idea of government ownership, a “do-over” for the socialism of the twentieth century, we can hope and reasonably expect that workplaces that are governed from within, not by the state but by their workers, engineers, and managers, will lead to a more responsible economy. That is economic democracy, the long-neglected complement to political democracy. It is not socialism. 

And what standards of social responsibility might we expect from firms that are owned and governed democratically? Workers are also citizens. They drink the same water that consumers in their communities drink. They are not absentee investors, buying and selling their stock in nanoseconds. It seems reasonable to bet that if given the chance, they and their counterparts in management can be counted upon to arrive at answers about how best to carry out our economic life far better than the impersonal stewards of modern finance. The regulatory power of the state would not disappear under economic democracy. It would remain as the vigilant protector of the public interest. But for democracy to live up to its potential in society at large, the realms of the polity and the economy must remain distinct and in constructive tension. 

Exactly 20 years ago, in a neglected book called Democracy at Risk, attorney Jeff Gates coined a metaphor that aptly described the “maximizing shareholder value” framework that has served as the conceptual North Star for elite opinion and for our business and law schools across the land. Ralph Nader was one of Gates’s most prominent supporters. Gates referred to the prevailing economic regime as “money on autopilot.” It is time that we design an economy where we confront our responsibility deficit, where we disable the autopilot machinery and replace it with an “eyes wide open” ethos and regime of law and corporate governance that manages consciously, ethically, and with responsibility. 

Christopher Mackin is a Ray Carey and a Louis Kelso Fellow at the Rutgers University School of Management and Labor Relations. He also serves as a strategic adviser to companies, employee groups, and governments on the topic of broad-based employee ownership. 

Monday, March 23, 2020

Ancestor of all animals identified in Australian fossils


Ancestor of all animals identified in Australian fossils

by University of California - Riverside
Artist's rendering of Ikaria wariootia. Credit: Sohail Wasif/UCR

A team led by UC Riverside geologists has discovered the first ancestor on the family tree that contains most familiar animals today, including humans.




The tiny, wormlike creature, named Ikaria wariootia, is the earliest bilaterian, or organism with a front and back, two symmetrical sides, and openings at either end connected by a gut. The paper is published today in Proceedings of the National Academy of Sciences.

The earliest multicellular organisms, such as sponges and algal mats, had variable shapes. Collectively known as the Ediacaran Biota, this group contains the oldest fossils of complex, multicellular organisms. However, most of these are not directly related to animals around today, including lily pad-shaped creatures known as Dickinsonia that lack basic features of most animals, such as a mouth or gut.

The development of bilateral symmetry was a critical step in the evolution of animal life, giving organisms the ability to move purposefully and a common, yet successful way to organize their bodies. A multitude of animals, from worms to insects to dinosaurs to humans, are organized around this same basic bilaterian body plan.

Evolutionary biologists studying the genetics of modern animals predicted the oldest ancestor of all bilaterians would have been simple and small, with rudimentary sensory organs. Preserving and identifying the fossilized remains of such an animal was thought to be difficult, if not impossible.


A 3D laser scan that showing the regular, consistent shape of a cylindrical body with a distinct head and tail and faintly grooved musculature. Credit: Droser Lab/UCR

For 15 years, scientists agreed that fossilized burrows found in 555 million-year-old Ediacaran Period deposits in Nilpena, South Australia, were made by bilaterians. But there was no sign of the creature that made the burrows, leaving scientists with nothing but speculation.

Scott Evans, a recent doctoral graduate from UC Riverside; and Mary Droser, a professor of geology, noticed miniscule, oval impressions near some of these burrows. With funding from a NASA exobiology grant, they used a three-dimensional laser scanner that revealed the regular, consistent shape of a cylindrical body with a distinct head and tail and faintly grooved musculature. The animal ranged between 2-7 millimeters long and about 1-2.5 millimeters wide, with the largest the size and shape of a grain of rice—just the right size to have made the burrows.

"We thought these animals should have existed during this interval, but always understood they would be difficult to recognize," Evans said. "Once we had the 3-D scans, we knew that we had made an important discovery."

The researchers, who include Ian Hughes of UC San Diego and James Gehling of the South Australia Museum, describe Ikaria wariootia, named to acknowledge the original custodians of the land. The genus name comes from Ikara, which means "meeting place" in the Adnyamathanha language. It's the Adnyamathanha name for a grouping of mountains known in English as Wilpena Pound. The species name comes from Warioota Creek, which runs from the Flinders Ranges to Nilpena Station.


Ikaria wariootia impressions in stone. Credit: Droser Lab/UCR

"Burrows of Ikaria occur lower than anything else. It's the oldest fossil we get with this type of complexity," Droser said. "Dickinsonia and other big things were probably evolutionary dead ends. We knew that we also had lots of little things and thought these might have been the early bilaterians that we were looking for."

In spite of its relatively simple shape, Ikaria was complex compared to other fossils from this period. It burrowed in thin layers of well-oxygenated sand on the ocean floor in search of organic matter, indicating rudimentary sensory abilities. The depth and curvature of Ikaria represent clearly distinct front and rear ends, supporting the directed movement found in the burrows.

The burrows also preserve crosswise, "V"-shaped ridges, suggesting Ikaria moved by contracting muscles across its body like a worm, known as peristaltic locomotion. Evidence of sediment displacement in the burrows and signs the organism fed on buried organic matter reveal Ikaria probably had a mouth, anus, and gut.

"This is what evolutionary biologists predicted," Droser said. "It's really exciting that what we have found lines up so neatly with their prediction."

Wednesday, March 18, 2020

The World Lacks Leadership


The World Lacks Leadership 

BY YUVAL NOAH HARARI Time Magazine



Noah Harari is a historian, philosopher and the bestselling author of Sapiens, Homo Deus and 21 Lessons for the 21st Century. 

Many people blame the coronavirus epidemic on globalization, and say that the only way to prevent more such outbreaks is to de-globalize the world. Build walls, restrict travel, reduce trade. However, while short-term quarantine is essential to stop epidemics, long-term isolationism will lead to economic collapse without offering any real protection against infectious diseases. Just the opposite. The real antidote to epidemic is not segregation, but rather cooperation. 

Epidemics killed millions of people long before the current age of globalization. In the 14th century there were no airplanes and cruise ships, and yet the Black Death spread from East Asia to Western Europe in little more than a decade. It killed between 75 million and 200 million people – more than a quarter of the population of Eurasia. In England, four out of ten people died. The city of Florence lost 50,000 of its 100,000 inhabitants. 

In March 1520, a single smallpox carrier – Francisco de EguĂ­a – landed in Mexico. At the time, Central America had no trains, buses or even donkeys. Yet by December a smallpox epidemic devastated the whole of Central America, killing according to some estimates up to a third of its population. 

In 1918 a particularly virulent strain of flu managed to spread within a few months to the remotest corners of the world. It infected half a billion people – more than a quarter of the human species. It is estimated that the flu killed 5% of the population of India. On the island of Tahiti 14% died. On Samoa 20%. Altogether the pandemic killed tens of millions of people – and perhaps as high as 100 million – in less than a year. More than the First World War killed in four years of brutal fighting. 

In the century that passed since 1918, humankind became ever more vulnerable to epidemics, due to a combination of growing populations and better transport. A modern metropolis such as Tokyo or Mexico City offers pathogens far richer hunting grounds than medieval Florence, and the global transport network is today far faster than in 1918. A virus can make its way from Paris to Tokyo and Mexico City in less than 24 hours. We should therefore have expected to live in an infectious hell, with one deadly plague after another. 

However, both the incidence and impact of epidemics have actually gone down dramatically. Despite horrendous outbreaks such as AIDS and Ebola, in the twenty-first century epidemics kill a far smaller proportion of humans than in any previous time since the Stone Age. This is because the best defense humans have against pathogens is not isolation – it is information. Humanity has been winning the war against epidemics because in the arms race between pathogens and doctors, pathogens rely on blind mutations while doctors rely on the scientific analysis of information. 

When the Black Death struck in the 14th century, people had no idea what causes it and what could be done about it. Until the modern era, humans usually blamed diseases on angry gods, malicious demons or bad air, and did not even suspect the existence of bacteria and viruses. People believed in angels and fairies, but they could not imagine that a single drop of water might contain an entire armada of deadly predators. Therefore when the Black Death or smallpox came to visit, the best thing the authorities could think of doing was organizing mass prayers to various gods and saints. It didn’t help. Indeed, when people gathered together for mass prayers, it often caused mass infections. 

During the last century, scientists, doctors and nurses throughout the world pooled information and together managed to understand both the mechanism behind epidemics and the means of countering them. The theory of evolution explained why and how new diseases erupt and old diseases become more virulent. Genetics enabled scientists to spy on the pathogens’ own instruction manual. While medieval people never discovered what caused the Black Death, it took scientists just two weeks to identify the novel coronavirus, sequence its genome and develop a reliable test to identify infected people. 

Once scientists understood what causes epidemics, it became much easier to fight them. Vaccinations, antibiotics, improved hygiene, and a much better medical infrastructure have allowed humanity to gain the upper hand over its invisible predators. In 1967, smallpox still infected 15 million people and killed 2 million of them. But in the following decade a global campaign of smallpox vaccination was so successful, that in 1979 the World Health Organization declared that humanity had won, and that smallpox had been completely eradicated. In 2019 not a single person was either infected or killed by smallpox. 

Guard Our Border 

What does this history teach us for the current Coronavirus epidemic? 

First, it implies that you cannot protect yourself by permanently closing your borders. Remember that epidemics spread rapidly even in the Middle Ages, long before the age of globalization. So even if you reduce your global connections to the level of England in 1348 – that still would not be enough. To really protect yourself through isolation, going medieval won’t do. You would have to go full Stone Age. Can you do that? 

Secondly, history indicates that real protection comes from the sharing of reliable scientific information, and from global solidarity. When one country is struck by an epidemic, it should be willing to honestly share information about the outbreak without fear of economic catastrophe – while other countries should be able to trust that information, and should be willing to extend a helping hand rather than ostracize the victim. Today, China can teach countries all over the world many important lessons about coronavirus, but this demands a high level of international trust and cooperation. 

International cooperation is needed also for effective quarantine measures. Quarantine and lock-down are essential for stopping the spread of epidemics. But when countries distrust one another and each country feels that it is on its own, governments hesitate to take such drastic measures. If you discover 100 coronavirus cases in your country, would you immediately lock down entire cities and regions? To a large extent, that depends on what you expect from other countries. Locking down your own cities could lead to economic collapse. If you think that other countries will then come to your help – you will be more likely to adopt this drastic measure. But if you think that other countries will abandon you, you would probably hesitate until it is too late. 

Perhaps the most important thing people should realize about such epidemics, is that the spread of the epidemic in any country endangers the entire human species. This is because viruses evolve. Viruses like the corona originate in animals, such as bats. When they jump to humans, initially the viruses are ill-adapted to their human hosts. While replicating within humans, the viruses occasionally undergo mutations. Most mutations are harmless. But every now and then a mutation makes the virus more infectious or more resistant to the human immune system – and this mutant strain of the virus will then rapidly spread in the human population. Since a single person might host trillions of virus particles that undergo constant replication, every infected person gives the virus trillions of new opportunities to become more adapted to humans. Each human carrier is like a gambling machine that gives the virus trillions of lottery tickets – and the virus needs to draw just one winning ticket in order to thrive . 

This is not mere speculation. Richard Preston’s Crisis in the Red Zone describes exactly such a chain of events in the 2014 Ebola outbreak. The outbreak began when some Ebola viruses jumped from a bat to a human. These viruses made people very sick, but they were still adapted to living inside bats more than to the human body. What turned Ebola from a relatively rare disease into a raging epidemic was a single mutation in a single gene in one Ebola virus that infected a single human, somewhere in the Makona area of West Africa. The mutation enabled the mutant Ebola strain – called the Makona strain – to link to the cholesterol transporters of human cells. Now, instead of cholesterol, the transporters were pulling Ebola into the cells. This new Makona strain was four times more infectious to humans. 

As you read these lines, perhaps a similar mutation is taking place in a single gene in the coronavirus that infected some person in Tehran, Milan or Wuhan. If this is indeed happening, this is a direct threat not just to Iranians, Italians or Chinese, but to your life, too. People all over the world share a life-and-death interest not to give the coronavirus such an opportunity. And that means that we need to protect every person in every country. 

In the 1970s humanity managed to defeat the smallpox virus because all people in all countries were vaccinated against smallpox. If even one country failed to vaccinate its population, it could have endangered the whole of humankind, because as long as the smallpox virus existed and evolved somewhere, it could always spread again everywhere. 

In the fight against viruses, humanity needs to closely guard borders. But not the borders between countries. Rather, it needs to guard the border between the human world and the virus-sphere. Planet earth is teaming with countless viruses, and new viruses are constantly evolving due to genetic mutations. The borderline separating this virus-sphere from the human world passes inside the body of each and every human being. If a dangerous virus manages to penetrate this border anywhere on earth, it puts the whole human species in danger. 

Over the last century, humanity has fortified this border like never before. Modern healthcare systems have been built to serve as a wall on that border, and nurses, doctors and scientists are the guards who patrol it and repel intruders. However, long sections of this border have been left woefully exposed. There are hundreds of millions of people around the world who lack even basic healthcare services. This endangers all of us. We are used to thinking about health in national terms, but providing better healthcare for Iranians and Chinese helps protect Israelis and Americans too from epidemics. This simple truth should be obvious to everyone, but unfortunately it escapes even some of the most important people in the world. 

A Leaderless World 

Today humanity faces an acute crisis not only due to the coronavirus, but also due to the lack of trust between humans. To defeat an epidemic, people need to trust scientific experts, citizens need to trust public authorities, and countries need to trust each other. Over the last few years, irresponsible politicians have deliberately undermined trust in science, in public authorities and in international cooperation. As a result, we are now facing this crisis bereft of global leaders that can inspire, organize and finance a coordinated global response. 

During the 2014 Ebola epidemic, the U.S. served as that kind of leader. The U.S. fulfilled a similar role also during the 2008 financial crisis, when it rallied behind it enough countries to prevent global economic meltdown. But in recent years the U.S. has resigned its role as global leader. The current U.S. administration has cut support for international organizations like the World Health Organization, and has made it very clear to the world that the U.S. no longer has any real friends – it has only interests. When the coronavirus crisis erupted, the U.S. stayed on the sidelines, and has so far refrained from taking a leading role. Even if it eventually tries to assume leadership, trust in the current U.S. administration has been eroded to such an extent, that few countries would be willing to follow it. Would you follow a leader whose motto is “Me First”? 

The void left by the U.S. has not been filled by anyone else. Just the opposite. Xenophobia, isolationism and distrust now characterize most of the international system. Without trust and global solidarity we will not be able to stop the coronavirus epidemic, and we are likely to see more such epidemics in future. But every crisis is also an opportunity. Hopefully the current epidemic will help humankind realize the acute danger posed by global disunity. 

To take one prominent example, the epidemic could be a golden opportunity for the E.U. to regain the popular support it has lost in recent years. If the more fortunate members of the E.U. swiftly and generously send money, equipment and medical personnel to help their hardest-hit colleagues, this would prove the worth of the European ideal better than any number of speeches. If, on the other hand, each country is left to fend for itself, then the epidemic might sound the death-knell of the union. 

In this moment of crisis, the crucial struggle takes place within humanity itself. If this epidemic results in greater disunity and mistrust among humans, it will be the virus’s greatest victory. When humans squabble – viruses double. In contrast, if the epidemic results in closer global cooperation, it will be a victory not only against the coronavirus, but against all future pathogens. 

Copyright © Yuval Noah Harari 2020


Friday, March 06, 2020

The People of Las Vegas


The People of Las Vegas 

AMANDA FORTINI The Believer

Las Vegas is both stranger and more normal than you might imagine, and for some reason, people don’t think anyone lives there



1. 

It’s February in Las Vegas, and because I have managed to step on my glasses and break them, as I do at least once a year, I have gone to the LensCrafters at the Boulevard Mall, a faux deco artifact of midcentury Vegas that, like so many malls in America, is a mere husk of its former self. In a faculty meeting a few days earlier, I’d watched as one of my colleagues bent and manipulated a paper clip, then used it to refasten the left bow of his glasses, creating a tiny antenna at his temple. That’s not a look I’m after, so I am here, obsessively trying on frame after frame, as the young Iranian man who is helping me on this quiet Monday afternoon patiently nods or shakes his head: yes, yes; no, no, no. 

I order two pairs. LensCrafters, the movie theater chain of eyeglasses, is always offering deals: half off a second set of frames, a supersize popcorn for fifty cents more. While I wait, I walk around the mall, a 1.2-million-square-foot monstrosity built on seventy-five acres, with a 31,000-square-foot SeaQuest aquarium and a 28,000-square-foot Goodwill. 

Next door to LensCrafters, there’s a shop that sells gemstones, crystals, sage, and pink Himalayan salt lamps. The burning sage makes that end of the mall smell musky, animalistic—a strangely feral odor in this synthetic environment. Snaking its lazy way around the scuffed tile floor is an automated miniature train, the sort children might ride at the zoo, driven by an adult man dressed as a conductor; it toots loudly and gratingly at regular intervals. JCPenney and Macy’s and Dillard’s closed months and years ago, while Sears is limping along in its fire-sale days. At Foot Locker, I try on black-and-white Vans in an M. C. Escher print. At Hot Topic, I browse the cheap T-shirts printed with sayings like Keep Calm and Drink On and Practice Safe Hex. I eat a corn dog, fried and delicious, at a place called Hot Dog on a Stick. (I really do.) The atmosphere is depressing, in all its forced cheerfulness and precise evocation of the empty material promises of my ’80s-era youth. 

I am almost three miles east of the Strip, but I could be anywhere, at any ailing mall in America. The only clues that I am in Las Vegas are a couple of clothing shops that carry items like six-inch Lucite stilettos and pearl-encrusted crop tops. And then, outside, a well-worn swimsuit someone has discarded on a pedestal near the entrance, where Fleetwood Mac’s “Rhiannon” blares. The swimsuit has a built-in corset-like bra, an exoskeleton of sorts—it could probably stand on its own—and it’s as if someone has left a part of her body behind. There’s no pool, I think. Who undressed here? Such odd Vegas-y details are everywhere in this city—the Elvis impersonator shopping in full-spangled regalia at my local health food store, the pink vibrator melting on Maryland Parkway in 110-degree heat—and I assume you eventually become blind to them, but after four years here, I still see them. 

2. 

Las Vegas is a place about which people have ideas. They have thoughts and generalizations, takes and counter-takes, most of them detached from any genuine experience and uninformed by any concrete reality. This is true of many cities—New York, Paris, Prague in the 1990s—owing to books and movies and tourism bureaus, but it is particularly true of Las Vegas. It is a place that looms large in popular culture as a setting for blowout parties and high-stakes gambling, a place where one might wed a stripper with a heart of gold, like Ed Helms does in The Hangover, or hole up in a hotel room and drink oneself to death, as Nicolas Cage does in Leaving Las Vegas. Even those who would never go to Las Vegas are in the grip of its mythology. Yet roughly half of all Americans, or around 165 million people, have visited and one slivery weekend glimpse bestows on them a sense of ownership and authority. 

Of course, most tourists stay on the Strip, that 4.2-mile neon stretch of hotels and casinos: an artificial, sealed-off capsule where they remain for the duration of their visit, having some carefully orchestrated corporate fun. The city understands how the mythology fuels the business of tourism, and it does its part to sell it. City of sin, city of vice, of wild abandon and crazy drunk antics. “What happens here stays here,” says the infamous official ad slogan—whether that’s a lap dance at Spearmint Rhino or an embarrassingly pricey brunch at Giada. Implicit in this unabashed celebration of excess and vice is the notion that nothing that occurs here will be too sinful, too dangerous, or too scary: “Just the right amount of wrong,” as one casino ad goes. For the most part, all of this is true. But anyone who has walked along Flamingo Road and observed the Strip’s human backwash in the pale gray light of a weekend morning, or who has talked to a gaming lawyer about what happens when a person can’t pay their markers (more than $650 is a felony in Nevada, carrying one to four years in prison and up to $5,000 in fines), knows that the fun isn’t always without consequences. One might argue that people have ideas about Las Vegas because they have a shit-ton of ideas about morality, and a fearful desire to distance themselves from their weaker, more susceptible human counterparts. 

All these received narratives, these Vegas hand-me-downs, get recycled by the journalists who parachute in and out of here. These writers (who, it’s worth noting, are almost always male) swoop in for a day or two or four, steep themselves in authority and gin at a casino bar, and deliver their pronouncements on the very essence of this “wild and crazy” place. They attach like barnacles to the same tired tropes, even the same language (all the singers in Las Vegas tend to “croon”), and their takes are often hackneyed and snooty at once. They will remind you that Las Vegas, that neon fever dream, is set down smack in the middle of the Mojave Desert. They’ll note that the lights of the Strip can be seen from outer space. They’ll train their lens on the excessive, the gaudy, the vulgar, and the seedy, of which there is certainly plenty here. There will be a scene from the airport, and it will mention the slot machines there, plus the weirdos playing them—of that you can be sure. It’s not that these ideas are wrong, or not exactly; it’s more that they’re hazy, or half-baked, or tend to conflate the marketing of the Strip with the city itself. Any writer knows that you can’t be a years-deep expert on everything you write about, but when you have very little experience of a subject, a single point can look like a line. One colleague perfectly summed up this lazy tourist-journalism with the phrase “Let’s check in and see how stupid and craven everyone is in Las Vegas.” 

At its root, such writing is often not about the myths of Las Vegas but about the myths these writers hold about themselves, and how those play out against the backdrop of this city. Las Vegas is the setting, the mise-en-scène, for a rambling, gambling writer, in the now-familiar vein of Hunter S. Thompson and his drug-addled, hallucinatory early ’70s romp through Circus Circus. Other people are merely bit players in a private script. A handful of uncouth, uneducated characters might get a mention—the conventioneers in their Hawaiian shirts sucking on daiquiris as big as prizewinning squash, the alt-right talk radio dummies roaming around a gun show with delusions of heroism in their heads—but it’s rare that anyone is actually interviewed. Writing about Las Vegas is inevitably an extreme case of the problem of travel writing more generally: its practitioners forget that the way to understand a place is to get out and see it, and to talk to its people. 

In most cases, the parachute writers seem unaware of—or perhaps just uninterested in—the fact that the city has people: there are roughly 2.23 million permanent residents of the Las Vegas metropolitan area. These are the bartenders and cooks, the cocktail waitresses and card dealers, the valet parkers and hotel maids, who keep this adult playland in motion, but they are also the doctors and nurses and teachers and lawyers who keep any city in motion. The Las Vegas Valley, a vast, sprawling sixteen-hundred-square-mile expanse of desert surrounded almost entirely by mountains and foothills, looks like pretty much any other Western metropolitan area, with churches, big-box stores, fast-food chains, and strip malls populated by insurance offices and vape shops. Three times as many people live outside the city limits as within them, in one of the area’s four other cities (Boulder City, Henderson, Mesquite, North Las Vegas) or in unincorporated Clark County.1 They live like much of America does: going to church, to work, to school, to bars, to buy garbage bags, to get their teeth cleaned. 

Las Vegans who consider themselves culturally sophisticated (like many of my university colleagues) tend to distance themselves from the Strip’s uncomplicated and coarse enchantments, emphatically claiming they never set foot in any of the restaurants, nightclubs, or overpriced boutiques that populate the casinos. This may be true, but I also think it’s a defensive, contrarian reaction to the prevailing clichĂ©s—a kneejerk assertion of individuality in response to years of stereotyping. One could argue that some residents get so caught up in rejecting the clichĂ©s that they, too, cease to see the place as it is. Because to say that the world of the casinos and the Vegas beyond them are wholly separate fiefdoms is just as inaccurate as saying that Las Vegas and the Strip are synonymous. As one friend, a gifted writer in his late twenties who was born and raised here, told me, “It’s not either/or. To say that is just wrong.” 

The residents of Las Vegas interact with the tens of millions of tourists who visit each year—around 42 million, according to the Las Vegas Convention and Visitors Authority, three times as many as go to Mecca—in fascinating and complicated ways. This is a company town, after all. The Bureau of Labor Statistics reports that approximately 300,000 people work in the leisure and hospitality sector here, far more than in any other industry. (The next closest, trade, transportation, and utilities, employs 176,000 people.) My students tell me about their jobs on the Strip: as cocktail waitresses at swimming pools and sports bars, where men comment on their uniforms and tourists bet on games; at the Fashion Show mall, where at night they wait on drunk people who have decided to do a little shopping. My female students talk about the male tourists hitting on them at nightclubs, where, even more so than in other cities, every interaction comes freighted with the awareness that it will only ever be temporary. Some weekends, my friends arrange a field trip to a casino buffet, where people from all over the world drink bottomless mimosas. One Friday night, a girlfriend takes me to see the show Magic Mike Live, and we sit marooned in a sea of hooting bachelorettes wearing veils. In some ways, Las Vegans are like the permanent crew of a cruise ship—Las Vegas as The Love Boat, if you will—and the tourists, the real character actors, stream on and off, week in and week out. 

I have often wondered whether the general ignorance about Las Vegas is born of laziness, snobbery, or an altogether more insidious impulse. Las Vegas was, of course, dĂ©classĂ© and embarrassing from the start: founded by the Mafia, the first “unaristocratic” Americans, as Tom Wolfe wrote, “to have enough money to build a monument to their style of life.” It’s frequently said that Las Vegas has no culture, but that’s not true. My Italian relatives from Illinois—my aunts with their Carmela Soprano hairdos and long acrylic nails—love it for a reason. They love playing the slots downtown at the Golden Nugget and going out for martini dinners at old-school Italian places. (At one of these, I heard Pia Zadora breathily sing about her “accidents and arrests.”) They love Cirque du Soleil shows, where you can sit and watch first-class acrobats fly across the stage while you sip from a plastic cup of beer. Las Vegas is vernacular culture—“prole,” Wolfe called it—and thus, he notes, “it gets ignored, except on the most sensational level.” Those who think of themselves as cultured and educated look down on Las Vegas as garish and brazen. But concern about “good taste” is often just socially palatable code for classism and racism. This is a working-class town that’s nearly 33 percent Hispanic, 12 percent Black, and 7 percent Asian. It has one of the largest populations of undocumented immigrants in the country, and the eighth-highest rate of homelessness. Consider these demographics, and one starts to understand why the people of Las Vegas get overlooked. 

3. 

One of the strangest side effects of moving to Las Vegas is that no one can remember where I live. I have always divided my time between multiple locales, and my friends and colleagues never had a problem recalling where I was. I’m convinced this amnesia is an outgrowth of the fact that no one quite believes anyone lives here. When they do remember, they’ll say, fuzzily, confusedly: “Are you still in Las Vegas? What’s that like? How’s that going?” I can’t help but detect more than a little class bias in their incredulity; they can’t seem to understand why someone who has a choice, who isn’t required to live in Las Vegas, would choose to do so anyway. 

When you say that yes, you do in fact reside in Las Vegas, they’ll say, “I could never live there.” They mean on the Strip, with its hectic all-night carousing, its decadent, exorbitant dinners with steaks as big as your face, its bass-heavy club music that feels like it’s rattling your organs. (Once, at XS, the nightclub at the Encore, I became convinced the thudding techno would give me a heart attack and made my husband leave with me.) Or they’ll ask you where in Las Vegas you live—like they’d know if you told them—and with a little probing you’ll realize that they assume you live in a casino hotel. Many people know Las Vegas as the place where entertainers are sent to perform in their dotage, a place populated almost solely by magicians and strippers. One friend told me he thought residents lived in trailers behind the casinos, like on the set of a Hollywood film, or maybe on golf courses, because he’d heard of politicians coming to Las Vegas to golf. People don’t grasp that most of the housing here is like housing anywhere—apartments, condos, single-family dwellings in suburban subdivisions—except that the homes are mostly stucco in flat desert shades of dirt, sand, and clay.2 It’s funny to think that one of the most surprising things you can say about a city is that people actually live there. 

But a counter idea is still an idea, an abstraction that doesn’t tell you much about the feeling and fabric of life. And if you’re playing a game of contrariness or defiance, you’re still captive to what’s come before. What does it mean, really, to say that people live in Las Vegas? Show me their lives. I say this to a colleague one afternoon as we wait in line to order burritos. He stops talking, cocks his head, and looks stricken. “Please,” he says, “tell me you’re not writing another thing about ‘the real Las Vegas.’” 

Oh, but I am, I think. To my mind, there isn’t much out there that evokes this so-called “real Las Vegas,” that treats it simply as a place some people visit and other people live. Literature should portray, raise questions, and perhaps come to some conclusions about existence, which nobody ever seeks through Las Vegas. People come to Las Vegas looking for their idea of Las Vegas; they don’t come here looking for life. 

4. 

I came here four years ago. I hadn’t planned to stay, but I somehow couldn’t leave. “This place grows on you like a fungus,” one fellow relative newcomer said. My husband had a literary fellowship, and I was haunting around here when a professor, an unassuming connector type who has since become a close friend, suggested I apply for a job at UNLV’s journalism school. My husband and I rented an apartment near the university, in the Vegas Towers: two “luxury” high-rise buildings built in 1974 that have some of the worst Yelp reviews I’ve ever read. “One of the shadiest places to rent in Vegas,” reads a typical one. “The staff seem pleasant on first encounter but they are all snakes.” My objections were mostly aesthetic: to the wall-to-wall beige ’80s-style carpeting whose chemical smell meant we had to keep the windows open at all times, to the dingy hallways that seemed never to be vacuumed—a green M&M once sat there, untouched, for a full three weeks. The airport is ten minutes away, and at around 5 a.m. each morning, I was awakened by the sound of airplanes ripping holes in the atmosphere. At night, a symphony of construction noises would begin, the players taking up their instruments at 9 p.m. and playing straight through until morning. On Sundays—cleaning day, I guess—someone would walk the hallways and spray an air freshener so cloyingly fragrant I could taste it for hours. 

But my students delighted and astonished me. UNLV is the most diverse undergraduate campus in the country, and their families came from all over the world—in Nevada, 38 percent of children live with at least one parent who is an immigrant. Many of these young people had made their way through one of the worst public school systems in the country (Nevada is ranked fiftieth overall, behind only New Mexico); they were the success stories. 

Fierce, tenacious, and hardworking (that’s not to say some weren’t infuriating or lazy), they were going to college while holding down full-time jobs in retail stores, as waitresses, as substitute teachers. One student, when I asked why he kept nodding off in class, told me he had to get up at 4:00 a.m. to open the Krispy Kreme at Excalibur. Another student, also always sleepy, told me she worked a 3:30 a.m.–2 p.m. security shift at the airport. Yet another young woman—a student I became quite close to—was, at fifteen years old, kicked out of her home in Pahrump by her mother, who hoped to protect her from her stepfather and his motorcycle-gang friends; she told me she used to steal toilet paper from her job so she didn’t have to buy it. She’d read every book I mentioned, even in passing. The White Album, Battleborn—I’d see them peeking out of her purse. Each week, she’d arrive at my office hours, where she was joined by other student regulars, who would come to hang out, eat cereal, recite slam poetry, ask advice, and tell me about their lives. 

When people wonder what I’m still doing in Las Vegas—someone is always asking you to justify your decision to live here, a phenomenon I’ve never experienced with any other place, and I’ve lived in rural Montana—I talk about my students. I mention the mass shooting here, which I covered for the better part of a year. But I could also say that in Las Vegas there is, at least in my experience, a curious and refreshing lack of class consciousness, what the critic Dave Hickey, a former resident, has called “a suppression of social differences,” and that, as a result, I know a wider and more varied range of people than I do almost anywhere else, whereas in New York, boringly, I knew mostly writers. I could explain that the creative community here is so small that writers and artists and intellectuals of all ages and backgrounds mix with one another. I could mention the arresting variation of the landscape: the fact that you can drive up the Eastside at dusk and look down at the city lights glittering like a sequined dress, and that a summer day in downtown Summerlin feels, save for the palm trees, like a day in suburban Connecticut. I could say that when I had kidney surgery last spring, friends took over my classes, offered to bring groceries, called and texted for updates—and the gestures were not dutiful but sincere. (You could argue that this would happen in other cities, but I have lived in those cities, and in my experience it doesn’t.) Sometimes I just say that Las Vegas—with its glossy celebrity-chef outposts, where the meals are painstakingly perfected, and its off-Strip restaurants, where you can find pretty much any cuisine you want—has the best food in the country. I sound like a travel magazine article, but it does. 

Yet the deeper truth is something far more complicated. As a writer, as a human, no place has ever captured my attention, my imagination, and my concern as this city has. There’s certainly plenty of mundane shit here—I have spent many lonely nights zombie-ing around the Target on Flamingo and Maryland—but I have also seen and heard things here that I’ve never witnessed anywhere else. Some of them are beautiful, some hilarious, some perplexing, dark, and disturbing, but they are all blessedly out in the open. “What is hidden elsewhere exists here in quotidian visibility,” to quote Dave Hickey again. 

Why is what’s invisible in other cities so visible here? Las Vegas was a place founded on a kind of clarity about human nature, and it has never pretended otherwise. Starting in 1931—the same year that gambling was legalized—Nevada passed the most lenient divorce law in the country, requiring only six weeks of residency to file, compared to years in some states. Prohibition barely registered here; alcohol continued to be served on Fremont Street, the town’s main gambling drag at the time, and on Block 16, the erstwhile red light district, where railroad workers, travelers, and, in the 1930s, Hoover Dam laborers would come for the saloons and the prostitutes. Sex work was legal here until 1941, and still is in various brothels around Nevada—the closest to the city are in nearby Pahrump, a little more than an hour away. Indeed, still, today, the whole tourism industrial complex is devoted to serving appetites of all kinds. On top of this, there aren’t the same brakes on behavior that exist elsewhere. Joan Didion once called Las Vegas “the most extreme and allegorical of American settlements, bizarre and beautiful in its venality and in its devotion to immediate gratification.” 

The metropolitan area also is—and, with the exception of the housing crisis in 2008, has long been—one of the fastest-growing communities in the country. In 1950, four years after Bugsy Siegel opened the Flamingo Hotel, the city’s first major luxury casino (he stepped in with a mob loan when Hollywood Reporter founder and columnist Billy Wilkerson ran out of funds), the population of Las Vegas proper hovered around 24,624; now, a mere seventy-five years later, it is almost 645,000 and growing. Between 1985 and 1995, in fact, as the lavish resort hotels began to go up and gaming and tourism flourished, the city’s population nearly doubled. The entire Las Vegas Valley is relatively young and full of displaced people—in 2018, Nevada was the fourth most moved-to state, according to United Van Lines, with many of the transplants fleeing high housing costs in California. People are also drawn by the fact that the entrenched cultural institutions of the coasts, and their rigid mores, simply don’t exist here. Whoever you are, whatever you pretended to be back in Boston or New York, you don’t have to keep up that pretense. You can let it all hang out, and Las Vegas has long promised to let you. Tourists can be staggeringly drunk on the streets, couples can fight in public—I recently saw a married couple nearly come to blows over the amount of time it took to use a four-dollar coupon at Target—and truly no one will bat an eye. Combine the city’s dedication to encouraging shameless self-indulgence and its anything-goes outlaw ethos with seriously light policing outside of the Strip (Clark County’s services, like its school system, are, let’s just say, a little lacking) and you have human peccadilloes blatantly on display, along with human suffering. 

One early morning as I am leaving my apartment, two esoteric sports cars are idling in front of me, bumper to bumper: a man gets out of the rear car holding a giant aspirin-pink designer purse and hurls it, with all the rage in his body, into the first car, which is presumably occupied by the purse’s owner. Recently, at a party on the Strip, a four- or five-year-old girl in a mermaid costume posed for photos with partygoers; her parents, also dressed as mermaids, were placing her in people’s laps. “I don’t think children should be used as props,” my friend whispered, after the parents tried to sit the child on her, “but that’s just me.” I agreed, but the kid seemed to be enjoying herself. Downtown, on a sweltering late-spring afternoon, my husband and I watched as a man in a wheelchair determinedly kicked his way up Fremont Street, backward and uphill, with one leg, his only limb. My heart collapsed in on itself, as it does so often here. Just last Saturday, I saw a woman on the sidewalk outside my apartment, bathing her legs in beer. Well, it’s not water, I thought as I passed her, but it works. That’s a thought I never would have had before moving to Las Vegas. 

It’s difficult to generalize about the people of an entire city, but one thing can be said about Las Vegans: They are honest, sometimes bleakly so, and they tend to recognize this quality in others. They believe, to quote Gretel Ehrlich, that “honesty is stronger medicine than sympathy, which may console but often conceals.” Sure, there are those whose instinct is to protect, to boost, to paper over the city’s problems—I once horrified a colleague by regaling a famous visiting writer with a story of witnessing a robbery in the Walgreens on Flamingo—but most Las Vegans are clear-eyed about the naked human drama taking place around them. In this, they are like the artists I know, and not surprisingly: to live here requires a certain independence of spirit and, to quote F. Scott Fitzgerald, “the ability to hold two opposed ideas in mind at the same time and still retain the ability to function.” People here know firsthand that the banal lies alongside the sensational, that just because it’s tacky doesn’t mean it isn’t fun, and that wealth here sits far too close to painful, abject poverty. Las Vegas is as regular a place as any other where people shave their stubble and pay their bills, and as savage, as vulgar, and as glamorous a place as the Las Vegas of lore. Our state legislature votes on insulin pricing and voters elect a dead pimp to office, and when I need my computer fixed, I drive to the Apple store at Caesars Palace, where tourists are jostling to see water spout from fake Greco Roman statues. “You laugh,” someone here told me, “or you die.” 


5. 

Back at the Boulevard Mall, I return to LensCrafters to pick up my glasses. I’m anxious because I have one ear that’s higher than the other and a touch of undiagnosed OCD, and getting them to fit the way I like them—almost floating, not gripping my ears like tight little baby hands—is always an ordeal. But the young man who helped me choose the frames bends and shapes them with such slow and attentive care that I am put immediately at ease. As he works, we talk. His name is Pouyan. He is twenty-eight years old, bearded and solidly built, with intense blue eyes and a warm, open manner. He immigrated to Las Vegas three years ago from Fardis, a city outside Tehran, by way of Turkey, to which he had escaped by foot, and where he was later met by his parents and younger brother. He was an optician in Iran, as was his father, but his family is Baha’i, a persecuted religious minority there, and the Iranian government shut their optical shops down. 

In order to practice in the US, he had to go back to school to get his degree. He learned English from talking with his friends, he says, none of whom are from Iran, but his parents are struggling with the language. He tells me he’s completed his courses and taken one of five qualifying tests required in Nevada—they cost three hundred dollars each—but wants to help his mom set up an eyebrow threading business before he studies for them. He hands me my glasses. I try them on. They are ever so slightly crooked, so he gently, delicately manipulates them some more. I imagine him extending such careful kindness to every customer, day in, day out, here at this nondescript LensCrafters in the mall. 

What does it mean to write about people who are usually overlooked or ignored? I am thinking about this as I walk back to my office that afternoon, through a corridor on campus where someone often builds little sculpture-towers out of rocks—they remind me of Stonehenge in miniature. I see this found art every day; I wonder who creates these sculptures, and I marvel that the artist persists in re-creating them when students or maintenance workers topple them, as they always do. As I walk across the quad, I see a wire-thin man with close-cropped gray hair placing the rocks, rough-hewn and triangular, one on top of another. They stand as if by some ineffable magic. He tells me his name is Ken; he’s part Mi’Kmaq Indian, a civil engineer who ran an environmental remediation business for eleven years but is no longer practicing. He says his company removed asbestos from underneath the Statue of Liberty. He became an artist seventeen years ago, when he had a vision while planting a garden for his disabled mother in Upstate New York, and he moved to Las Vegas in 2013. He calls his pieces “geoglyphs.” 

I’m always curious what compels people to create art outside the spotlight or the marketplace, so I ask him why he does it. He tells me that he can look at a pile of rocks—he gestures to a river of stones on the grass, an undifferentiated mass to my eye—and see how they could be beautiful. The shapes, angles, planes speak to him. They’re puzzle pieces he has to make fit. Every morning when he walks his dog, he will be here, making his towers, one rock precariously balanced on another. When they get knocked down, he will pile them up again. He will do this whether I write about him or not.




Rosewood