Wednesday, January 08, 2014

Are we the new Rome? Heading for an uncontrollable collapse?
Commentary by Rob Goodman
Politico Magazine
 
What does decline sound like? I imagine equal parts self-pity and self-flagellation, moral outrage and exhaustion. Once we could have heard it from the original Capitol Hill, the seat of the failing Roman Republic. As Caesar told Rome’s Senate, “Certainly there was greater merit and wisdom in those who raised so mighty an empire from humble means, than in us, who can scarcely preserve what they so honorably acquired.” His enemy Cato responded, “There were other things that made them great, which we lack: industriousness at home; fair government abroad; minds impartial in council.” They lived in an era of decline, and they both knew it.
Do we? That exchange resonates as we look again over last year’s bruising budget battles—which, thankfully, appear to have reached their wearying apotheosis. In unsteady times, we’re compelled to look back: Tea Partiers imagine themselves as revolutionary Americans; revolutionary Americans (churning out pamphlets under names like “Publius,” “Brutus,” and “Cato”) imagined themselves as republican Romans; and those Romans measured themselves against the generations that bequeathed them an empire. We live in a nation modeled on Rome, founded by men who modeled themselves on Romans—and having traced Rome’s history in outline, from backwater republic to imperial power, it’s natural to wonder if the next step is ours as well.
It’s a fair worry. Across time and place, the breakdowns of republican governments share eerie similarities, as political conflicts spill beyond the bounds of the norms designed to hold them in check. Rome’s example warns us that a cycle of crisis politics, once entered into, grows increasingly difficult to escape. There is reason to believe that we’ve entered into just such a cycle. But there is also reason to hope that we can respond with a rededication to political norms—not with the panicked cries of tyranny and self-fulfilling predictions of collapse that doomed Rome.
Brinksmanship, “nuclear options” and shutdowns are not unique to American politics. The Roman Republic’s final years were increasingly prone to political conflicts so intractable that they left the government paralyzed. In 60 bce, Cato, the leader of Rome’s traditionalist optimas faction, ground the Senate to a halt for months through unprecedented use of the filibuster. His prime target was a program of land reform that would distribute public farms to Rome’s veterans, a measure that the optimates feared would create the constituency for a military tyrant.
For the following year, Rome elected a government headed by Caesar, who promised to carry out the land reform, and Cato’s son-in-law Bibulus, who pledged to stop it (proving that we aren’t the first to struggle with divided government). Cato, a man of great personal courage but the politics of a brick wall, again blocked a Senate vote on the land bill. Bibulus attempted to halt public business by declaring every remaining day on the legislative calendar a religious holiday; as he announced to the public, “You shall not have this law this year, not even if you all want it!” Caesar answered with a constitutionally questionable step of his own, bypassing the Senate and pushing land reform directly through the Roman people’s assembly. Bibulus retaliated by barricading himself in his home, boycotting the government and postponing the next election for three months.
 
Though the victory was Caesar’s, the episode left a deep legacy of bitterness. Three years later, the optimates went on legislative strike again, in renewed protest against Caesar’s political faction. Rome’s hardline senators shut down the chamber, dressed in black mourning clothes, and, in the words of the ancient historian Cassius Dio, “spent the rest of the year as if they were in bondage and possessed no authority to choose officials or carry on any other public business.” Most significantly, they refused to schedule elections. The Republic faced the prospect of a new year with no elected government at all, until the senators backed down and allowed a vote on the calendar’s final day. By this point, stalemated government and manipulation of elections had become routine: Over the Republic’s last decade, elections were postponed in five consecutive years. And in the midst of the squabbling, the Forum heard louder and louder cries for a strongman to save Rome from the muck of self-government.
History has been called “a distant mirror” for good reason. We can see our own features reflected in the past, but only vaguely. A distance of 2,000 years means that there are no easy reflections between this shutdown story and ours. But at the same time, history would be worthless to us if we didn’t try to translate truths from one age to another. And one truth that does seem to translate is this: In republican government, norms matter profoundly.
Political elites aren’t simply bound by written rules; they’re bound as well by unwritten rules that are developed and refined in practice. We wouldn’t want to write the entire code of political behavior into law. Instead, we prize freedom of action and flexibility; we understand that a code of written rules that tried to anticipate every situation would be an oppressive failure. Perhaps, as a political community, we also value systems that depend on a degree of mutual trust in order to function. And yet our reliance on unwritten rules leaves us vulnerable. Built over generations of conflict, compromise and accommodation, norms can be discarded in an instant. They are far easier to break than to build—and breaking them only takes the defection of one side.
The Roman Republic was nearly five centuries old when it collapsed. In that time, it had developed norms against permanent filibuster campaigns, boycotts of government, bypassing the Senate to enact policy and postponement of elections. All of the steps I’ve described were legal. They were also disastrous. Collectively, layered one on the other, they normalized a state of crisis politics.

But given how far we’re already traveled down the road of norm failure, there is also some reason to fear that these developments are too little and too late. When norms collapse, they often fall in unpredictable cycles of resentment and recrimination. As two sides engage in tit-for-tat behavior, or as one side discovers that unwritten rules are hollow, republics can suffer cascades of norm-breaking. The last years of the Roman Republic saw unwritten rules called into question across nearly every dimension of its political life. Taboos against multiple terms in the highest office, and on “extraordinary,” multi-year military commands—both conceived as checks on elite ambition—were discarded. The Senate broke precedent and executed suspected conspirators without trial. Campaign spending exploded, pumping so much money into the economy that Rome’s interest rate briefly doubled. Religion devolved from a unifying force to a political tool, as politicians increasingly claimed the sanction of gods and omens to delay elections and block unfavorable laws.
It would be too facile to draw a direct causal line between each of these events. But they all took place within an atmosphere of deepening distrust and loss of faith, in which each breach made the next more likely. The political scientist Robert Dahl argued that self-government is most stable where societies develop a “system of mutual security”: where conflict is confined to a finite battlefield and political actors agree not to use all of the weapons at their disposal. What we see in the last years of the Roman Republic is a system of mutual security falling to pieces. In its last days, as Rome’s leaders met in a final effort to avert civil war, Cato himself scuttled a face-saving deal: As he shouted to the negotiators on his own side of the table, “You’re being deceived again!” The last measure of political trust had leached away.
But it’s not enough to identify these vicious cycles of failing norms. What sets them in motion? Rome’s example suggests at least one cause: These vicious cycles often begin when the stakes of politics increase more rapidly than a political culture can adapt.
The Republic’s consensus-driven institutions were built by and for an elite that enjoyed a rough and hardy equality. The founding republican elites, in a popular shorthand of the time, prayed to clay gods rather than marble ones. Empire changed that: In the judgment of an ancient historian who observed the Romans firsthand, “It is clear that when the state achieves considerable prosperity, lifestyles become more extravagant and men become unduly keen for offices and other objects of ambition.”
Feeding on foreign conquest, power grew more lucrative than it had ever been. A term in office at home was a promise of plunder abroad. That wealth could be turned into the best Greek education for one’s children, into the purchase of political “clients” and hangers-on and into the massive outlays that would secure election again. The growth of the political stakes outpaced the evolution of new norms for a new world: Rome never discovered how to sustainably meld empire abroad and republic at home.
In our time, a norm has held that a president is entitled to have qualified nominees confirmed by the Senate; yet judicial vacancies are at a historic high, and 68 confirmable positions were unfilled at the end of President Barack Obama’s first term. A norm has held that the Senate filibuster protects the rights of the minority in extraordinary circumstances; yet filibuster use has hit all-time record levels, and a 60-vote threshold in the Senate has turned from extraordinary to routine. A norm has held that Congress never threatens a debt default to gain political leverage; yet we have come within days of default twice in the past three years. All of these steps have been legal.
There’s reason to hope that this autumn’s brush with default has strengthened the norm against such threats. As Senate Minority Leader Mitch McConnell (R-Ky.) colorfully put it, “There’s no education in the second kick of a mule.” Similarly, there’s reason to hope that Senate Democrats’ important decision to limit the filibuster will actually shift the supermajority requirement back to its historic role: a gauge to register exceptional opposition, not a chronic choke point on legislation.
But growing scarcity can raise the stakes of political competition, too—and this seems to be the case in our republic. As the conservative commentator David Frum has persuasively argued, “Once it seemed possible to have the spending Democrats wanted, financed at the tax rates the Republicans wanted, while paying for sufficient national security and running bearable deficits. That sense of expansiveness is gone.” Whether the cause is an influx of wealth, or a dawning discovery of scarcity, the effect can be the same. Losing becomes much more costly and far more infuriating—in the first case, because of the fear that opponents will use their wealth to exclude you from power, and in the second case, because of the fear that one more election will put your political priorities permanently out of reach.

And this fear is redoubled whenever politics is seen not simply as a struggle over wealth, but as a struggle over identity. The early Roman Republic prided itself as a society of small landowners who practiced the traditional virtues; the late Republic saw those small farms swallowed up by the plantations of the ascendant elite, a social revolution that added fire to the brutal debates over land reform. And in contrast to the homogeneity of the early days, the late Republic was thrown open to the world, and a tone of cultural paranoia crept into the language of the old guard. We can hear it, for instance, in a letter from Cato’s great-grandfather: “In due course, my son Marcus, I shall explain to you what I found out in Athens about those Greeks….They are a worthless and unruly tribe. Take this as a prophecy: when those folk give us their writings they will corrupt everything. All the more if they send their doctors here.”
The cultural paranoia of the optimates has its echo on today’s American right, in a nation reshaped by immigration and a shrinking white majority. Listen to an evangelical voter asked in a recent focus group to describe the disappearing world he grew up in: “Everybody is happy. Everybody is white. Everybody is middle class, whether or not they really are. Everybody looks that way….Very homogenous.” When politics looks like a question of cultural self-defense, political norms may also look like empty niceties unsuited to the urgency of the times.
We know where this downward spiral takes us. But what can we do to check it? A bit paradoxically, the worst thing we can do in a situation like this is to predict decline and collapse—because such predictions become self-fulfilling.
Rome’s hardliners were certainly born into an unstable state. But long before anyone else—long before it was a reality— they insisted that the Republic was not merely unstable, but falling apart. For more than a decade, Cato made his name as Rome’s prophet of tyranny. Inspired by these fears, the optimates took dramatic, uncompromising action that made collapse more likely.
On the eve of civil war, Cato declared to the Senate, as only a spurned prophet could, “Now these things are come to pass which I foretold to you!” But he failed to consider that doom-mongering is itself a political act: It turns the unthinkable into the acceptable, and justifies radicalism in the name of liberty. Rome’s political leaders broke norms because they believed the Republic was at stake—when the Republic was really in the norms themselves.
The idea that our republic is perpetually one small step away from tyranny is our most dangerous inheritance from the Romans. America’s founders regularly branded their opponents as would-be “Caesars,” and in our time, their style of argument has blended with apocalyptic religion and taken on new life, from Sen. Ted Cruz’s claim that “this is an administration that seems bound and determined to violate every one of our bill of rights,” to Gov. Rick Perry’s argument that implementing Obamacare is “a criminal act,” to Rep. Michele Bachmann’s belief that “we are in God’s end times history.” And while Democrats in the George W. Bush years did not practice obstruction with nearly such apocalyptic gusto, they too suffered their hyperbolic moments (Al Gore’s invocation of the president’s “digital Brown Shirts” comes to mind).
In all, it has become the background drone of our politics, the dull hum of impending doom. Let’s understand why this thinking appeals. Envisioning decline is addictive. It offers us the chance to imagine our times as extraordinary and to cast ourselves in heroic roles to meet them. And the thrill demands a higher dose of doom each year.
But let’s also understand what this thinking does. If our republic is at stake, then it’s reasonable to treat an elected president as illegitimate. If our republic is at stake, then it’s fair to nullify laws that offend us. If our republic is really at stake, then defaulting on our debts to save it—paying any price at all—is a bargain.
To study the Roman Republic’s last years is to watch this pattern play out in a distant mirror. And to study the years of its strength is to come into contact with an entirely different cast of mind. This is the awareness that the norms that grow up over generations of experience embody more wisdom than we know, are worthy of respect for their complexity and their practicality, and, if radically disturbed, will react in unpredictable ways. It’s the calm faith that our times are likely to be no more or less extraordinary than any other times. And it’s the conviction that vigilance is liberty’s price, but paranoia is its solvent. These are all fundamentally conservative insights, and they sit uncomfortably with radicals of any age. The Romans had a name for this cast of mind: mos maiorum, “the way of the elders.”
Rome’s tragedy is that the men who saw and sold themselves as guardians of the way of the elders did more than anyone to undermine it. Our hope is that we have what they lacked: the example of their failure.
Rob Goodman is a former House and Senate speechwriter

Republicans Hold Unemployed Hostage to Partisan Politics

John Cassidy THE NEW YORKER
If you think Tuesday’s vote in the Senate to extend unemployment benefits means that Washington has finally come to its senses, think again. Although six Republican senators broke with their party and joined Democrats in supporting the notion of preserving benefits for about 1.3 million Americans who have been out of work for more than six months, this was just a procedural vote that paves the way for a full debate on the measure. And Republicans, in both the Senate and the House, have made clear that they won’t approve any actual legislation unless the White House agrees to cut spending in other areas, to cover the cost of the extension—about 6.4 billion dollars, according to the National Journal.
From a political perspective, it’s easy to see the appeal of this maneuver. Going into an election year, the last thing the Republicans want is to be depicted as heartless goons with no sympathy for the millions of Americans struggling to find work, the blameless victims of the Great Recession and its aftermath. (Of course, this is exactly how the Democrats would like to portray them.) At the same time, though, the average G.O.P. congressman or senator lives in mortal fear of upsetting right-wing groups, such as Heritage Action for America and the Club for Growth, which are leading the fight against extending jobless benefits. (On Monday, Heritage Action said it would include the Senate vote on its “legislative scorecard,” which ranks elected officials on their fealty to the conservative cause.)
This political dilemma explains why the list of Republicans who voted with the Democrats includes just one senator up for reĆ«lection this year: Susan Collins, of Maine, who is widely regarded as a shoo-in. In the upside-down world of today’s G.O.P., the only members who can put the party’s best interests front and center are mavericks, such as Collins, and those who don’t face the prospect of being challenged from the right in a primary election. None of the other five defectors—Dean Heller, of Nevada; Kelly Ayotte, of New Hampshire; Dan Coats, of Indiana; Rob Portman, of Ohio; and Lisa Murkowski, of Alaska—is up for reĆ«lection until 2016. And even most of them have said that they won’t support the final bill unless the White House agrees to cut other government programs.
So far, the Obama Administration has rejected this quid pro quo, and it should hold firm. The Emergency Unemployment Compensation system, which provides benefits averaging about two hundred and fifty dollars a week to the jobless after their twenty-six weeks of state-provided benefits run out, isn’t an ongoing spending program. As its name suggests, it is a contingency measure that goes into effect only when the unemployment rate is abnormally high and the unemployed are finding it unusually difficult to find work. Until very recently, the program enjoyed all-party support. Indeed, it was George W. Bush, in July, 2008, who signed the current version into law.
Back then, it is worth noting, the unemployment rate was 5.6 per cent and the long-term unemployment rate—i.e., the proportion of people in the workforce who had been out of work for more than six months—was one per cent. The average duration of unemployment was 17.1 weeks. Today, all of these figures are considerably higher. The unemployment rate is seven per cent, the long-term rate is about 2.5 per cent, and the average duration of unemployment is more than thirty-five weeks.
After a recession ends, it is always a judgement call when to wind down the emergency benefits program and limit eligibility to twenty-six weeks. But, as the White House Council of Economic Advisors noted in a recent study, the suggestion that extended benefits should be withdrawn now goes against more than half a century of history. “In no prior case has Congress allowed special extended benefits to expire when the unemployment rate was as high as it is today,” the study noted. “Moreover, the long-term unemployment rate is twice as high today as in any prior month when extended benefits were allowed to expire.” This record applies to Republican and Democratic Administrations alike. After the recession of the early nineteen-seventies, the Nixon Administration waited until the long-term jobless rate had reached 0.9 per cent before cutting off extended benefits; in the nineteen-eighties, the Reagan Administration waited until the rate had reached 1.2 per cent.
Today’s G.O.P. is different, of course. Its insistence that an extension be held hostage to finding equivalent cost savings in other areas defies precedent and economic logic. Firstly, the financial cost of the program is pretty modest: about two billion dollars a month in 2013. Secondly, almost all of this money goes straight back into the economy, where it generates additional spending, hiring, and tax revenues. And finally, as I said earlier, this is a temporary program, not a permanent one. Assuming the economy continues to grow and the unemployment rate continues to trend down, it will eventually be put on hiatus until the next recession.
Other than an ideological aversion to government spending of any kind, there is no reason not to extend unemployment benefits for a while longer. Economists sometimes worry that making them available for long periods will encourage the jobless to remain unemployed rather than taking jobs, but careful studies have failed to show much evidence of this. When employment openings are scarce, as they are still, a bigger worry is that curtailing benefits will encourage some of the long-term unemployed to drop out of the labor force completely. (As a condition for receiving benefits, recipients have to be looking for work.) When that happens, it inflicts further suffering on many of the people concerned, and crimps the growth potential of the economy at large.
Perhaps it isn’t accurate to say that most Republican senators and congressman don’t care about these things. But they are trapped inside a party and a conservative movement that, increasingly, makes them act as though this were the case. The best hope for the long-term unemployed—and, indeed, for the G.O.P.—is that cynical self-interest and a fear of alienating moderate voters will ultimately persuade lawmakers to do the right thing, even if that means defying the ultras.
Will it happen? Let’s hope so.
 
 
 
 
 

Sunday, January 05, 2014

 

The Rise of the New New Left

Peter Beinart
DAILY BEAST 9/12
Bill de Blasio’s win in New York’s Democratic primary isn’t a local story. It’s part of a vast shift that could upend three decades of American political thinking. By Peter Beinart
Maybe Bill de Blasio got lucky. Maybe he only won because he cut a sweet ad featuring his biracial son. Or because his rivals were either spectacularly boring, spectacularly pathological, or running for Michael Bloomberg’s fourth term. But I don’t think so. The deeper you look, the stronger the evidence that de Blasio’s victory is an omen of what may become the defining story of America’s next political era: the challenge, to both parties, from the left. It’s a challenge Hillary Clinton should start worrying about now.
To understand why that challenge may prove so destabilizing, start with this core truth: For the past two decades, American politics has been largely a contest between Reaganism and Clintonism. In 1981, Ronald Reagan shattered decades of New Deal consensus by seeking to radically scale back government’s role in the economy. In 1993, Bill Clinton brought the Democrats back to power by accepting that they must live in the world Reagan had made. Located somewhere between Reagan’s anti-government conservatism and the pro-government liberalism that preceded it, Clinton articulated an ideological “third way”: Inclined toward market solutions, not government bureaucracy, focused on economic growth, not economic redistribution, and dedicated to equality of opportunity, not equality of outcome. By the end of Clinton’s presidency, government spending as a percentage of Gross Domestic Product was lower than it had been when Reagan left office.
For a time, small flocks of pre-Reagan Republicans and pre-Clinton Democrats endured, unaware that their species were marked for extinction. Hard as they tried, George H.W. Bush and Bob Dole could never muster much rage against the welfare state. Ted Kennedy never understood why Democrats should declare the era of big government over. But over time, the older generation in both parties passed from the scene and the younger politicians who took their place could scarcely conceive of a Republican Party that did not bear Reagan’s stamp or a Democratic Party that did not bear Clinton’s. These Republican children of Reagan and Democratic children of Clinton comprise America’s reigning political generation.
By “political generation,” I mean something particular. Pollsters slice Americans into generations at roughly 20-year intervals: Baby Boomers (born mid-1940s to mid-1960s); Generation X (mid-1960s to early 1980s); Millennials (early 1980s to 2000). But politically, these distinctions are arbitrary. To understand what constitutes a political generation, it makes more sense to follow the definition laid out by the early-20th-century sociologist Karl Mannheim. For Mannheim, generations were born from historical disruption. As he argued—and later scholars have confirmed—people are disproportionately influenced by events that occur between their late teens and mid-twenties. During that period—between the time they leave their parents’ home and the time they create a stable home of their own—individuals are most prone to change cities, religions, political parties, brands of toothpaste. After that, lifestyles and attitudes calcify. For Mannheim, what defined a generation was the particular slice of history people experienced during those plastic years. A generation had no set length. A new one could emerge “every year, every thirty, every hundred.” What mattered was whether the events people experienced while at their most malleable were sufficiently different from those experienced by people older or younger than themselves.
Mannheim didn’t believe that everyone who experienced the same formative events would interpret them the same way. Germans who came of age in the early 1800s, he argued, were shaped by the Napoleonic wars. Some responded by becoming romantic-conservatives, others by becoming liberal-rationalists. What they shared was a distinct generational experience, which became the basis for a distinct intra-generational argument.
130911-beinart-millennials-embed2
Barack Obama and Bill Clinton share a moment at the 2012 Democratic National Convention. (Getty Images)

If Mannheim’s Germans constituted a political generation because in their plastic years they experienced the Napoleonic Wars, the men and women who today dominate American politics constitute a political generation because during their plastic years they experienced some part of the Reagan-Clinton era. That era lasted a long time. If you are in your late 50s, you are probably too young to remember the high tide of Kennedy-Johnson big government liberalism. You came of age during its collapse, a collapse that culminated with the defeat of Jimmy Carter. Then you watched Reagan rewrite America’s political rules. If you are in your early ‘40s, you may have caught the tail end of Reagan. But even if you didn’t, you were shaped by Clinton, who maneuvered within the constraints Reagan had built. To pollsters, a late 50-something is a Baby Boomer and an early 40-something is a Gen-Xer. But in Mannheim’s terms, they constitute a single generation because no great disruption in American politics divides them. They came of age as Reagan defined a new political era and Clinton ratified it. And as a rule, they play out their political struggles between the ideological poles that Reagan and Clinton set out.
To understand how this plays out in practice, look at the rising, younger politicians in both parties. Start with the GOP. If you look at the political biographies of nationally prominent 40-something Republicans—Bobby Jindal, Scott Walker, Paul Ryan, Marco Rubio, Ted Cruz—what they all have in common is Reagan. Jindal has said about growing up in Louisiana, “I grew up in a time when there weren’t a whole lot of Republicans in this state. But I identified with President Reagan.” At age 17, Scott Walker was chosen to represent his home state of Colorado in a Boys Nation trip to Washington. There he met “his hero, Ronald Reagan,” who “played a big role in inspiring me.” At age 21, Paul Ryan interned for Robert Kasten, who had ridden into the Senate in 1980 on Reagan’s coattails. Two years later he took a job with Jack Kemp, whose 1981 Kemp-Roth tax cut had helped usher in Reaganomics. Growing up in a fiercely anti-communist Cuban exile family in Miami, Marco Rubio writes in his autobiography that “Reagan’s election and my grandfather’s allegiance to him were defining influences on me politically.” Ted Cruz is most explicit of all. “I was 10 when Reagan became president,” he told a conservative group earlier this year. “I was 18 when he left the White House … I’ll go to my grave with Ronald Wilson Reagan defining what it means to be president … and when I look at this new generation of [Republican] leaders I see leaders that are all echoing Reagan.”
Younger Democratic politicians are less worshipful of Clinton. Yet his influence on their worldview is no less profound. Start with the most famous, still-youngish Democrat, a man who although a decade older than Rubio, Jindal, and Cruz, hails from the same Reagan-Clinton generation: Barack Obama. Because he opposed the Iraq War, and sometimes critiqued the Clintons as too cautious when running against Hillary in 2008, some commentators depicted Obama’s victory as a rejection of Clintonism. But to read The Audacity of Hope—Obama’s most detailed exposition of his political outlook—is to be reminded how much of a Clintonian Obama actually is. At Clintonism’s core was the conviction that to revive their party, Democrats must first acknowledge what Reagan got right.
Obama, in describing his own political evolution, does that again and again: “as disturbed as I might have been by Ronald Reagan’s election … I understood his appeal” (page 31). “Reagan’s central insight … contained a good deal of truth” (page 157). “In arguments with some of my friends on the left, I would find myself in the curious position of defending aspects of Reagan’s worldview” (page 289). Having given Reagan his due, Obama then sketches out a worldview in between the Reaganite right and unreconstructed, pre-Reagan left. “The explanations of both the right and the left have become mirror images of each other” (page 24), he declares in a chapter in which he derides “either/or thinking” (page 40). “It was Bill Clinton’s singular contribution that he tried to transcend this ideological deadlock” (page 34). Had the term not already been taken, Obama might well have called his intermediary path the “third way.”
The nationally visible Democrats rising behind Obama generally share his pro-capitalist, anti-bureaucratic, Reaganized liberalism. The most prominent is 43-year-old Cory Booker, who is famously close to Wall Street and supports introducing market competition into education via government-funded vouchers for private schools. In the words of New York magazine, “Booker is essentially a Clinton Democrat.” Gavin Newsom, the 45-year-old lieutenant governor of California, has embraced Silicon Valley in the same way Booker has embraced Wall Street. His book, Citizenville, calls for Americans to “reinvent government,” a phrase cribbed from Al Gore’s effort to strip away government bureaucracy in the 1990s. “In the private sector,” he told Time, “leaders are willing to take risks and find innovative solutions. In the public sector, politicians are risk-averse.” Julian Castro, the 39-year-old mayor of San Antonio and 2012 Democratic convention keynote speaker, is a fiscal conservative who supports NAFTA.
The argument between the children of Reagan and the children of Clinton is fierce, but ideologically, it tilts toward the right. Even after the financial crisis, the Clinton Democrats who lead their party don’t want to nationalize the banks, institute a single-payer health-care system, raise the top tax rate back to its pre-Reagan high, stop negotiating free-trade deals, launch a war on poverty, or appoint labor leaders rather than Wall Streeters to top economic posts. They want to regulate capitalism modestly. Their Reaganite Republican adversaries, by contrast, want to deregulate it radically. By pre-Reagan standards, the economic debate is taking place on the conservative side of the field. But—and this is the key point--there’s reason to believe that America’s next political generation will challenge those limits in ways that cause the leaders of both parties fits.
America’s youngest adults are called “Millennials” because the 21st century was dawning as they entered their plastic years. Coming of age in the 21st century is of no inherent political significance. But this calendric shift has coincided with a genuine historical disruption. Compared to their Reagan-Clinton generation elders, Millennials are entering adulthood in an America where government provides much less economic security. And their economic experience in this newly deregulated America has been horrendous. This experience has not produced a common generational outlook. No such thing ever exists. But it is producing a distinct intragenerational argument, one that does not respect the ideological boundaries to which Americans have become accustomed. The Millennials are unlikely to play out their political conflicts between the yard lines Reagan and Clinton set out.
Video screenshot
 
See which celebrities support Bill de Blasio, one of the rising stars of the new new left.

Even if they are only a decade older than Millennials, politicians like Cruz, Rubio, and Walker hail from a different political generation.
In 2001, just as the first Millennials were entering the workforce, the United States fell into recession. By 2007 the unemployment rate had still not returned to its pre-recession level. Then the financial crisis hit. By 2012, data showed how economically bleak the Millennials’ first decade of adulthood had been. Between 1989 and 2000, when younger members of the Reagan-Clinton generation were entering the job market, inflation-adjusted wages for recent college graduates rose almost 11 percent, and wages for recent high school graduates rose 12 percent. Between 2000 and 2012, it was the reverse. Inflation-adjusted wages dropped 13 percent among recent high school graduates and 8 percent among recent graduates of college.
But it was worse than that. If Millennials were victims of a 21st-century downward slide in wages, they were also victims of a longer-term downward slide in benefits. The percentage of recent college graduates with employer-provided health care, for instance, dropped by half between 1989 and 2011.
130911-beinart-millennials-embed3
Christine Quinn and Hillary Clinton meet in Manhattan. (Getty Images)

The Great Recession hurt older Americans, too. But because they were more likely to already have secured some foothold in the job market, they were more cushioned from the blow. By 2009, the net worth of households headed by someone over 65 was 47 times the net worth of households headed by someone under 35, almost five times the margin that existed in 1984.
One reason is that in addition to coming of age in a terrible economy, Millennials have come of age at a time when the government safety net is far more threadbare for the young than for the middle-aged and old. As the Economic Policy Institute has pointed out, younger Americans are less likely than their elders to qualify for unemployment insurance, food stamps, Temporary Assistance for Needy Families, or the Earned Income Tax Credit. (Not to mention Medicare and Social Security.)
Millennials have also borne the brunt of declines in government spending on higher education. In 2012, according to The New York Times, state and local spending per college student hit a 25-year low. As government has cut back, universities have passed on the (ever-increasing) costs of college to students. Nationally, the share of households owing student debt doubled between 1989 and 2010, and the average amount of debt per household tripled, to $26,000.
Economic hardship has not always pushed Americans to the left. In the Clinton-Reagan era, for instance, the right often used culture and foreign policy to convince economically struggling Americans to vote against bigger government. But a mountain of survey data—plus the heavily Democratic tilt of Millennials in every national election in which they have voted—suggests that they are less susceptible to these right-wing populist appeals. For one thing, right-wing populism generally requires rousing white, Christian, straight, native-born Americans against Americans who are not all those things. But among Millennials, there are fewer white, Christian non-immigrants to rouse. Forty percent of Millennials are racial or ethnic minorities. Less than half say religion is “very important” to their lives.
And even those Millennials who are white, Christian, straight, and native-born are less resentful of people who are not. According to a 2010 Pew survey, whites under the age of 30 were more than 50 points more likely than whites over 65 to say they were comfortable with someone in their family marrying someone of another ethnicity or race. A 2011 poll by the Public Religion Research Institute found that almost 50 percent of evangelicals under the age of 30 back gay marriage
Of course, new racial, ethnic, and sexual fault lines could emerge. But today, a Republican seeking to divert Millennial frustrations in a conservative cultural direction must reckon with the fact that Millennials are dramatically more liberal than the elderly and substantially more liberal than the Reagan-Clinton generation on every major culture war issue except abortion (where there is no significant generational divide).
They are also more dovish on foreign policy. According to the Pew Research Center, Millennials are close to half as likely as the Reagan-Clinton generation to accept sacrificing civil liberties in the fight against terrorism  and much less likely to say the best way to fight terrorism is through military force.
130911-beinart-millennials-embed4
A protester carries a flag at Occupy Wall Street. (Andrew Burton/Getty Images)

It is these two factors—their economic hardship in an age of limited government protection and their resistance to right-wing cultural populism—that best explain why on economic issues, Millennials lean so far left. In 2010, Pew found that two-thirds of Millennials favored a bigger government with more services over a cheaper one with fewer services, a margin 25 points above the rest of the population. While large majorities of older and middle-aged Americans favored repealing Obamacare in late 2012, Millennials favored expanding it, by 17 points. Millennials are substantially more pro–labor union than the population at large.
The only economic issue on which Millennials show much libertarian instinct is the privatization of Social Security, which they disproportionately favor. But this may be less significant than it first appears. Historically, younger voters have long been more pro–Social Security privatization than older ones, with support dropping as they near retirement age. In fact, when asked if the government should spend more money on Social Security, Millennials are significantly more likely than past cohorts of young people to say yes.
Most striking of all, Millennials are more willing than their elders to challenge cherished American myths about capitalism and class. According to a 2011 Pew study, Americans under 30 are the only segment of the population to describe themselves as “have nots” rather than “haves.” They are far more likely than older Americans to say that business enjoys more control over their lives than government.  And unlike older Americans, who favor capitalism over socialism by roughly 25 points, Millennials, narrowly, favor socialism.
There is more reason to believe these attitudes will persist as Millennials age than to believe they will change. For starters, the liberalism of Millennials cannot be explained merely by the fact that they are young, because young Americans have not always been liberal. In recent years, polls have shown young Americans to be the segment of the population most supportive of government-run health care. But in 1978, they were the least supportive. In the last two elections, young Americans voted heavily for Obama. But in 1984 and 1988, Americans under 30 voted Republican for president.
130911-reagan-clinton-split-embed
Getty

Nor is it true that Americans necessarily grow more conservative as they age. Sometimes they do. But academic studies suggest that party identification, once forged in young adulthood, is more likely to persist than to change. There’s also strong evidence from a 2009 National Bureau of Economic Research paper that people who experience a recession in their plastic years support a larger state role in the economy throughout their lives.
The economic circumstances that have pushed Millennials left are also unlikely to change dramatically anytime soon. A 2010 study by Yale economist Lisa Kahn found that even 17 years later, people who had entered the workforce during a recession still earned 10 percent less than those who entered when the economy was strong.  In other words, even if the economy booms tomorrow, Millennials will still be suffering the Great Recession’s aftershocks for decades.
And the economy is not likely to boom. Federal Reserve Chairman Ben Bernanke doesn’t believe the unemployment rate will reach 6 percent until 2016, and even that will be higher than the 1990s average. Nor are the government protections Millennials crave likely to appear anytime soon. To the contrary, as a result of the spending cuts signed into law in 2010 and the sequester that began this year, non-defense discretionary spending is set to decline by decade’s end to its lowest level in 50 years.
If Millennials remain on the left, the consequences for American politics over the next two decades could be profound. In the 2008 presidential election, Millennials constituted one-fifth of America’s voters. In 2012, they were one-quarter. In 2016, according to predictions by political demographer Ruy Teixeira, they will be one-third. And they will go on constituting between one-third and two-fifths of America’s voters through at least 2028.
This rise will challenge each party, but in different ways. In the runup to 2016, the media will likely feature stories about how 40-something Republicans like Marco Rubio, who blasts Snoop Dog from his car, or Paul Ryan, who enjoys Rage Against the Machine, may appeal to Millennials in ways that geezers like McCain and Romney did not. Don’t believe it. According to a 2012 Harvard survey, young Americans were more than twice as likely to say Mitt Romney’s selection of Ryan made them feel more negative about the ticket than more positive. In his 2010 Senate race, Rubio fared worse among young voters than any other age group. The same goes for Rand Paul in his Senate race that year in Kentucky, and Scott Walker in his 2010 race for governor of Wisconsin  and his recall battle in 2012.
Pre-election polls in Ted Cruz’s 2012 senate race in Texas (there were no exit polls) also showed him faring worst among the young.
The likeliest explanation for this is that while younger Republican candidates may have a greater cultural connection to young voters, the ideological gulf is vast. Even if they are only a decade older than Millennials, politicians like Cruz, Rubio, and Walker hail from a different political generation both because they came of age at a time of relative prosperity and because they were shaped by Reagan, whom Millennials don’t remember. In fact, the militantly anti-government vision espoused by ultra-Reaganites like Cruz, Rubio, and Walker isn’t even that popular among Millennial Republicans. As a July Pew survey notes, Republicans under 30 are more hostile to the Tea Party than any other Republican age group. By double digits, they’re also more likely than other Republicans to support increasing the minimum wage.
Republicans may modestly increase their standing among young voters by becoming more tolerant on cultural issues and less hawkish on foreign policy, but it’s unlikely they will become truly competitive unless they follow the counsel of conservative commentators Ross Douthat and Reihan Salam and “adapt to a new reality—namely, that today, Americans are increasingly worried about their economic security.” If there’s hope for the GOP, it’s that Millennials, while hungry for government to provide them that economic security, are also distrustful of its capacity to do so. As a result of growing up in what Chris Hayes’ has called the “fail decade” —the decade of the Iraq War, Hurricane Katrina and the financial crisis—Millennials are even more cynical about government than the past generations of young Americans who wanted less from it. If a Republican presidential candidate could match his Democratic opponent as a champion of economic security and yet do so in a way that required less faith in Washington’s competence and benevolence, he might boost the GOP with young voters in a way no number of pop-culture references ever could.
If the Millennials challenge Reaganite orthodoxy, they will likely challenge Clintonian orthodoxy, too. Over the past three decades, Democratic politicians have grown accustomed to campaigning and governing in the absence of a mobilized left. This absence has weakened them: Unlike Franklin Roosevelt or Lyndon Johnson, Bill Clinton and Barack Obama could never credibly threaten American conservatives that if they didn’t pass liberal reforms, left-wing radicals might disrupt social order. But Democrats of the Reagan-Clinton generation have also grown comfortable with that absence. From Tony Coelho, who during the Reagan years taught House Democrats to raise money from corporate lobbyists to Bill Clinton, who made Goldman Sachs co-chairman Robert Rubin his chief economic adviser, to Barack Obama, who gave the job to Rubin’s former deputy and alter ego, Larry Summers, Democrats have found it easier to forge relationships with the conservative worlds of big business and high finance because they have not faced much countervailing pressure from an independent movement of the left.
130911-beinart-millennials-embed5
But that may be changing. Look at the forces that created Occupy Wall Street. The men and women who assembled in September 2011 in Zuccotti Park bore three key characteristics. First, they were young. According to a survey published by City University of New York’s Murphy Institute for Worker Education and Labor, 40 percent of the core activists involved taking over the park were under 30 years old. Second, they were highly educated. Eighty percent possessed at least a bachelors’ degree, more than twice the percentage of New Yorkers overall. Third, they were frustrated economically. According to the CUNY study, more than half the Occupy activists under 30 owed at least $1,000 in student debt. More than a one-third had lost a job or been laid off in the previous five years. In the words of David Graeber, the man widely credited with coining the slogan “We are the 99 percent,” the Occupy activists were “forward-looking people who had been stopped dead in their tracks” by bad economic times.
 
Occupy Wall Street protesters picket during a May Day rally in front of the Bank of America building in 2012. (Monika Graff/Getty Images)

For a moment, Occupy shook the country. At one point in December 2011, Todd Gitlin points out in Occupy Nation, the movement had branches in one-third of the cities and towns in California. Then it collapsed. But as the political scientist Frances Fox Piven has argued, “The great protest movements of history … did not expand in the shape of a simple rising arc of popular defiance. Rather, they began in a particular place, sputtered and subsided, only to re-emerge elsewhere in perhaps a different form, influenced by local particularities of circumstance and culture.”
It’s impossible to know whether the protest against inequality will be such a movement. But the forces that drove it are unlikely to subside. Many young Americans feel that economic unfairness is costing them a shot at a decent life. Such sentiments have long been widespread among the poor. What’s new is their prevalence among people who saw their parents achieve—and expected for themselves—some measure of prosperity, the people Chris Hayes calls the “newly radicalized upper-middle class.”
If history is any guide, the sentiments behind Occupy will find their way into the political process, just as the anti-Vietnam movement helped create Eugene McCarthy’s presidential bid in 1968, and the civil-rights movement bred politicians like Andrew Young, Tom Bradley, and Jesse Jackson. That’s especially likely because Occupy’s message enjoys significant support among the young. A November 2011 Public Policy Polling survey found that while Americans over 30 opposed Occupy’s goals by close to 20 points, Millennials supported them by 12.
Bill de Blasio’s mayoral campaign offers a glimpse into what an Occupy-inspired challenge to Clintonism might look like. In important ways, New York politics has mirrored national politics in the Reagan-Clinton era. Since 1978, the mayoralty has been dominated by three men—Ed Koch, Rudy Giuliani, and Michael Bloomberg—who although liberal on many cultural issues have closely identified Wall Street’s interests with the city’s. During their time in office, New York has become far safer, cleaner, more expensive, and more unequal. In Bloomberg’s words, New York is now a “high-end product.
City Council Speaker Christine Quinn, despite her roots on the left as a housing and LGBT activist, became Bloomberg’s heir apparent by stymieing bills that would have required businesses to give their employees paid sick leave and mandated a higher minimum wage for companies that receive government subsidies. Early in the campaign, many commentators considered this a wise strategy and anticipated that as New York’s first lesbian mayor, Quinn would symbolize the city’s unprecedented cultural tolerance while continuing its Clintonian economic policies.
Then strange things happened. First, Anthony Weiner entered the race and snatched support from Quinn before exploding in a blaze of late-night comedy. But when Weiner crashed, his support went not back to Quinn but to de Blasio, the candidate who most bluntly challenged Bloomberg’s economic philosophy. Calling it “an act of equalization in a city that is desperately falling into the habit of disparity,” de Blasio made his central proposal a tax on people making over $500,000 to fund universal childcare. He also called for requiring developers to build more affordable housing and ending the New York Police Department’s “stop and frisk” policies that had angered many African-Americans and Latinos. Bloomberg’s deputy mayor Howard Wolfson tweeted that de Blasio’s “agenda is clear: higher taxes, bigger govt, more biz mandates. A u-turn back to the 70s.”
But in truth, it was Wolfson who was out of date: Fewer and fewer New Yorkers remember the 1970s, when economic stagnation, rising crime, and bloated government helped elect both Ed Koch and Ronald Reagan. What concerns them more today is that, as The New Yorker recently noted, “If the borough of Manhattan were a country, the income gap between the richest twenty per cent and the poorest twenty per cent would be on par with countries like Sierra Leone, Namibia, and Lesotho.”  In Tuesday’s Democratic primary, Quinn defeated de Blasio in those parts of New York where average income tops $175,000 per year.  But he beat her by 25 points overall.
Democrats in New York are more liberal than Democrats nationally. Still, the right presidential candidate, following de Blasio’s model, could seriously challenge Hillary Clinton. If that sounds far-fetched, consider the last two Democratic presidential primary campaigns. In October 2002, Howard Dean was so obscure that at a Jefferson-Jackson Day dinner, Iowa Sen. Tom Harkin repeatedly referred to him as “John.” But in the summer of 2003, running against the Iraq War amidst a field of Washington Democrats who had voted to authorize it, Dean caught fire. In the first quarter of the year he raised just under $3 million, less than one-third of John Kerry’s total. In the second quarter, he shocked insiders by beating Kerry and raising over $7 million. In the third quarter, he raised almost $15 million, far more than any Democrat ever had. By November, Harkin, Al Gore, and the nation’s two most powerful labor unions had endorsed Dean and he was well ahead in the Iowa polls.
At the last minute, Dean flamed out, undone by harsh attacks from his rivals and his campaign’s lack of discipline. Still, he established a template for toppling a Democratic frontrunner: inspire young voters, raise vast funds via small donations over the Web, and attack those elements of Clintonian orthodoxy that are accepted by Democratic elites but loathed by liberal activists on the ground.
In 2008, that became the template for Barack Obama. As late as October 2007, Hillary enjoyed a 33-point lead in national polls. But Obama made her support for the Iraq War a symbol of her alleged timidity in challenging the right-leaning consensus in Washington. As liberals began to see him as embodying the historic change they sought, Obama started raising ungodly amounts via small donors over the Internet, which in turned won him credibility with insiders in Washington. He overwhelmed Hillary Clinton in caucus states, where liberal activists wield greater power. And he overwhelmed her among younger voters. In the 2008 Iowa caucuses, youth turnout rose 30 percent and among voters under the age of 30, Obama beat Hillary by 46 points.
Hillary starts the 2016 race with formidable strengths. After a widely applauded term as secretary of state, her approval rating is 10 points higher than it was when she began running in 2008. Her vote to authorize Iraq will be less of a liability this time. Her campaign cannot possibly be as poorly managed. And she won’t have to run against Barack Obama.
Still, Hillary is vulnerable to a candidate who can inspire passion and embody fundamental change, especially on the subject of economic inequality and corporate power, a subject with deep resonance among Millennial Democrats. And the candidate who best fits that description is Elizabeth Warren.
130911-beinart-millennials-embed6
A crowd watches Azealia Banks onstage during the Coachella Festival. (Karl Walter/Getty Images)

First, as a woman, Warren would drain the deepest reservoir of pro-Hillary passion: the prospect of a female president. While Hillary would raise vast sums, Dean and Obama have both shown that in the digital age, an insurgent can compete financially by inspiring huge numbers of small donations. Elizabeth Warren can do that. She’s already shown a knack for going viral. A video of her first Senate banking committee hearing, where she scolded regulators that “too-big-to-fail has become too-big-for-trial,”  garnered 1 million hits on YouTube. In her 2012 Senate race, despite never before having sought elected office, she raised $42 million, more than twice as much as the second-highest-raising Democrat. After Bill Clinton and the Obamas, no other speaker at last summer’s Democratic convention so electrified the crowd.
Warren has done it by challenging corporate power with an intensity Clinton Democrats rarely muster. At the convention, she attacked the “Wall Street CEOs—the same ones who wrecked our economy and destroyed millions of jobs—[who] still strut around Congress, no shame, demanding favors, and acting like we should thank them.”
And in one of the biggest applause lines of the entire convention, taken straight from Occupy, she thundered that “we don’t run this country for corporations, we run it for people.”
Don’t be fooled by Warren’s advanced age. If she runs, Millennials will be her base. No candidate is as well positioned to appeal to the young and economically insecure. Warren won her Senate race by eight points overall, but by 30 points among the young. The first bill she introduced in the Senate was a proposal to charge college students the same interest rates for their loans that the Federal Reserve offers big banks. It soon garnered 100,000 hits on YouTube.
A big reason Warren’s speech went viral was its promotion by Upworthy, a website dedicated to publicizing progressive narratives. And that speaks to another, underappreciated, advantage Warren would enjoy. Clinton Democrats once boasted a potent intellectual and media infrastructure. In the late 1980s and 1990s, the Democratic Leadership Council and its think tank, the Progressive Policy Institute, were the Democratic Party’s hottest ideas shops, and they dedicated themselves to restoring the party’s reputation as business-friendly. Influential New Democratic–aligned magazines like The New Republic and Washington Monthly also championed the cause. 
Today, that New Democratic infrastructure barely exists. The DLC has closed down. The New Republic and Washington Monthly have moved left. And all the new powerhouses of the liberal media—from Paul Krugman (who was radicalized during the Bush years) to Jon Stewart (who took over The Daily Show in 1999) to MSNBC (which as late as 2008 still carried a show hosted by Tucker Carlson)—believe the Democrats are too soft on Wall Street.
You can see that shift in the race for governor of the Federal Reserve, where the liberal media has rallied behind Janet Yellen and against the more Wall Street–identified Larry Summers. In the age of MSNBC, populist Democrats enjoy a media echo chamber that gives them an advantage over pro-business Democrats that did not exist a decade ago. And if Clinton, who liberal pundits respect, runs against Warren, who liberal pundits revere, that echo chamber will benefit Warren.
Of course, Warren might not run. Or she might prove unready for the national stage. (She has no foreign-policy experience). But the youthful, anti-corporate passion that could propel her candidacy will be there either way. If Hillary Clinton is shrewd, she will embrace it, and thus narrow the path for a populist challenger. Just as New York by electing Ed Koch in 1978 foreshadowed a national shift to the right, New York in 2013 is foreshadowing a national shift to the left. The door is closing on the Reagan-Clinton era. It would be ironic if it was a Clinton herself who sealed it shut

Rosewood