Monday, May 22, 2017



How the Swastika Became a Confederate Flag

Editorial Observer
By BRENT STAPLES NY TIMES

The easy-listening white supremacists who surged out of the shadows during the presidential campaign are no less dangerous than their white power survivalist or raving skinhead counterparts. But they are hoping to re brand themselves by wearing business clothes and attempting to sound reasonable as they advance a racist agenda. The debate about removing monuments to white supremacy that were built throughout the South a century or more ago is tailor-made for this tactic.

The white supremacist protest in Charlottesville, Va., this month over a plan to remove a statue of Robert E. Lee, the Confederate general, shows how this is likely to go. The marchers feigned civility. But a closer look shows that the protest drew on the toxic symbolism of the Third Reich in ways that few Americans would recognize.

By wielding torches in a protest staged by night, the demonstrators nodded to Nazi rallies held during the 1930s at Nuremberg, where the open flame was revered as a mystical means of purifying the Aryan spirit. They reinforced this toxic connection by chanting “blood and soil,” a Nazi-era slogan that connected German ethnic purity to cultivation of the land and, more broadly, to the notion that the “master race” was divinely entitled to confiscate the holdings of “lesser peoples,” even if it meant slaughtering them along the way.

The demonstrators at Charlottesville — led by the theatrically inclined white supremacist Richard Spencer — had no real interest in the civic or aesthetic value of the monument they ostensibly came to defend. The essence of their argument was that any attempt to renounce Confederate ideology by moving this — or any — monument would be an assault on the so-called white race.

The protest also celebrated the intimate connection between Nazi-era rule in Germany and Jim Crow-era rule in the United States. That connection, long overlooked by historians, was obvious to the network of black-owned newspapers that reached the peak of its influence during World War II.

The barons of the Negro press ridiculed the attempt to frame the war as a fight for liberty at a time when the military was segregating by race soldiers, nurses and even plasma in the wartime blood bank, and running Jim Crow military bases in ways that were fully consistent with the German view of Negroes and others as not fully human.

Editorial cartoonists underscored this point by depicting Hitler and Hirohito together, laughing uproariously, while reading newspaper accounts of lynchings in the American South. The Pittsburgh Courier finally made it palatable to African-Americans to support the war in Europe by recasting it as a struggle to vanquish Nazism abroad and Jim Crow racism at home.

Hitler drew a similar, more sinister comparison in “Mein Kampf.” He describes the United States as “the one state” that had made headway toward what he regarded as a healthy and utterly necessary racist regime. Historians have long sought to minimize the importance of that passage. But in recent years, archival research in Germany has shown that the Nazis were keenly focused on Jim Crow segregation laws, on statutes that criminalized interracial marriage and on other policies that created second-class citizenship in the United States.

The Yale legal scholar James Q. Whitman fleshes this out to eerie effect in his new book “Hitler’s American Model: The United States and the Making of Nazi Race Law.” He illustrates how German propagandists sought to normalize the Nazi agenda domestically by putting forth the United States as a model. They assured the German people that Americans had “racist politics and policies,” just as Germany did, including “special laws directed against the Negroes, which limit their voting rights, freedom of movement, and career possibilities.” Embracing the necessity of lynching, one propagandist wrote: “What is lynch justice, if not the natural resistance of the Volk to an alien race that is attempting to gain the upper hand?”

“Hitler’s American Model” shows that homegrown American racism played a role in the notorious Nuremberg Laws of 1935, which deprived “non-Aryans” of citizenship and the right to marry “true” Germans. As Mr. Whitman writes, Nuremberg “signaled the full-scale creation of a racist state in a Germany on the road to the Holocaust.”

Nazism and the tradition of American white supremacy that is memorialized in monuments throughout the South are the fruit of the same poisonous tree. In this light, the Confederate flag can legitimately be seen as an alternate version of the Nazi emblem.

After the war, Germany tried to put Nazism back in its box by banning public display of swastikas and other emblems of the Third Reich. Later generations understood that to wear such an insignia was to smear oneself with history’s worst filth. Many Americans have failed to grasp this point. This explains why one still sees people parading around with both Nazi emblems and Confederate flags, openly embracing the meanings of both.

The new-age white supremacists who want so eagerly to expand their market shares recognize that covering themselves with swastikas is a route to marginalization. They are betting that they can achieve the effect they seek by embracing the Confederate cause, while serving up easy-listening Nazism on the side.

Saturday, May 20, 2017

Killing C.I.A. Informants, China Crippled U.S. Spying Operations

By MARK MAZZETTI, ADAM GOLDMAN, MICHAEL S. SCHMIDT and MATT APUZZO NY TIMES

WASHINGTON — The Chinese government systematically dismantled C.I.A. spying operations in the country starting in 2010, killing or imprisoning more than a dozen sources over two years and crippling intelligence gathering there for years afterward.

Current and former American officials described the intelligence breach as one of the worst in decades. It set off a scramble in Washington’s intelligence and law enforcement agencies to contain the fallout, but investigators were bitterly divided over the cause. Some were convinced that a mole within the C.I.A. had betrayed the United States. Others believed that the Chinese had hacked the covert system the C.I.A. used to communicate with its foreign sources. Years later, that debate remains unresolved.

But there was no disagreement about the damage. From the final weeks of 2010 through the end of 2012, according to former American officials, the Chinese killed at least a dozen of the C.I.A.’s sources. According to three of the officials, one was shot in front of his colleagues in the courtyard of a government building — a message to others who might have been working for the C.I.A.

Still others were put in jail. All told, the Chinese killed or imprisoned 18 to 20 of the C.I.A.’s sources in China, according to two former senior American officials, effectively unraveling a network that had taken years to build.
Assessing the fallout from an exposed spy operation can be difficult, but the episode was considered particularly damaging. The number of American assets lost in China, officials said, rivaled those lost in the Soviet Union and Russia during the betrayals of both Aldrich Ames and Robert Hanssen, formerly of the C.I.A. and the F.B.I., who divulged intelligence operations to Moscow for years.

The previously unreported episode shows how successful the Chinese were in disrupting American spying efforts and stealing secrets years before a well-publicized breach in 2015 gave Beijing access to thousands of government personnel records, including intelligence contractors. The C.I.A. considers spying in China one of its top priorities, but the country’s extensive security apparatus makes it exceptionally hard for Western spy services to develop sources there.

At a time when the C.I.A. is trying to figure out how some of its most sensitive documents were leaked onto the internet two months ago by WikiLeaks, and the F.B.I. investigates possible ties between President Trump’s campaign and Russia, the unsettled nature of the China investigation demonstrates the difficulty of conducting counterespionage investigations into sophisticated spy services like those in Russia and China.

The C.I.A. and the F.B.I. both declined to comment.

Details about the investigation have been tightly held. Ten current and former American officials described the investigation on the condition of anonymity because they did not want to be identified discussing the information.
The first signs of trouble emerged in 2010. At the time, the quality of the C.I.A.’s information about the inner workings of the Chinese government was the best it had been for years, the result of recruiting sources deep inside the bureaucracy in Beijing, four former officials said. Some were Chinese nationals who the C.I.A. believed had become disillusioned with the Chinese government’s corruption.

But by the end of the year, the flow of information began to dry up. By early 2011, senior agency officers realized they had a problem: Assets in China, one of their most precious resources, were disappearing.

The F.B.I. and the C.I.A. opened a joint investigation run by top counterintelligence officials at both agencies. Working out of a secret office in Northern Virginia, they began analyzing every operation being run in Beijing. One former senior American official said the investigation had been code-named Honey Badger.

As more and more sources vanished, the operation took on increased urgency. Nearly every employee at the American Embassy was scrutinized, no matter how high ranking. Some investigators believed the Chinese had cracked the encrypted method that the C.I.A. used to communicate with its assets. Others suspected a traitor in the C.I.A., a theory that agency officials were at first reluctant to embrace — and that some in both agencies still do not believe.

Their debates were punctuated with macabre phone calls — “We lost another one” — and urgent questions from the Obama administration wondering why intelligence about the Chinese had slowed.

The mole hunt eventually zeroed in on a former agency operative who had worked in the C.I.A.’s division overseeing China, believing he was most likely responsible for the crippling disclosures. But efforts to gather enough evidence to arrest him failed, and he is now living in another Asian country, current and former officials said.

There was good reason to suspect an insider, some former officials say. Around that time, Chinese spies compromised National Security Agency surveillance in Taiwan — an island Beijing claims is part of China — by infiltrating Taiwanese intelligence, an American partner, according to two former officials. And the C.I.A. had discovered Chinese operatives in the agency’s hiring pipeline, according to officials and court documents.

But the C.I.A.’s top spy hunter, Mark Kelton, resisted the mole theory, at least initially, former officials say. Mr. Kelton had been close friends with Brian J. Kelley, a C.I.A. officer who in the 1990s was wrongly suspected by the F.B.I. of being a Russian spy. The real traitor, it turned out, was Mr. Hanssen. Mr. Kelton often mentioned Mr. Kelley’s mistreatment in meetings during the China episode, former colleagues say, and said he would not accuse someone without ironclad evidence.

Those who rejected the mole theory attributed the losses to sloppy American tradecraft at a time when the Chinese were becoming better at monitoring American espionage activities in the country. Some F.B.I. agents became convinced that C.I.A. handlers in Beijing too often traveled the same routes to the same meeting points, which would have helped China’s vast surveillance network identify the spies in its midst.

Some officers met their sources at a restaurant where Chinese agents had planted listening devices, former officials said, and even the waiters worked for Chinese intelligence.

This carelessness, coupled with the possibility that the Chinese had hacked the covert communications channel, would explain many, if not all, of the disappearances and deaths, some former officials said. Some in the agency, particularly those who had helped build the spy network, resisted this theory and believed they had been caught in the middle of a turf war within the C.I.A.

Still, the Chinese picked off more and more of the agency’s spies, continuing through 2011 and into 2012. As investigators narrowed the list of suspects with access to the information, they started focusing on a Chinese-American who had left the C.I.A. shortly before the intelligence losses began. Some investigators believed he had become disgruntled and had begun spying for China. One official said the man had access to the identities of C.I.A. informants and fit all the indicators on a matrix used to identify espionage threats.

After leaving the C.I.A., the man decided to remain in Asia with his family and pursue a business opportunity, which some officials suspect that Chinese intelligence agents had arranged.

Officials said the F.B.I. and the C.I.A. lured the man back to the United States around 2012 with a ruse about a possible contract with the agency, an arrangement common among former officers. Agents questioned the man, asking why he had decided to stay in Asia, concerned that he possessed a number of secrets that would be valuable to the Chinese. It’s not clear whether agents confronted the man about whether he had spied for China.

The man defended his reasons for living in Asia and did not admit any wrongdoing, an official said. He then returned to Asia.

By 2013, the F.B.I. and the C.I.A. concluded that China’s success in identifying C.I.A. agents had been blunted — it is not clear how — but the damage had been done.

The C.I.A. has tried to rebuild its network of spies in China, officials said, an expensive and time-consuming effort led at one time by the former chief of the East Asia Division. A former intelligence official said the former chief was particularly bitter because he had worked with the suspected mole and recruited some of the spies in China who were ultimately executed.

China has been particularly aggressive in its espionage in recent years, beyond the breach of the Office of Personnel Management records in 2015, American officials said. Last year, an F.B.I. employee pleaded guilty to acting as a Chinese agent for years, passing sensitive technology information to Beijing in exchange for cash, lavish hotel rooms during foreign travel and prostitutes.

In March, prosecutors announced the arrest of a longtime State Department employee, Candace Marie Claiborne, accused of lying to investigators about her contacts with Chinese officials. According to the criminal complaint against Ms. Claiborne, who pleaded not guilty, Chinese agents wired cash into her bank account and showered her with gifts that included an iPhone, a laptop and tuition at a Chinese fashion school. In addition, according to the complaint, she received a fully furnished apartment and a stipend.

Thursday, May 18, 2017

Israel-Palestine

Israel-Palestine: the real reason there’s still no peace

The possibility of a lasting deal seems as far away as ever – and the history of failed negotiations suggests it’s largely because Israel prefers the status quo
by Nathan Thrall London Guardian

Scattered over the land between the Jordan river and the Mediterranean Sea lie the remnants of failed peace plans, international summits, secret negotiations, UN resolutions and state-building programmes, most of them designed to partition this long-contested territory into two independent states, Israel and Palestine. The collapse of these initiatives has been as predictable as the confidence with which US presidents have launched new ones, and the current administration is no exception.
In the quarter century since Israelis and Palestinians first started negotiating under US auspices in 1991, there has been no shortage of explanations for why each particular round of talks failed. The rationalisations appear and reappear in the speeches of presidents, the reports of thinktanks and the memoirs of former officials and negotiators: bad timing; artificial deadlines; insufficient preparation; scant attention from the US president; want of support from regional states; inadequate confidence-building measures; coalition politics; or leaders devoid of courage.

Among the most common refrains are that extremists were allowed to set the agenda and there was a neglect of bottom-up economic development and state-building. And then there are those who point at negative messaging, insurmountable scepticism or the absence of personal chemistry (a particularly fanciful explanation for anyone who has witnessed the warm familiarity of Palestinian and Israeli negotiators as they reunite in luxury hotels and reminisce about old jokes and ex-comrades over breakfast buffets and post-meeting toasts). If none of the above works, there is always the worst cliche of them all – lack of trust.
Postmortem accounts vary in their apportioning of blame. But nearly all of them share a deep-seated belief that both societies desire a two-state agreement, and therefore need only the right conditions – together with a bit of nudging, trust-building and perhaps a few more positive inducements – to take the final step.

In this view, the Oslo accords of the mid-1990s would have led to peace had it not been for the tragic assassination of the Israeli prime minister Yitzhak Rabin in 1995. The 1998 Wye River Memorandum and its commitment to further Israeli withdrawals from the West Bank would have been implemented if only the Israeli Labor party had joined Benjamin Netanyahu’s coalition to back the agreement. The Camp David summit in July 2000 would have succeeded if the US had been less sensitive to Israeli domestic concerns, insisted on a written Israeli proposal, consulted the Arab states at an earlier phase, and taken the more firm and balanced position adopted half a year later, in December 2000, when President Clinton outlined parameters for an agreement. Both parties could have accepted the Clinton parameters with only minimal reservations had the proposal not been presented so fleetingly, as a one-time offer that would disappear when Clinton stepped down less than a month later. The negotiations in Taba, Egypt, in January 2001 were on the brink of agreement but failed because time ran out, with Clinton just out of office, and Ehud Barak facing almost certain electoral defeat to Ariel Sharon. The two major peace plans of 2003 – the US-sponsored road map to peace in the Middle East and the unofficial Geneva accord – could have been embraced had it not been for a bloody intifada and a hawkish Likud prime minister in power.

And on it goes: direct negotiations between the Palestinian president Mahmoud Abbas and Netanyahu in 2010 could have lasted more than 13 days if only Israel had agreed to temporarily halt construction of some illegal settlements in exchange for an extra $3bn package from the United States. Several years of secret back-channel negotiations between the envoys of Netanyahu and Abbas could have made history if only they hadn’t been forced to conclude prematurely in late 2013, because of an artificial deadline imposed by separate talks led by secretary of state John Kerry. And, finally, the Kerry negotiations of 2013–2014 could have led to a framework agreement if the secretary of state had spent even a sixth as much time negotiating the text with the Palestinians as he did with the Israelis, and if he hadn’t made inconsistent promises to the two sides regarding the guidelines for the talks, the release of Palestinian prisoners, curtailing Israeli settlement construction, and the presence of US mediators in the negotiating room.

Each of these rounds of diplomacy began with vows to succeed where predecessors had failed. Each included affirmations of the urgency of peace or warnings of the closing window, perhaps even the last chance, for a two-state solution. Each ended with a list of tactical mistakes and unforeseen developments that resulted in failure. And, just as surely, each neglected to offer the most logical and parsimonious explanation for failure: no agreement was reached because at least one of the parties preferred to maintain the impasse.

The Palestinians chose no agreement over one that did not meet the bare minimum supported by international law and most nations of the world. For years this consensus view supported the establishment of a Palestinian state on the pre-1967 lines with minor, equivalent land swaps that would allow Israel to annex some settlements. The Palestinian capital would be in East Jerusalem, with sovereignty over the holy site known to Jews as the Temple Mount and to Muslims as the Noble Sanctuary or al-Aqsa mosque compound, and overland contiguity with the rest of the Palestinian state. Israel would withdraw its forces from the West Bank and release Palestinian prisoners. And Palestinian refugees would be offered compensation, a right to return not to their homes but to their homeland in the State of Palestine, acknowledgment of Israel’s partial responsibility for the refugee problem, and, on a scale that would not perceptibly change Israel’s demography, a return of some refugees to their pre-1948 lands and homes.
Although years of violence and repression have led Palestinians to make some small concessions that chipped away at this compromise, they have not fundamentally abandoned it. They continue to hope that the support of the majority of the world’s states for a plan along these lines will eventually result in an agreement. In the meantime, the status quo has been made more bearable thanks to the architects of the peace process, who have spent billions to prop up the Palestinian government, create conditions of prosperity for decision-makers in Ramallah, and dissuade the population from confronting the occupying force.

Israel, for its part, has consistently opted for stalemate rather than the sort of agreement outlined above. The reason is obvious: the deal’s cost is much higher than the cost of making no deal. The damages Israel would risk incurring through such an accord are massive. They include perhaps the greatest political upheaval in the country’s history; enormous demonstrations against – if not majority rejection of – Palestinian sovereignty in Jerusalem and over the Temple Mount/Noble Sanctuary; and violent rebellion by some Jewish settlers and their supporters.

There could also be bloodshed during forcible evacuations of West Bank settlements and rifts within the body implementing the evictions, the Israeli army, whose share of religious infantry officers now surpasses one third. Israel would lose military control over the West Bank, resulting in less intelligence-gathering, less room for manoeuvre in future wars, and less time to react to a surprise attack. It would face increased security risks from a Gaza-West Bank corridor, which would allow militants, ideology and weapons-production techniques to spread from Gaza training camps to the West Bank hills overlooking Israel’s airport. Israeli intelligence services would no longer control which Palestinians enter and exit the occupied territories. The country would cease extraction of the West Bank’s natural resources, including water, lose profits from managing Palestinian customs and trade, and pay the large economic and social price of relocating tens of thousands of settlers.

Only a fraction of these costs could be offset by a peace agreement’s benefits. But chief among them would be the blow dealt to efforts to delegitimise Israel and the normalisation of relations with other nations of the region. Israeli businesses would be able to operate more openly in Arab states, and government cooperation with such countries as Saudi Arabia and the United Arab Emirates would go from covert to overt. Through a treaty with the Palestinians, Israel could attain the relocation of every Tel Aviv embassy to Jerusalem, and receive additional financial and security benefits from the US and Europe. But all of these combined do not come close to outweighing the deficits.

Nor have the moral costs of occupation for Israeli society been high enough to change the calculus. Ending international opprobrium is indeed important to the country’s elites, and as they find themselves increasingly shunned, the incentive to withdraw from the occupied territories will likely increase. But so far Israel has proven quite capable of living with the decades-old label of “pariah”, the stain of occupation and the associated impact on the country’s internal harmony and relations with diaspora Jews. For all the recent fretting about decreasing American Jewish support for Israel, the conversation today is not so different than it was at the time of the first Likud-led governments decades ago. Similarly enduring – and endurable – are the worries that occupation delegitimises Zionism and causes discord within Israel. More than 30 years ago, former deputy mayor of Jerusalem Meron Benvenisti wrote of growing numbers of Israelis who had doubts about Zionism, “expressed in the forms of alienation, emigration of young Israelis, the emergence of racist Jews, violence in society, the widening gap between Israel and the diaspora, and a general feeling of inadequacy”. Israelis have grown adept at tuning such criticisms out.
It was, is, and will remain irrational for Israel to absorb the costs of an agreement when the price of the alternative is so comparatively low. The consequences of choosing impasse are hardly threatening: mutual recriminations over the cause of stalemate, new rounds of talks, and retaining control of all of the West Bank from within and much of Gaza from without. Meanwhile, Israel continues to receive more US military aid per year than goes to all the world’s other nations combined, and presides over a growing economy, rising standards of living and a population that reports one of the world’s highest levels of subjective wellbeing. Israel will go on absorbing the annoying but so-far tolerable costs of complaints about settlement policies. And it will likely witness several more countries bestowing the State of Palestine with symbolic recognition, a few more negative votes in impotent university student councils, limited calls for boycotts of settlement goods, and occasional bursts of violence that the greatly overpowered Palestinians are too weak to sustain. There is no contest.

The real explanation for the past decades of failed peace negotiations is not mistaken tactics or imperfect circumstances, but that no strategy can succeed if it is premised on Israel behaving irrationally. Most arguments put to Israel for agreeing to a partition are that it is preferable to an imagined, frightening future in which the country ceases to be either a Jewish state or a democracy, or both. Israel is constantly warned that if it does not soon decide to grant Palestinians citizenship or sovereignty, it will become, at some never-defined future date, an apartheid state. But these assertions contain the implicit acknowledgment that it makes no sense for Israel to strike a deal today rather than wait to see if such imagined threats actually materialise. If and when they do come to be, Israel can then make a deal. Perhaps in the interim, the hardship of Palestinian life will cause enough emigration that Israel may annex the West Bank without giving up the state’s Jewish majority. Or, perhaps, the West Bank will be absorbed by Jordan, and Gaza by Egypt, a better outcome than Palestinian statehood, in the view of many Israeli officials.

It is hard to argue that forestalling an agreement in the present makes a worse deal more likely in the future: the international community and the PLO have already established the ceiling of their demands – 22% of the land now under Israeli control – while providing far less clarity about the floor, which Israel can try to lower. Israel has continued to reject the same Palestinian claims made since the 1980s, albeit with a few added Palestinian concessions. In fact, history suggests that a strategy of waiting would serve the country well: from the British government’s 1937 Peel Commission partition plan and the UN partition plan of 1947 to UN Security Council Resolution 242 and the Oslo accords, every formative initiative endorsed by the great powers has given more to the Jewish community in Palestine than the previous one. Even if an Israeli prime minister knew that one day the world’s nations would impose sanctions on Israel if it did not accept a two-state agreement, it would still be irrational to strike such a deal now. Israel could instead wait until that day comes, and thereby enjoy many more years of West Bank control and the security advantages that go with it – particularly valuable at a time of cataclysm in the region.
Israel is frequently admonished to make peace in order to avoid becoming a single, Palestinian-majority state ruling all the territory from the Jordan river to the Mediterranean Sea. But that threat does not have much credibility when it is Israel that holds all the power, and will therefore decide whether or not it annexes territory and offers citizenship to all its inhabitants. A single state will not materialise until a majority of Israelis want it, and so far they overwhelmingly do not. The reason Israel has not annexed the West Bank and Gaza is not for fear of international slaps on the wrist, but because the strong preference of most of the country’s citizens is to have a Jewish-majority homeland, the raison d’être of Zionism. If and when Israel is confronted with the threat of a single state, it can enact a unilateral withdrawal and count on the support of the great powers in doing so. But that threat is still quite distant.

In fact, Israelis and Palestinians are now farther from a single state than they have been at any time since the occupation began in 1967. Walls and fences separate Israel from Gaza and more than 90% of the West Bank. Palestinians have a quasi-state in the occupied territories, with its own parliament, courts, intelligence services and foreign ministry. Israelis no longer shop in Nablus and Gaza the way they did before the Oslo accords. Palestinians no longer travel freely to Tel Aviv. And the supposed reason that partition is often claimed to be impossible – the difficulty of a probable relocation of more than 150,000 settlers – is grossly overstated: in the 1990s, Israel absorbed several times as many Russian immigrants, many of them far more difficult to integrate than settlers, who already have Israeli jobs, fully formed networks of family support and a command of Hebrew.

As long as the Palestinian government and the Oslo system are in place, the world’s nations will not demand that Israel grant citizenship to Palestinians. Indeed, Israel has had a non-Jewish majority in the territory it controls for several years. Yet even in their sternest warnings, western governments invariably refer to an undemocratic Israel as a mere hypothetical possibility. Most of the world’s nations will refuse to call Israel’s control of the West Bank a form of apartheid – defined by the International Criminal Court as a regime of systematic oppression and domination of a racial group with the intention of maintaining that regime – so long as there is a chance, however slim, that Oslo remains a transitional phase to an independent Palestinian state.

Contrary to what nearly every US mediator has asserted, it is not that Israel greatly desires a peace agreement but has a pretty good fallback option. It is that Israel greatly prefers the fallback option to a peace agreement. No tactical brilliance in negotiations, no amount of expert preparation, no perfect alignment of the stars can overcome that obstacle. Only two things can: a more attractive agreement, or a less attractive fallback. The first of these options has been tried extensively, from offering Israel full normalisation with most Arab and Islamic states to promising upgraded relations with Europe, US security guarantees, and increased financial and military assistance. But for Israel these inducements pale in comparison to the perceived costs.

The second option is to make the fallback worse. This is what President Eisenhower did following the 1956 Suez crisis when he threatened economic sanctions to get Israel to withdraw from Sinai and Gaza. This is what President Ford did in 1975 when he reassessed US relations with Israel, refusing to provide it with new arms deals until it agreed to a second Sinai withdrawal. This is what President Carter did when he raised the specter of terminating US military assistance if Israel did not immediately evacuate Lebanon in September 1977. And this is what Carter did when he made clear to both sides at Camp David that the United States would withhold aid and downgrade relations if they did not sign an agreement. This, likewise, is what the US secretary of state James Baker did in 1991, when he forced a reluctant Prime Minister Yitzhak Shamir to attend negotiations in Madrid by withholding a $10bn loan guarantee that Israel needed to absorb the immigration of Soviet Jews. That was the last time the United States applied pressure of this sort.

The Palestinians, too, have endeavored to make Israel’s fallback option less attractive through two uprisings and other periodic bouts of violence. But the extraordinary price they paid proved unsustainable, and on the whole they have been too weak to worsen Israel’s fallback for very long. As a result, Palestinians have been unable to induce more from Israel than tactical concessions, steps meant to reduce friction between the populations in order not to end occupation but to mitigate it and restore its low cost.

Forcing Israel to make larger, conflict-ending concessions would require making its fallback option so unappealing that it would view a peace agreement as an escape from something worse. That demands more leverage than the Palestinians have so far possessed, while those who do have sufficient power have not been eager to use it. Since Oslo, in fact, the US has done quite the reverse, working to maintain the low cost of Israel’s fallback option. Successive US administrations have financed the Palestinian government, trained its resistance-crushing security forces, pressured the PLO not to confront Israel in international institutions, vetoed UN Security Council resolutions that were not to Israel’s liking, shielded Israel’s arsenal from calls for a nuclear-free Middle East, ensured Israel’s military superiority over all of its neighbours, provided the country with more than $3bn in military aid each year, and exercised its influence to defend Israel from criticism.
No less importantly, the United States has consistently sheltered Israel from accountability for its policies in the West Bank by putting up a facade of opposition to settlements that in practice is a bulwark against more significant pressure to dismantle them. The US and most of Europe draw a sharp distinction between Israel and the occupied territories, refusing to recognise Israeli sovereignty beyond the pre-1967 lines. When the limousine of the US president travels from West to East Jerusalem, the Israeli flag comes down from the driver-side front corner. US officials must obtain special permission to meet Israelis at the IDF’s central command headquarters in the Jerusalem settlement of Neve Yaakov or at the Justice Ministry in the heart of downtown East Jerusalem. And US regulations, not consistently enforced, stipulate that products from the settlements should not bear a made-in-Israel label.

Israel vehemently protests against this policy of so-called differentiation between Israel and the occupied territories, believing that it delegitimises the settlements and the state, and could lead to boycotts and sanctions of the country. But the policy does precisely the opposite: it acts not as a complement to punitive measures against Israel, but as an alternative to them.

Differentiation creates an illusion of US castigation, but in reality it insulates Israel from answering for its actions in the occupied territories, by assuring that only settlements and not the government that creates them will suffer consequences for repeated violations of international law. Opponents of settlements and occupation, who would otherwise call for costs to be imposed on Israel, instead channel their energies into a distraction that creates headlines but has no chance of changing Israeli behaviour. It is in this sense that the policy of differentiation, of which Europeans and US liberals are quite proud, does not so much constitute pressure on Israel as serve as a substitute for it, thereby helping to prolong an occupation it is ostensibly meant to bring to an end.

Support for the policy of differentiation is widespread, from governments to numerous self-identified liberal Zionists, US advocacy groups such as J Street that identify with centrist and centre-left parties in Israel, and the editorial board of the New York Times. Differentiation allows them to thread the needle of being both pro-Israel and anti-occupation, the accepted view in polite society. There are of course variations among these opponents of the settlements, but all agree that Israeli products that are created in the West Bank should be treated differently, whether through labelling or even some sort of boycott.

What supporters of differentiation commonly reject, however, is no less important. Not one of these groups or governments calls for penalising the Israeli financial institutions, real estate businesses, construction companies, communications firms, and, above all, government ministries that profit from operations in the occupied territories but are not headquartered in them. Sanctions on those institutions could change Israeli policy overnight. But the possibility of imposing them has been delayed if not thwarted by the fact that critics of occupation have instead advocated for a reasonable-sounding yet ineffective alternative.

Supporters of differentiation hold the view that while it may be justifiable to do more than label the products of West Bank settlements, it is inconceivable that sanctions might be imposed on the democratically elected government that established the settlements, legalised the outposts, confiscated Palestinian land, provided its citizens with financial incentives to move to the occupied territories, connected the illegally built houses to roads, water, electricity and sanitation, and provided settlers with heavy army protection. They have accepted the argument that to resolve the conflict more force is needed, but they cannot bring themselves to apply it to the state actually maintaining the regime of settlement, occupation and land expropriation that they oppose.

Since the end of the cold war, the United States has not so much as considered using the sort of pressure it once did, and its achievements during the past quarter-century have been accordingly meagre. US policymakers debate how to influence Israel, but without using almost any of the power at their disposal, including placing aid under conditions of changes in Israeli behaviour, a standard tool of diplomacy that officials deem unthinkable in this case.

Listening to them discuss how to devise an end to occupation is like listening to the operator of a bulldozer ask how to demolish a building with a hammer. The former Israeli defence minister Moshe Dayan once said: “Our American friends offer us money, arms and advice. We take the money, we take the arms, and we decline the advice.” Those words have become only more resonant in the decades since they were uttered.

Until the US and Europe formulate a strategy to make Israel’s circumstances less desirable than the concessions it would make in a peace agreement, they will shoulder responsibility for the oppressive military regime they continue to preserve and fund. When peaceful opposition to Israel’s policies is squelched and those with the capacity to dismantle the occupation don’t raise a finger against it, violence invariably becomes more attractive to those who have few other means of upsetting the status quo.

Through pressure on the parties, a peaceful partition of Palestine is achievable. But too many insist on sparing Israelis and Palestinians the pain of outside force, so that they may instead continue to be generous with one another in the suffering they inflict.

This is an adapted extract from The Only Language They Understand: Forcing Compromise in Israel and Palestine, published by Metropolitan Books. Main photograph: Jim Young/Reuters

Wednesday, May 17, 2017

The Criminal President?

The Criminal President?


By RICHARD W. PAINTER and NORMAN L. EISEN NY Times

After the revelations of the past 24 hours, it appears that President Trump’s conduct in and around the firing of the F.B.I. director, James Comey, may have crossed the line into criminality. The combination of what is known and what is credibly alleged would, if fully substantiated, constitute obstruction of justice. It is time for Congress and a special counsel in the executive branch to conduct objective, bipartisan inquiries into these allegations, together with the underlying matters involving Michael Flynn and Russia that gave rise to them.

First, the facts. On Jan. 26, Sally Yates, then the acting attorney general, informed the White House that Mr. Flynn had apparently lied about his conversations with the Russian ambassador. The next day, President Trump hosted Mr. Comey for a private dinner, during which he allegedly asked Mr. Comey repeatedly whether he would pledge his “loyalty” to him, which Mr. Comey declined to do.

On Feb. 14, the day after Mr. Flynn’s resignation as National Security Advisor, President Trump allegedly held Mr. Comey back after a meeting to say that Mr. Flynn had done nothing wrong and that, “He is a good guy. I hope you can let this go.” Mr. Comey declined to drop the investigation, going on in March to confirm before Congress that it was ongoing, and later requesting greater resources from the Department of Justice to pursue it.

Finally, on May 9, President Trump fired Mr. Comey. We were first told he did so because Mr. Comey bungled the F.B.I.’s investigation into Hillary Clinton’s email. Two days later, President Trump changed his story: “In fact, when I decided to just do it, I said to myself, I said, ‘You know, this Russia thing with Trump and Russia is a made-up story. It’s an excuse by the Democrats for having lost an election that they should have won.’” The day after that, President Trump threatened Mr. Comey on Twitter, warning him against leaking to the press.

Any one of these facts or allegations, by itself, likely would not constitute obstruction of justice. After all, as the F.B.I. director himself stated, the president has the undisputed power under the Constitution to hire and fire members of his administration in the normal course of government business.

But what he cannot do is exercise that power corruptly, to spare himself or those associated with him, like Mr. Flynn, from scrutiny and possible criminal liability. To do so would run afoul of a series of federal statutes that define the crime of obstruction of justice. They are variations on the theme that anyone who “corruptly” or by “any threatening letter or communication” tries “to influence, obstruct, or impede, the due administration of justice” will be subject to criminal penalties.
The operative word here is “corruptly.” It means “an improper purpose,” or one that is “evil” or “wicked.” There is no precise formula for defining it; those involved in the administration of justice must continually wrestle with its interpretation.

Here, the evidence strongly suggests that the president acted corruptly. That starts with the demand for loyalty from Mr. Comey, the account of which the White House disputes. That demand can reasonably be understood to mean that Mr. Comey should protect Trump and follow his bidding, rather than honoring his oath to follow the evidence. It is also an implicit threat: Be loyal, or you will be fired.

When Mr. Comey did not seem to take the hint, Mr. Trump made his meaning crystal-clear on Feb. 14: Let the investigation go, and let Mr. Flynn go, too. The president denies this as well, of course, as he has denied so much else that has proven to be true. Who are we to believe: Mr. Comey, who would have no reason to accuse the president of obstruction of justice, and who has apparently preserved meticulous notes of his conversations? Or the president, who fact-checkers have demonstrated has told more lies in less time than any other modern occupant of the Oval Office?

While Mr. Trump might have been within his rights to fire Mr. Comey, this pattern of demands to protect himself and Mr. Flynn, followed by retaliation when the demands were not met, if proven, is a textbook case of wrongful conduct. Add to this the fact that Mr. Flynn was already offering testimony about the Russia connection in exchange for immunity from prosecution, and Mr. Trump’s clumsy attempt to dissemble the cause of the firing, and it is clear that a cover-up was afoot.

Finally, Mr. Trump topped things off with his tweeted threat to Mr. Comey; witness intimidation is both obstruction of justice in itself, and a free-standing statutory offense.

Taken together, this evidence is already more than sufficient to make out a prima facie case of obstruction of justice — and there are likely many more shoes to drop. Mr. Comey reportedly took notes on all of his encounters with the president. If what has emerged so far is any indication, this is unlikely to offer much comfort to Mr. Trump.

And there remains the core question of the president’s motives. Is he withholding his taxes because they show evidence of “a lot of money pouring in from Russia,” as his son once stated, or do they show no such thing, as his lawyers claim? Why is Mr. Trump so fervently protecting Mr. Flynn: out of loyalty to a friend, or because Mr. Trump fears what that friend would say if he received immunity?

We have previously called for Congress to set up an independent 9/11-style commission on the Russia and Flynn investigations, and for the Department of Justice to appoint a special prosecutor. This appointment is necessary because Congress can’t actually prosecute anyone who may have committed crimes, including obstruction of justice, in connection with the Trump-Russia matter. This week’s revelations about the president, the most powerful man in the country, emphasize the need for these independent structures to be erected and to encompass these new allegations.

At least for now, we need not address the question, fully briefed to the Supreme Court during Watergate, but never resolved, of whether a special prosecutor could indict the president; as with Nixon, the question may again be obviated by other events, like the House initiating impeachment proceedings and the president resigning.

In the meantime, the House and Senate must continue their existing investigations and expand them, with the Judiciary Committees of both bodies immediately beginning hearings into the president’s abuse of power. Congress must be prepared to follow the evidence wherever it may lead.

Richard W. Painter, a professor at the University of Minnesota Law School, is the vice chairman and Norman L. Eisen is the chairman of Citizens for Responsibility and Ethics. They were chief White House ethics lawyers for Presidents George W. Bush and Barack Obama, respectively.

Friday, May 12, 2017

Rising high water blues

PETER COATES Times Literary Suppliment

Only the catchment areas of the Amazon, the Congo and the Nile exceed that of the Mississippi, which drains 40 per cent of the United States, encompassing thirty-one states (and two Canadian provinces). Other renowned calamities in the United States – the (human-caused) Johnstown (Pennsylvania) flood of 1889 (more than 2,000 German and Welsh immigrant ironworkers lost their lives when a rickety dam burst higher up the valley), the Galveston (Texas) hurricane of 1900 (the nation’s deadliest hurricane), the San Francisco earthquake and fire (1906) and Hurricane Katrina (2005) – involved more physical damage and greater loss of life. But the area affected by the Mississippi flood of 1927, the most severe in US history, is unrivalled in the annals of American “natural” disasters. Unusually heavy and persistent precipitation began in August 1926 throughout the Mississippi basin and did not let up until the spring of 1927. Long-term processes of deforestation in the upper basin, wetland drainage and installation of monoculture agricultural regimes had seriously compromised the earth’s capacity to store moisture from rain and snow, hastening runoff and erosion of the “naked” soil. So, by late spring, 30,000 square miles across seven states, from Cairo, Illinois, to the Gulf of Mexico, inhabited by nearly a million people, stood under up to 30 feet of water.

At the flood’s height, an expanse equivalent to all the New England states was awash, and the river was 80 miles wide in places. As Vernon Tull, a character in William Faulkner’s novel, As I Lay Dying (1930), put it, you “couldn’t tell where was the river and where the land. It was just a tangle of yellow and the levee not less wider than a knife-back”. According to the American Red Cross, which spearheaded relief efforts, the death toll was 246. But this figure did not include the deaths of black Americans; the total body count was probably over a thousand. Between 700,000 and 900,000 people were rendered homeless. Around 130,000 homes were destroyed. Some 300,000 African Americans were consigned to makeshift refugee camps. At Mounds Landing, just north of Greenville, Mississippi, when a crevasse appeared in the levee on April 21, 1927, a wall of water poured through with a force equivalent to that of Niagara Falls. Around 13,000 residents were evacuated to higher ground, and local black men toiled at gunpoint to shore up the defences. This incident inspired the husband and wife duo, Kansas Joe McCoy and Memphis Minnie, to record “When the Levee Breaks” (1929), a song reworked in 1971 by Led Zeppelin for their fourth album.

The flood of 1927 has multiple dimensions: the harrowing individual sagas of up to a million refugees; a relief effort of unprecedented scale in American history; the hubristic “levees only” conviction of over-confident hydraulic engineers that the unruly, indomitable river had finally been tamed in the early twentieth century by lining its entire lower stretch with enormous dykes 30 feet high and 188 feet wide at the base; the uneven impact of the disaster on blacks and whites, rich and poor, and the inequitable, often brutal treatment of African Americans in the relief camps; the forced levee and relief work imposed on African Americans in a variation on debt peonage and convict-lease, and white bosses’ coercive attempts to prevent the loss of a cheap and servile black labour force enticed by the “promised land” of northern cities and factory jobs; the exacerbation of already entrenched racial tensions in the Jim Crow South, from which charitable operations were in no sense exempt (“Farms, cattle, furniture and houses may be washed away by the disastrous Miss­issippi flood, but race prejudice remains as prominent as a butte on a Western plain. So deeply is the philosophy ingrained in the soul of the white South, that even a major calamity cannot eradicate it”, editorialized the Pittsburgh Courieron May 28, 1927); how the elite of New Orleans, the South’s richest city and banking centre, saved its bacon by sacrificing politically and economically less influential parishes downriver (Plaquemines and St Bernard); the flood’s boost to the political fortunes of the populist Democrat Huey Long (helping him become Governor of Louisiana) and Herbert Hoover (it assisted the presidential bid of Calvin Coolidge’s secretary of commerce, a man known as the “great humanitarian” after his efforts to ameliorate the conditions of destitute Europeans after the First World War, and who was placed in charge of the relief operations after the ravages of the “water enemy”); the flood’s contribution to the reshaping of the landscape of American politics by shifting the political allegiance of disenchanted black Americans from Republicans (“The party of Lincoln”) to Democrats; its role in fuelling the massive internal migration northward of millions of African American sharecroppers, and concomitant erosion of the political and economic power of the old planter class in the Yazoo-Mississippi Delta; and, not least, the federal government’s assumption of primary responsibility for flood protection measures (Flood Control Act of 1928) that presaged the interventionism of the New Deal in the early 1930s. For those familiar with Pete Daniel’s Deep’n as It Come: The 1927 Mississippi River flood (1979) and John Barry’s Rising Tide: The great Mississippi flood of 1927 and how it changed America (1997), these are well-known stories and themes.

Since the resonance within American culture of the river known as “Father of Waters”, “Ol’ Man River” and “Big Muddy” matches its ecological and economic significance, it comes as no surprise that the cultural fallout from the shock of 1927 was also enormous – the only comparable phenomenon is the musical inspiration provided by the boll weevil cotton pest, as recounted in James Giesen’s Boll Weevil Blues(2011). But it is much less familiar. Barry barely scratched the surface of the flood’s legacy in the realms of music and literature. And Daniel was content to note that, even before the waters subsided, Bessie Smith had released “Back-Water Blues”, followed by “Muddy Water (A Mississippi Moan)”, and the country singer Vernon Dalhart had recorded “The Mississippi Flood”. Daniel also listed Sippie Wallace’s “The Flood Blues”, Ernest Stoneman’s “Mighty Mississippi” and “Blind Lemon” Jefferson’s “Rising High Water Blues” as notable flood songs.

“Water in Arkansas, people screaming in Tennessee . . . If I don’t leave Memphis, backwater spill all over poor me”, lamented Jefferson. Meanwhile, Barbecue Bob was “crying” “Mississippi Heavy Water Blues”, walking along the levee “with mud all in my shoes” and his head “hanging low”. Richard Mizelle has written specifically about the flood’s impact on African Americans from the bottom-up perspective of sources such as blues lyrics, through which African Americans located this particularly dire environmental calamity within the larger context of racial oppression and the precariousness of life. In Backwater Blues: 1927 in the African American imagination (2014), the music historian David Evans has identified more than two dozen “flood songs” – Mizelle examines the alternative archive of blues lyrics (characterized as folk poetry by the African American poet Sterling Brown in 1931). He also explored the material contribution of benefit concerts in northern cities featuring Duke Ellington, Louis Armstrong and Nat King Cole, as well as the flood-inflected output of imaginative literature and investigative journalism, black and white.

When she arrived in Nashville, Tennessee, by train, in late December 1926, to perform her “Harlem Frolics” show at the Bijou Theater, Smith had to be rowed to dry land because the railway station was inundated. A few months later, she wrote in “Back-Water Blues”: “Then they rowed a little boat about five miles cross the pond . . . I packed all my clothes, throwed them in and they rowed me along . . . Then I went and stood upon some high old lonesome hill . . . Then looked down on the house where I used to live”. The dis­aster of 1927 also shaped the late-stage dev­elopment of vaudeville as a genre. The African American comedians Aubrey Lyles and Flournoy Miller, who grew up just beyond the flood zone in Tennessee, topped the comic relief bill at Harlem’s biracial Lafayette Theatre. But it was Will Rogers, the Cherokee-cowboy humorist and entertainer, one of the era’s biggest celebrities, who raised more money for flood victims than any other individual (including $48,000 for a single show in New Orleans). Rogers had extensively toured the flooded region and spoken with sharecropper evacuees and hydraulic engineers alike, and at a gala performance at Broadway’s brand new Ziegfeld Theatre on May 1, 1927, the proud Southlander turned Hollywood cowboy wowed the audience with his astonishing rope tricks and gum-chewing antics (Rogers’s wife explained in a 1941 reminiscence that he acquired his chewing habit from the professional baseball players he hung out with, and that his gum routine involved parking his wad temporarily somewhere convenient like the lower part of a proscenium arch).

The catastrophe was also crucial in launching the literary careers of William Faulkner (twenty-nine in 1927) and another novelist, Richard Wright (who was just eighteen, and joined the Great Migration northwards in 1927), as well as a major event for already prominent African American writers and public figures such as W. E. B. Du Bois, Walter White (soon to become executive secretary of the National Association for the Advancement of Colored People) and Ida B. Wells-Barnett (notable critic of discriminatory and authoritarian Red Cross relief practices). The “sage of Baltimore”, the newspaperman and satirist H. L. Mencken did not waste this opportunity to pour more bile on backward white, creationist Southern “hicks” and “Ku Kluxers”.

Given this almost saturation coverage, do we really need another big study of the flood of 1927? Susan Scott Parrish, whose previous book was the award-winning American Curiosity: Cultures of natural history in the colonial British Atlantic world (2006), is less concerned with the flood itself than with “how this disaster took on form and meaning as it was nationally and internationally represented across multiple media platforms”. The effects were certainly felt beyond the United States. Representing “Cottonopolis” and alert to Lancashire textile operatives’ reliance on the cotton plantations that occupied the rich alluvial bottomlands of the lower river, the Manchester Guardian seems to have taken a keener interest in the flood than many US newspapers. Parrish also points out that Walter Benjamin included the Mississippi flood as one of the topics in his series of radio broadcasts for young Berliners on disasters in 1932. Public meaning and significance were imparted to the flood through newspaper articles (including cartoons – many of which are reproduced in Parrish’s text), radio broadcasts, vaudeville benefit concerts, blues songs and imaginative literature. Many of these are familiar, but Parrish breaks new ground as an exponent of eco-criticism, which she characterizes as “the branch of literary study that puts the natural world, and human/natural relations, at the center of its inquiry”.

Having published eco-critical readings of Wright, Faulkner and Zora Neale Hurston (whose own flood novel, Their Eyes Were Watching God, 1937, was about a different flood – the Lake Okeechobee flood of 1928 in Florida, product of a hurricane-fuelled storm surge; most of the over 2,000 lives lost were African American), Parrish is particularly concerned with links between race, envi­ronment and epistemology. Moreover, contemporary concerns strongly flavour her approach – she notes that she began working on this book a few months after Katrina, since when the United States has also been buffeted by the Deepwater Horizon oil spill of 2010 and Superstorm Sandy (2012), as well as rattled by the Fukushima earthquake/tsunami/nuclear catastrophe in Japan (2011) and Haitian earthquake of 2010.

In The Flood Year 1927 – a phrase used by a character in Faulkner’s third flood novel, The Wild Palms (1939) – Parrish covers much the same soggy, creatively fertile terrain as Mizelle. And, like Mizelle, she points out that music offered an attractively oblique medium for critique that sidestepped direct confrontation with the white power structure, and agrees with him that black Americans were fully alert to environmental damage, vigorously dismissing, too, what Mizelle refers to as “the historical myth of black environmental illiteracy”. Parrish departs from previous accounts in devoting more sustained attention to the imaginative flood writings of Faulkner and Wright. With Faulkner, Parrish really comes into her own, and her commendable (and delivered) objective is to “resituate his first two major novels [The Sound and the Fury and As I Lay Dying] in the milieu of environmental trauma in which they were written”.

Building on William Howard’s article on Wright’s flood stories published over thirty years ago in the Southern Literary Journal, Parrish subjects to close eco-critical scrutiny Wright’s novella Down by the Riverside, published in the collection Uncle Tom’s Children (1938), a tale of multiple woes – perhaps inspired by the Mounts Landing crevasse, that begins with the desperate efforts of a black farmer, Mann, to procure a boat to take his wife Lulu, who is in labour, to the Red Cross hospital in town – and the short story “Silt” (a seven-pager first published in the left-leaning magazine The New Masses, in 1937). The story relates the experience of a black family returning to their cabin after the floodwaters have receded, and Parrish gives it a gratifyingly close and visceral reading:

As they move toward the house, they find that the cabin, its bottom half painted yellow by the silt, “looked weird, as though its ghost were standing beside it”. The door is not closed, as they left it, but “half-opened”, as if a visitation has occurred . . . . Indeed, nature – or muddy water “eight feet high” – has been the visitor here, or the intruder, marking, opening, drowning, warping, sweeping. It has awoken objects into a grotesque life. Thus, when the family steps inside their house, they encounter something like a murder scene: a “smell” assaults their nostrils; a dresser appeared, “its drawers and sides bulging like a bloated corpse”; the bed “was like a giant casket forged of mud”.

The only things the silt has left unscathed are a box of matches and a half-filled sack of Bull Durham tobacco.

Though her book is sub-titled A cultural history, precisely because it adopts an eco-critical approach, there is a large dose of environmental history too. The conviction of the narrator of Faulkner’s short story “Old Man” (which Parrish quotes), that “the course of [Man’s] destiny and his actual physical appearance [were] rigidly coerced and postulated” by the Mississippi, communicates a powerful sense of a fluvial protagonist and a riverine agency that is fully consonant with the current materialist backlash in environmental history against a decades-long emphasis on cultural constructions of nature. Declining an offer to write a non-fiction book about the Mississippi in 1935, Faulkner explained: “I’m a novelist, you see: people first, where second”. As Parrish knows full well, this was not a formula he actually managed to adhere to: as she amply demonstrates, people and “where” both took first place in his writing, inextricably, gloriously co-constituted.

At the same time, Parrish engages frequently with the intellectual history of attitudes to “nature”. Sometimes deftly tucked away in the endnotes, but also in the form of rather obtrusive (and sometimes tedious) interruptions to the text, Parrish joins conversations about Foucaultian notions of biopower and biopolitics, about environmental justice and environmental racism, and the unnaturalness/social production of natural disasters, also elaborating on notions such as new materialism, Rob Nixon’s concept of slow violence, the Anthropocene and the “New Southern Studies”.

For a straightforward, non-meandering, blow-by-blow account of the 1927 flood, rich in thick description and attention to detail and larger context, and crafted in plain prose, Barry’s Rising Tide is still unmatchable. But in Barry’s study (and Daniel’s and Mizelle’s) there is no place for 1920s phenomena such as Cubism or Dadaism. There is no reference to the photomontage work of the German artist Helmut Herzfeld (John Heartfield), the Chicago School of sociology’s concept of human ecology, Howard W. Odum’s distinctive brand of Southland sociology pursued at the University of North Carolina’s Institute for Research in Social Science, Clementsian climax ecology, or the discussion led by Walter Lippmann and John Dewey about public communication, the notion of the public and the distinction between news and truth. There are no departures from the riverbank in Barry and Daniel for ruminations on modernity, modernization and the place of nature within them. And there is definitely no Anthropocene in Rising Tide.

Those with more patience for stuff about the ascendancy of key intellectual paradigms and the status of “subaltern counterpublics”, and for abstract ruminations on cultures of nature and notions of “naturalcultural” derived from Donna Haraway, will find Parrish more to their taste than Barry. Yet there is also plenty here for environmental humanists and environmentalists, who will be curious to see what happens to the infamous flood when eco-critical and eco-historicist perspectives are applied. Read eco-critically, Bessie Smith’s blues lyrics become a modern Southern version of a Vir­gilian eclogue and Faulkner’s flood novels become environmental trauma narratives (reinterpreting the flood as an experience and expression of modernization’s environmental impacts). The Sound and the Fury (1929), specifically, written the year after the flood, is served up as a “post-eco-catastrophe novel”, Richard Wright’s flood-induced stories are read as “the black environmental Heimlich”, and vaudevillian benefit acts become a late 1920s precursor of 1960s Performance Group “environmental theatre”. A folk sermon, “Noah built the Ark”, published as a poem in 1927 by the African American author James Weldon Johnson, serves Susan Scott Parrish as a springboard for eco-sermonizing about the urgent imperative for a new Noah to help us negotiate the turbulent waters of the Anthropocene. Whether or not we live on the banks of the Mississippi, we certainly need all the help we can get with that.

Rosewood