Thursday, January 26, 2017


This granular life

Carlo Rovelli AEON

That the world is not solid but made up of tiny particles is a very ancient insight. Is it humanity’s greatest idea?


Acording to tradition, in the year 450 BCE, a man embarked on a 400-mile sea voyage from Miletus in Anatolia to Abdera in Thrace, fleeing a prosperous Greek city that was suddenly caught up in political turmoil. It was to be a crucial journey for the history of knowledge. The traveller’s name was Leucippus; little is known about his life, but his intellectual spirit proved indelible. He wrote the book The Great Cosmology, in which he advanced new ideas about the transient and permanent aspects of the world. On his arrival in Abdera, Leucippus founded a scientific and philosophical school, to which he soon affiliated a young disciple, Democritus, who cast a long shadow over the thought of all subsequent times.

Together, these two thinkers have built the majestic cathedral of ancient atomism. Leucippus was the teacher. Democritus, the great pupil who wrote dozens of works on every field of knowledge, was deeply venerated in antiquity, which was familiar with these works. ‘The most subtle of the Ancients,’ Seneca called him. ‘Who is there whom we can compare with him for the greatness, not merely of his genius, but also of his spirit?’ asks Cicero.

What Leucippus and Democritus had understood was that the world can be comprehended using reason. They had become convinced that the variety of natural phenomena must be attributable to something simple, and had tried to understand what this something might be. They had conceived of a kind of elementary substance from which everything was made. Anaximenes of Miletus had imagined this substance could compress and rarefy, thus transforming from one to another of the elements from which the world is constituted. It was a first germ of physics, rough and elementary, but in the right direction. An idea was needed, a great idea, a grand vision, to grasp the hidden order of the world. Leucippus and Democritus came up with this idea.

The idea of Democritus’s system is extremely simple: the entire universe is made up of a boundless space in which innumerable atoms run. Space is without limits; it has neither an above nor a below; it is without a centre or a boundary. Atoms have no qualities at all, apart from their shape. They have no weight, no colour, no taste. ‘Sweetness is opinion, bitterness is opinion; heat, cold and colour are opinion: in reality only atoms, and vacuum,’ said Democritus. Atoms are indivisible; they are the elementary grains of reality, which cannot be further subdivided, and everything is made of them. They move freely in space, colliding with one another; they hook on to and push and pull one another. Similar atoms attract one another and join.

This is the weave of the world. This is reality. Everything else is nothing but a by-product – random and accidental – of this movement, and this combining of atoms. The infinite variety of the substances of which the world is made derives solely from this combining of atoms.

When atoms aggregate, the only thing that matters, the only thing that exists at the elementary level, is their shape, their arrangement, and the order in which they combine. Just as by combining letters of the alphabet in different ways we can obtain comedies or tragedies, ridiculous stories or epic poems, so elementary atoms combine to produce the world in its endless variety. The metaphor is Democritus’s own.

There is no finality, no purpose, in this endless dance of atoms. We, just like the rest of the natural world, are one of the many products of this infinite dance – the product, that is, of an accidental combination. Nature continues to experiment with forms and structures; and we, like the animals, are the products of a selection that is random and accidental, over the course of aeons of time. Our life is a combination of atoms, our thoughts are made up of thin atoms, our dreams are the products of atoms; our hopes and our emotions are written in a language formed by combinations of atoms; the light that we see is composed of atoms, which bring us images. The seas are made of atoms, as are our cities, and the stars. It’s an immense vision: boundless, incredibly simple, and incredibly powerful, on which the knowledge of a civilisation would later be built.

On this foundation Democritus wrote dozens of books articulating a vast system, dealing with questions of physics, philosophy, ethics, politics and cosmology. He writes on the nature of language, on religion, on the origins of human societies, and on much else besides. All these books have been lost. We know of his thought only through the quotations and references made by other ancient authors, and by their summaries of his ideas. The thought that thus emerges is a kind of intense humanism, rationalist and materialist.

Democritus combines a keen attention to nature, illuminated by a naturalistic clarity in which every residual system of mythic ideas is cleared away, with a great attention to humanity and a deep ethical concern for life – anticipating by some 2,000 years the best aspects of the 18th-century Enlightenment. The ethical ideal of Democritus is that of a serenity of mind reached through moderation and balance, by trusting in reason and not allowing oneself to be overwhelmed by passions.

Plato and Aristotle were familiar with Democritus’s ideas, and fought against them. They did so on behalf of other ideas, some of which were later, for centuries, to create obstacles to the growth of knowledge. Both insisted on rejecting Democritus’s naturalistic explanations in favour of trying to understand the world in finalistic terms – believing, that is, that everything that happens has a purpose, a way of thinking that would reveal itself to be very misleading for understanding the ways of nature – or, in terms of good and evil, confusing human issues with matters that do not relate to us.

Aristotle speaks extensively about the ideas of Democritus, with respect. Plato never cites Democritus, but scholars suspect today that this was out of deliberate choice, and not for lack of knowledge of his works. Criticism of Democritus’s ideas is implicit in several of Plato’s texts, as in his critique of ‘physicists’, for example. In a passage in his Phaedo, Plato has Socrates articulate a reproach to all ‘physicists’. He complains that when ‘physicists’ had explained that Earth was round, he rebelled because he wanted to know what ‘good’ it was for Earth to be round; how its roundness would benefit it. How completely off-track the great Plato was here!
The greatest physicist of the second half of the 20th century, Richard Feynman, wrote at the beginning of his wonderful introductory lectures on physics:

If, in some cataclysm, all scientific knowledge were to be destroyed, and only one sentence passed on to the next generation of creatures, what statement would contain the most information in the fewest words? I believe it is the atomic hypothesis, or the atomic fact, or whatever you wish to call it, that all things are made of atoms – little particles that move around in perpetual motion, attracting each other when they are a little distance apart, but repelling upon being squeezed into one another. In that one sentence you will see an enormous amount of information about the world, if just a little imagination and thinking are applied.

Without needing anything from modern physics, Democritus had already arrived at the idea that everything is made up of indivisible particles. He did it in part by marshalling arguments based upon observation; for example, he imagined, correctly, that the wearing down of a wheel, or the drying of clothes on a line, could be due to the slow flight of particles of wood or of water. But he also had arguments of a philosophical kind. Let’s dwell on these, because their potency reaches all the way to quantum gravity.

Democritus observed that matter could not be a continuous whole, because there is something contradictory in the proposition that it should be so. We know of Democritus’s reasoning because Aristotle reports it. Imagine, says Democritus, that matter is infinitely divisible, that is to say, that it can be broken down an infinite number of times. Imagine then that you break up a piece of matter ad infinitum. What would be left?

Could small particles of extended dimension remain? No, because if this were the case, the piece of matter would not yet be broken up to infinity. Therefore, only points without extension would remain. But now let us try to put together the piece of matter starting from these points: by putting together two points without extension, you cannot obtain a thing with extension, nor can you with three, or even with four. No matter how many you put together, in fact, you never have extension, because points have no extension. Therefore, we cannot think that matter is made of points without extension, because no matter how many of these we manage to put together, we never obtain something with an extended dimension. The only possibility, Democritus concludes, is that any piece of matter is made up of a finite number of discrete pieces that are indivisible, each one having finite size: the atoms.

The origin of this subtle mode of argumentation predates Democritus. It comes from the Cilento region in the south of Italy, from a town now called Velia, which in the fifth century BCE was a flourishing Greek colony called Elea. This was home to Parmenides, the philosopher who had taken to the letter – perhaps too much – the rationalism of Miletus and its idea that reason can reveal to us how things can be other than they appear.

Parmenides had explored an avenue to truth via pure reason alone, which led him to declare that all appearances are illusory, thus opening the path that would progressively move toward metaphysics and distance itself from what would come to be known as ‘natural science’. His pupil Zeno, also from Elea, had brought subtle arguments to bear in support of this fundamentalist rationalism, which radically refutes the credibility of appearances. Among these arguments, there was a series of paradoxes that became celebrated as ‘Zeno’s paradoxes’, and that seek to show how all appearance is illusory, arguing that the commonplace notion of motion is absurd.

The most famous of Zeno’s paradoxes is presented in the form of a brief fable: the tortoise challenges Achilles to a race, starting out with a 10-metre advantage. Will Achilles manage to catch up with the tortoise? Zeno argues that rigorous logic dictates that he will never be able to do so. Before catching up, in effect, Achilles needs to cover the 10 metres and, in order to do this, he will take a certain amount of time. During this time, the tortoise will have advanced a few centimetres. To cover these centimetres, Achilles will have to take a little more time, but meanwhile the tortoise will have advanced further, and so on, ad infinitum. Achilles therefore requires an infinite number of such times to reach the tortoise, and an infinite number of times, argues Zeno, is an infinite amount of time. Since, however, we do see the swift Achilles reaching and overtaking as many tortoises as he likes, it follows that what we see is irrational, and therefore illusory.

The string cannot be cut as many times as we want; matter is not continuous, it is made of individual ‘atoms’ of a finite size

Let’s be honest: this is hardly convincing. Where does the error lie? One possible answer is that Zeno is wrong because it is not true that, by accumulating an infinite number of things, one ends up with an infinite thing. Think of taking a piece of string, cutting it in half, and then again in half, and so on ad infinitum. At the end, you will obtain an infinite number of small pieces of string; the sum of these, however, will be finite, because they can add up only to the length of the original piece of string. Hence, an infinite number of strings can make a finite string; an infinite number of increasingly short times can make a finite time, and the hero, even if he will have to cover an infinite number of distances, ever smaller, will take a finite time to do so, and will end up catching the tortoise. In mathematics, we call this a converging series.

It seems that the paradox is resolved. The solution, that is, lies in the idea of the continuum – arbitrarily small times can exist, an infinite number of which make up a finite time. Aristotle is the first to intuit this possibility, subsequently developed by ancient and modern mathematics. But is this really the correct solution in the real world? Do arbitrarily short strings really exist? Can we really cut a piece of string an arbitrary number of times? Do infinitely small amounts of time exist? These are not just long-ago questions for Aristotle to ponder. They are precisely the problems that modern physicists face in trying to create a theory of quantum gravity, one that merges the large-scale rules of Albert Einstein’s general relativity with the tiny distances of quantum mechanics.

According to tradition, Zeno had met Leucippus and had become his teacher. Leucippus was therefore familiar with Zeno’s riddles. But he had devised a different way of resolving them. Maybe, Leucippus suggests, nothing arbitrarily small exists: there is a lower limit to divisibility. The universe is granular, not continuous. With infinitely small points, it would be impossible to ever construct extension – as in Democritus’s argument reported by Aristotle and mentioned previously. Therefore, the extension of the string must be formed by a finite number of finite objects with finite size. The string cannot be cut as many times as we want; matter is not continuous, it is made of individual ‘atoms’ of a finite size.

Whether this abstract argument is correct or not, its conclusion – as we know today – contains a great deal of truth. Matter does indeed have an atomic structure. If I divide a drop of water in two, I obtain two drops of water. I can divide each one of these two drops again, and so on. But I cannot continue to infinity. At a certain point, I have only one molecule, and I am done. No drops of water exist smaller than a single molecule of water. 

Evidence for the atomic nature of matter accumulated over centuries, much of it from chemistry. Chemical substances are made up of combinations of a few elements and are formed by proportions (of weight) given by whole numbers. Chemists have constructed a way of thinking about substances as composed of molecules made up of fixed combinations of atoms. Water, for example – H2O – is composed of two parts hydrogen and one part oxygen.

But these were only clues. At the beginning of the previous century, numerous scientists and philosophers still did not consider the atomic hypothesis to be credible. Among them was the renowned physicist and philosopher Ernst Mach, whose ideas on space would come to have great importance for Einstein. At the end of a lecture by Ludwig Boltzmann at the Imperial Academy of Sciences in Vienna, Mach publicly declared: ‘I do not believe that atoms exist!’

This was in 1897. Many, like Mach, understood chemical notation only as a conventional method of summarising laws of chemical reactions – not as evidence that there actually were molecules of water composed of two atoms of hydrogen and one of oxygen. You can’t see atoms, they would say. Atoms will never be seen, they would say. And then, they asked, how big would an atom be? Democritus could never measure the size of his atoms.

But somebody else could. The definitive proof of the ‘atomic hypothesis’ had to wait until 1905. It was found by a rebellious 25-year-old who had studied physics but had not been able to find employment as a scientist and was making ends meet by working in the patent office in Bern. In my new book I speak a lot about this young man, and about the three articles he sent to the most prestigious physics journal of the time, the Annalen der Physik. The first of these articles contained the definitive proof that atoms exist and calculated their dimensions, solving the problem posed by Leucippus and Democritus 23 centuries earlier.

The name of this 25-year-old, obviously, is Albert Einstein.

His method is surprisingly simple. Anyone could have arrived at it, from the time of Democritus onward, if he had had Einstein’s acumen, and a sufficient mastery of mathematics to make what was not an easy calculation. The idea goes like this: if we attentively observe very small particles, such as a speck of dust or a grain of pollen, suspended in still air or in a liquid, we see them tremble and dance. Pushed by this trembling, they move, randomly zigzagging, and so they drift slowly, gradually moving away from their starting point. This motion of particles in a fluid is called Brownian motion, after Robert Brown, a biologist who described it in detail in the 19th century. It is as if the small particle is receiving blows randomly from each side. No, it isn’t ‘as if’ it were being hit; it really is hit. It trembles because it is struck by the individual molecules of air, which collide with the particle, at times from the right and at times from the left.

It is possible to work back from the amount of movement of the granule, which can be observed, to the dimensions of the molecules

The subtle point is that there is an enormous number of molecules of air. On average, as many hit the granule from the left as from the right. If the air’s molecules were infinitely small and infinitely numerous, the effect of the collisions from right and left would balance at each instant, and the granule would not move. But the finite size of the molecules, and the fact that these are present in finite rather than infinite number, causes there to be fluctuations (this is the key word). That is to say, the collisions never balance out exactly; they balance out only on average. Imagine for a moment the molecules were very few in number and large in size. The granule would clearly receive a blow only occasionally: now one on the right, then one on the left. Between one collision and the other, it would move here and there to a significant degree, like a football kicked by boys running around a playing field. The smaller the molecules, the shorter the interval between collisions, the better that hits from different directions would cancel out one another, and the less the granule would move.

It is possible, with a little mathematics, to work back from the amount of movement of the granule, which can be observed, to the dimensions of the molecules. Einstein does this at the age of 25. From observations of granules drifting in fluids, from the measurement of how much these ‘drift’ – that is, move away from a position – he calculates the dimensions of Democritus’s atoms, the elementary grains of which matter is made. Einstein provides, after 2,300 years, the proof of the accuracy of Democritus’s insight: matter is granular.

‘Sublime Lucretius’s work will not die, Until the day the world itself passes away,’ wrote Ovid. I often think that the loss of the works of Democritus in their entirety is the greatest intellectual tragedy to ensue from the collapse of the old classical civilisation. We have been left with all of Aristotle, by way of which Western thought reconstructed itself, and nothing by Democritus. Perhaps if all the works of Democritus had survived, and nothing of Aristotle’s, the intellectual history of our civilisation would have been better. But centuries dominated by monotheism have not permitted the survival of Democritus’s naturalism.

The closure of the ancient schools such as those of Athens and Alexandria, and the destruction of all the texts not in accordance with Christian ideas, was vast and systematic, at the time of the brutal antipagan repression following the edicts of Emperor Theodosius, which in 390–391 CE declared that Christianity was to be the only and obligatory religion of the empire. Plato and Aristotle, pagans who believed in the immortality of the soul or in the existence of a Prime Mover, could be tolerated by a triumphant Christianity. Not Democritus.

But a text survived the disaster and has reached us in its entirety. Through it, we know a little about ancient atomism, and above all we know the spirit of that science. It is the splendid poem De Rerum Natura (The Nature of Things, or On the Nature of the Universe), by the Latin poet Lucretius.

Lucretius adheres to the philosophy of Epicurus, a pupil of a pupil of Democritus. Epicurus is interested more in ethical than scientific questions, and does not have Democritus’s depth. He sometimes translates Democritean atomism a little superficially. But his vision of the natural world is substantially that of the great philosopher of Abdera. Lucretius decants in verse the thought of Epicurus and the atomism of Democritus, and in this way a part of this profound philosophy was saved from the intellectual catastrophe of the Dark Ages. Lucretius sings of atoms, the sea, the sky, of nature. He expresses in luminous verse philosophical questions, scientific ideas, refined arguments:

I will explain by what forces nature steers the courses of the sun and the journeyings of the moon, so that we shall not suppose that they run their yearly races between the heaven and earth of their own free will … or that they are rolled round in furtherance of some divine plan.

The beauty of the poem lies in the sense of wonder that pervades the vast atomistic vision. The sense of the profound unity of things, derived from the knowledge that we are all made of the same substance as are the stars, and the sea:

We are all sprung from heavenly seed. All alike have the same father, from whom all-nourishing mother earth receives the showering drops of moisture. Thus fertilised, she gives birth to smiling crops and lusty trees, to mankind and all the breeds of beasts. She it is that yields the food on which they all feed their bodies, lead their joyous lives and renew their race.

There is a deep acceptance of the life of which we are an integral part:

Do you not see that nature is clamouring for two things only, a body free from pain, a mind released from worry and fear for the enjoyment of pleasurable sensations?

And there is a serene acceptance of the inevitability of death, which cancels every evil, and about which there is nothing to fear. For Lucretius, religion is ignorance: reason is the torch that enlightens.

Lucretius’s text, forgotten for centuries, was rediscovered in January 1417 by the humanist Poggio Bracciolini, in the library of a German monastery. Poggio had been the secretary of many popes and was a passionate hunter of ancient books, in the wake of the celebrated rediscoveries made by Francesco Petrarch. His rediscovery of a text by Quintilian modified the course of the study of law throughout the faculties of Europe; his discovery of the treatise on architecture by Vitruvius transformed the way in which fine buildings were designed and constructed. But his triumph was rediscovering Lucretius.

‘You will see a multitude of tiny particles mingling in a multitude of ways in the empty space within the light of the beam’

The actual codex found by Poggio has been lost, but the copy made by his friend Niccolò Niccoli (now known as the ‘Codex Laurenziano 35.30’) is still preserved in its entirety in the Biblioteca Laurenziana in Florence. The ground was already surely prepared for something new when Poggio gave Lucretius’s book back to humanity. The rediscovery of De Rerum Natura had a profound effect upon the Italian and European Renaissance, and its echo resounds, directly or indirectly, in the pages of authors ranging from Galileo to Johannes Kepler, and from Francis Bacon to Niccolò Machiavelli. In William Shakespeare’s Romeo and Juliet, a century after Poggio, atoms make a delightful appearance:

MERCUTIO: O, then I see Queen Mab hath been with you. 
She is the fairies’ midwife, and she comes 
In shape no bigger than an agate-stone 
On the fore-finger of an alderman, 
Drawn with a little team of atomies 
Athwart men’s noses as they lie asleep.

From there, the influence of Lucretius extended to Isaac Newton, John Dalton, Baruch Spinoza, Charles Darwin, all the way to Einstein. The very idea that the existence of atoms is revealed by the Brownian motion of minute particles immersed in a fluid can be traced back to Lucretius. Here is a passage in which Lucretius provides a ‘living proof’ of the notion of atoms:

Observe what happens when sunbeams are admitted into a building and shed light on its shadowy places. You will see a multitude of tiny particles mingling in a multitude of ways in the empty space within the light of the beam, as though contending in everlasting conflict, rushing into battle rank upon rank with never a moment’s pause in a rapid sequence of unions and disunions. From this you may picture what it is for the atoms to be perpetually tossed about in the illimitable void… their dancing is an actual indication of underlying movements of matter that are hidden from our sight. There you will see many particles under the impact of invisible blows, changing their course and driven back upon their tracks, this way and that, in all directions. You must understand that they all derive this restlessness from the atoms. It originates with the atoms, which move of themselves.

Einstein resuscitated the proof presented by Lucretius, and probably first conceived of by Democritus, and translated it into mathematical terms, thus managing to calculate the size of the atoms.

The Catholic Church attempted to stop Lucretius: in the Florentine Synod of December 1516, it prohibited the reading of Lucretius in schools. In 1551, the Council of Trent banned his work. But it was too late. An entire vision of the world that had been swept away by medieval Christian fundamentalism was re-emerging in a Europe that had reopened its eyes. It was not just the rationalism, atheism and materialism of Lucretius that were being proposed in Europe. It was not merely a luminous and serene meditation on the beauty of the world. It was much more: it was an articulate and complex structure of thinking about reality, a new mode of thinking, radically different from what had been for centuries the mind-set of the Middle Ages.

The medieval cosmos so marvellously sung by Dante was interpreted on the basis of a hierarchical organisation of the universe that reflected the hierarchical organisation of European society: a spherical cosmic structure with Earth at its centre; the irreducible separation between Earth and heavens; finalistic and metaphorical explanations of natural phenomena; fear of God, fear of death; little attention to nature; the idea that forms preceding things determine the structure of the world; the idea that the source of knowledge could only be the past, in revelation and tradition.

There is none of this in the world of Leucippus and Democritus as sung by Lucretius. There is no fear of the gods; no ends or purposes in the world; no cosmic hierarchy; no distinction between Earth and heavens. There is a deep love of nature, a serene immersion within it; a recognition that we are profoundly part of it; that men, women, animals, plants and clouds are organic threads of a marvellous whole, without hierarchies. There is a feeling of deep universalism, in the wake of the splendid words of Democritus: ‘To a wise man, the whole earth is open, because the true country of a virtuous soul is the entire universe.’

The simple idea of the finite divisibility of things – the granular quality of the world – is the idea that stops the infinite between our fingers

There is, too, the ambition of being able to think about the world in simple terms. Of being able to investigate and understand the secrets of nature. To know more than our parents. There are extraordinary conceptual tools on which Galileo, Kepler and Newton will build: the idea of free and rectilinear motion in space; the idea of elementary bodies and their interactions, out of which the world is constructed; the idea of space as a container of the world.

And there is the simple idea of the finite divisibility of things – the granular quality of the world. It is the idea that stops the infinite between our fingers. This idea is at the root of the atomic hypothesis, but it has returned with augmented force in quantum mechanics. Energy can move only in discrete units, and we might yet find that space and time are likewise composed of their own fundamental units. Importing the atomic philosophy of Democritus into modern physics might be essential for reconciling general relativity (which assumes a continuous reality) with quantum mechanics (which very much does not).

Merging relativity and quantum mechanics into a new theory of quantum gravity will lift physics to the next level, and will also achieve an appealing historical closure. Einstein’s paper on Brownian motion was inspired by atomism, whereas his theory of relativity emerged from the anti-atomic philosophy of Mach. With quantum gravity, the last barrier will fall, and the song of Lucretius will ring out through all of physics.

This is an extract from ‘Reality Is Not What It Seems’ by Carlo Rovelli, translated by Simon Carnell and Erica Segre, published by Riverhead Books, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC. Copyright © 2014 by Carlo Rovelli. Translation copyright © 2016 by Simon Carnell and Erica Segre.

Sunday, January 15, 2017


How Barack Obama paved the way for Donald Trump

Gary Younge The London Guardian

Don’t blame it all on racism. During the financial crash Obama sided with the bankers, not people losing their homes – making Trump’s victory possible

To celebrate its 225th anniversary, the US Mint and Treasury last week unveiled plans to issue a 24-carat commemorative coin depicting Lady Liberty as an African-American woman. With full lips and braided hair tied back in a bun, her gold-embossed profile is framed by the words “LIBERTY” above and “In God We Trust” below. “As we as a nation continue to evolve,” said Elisa Basnight, the Mint’s chief of staff, “so does Liberty’s representation.”

Sadly, the representation is evolving far faster than the nation. The coin is worth $100 (£80); in 2010 the median net wealth for women of colour was calculated at just $5. Black women now earn 65 cents for every $1 made by a white man – the same gap as 20 years ago. So the Treasury has produced a coin in these women’s image that most cannot afford – because the economy is producing low-wage jobs that leave them with liberty without equality.

For the past eight years American liberals have gorged themselves on symbolism. A significant section of the population, including those most likely to support Barack Obama, have felt better about their country even as they have fared worse in it. The young, good-looking, intact, scandal-free black family in the White House embodied a hopeful future for America and beyond. Photogenic, with an understated chic, here were people of colour who looked even better in black and white. With personal stories of progress without privilege, they provided Camelot without the castle: evoking a sense of possibility in a period of economic stagnation, social immobility and political uncertainty.

As Obama passes the keys and the codes to Donald Trump at the end of this week, so many liberals mourn the passing of what has been, remain in a state of disbelief for what has happened, and express deep anxiety about what is to come. It is a steep cliff – politically, rhetorically and aesthetically – from the mocha-complexioned consensual intellectual to the permatanned, “pussy-grabbing” vulgarian.

But there is a connection between the “new normal” and the old that must be understood if resistance in the Trump era is going to amount to more than Twitter memes driven by impotent rage and fuelled by flawed nostalgia. This transition is not simply a matter of sequence – one bad president following a good one – but consequence: one horrendous agenda made possible by the failure of its predecessor.

It is easy for liberals to despise Trump. He is a thin-skinned charlatan, a self-proclaimed sexual harasser, a blusterer and a bigot. One need not exhaust any moral energy in making the case against his agenda. That is precisely what makes it so difficult to understand his appeal. Similarly, it is easy for liberals to love Obama. He’s measured, thoughtful, smart and eloquent – and did some good things despite strong opposition from Republicans. That is precisely what makes it so difficult for liberals to provide a principled and plausible critique of his presidency.

One cannot blame Obama for Trump. It was the Republicans – craven to the mob within their base, which they have always courted but ultimately could not control – that nominated and, for now, indulges him. And yet it would be disingenuous to claim Trump rose from a vacuum that bore no relationship to the previous eight years.

Some of that relationship is undeniably tied up in who Obama is: a black man, with a lapsed Muslim father from Kenya. That particular constellation of identities was like catnip to an increasingly strident wing of the Republican party in a time of war, migration and racial tumult. Trump did not invent racism. Indeed, race-baiting has been a staple of Republican party strategy for more than 50 years. But as he refused to observe the electoral etiquette of the Nixon strategy (“You have to face the fact that the whole problem is really the blacks,” Richard Nixon told his chief-of-staff, HR Haldeman. “The key is to devise a system that recognizes that while not appearing to”), his campaign descended into a litany of brazen racist taunts.

Racism’s role should not be underplayed, but its impact can arguably be overstated. While Trump evidently emboldened existing racists, it’s not obvious that he created new ones. He received the same proportion of the white vote as Mitt Romney in 2012 and George W Bush in 2004. It does not follow that because Trump’s racism was central to his meaning for liberals, it was necessarily central to his appeal for Republicans.

There is a deeper connection, however, between Trump’s rise and what Obama did – or rather didn’t do – economically. He entered the White House at a moment of economic crisis, with Democratic majorities in both Houses and bankers on the back foot. Faced with the choice of preserving the financial industry as it was or embracing far-reaching reforms that would have served the interests of those who voted for him, he chose the former.

Even as we protest about the 'new normal', we should not pretend it is replacing something popular or effective

Just a couple of months into his first term he called a meeting of banking executives. “The president had us at a moment of real vulnerability,” one of them told Ron Suskind in his book Confidence Men. “At that point, he could have ordered us to do just about anything and we would have rolled over. But he didn’t – he mostly wanted to help us out, to quell the mob.” People lost their homes while bankers kept their bonuses and banks kept their profits.

In 2010 Damon Silvers of the independent congressional oversight panel told Treasury officials: “We can either have a rational resolution to the foreclosure crisis, or we can preserve the capital structure of the banks. We can’t do both.” They chose the latter. Not surprisingly, this was not popular. Three years into Obama’s first term 58% of the country – including an overwhelming majority of Democrats and independents – wanted the government to help stop foreclosures. His Treasury secretary, Timothy Geithner, did the opposite, setting up a programme that would “foam the runway” for the banks.
So when Hillary Clinton stood for Obama’s third term, the problem wasn’t just a lack of imagination: it was that the first two terms had not lived up to their promise.

This time last year, fewer than four in 10 were happy with Obama’s economic policies. When asked last week to assess progress under Obama 56% of Americans said the country had lost ground or stood still on the economy, while 48% said it had lost ground on the gap between the rich and poor – against just 14% who said it gained ground. These were the Obama coalition – black and young and poor – who did not vote in November, making Trump’s victory possible. Those whose hopes are not being met: people more likely to go to the polls because they are inspired about a better future than because they fear a worse one.

Naturally, Trump’s cabinet of billionaires will do no better and will, in all likelihood, do far worse. And even as we protest about the legitimacy of the “new normal”, we should not pretend it is replacing something popular or effective. The old normal was not working. The premature nostalgia for the Obamas in the White House is not a yearning for Obama’s policies.

As any recipient of the new coin will tell you, there’s a difference between things that look different and make you feel good, and things that make a difference and actually do good. Symbols should not be dismissed as insubstantial; but nor should they be mistaken for substance.

Saturday, January 07, 2017

THE VOICES IN OUR HEADS

Why do people talk to themselves, and when does it become a problem?

By Jerome Groopman The New Yorker

Illustration by Leo Espinosa

“Talking to your yogurt again,” my wife, Pam, said. “And what does the yogurt say?”

She had caught me silently talking to myself as we ate breakfast. A conversation was playing in my mind, with a research colleague who questioned whether we had sufficient data to go ahead and publish. Did the experiments in the second graph need to be repeated? The results were already solid, I answered. But then, on reflection, I agreed that repetition could make the statistics more compelling.

I often have discussions with myself—tilting my head, raising my eyebrows, pursing my lips—and not only about my work. I converse with friends and family members, tell myself jokes, replay dialogue from the past. I’ve never considered why I talk to myself, and I’ve never mentioned it to anyone, except Pam. She very rarely has inner conversations; the one instance is when she reminds herself to do something, like change her e-mail password. She deliberately translates the thought into an external command, saying out loud, “Remember, change your password today.”

Verbal rehearsal of material—the shopping list you recite as you walk the aisles of a supermarket—is part of our working memory system. But for some of us talking to ourselves goes much further: it’s an essential part of the way we think. Others experience auditory hallucinations, verbal promptings from voices that are not theirs but those of loved ones, long-departed mentors, unidentified influencers, their conscience, or even God.

Charles Fernyhough, a British professor of psychology at Durham University, in England, studies such “inner speech.” At the start of “The Voices Within” (Basic), he also identifies himself as a voluble self-speaker, relating an incident where, in a crowded train on the London Underground, he suddenly became self-conscious at having just laughed out loud at a nonsensical sentence that was playing in his mind. He goes through life hearing a wide variety of voices: “My ‘voices’ often have accent and pitch; they are private and only audible to me, and yet they frequently sound like real people.”

Fernyhough has based his research on the hunch that talking to ourselves and hearing voices—phenomena that he sees as related—are not mere quirks, and that they have a deeper function. His book offers a chatty, somewhat inconclusive tour of the subject, making a case for the role of inner speech in memory, sports performance, religious revelation, psychotherapy, and literary fiction. He even coins a term, “dialogic thinking,” to describe his belief that thought itself may be considered “a voice, or voices, in the head.”

Discussing experimental work on voice-hearing, Fernyhough describes a protocol devised by Russell Hurlburt, a psychologist at the University of Nevada, Las Vegas. A subject wears an earpiece and a beeper sounds at random intervals. As soon as the person hears the beep, she jots notes about what was in her mind at that moment. People in a variety of studies have reported a range of perceptions: many have experienced “inner speech,” though Fernyhough doesn’t specify what proportion. For some, it was a full back-and-forth conversation, for others a more condensed script of short phrases or keywords. The results of another study suggest that, on average, about twenty to twenty-five per cent of the waking day is spent in self-talk. But some people never experienced inner speech at all.

In his work at Durham, Fernyhough participated in an experiment in which he had an inner conversation with an old teacher of his while his brain was imaged by fMRI scanning. Naturally, the scan showed activity in parts of the left hemisphere associated with language. Among the other brain regions that were activated, however, were some associated with our interactions with other people. Fernyhough concludes that “dialogic inner speech must therefore involve some capacity to represent the thoughts, feelings, and attitudes of the people with whom we share our world.” This raises the fascinating possibility that when we talk to ourselves a kind of split takes place, and we become in some sense multiple: it’s not a monologue but a real dialogue.

Early in Fernyhough’s career, his mentors told him that studying inner speech would be fruitless. Experimental psychology focusses on things that can be studied in laboratory situations and can yield clear, reproducible results. Our perceptions of what goes on in our heads are too subjective to quantify, and experimental psychologists tend to steer clear of the area.

Fernyhough’s protocols go some way toward working around this difficulty, though the results can’t be considered dispositive. Being prompted to enter into an inner dialogue in an fMRI machine is not the same as spontaneously debating with oneself at the kitchen table. And, given that subjects in the beeper protocol could express their experience only in words, it’s not surprising that many of them ascribed a linguistic quality to their thinking. Fernyhough acknowledges this; in a paper published last year in Psychological Bulletin, he wrote that the interview process may both “shape and change the experiences participants report.”

More fundamentally, neither experiment can do more than provide a rough phenomenology of inner speech—a sense of where we experience inner speech neurologically and how it may operate. The experiments don’t tell us what it is. This hard truth harks back to William James, who concluded that such “introspective analysis” was like “trying to turn up the gas quickly enough to see how the darkness looks.”

Nonetheless, Fernyhough has built up an interesting picture of inner speech and its functions. It certainly seems to be important in memory, and not merely the mnemonic recitation of lists, to which my wife and many others resort. I sometimes replay childhood conversations with my father, long deceased. I conjure his voice and respond to it, preserving his presence in my life. Inner speech may participate in reasoning about right and wrong by constructing point-counterpoint situations in our minds. Fernyhough writes that his most elaborate inner conversations occur when he is dealing with an ethical dilemma.

Inner speech could also serve as a safety mechanism. Negative emotions may be easier to cope with when channelled into words spoken to ourselves. In the case of people who hear alien voices, Fernyhough links the phenomenon to past trauma; people who live through horrific events often describe themselves “dissociating” during the episodes. “Splitting itself into separate parts is one of the most powerful of the mind’s defense mechanisms,” he writes. Given that his fMRI study suggested that some kind of split occurred during self-speech, the idea of a connection between these two mental processes doesn’t seem implausible. Indeed, a mainstream strategy in cognitive behavioral therapy involves purposefully articulating thoughts to oneself in order to diminish pernicious habits of mind. There is robust scientific evidence demonstrating the value of the method in coping with O.C.D., phobias, and other anxiety disorders.

Cognitive behavioral therapy also harnesses the effectiveness of verbalizing positive thoughts. Many athletes talk to themselves as a way of enhancing performance; Andy Murray yells at himself during tennis matches. The potential benefits of this have some experimental support. In 2008, Greek researchers randomly assigned tennis players to one of two groups. The first was trained in motivational and instructional self-talk (for instance, “Go,” “I can,” “Shoulder, low”). The second group got a tactical lecture on the use of particular shots. The group trained to use self-talk showed improved play and reported increased self-confidence and decreased anxiety, whereas no significant improvements were seen in the other group.

Sometimes the voices people hear are not their own, and instead are attributed to a celestial source. God’s voice figures prominently early in the Hebrew Bible. He speaks individually to Adam, Eve, Cain, Noah, and Abraham. At Mt. Sinai, God’s voice, in midrash, was heard communally, but was so overwhelming that only the first letter, aleph, was sounded. But in later prophetic books the divine voice grows quieter. Elijah, on Mt. Horeb, is addressed by God (after a whirlwind, a fire, and an earthquake) in what the King James Bible called a “still small voice,” and which, in the original Hebrew (kol demamah dakah), is even more suggestive—literally, “the sound of a slender silence.” By the time we reach the Book of Esther, God’s voice is absent.

In Christianity, however, divine speech continues through the Gospels—the apostle Paul converts after hearing Jesus admonish him. Especially in evangelical traditions, it has persisted. Martin Luther King, Jr., recounted an experience of it in the early days of the bus boycott in Montgomery, in 1956. After receiving a threatening anonymous phone call, he went in despair into his kitchen and prayed. He became aware of “the quiet assurance of an inner voice” and “heard the voice of Jesus saying still to fight on.”

Fernyhough relates some arresting instances of conversations with God and other celestial powers that occurred during the Middle Ages. In fifteenth-century France, Joan of Arc testified to hearing angels and saints tell her to lead the French Army in rescuing her country from English domination. A more intimate example is that of the famous mystic Margery Kempe, a well-to-do Englishwoman with a husband and family, who, in the early fifteenth century, reported that Christ spoke to her from a short distance, in a “sweet and gentle” voice. In “The Book of Margery Kempe,” a narrative she dictated, which is often considered the first autobiography in English, she relates how a series of domestic crises, including an episode of what she describes as madness, led her to embark on a life of pilgrimage, celibacy, and extreme fasting. The voice of Jesus gave her advice for negotiating a deal with her frustrated and worried husband. (She agreed to eat; he accepted her chastity.) Fernyhough writes imaginatively about the various registers of voice she hears. “One kind of sound she hears is like a pair of bellows blowing in her ear: it is the susurrus of the Holy Spirit. When He chooses, our Lord changes that sound into the voice of a dove, and then into a robin redbreast, tweeting merrily in her ear.”

Forty years ago, Julian Jaynes, a psychologist at Princeton, published a landmark book, “The Origin of Consciousness in the Breakdown of the Bicameral Mind,” in which he proposed a biological basis for the hearing of divine voices. He argued that several thousand years ago, at the time the Iliad was written, our brains were “bicameral,” composed of two distinct chambers. The left hemisphere contained language areas, just as it does now, but the right hemisphere contributed a unique function, recruiting language-making structures that “spoke” in times of stress. People perceived the utterances of the right hemisphere as being external to them and attributed them to gods. In the tumult of attacking Troy, Jaynes believed, Achilles would have heard speech from his right hemisphere and attributed it to voices from Mt. Olympus:

The characters of the Iliad do not sit down and think out what to do. They have no conscious minds such as we say we have, and certainly no introspections. When Agamemnon, king of men, robs Achilles of his mistress, it is a god that grabs Achilles by his yellow hair and warns him not to strike Agamemnon. It is a god who then rises out of the gray sea and consoles him in his tears of wrath on the beach by his black ships. . . . It is one god who makes Achilles promise not to go into battle, another who urges him to go, and another who then clothes him in a golden fire reaching up to heaven and screams through his throat across the bloodied trench at the Trojans, rousing in them ungovernable panic. In fact, the gods take the place of consciousness.

Jaynes believed that the development of nerve fibres connecting the two hemispheres gradually integrated brain function. Following a theory of Homeric authorship that assumed the Odyssey to have been composed at least a century after the Iliad, he pointed out that Odysseus, who is constantly reflecting and planning, manifests a self-consciousness of mind. The poem’s emphasis on Odysseus’ cunning starts to seem like the celebration of the emergence of a new kind of consciousness. For Jaynes, hearing the voice of God was a vestige of our past neuroanatomy.

Jaynes’s book was hugely influential in its day, one of those rare specialist works whose ideas enter the culture at large. (Bicamerality is an important plot point in HBO’s “Westworld”: Dolores, an android played by Evan Rachel Wood, is led to understand that a voice she hears, which has urged her to kill other android “hosts” at the park, comes from her own head.) But Jaynes’s thesis does not stand up to what we now know about the development of our species. In evolutionary time, the few thousand years that separate us from Achilles are a blink of an eye, far too short to allow for such radical structural changes in the brain. Contemporary neurologists offer alternative explanations for hearing celestial speech. Some speculate that it represents temporal-lobe epilepsy, others schizophrenia; auditory hallucinations are common in both conditions. They are also a feature of degenerative neurological diseases. An elderly relative with Alzheimer’s recently told me that God talks to her. “Do you actually hear His voice?” I asked. She said that she does, and knows it is God because He said so.

Remarkably, Fernyhough is reluctant to call such voices hallucinations. He views the term as pejorative, and he is notably skeptical about the value of psychiatric diagnosis in voice-hearing cases:

It is no more meaningful to attempt to diagnose . . . English mystics (nor others, like Joan, from the tradition to which they belong) than it is to call Socrates a schizophrenic. . . . If Joan wasn’t schizophrenic, she had “idiopathic partial epilepsy with auditory features.” Margery’s compulsive weeping and roaring, combined with her voice-hearing, might also have been signs of temporal lobe epilepsy. The white spots that flew around her vision (and were interpreted by her as sightings of angels) could have been symptoms of migraine. . . . The medieval literary scholar Corinne Saunders points out that Margery’s experiences were strange then, in the early fifteenth century, and they seem even stranger now, when we are so distant from the interpretive framework in which Margery received them. That doesn’t make them signs of madness or neurological disease any more than similar experiences in the modern era should be automatically pathologized.

In his unwillingness to draw a clear line between normal perceptions and delusions, Fernyhough follows ideas popularized by a range of groups that have emerged in the past three decades known as the Hearing Voices Movement. In 1987, a Dutch psychiatrist, Marius Romme, was treating a patient named Patsy Hage, who heard malign voices. Romme’s initial diagnosis was that the voices were symptoms of a biomedical illness. But Hage insisted that her voice-hearing was a valid mode of thought. Not coincidentally, she was familiar with the work of Julian Jaynes. “I’m not a schizophrenic,” she told Romme. “I’m an ancient Greek!”

Romme came to sympathize with her point of view, and decided that it was vital to engage seriously with the actual content of what patients’ voices said. The pair started to publicize the condition, asking other voice-hearers to be in touch. The movement grew from there. It currently has networks in twenty-four countries, with more than a hundred and eighty groups in the United Kingdom alone, and its membership is growing in the United States. It holds meetings and conferences in which voice-hearers discuss their experiences, and it campaigns to increase public awareness of the phenomenon.

The movement’s followers reject the idea that hearing voices is a sign of mental illness. They want it to be seen as a normal variation in human nature. Their arguments are in part about who controls the interpretation of such experiences. Fernyhough quotes an advocate who says, “It is about power, and it’s about who’s got the expertise, and the authority.” The advocate characterizes cognitive behavioral therapy as “an expert doing something to” a patient, whereas the movement’s approach disrupts that hierarchy. “People with lived experience have a lot to say about it, know a lot about what it’s like to experience it, to live with it, to cope with it,” she says. “If we want to learn anything about extreme human experience, we have to listen to the people who experience it.”

Like other movements that seek to challenge the authority of psychiatry’s diagnostic categories, the Hearing Voices Movement is controversial. Critics point out that, while depathologizing voice-hearing may feel liberating for some, it entails a risk that people with serious mental illnesses will not receive appropriate care. Fernyhough does not spend much time on these criticisms, though in a footnote he does concede the scant evidentiary basis of the movement’s claims. He mentions a psychotherapist sympathetic to the Hearing Voices Movement who says that, in contrast to the ample experimental evidence for the efficacy of cognitive behavioral therapy, “the organic nature of hearing voices groups” makes it hard to conduct randomized controlled trials.

Fernyhough is not only a psychologist; he also writes fiction, and in describing this work he emphasizes the role of hearing voices. “I never mistake these fictional characters for real people, but I do hear them speaking,” he writes in “The Voices Within.” “I have to get their voices right—transcribe them accurately—or they will not seem real to the people who are reading their stories.” He notes that this kind of conjuring is widespread among novelists, and cites examples including Charles Dickens, Joseph Conrad, Virginia Woolf, and Hilary Mantel.

Fernyhough and his colleagues have tried to quantify this phenomenon. Ninety-one writers attending the 2014 Edinburgh International Book Festival responded to a questionnaire; seventy per cent said that they heard characters speak. Several writers linked the speech of their characters to inner dialogues even when they are not actively writing. As for plot, some writers asserted that their characters “don’t agree with me, sometimes demand that I change things in the story arc of whatever I’m writing.”

The importance of voice-hearing to many writers might seem to validate the Hearing Voices Movement’s approach. If the result is great literature, it would be perverse to judge hearing voices an aberration requiring treatment rather than a precious gift. It’s not that simple, however. As Fernyhough writes, “Studies have shown a particularly high prevalence of psychiatric disorders (particularly mood disorders) in those of proven creativity.” Even leaving aside the fact that most people with mood disorders are not creative geniuses, many writers find their creative talent psychologically troublesome, and even prize an idea of themselves as, in some sense, abnormal. The novelist Jeanette Winterson has heard voices that she says put her “in the crazy category,” and the idea has a long history: Plato’s “mad poet,” Aristotle’s “melancholic genius,” and John Dryden’s dictum that “great wits are sure to madness near allied.” But, in cases where talent is accompanied by real psychological disturbance, do the creative benefits really outweigh the costs to the individual?

On a frigid night in January, 1977, while working as a young resident at Massachusetts General Hospital, I was paged to the emergency room. A patient had arrived by ambulance from McLean Hospital, a famous psychiatric institution in nearby Belmont. Sitting bolt upright, laboring to breathe, was the poet Robert Lowell. I introduced myself and performed a physical examination. Lowell was in congestive heart failure, his lungs filling with fluid. I administered diuretics and fitted an oxygen tube to his nostrils. Soon he was breathing comfortably. He seemed sullen and, to distract him from his predicament, I asked about a medallion that hung from a chain around his neck. “Achilles,” he replied, with a fleeting smile.

I’ve no idea if Lowell knew of Jaynes’s book, which had come out the year before, but Achilles was a figure of lifelong importance to him, one of many historical and mythical figures—Alexander the Great, Dante, T. S. Eliot, Christ—with whom he identified in moments of delusional grandiosity. In Achilles, Lowell seemed to find a heroic reflection of his own mental volatility. Achilles’ defining attribute—it’s the first word of the Iliad—is mÄ“nin, usually translated as “wrath” or “rage.” But in a forthcoming book, “Robert Lowell, Setting the River on Fire: A Study of Genius, Mania, and Character,” the psychiatry professor Kay Redfield Jamison points out that Lowell’s translation of the passage renders mÄ“nin as “mania.” As it happens, mania was Lowell’s most enduring diagnosis in his many years as a psychiatric patient.

In her account of Lowell’s hospitalization, Jamison cites my case notes and those of his cardiologist in the Phillips House, a wing of Mass General where wealthy Boston Brahmin patients were typically housed. Lowell wrote a poem about his stay, “Phillips House Revisited,” in which he overlays impressions of the medical crisis I had witnessed (“I cannot entirely get my breath, / as if I were muffled in snow”) with memories of his grandfather, who had died in the same hospital, forty years earlier.

There was a long history of mental illness in Lowell’s family. Jamison digs up the records of his great-great-grandmother, who was admitted to McLean in 1845, and who, doctors noted, was “afflicted with false hearing.” Lowell, too, suffered from auditory hallucinations. Sometimes, before sleep, he would talk to the heroes from Hawthorne’s “Greek Myths.” During a hospitalization in 1954, he often chatted to Ezra Pound, who was a friend—but not actually there. Among his contemporaries, recognition of Lowell’s mental instability was inextricably bound up with awe of his talent. The intertwining of madness and genius remains an essential part of his posthumous legend, and Lowell himself saw the two as related. Jamison quotes a report by one of his doctors:

Patient’s strong emotional ties with his manic phase were very evident. Besides the feeling of well-being which was present at that time, patient felt that, “my senses were more keen than they had ever been before, and that’s what a writer needs.”

But Jamison also shows that Lowell sometimes saw his episodes of manic inspiration in a more coldly medical light. After a period of intense religious revelation, he wrote, “The mystical experiences and explosions turned out to be pathological.” Splitting the difference, Jamison suggests that his mania and his imagination were welded into great art by the discipline he exerted between his manic episodes.
Lowell was discharged from Mass General on February 9th. Jamison quotes a note that one of my colleagues wrote to the doctors at McLean: “Thank you for referring Mr. Lowell to me. He proved to be just as interesting a person and a patient as you suggested he might be.” Later that month, Lowell had recovered sufficiently to travel to New York and do a reading with Allen Ginsberg. He read “Phillips House Revisited.” That September, he died. ♦

Rosewood