Science or rather Art
A frenzied desire to be first inspired Darwin and Einstein to bursts of creativity. Like writers and artists, scientists strive to have their names attached to a work of brilliance, but any breakthrough depends on the efforts of countless predecessors. Ian McEwan reflects on originality and collaboration.
Ian McEwan, guardian.co.uk, Friday 23 March 2012
In June 1858 a slender package from Ternate, an island off the Dutch East Indies, arrived for Charles Darwin at his country home in Down, Kent. He may well have recognised the handwriting as that of Alfred Wallace, with whom he had been in correspondence and from whom he was hoping to receive some specimens. But what Darwin found in the package along with a covering letter was a short essay. And this essay was to transform Darwin’s life.
Wallace’s 20 pages, so it seemed to their reader on that momentous morning, covered all the principle ideas of evolution by natural selection that Darwin had been working on for more than two decades and which he thought were his exclusive possession – and which he had yet to publish. Wallace, working alone, with very little in the way of encouragement or money, drew from his extensive experience of natural history, gathered while sending back specimens for collectors. He articulated concisely the elements as well as the sources familiar to Darwin: artificial selection, the struggle for survival, competition and extinction, the way species changed into different forms by an impersonal, describable process, by a logic that did not need the intervention of a deity. Wallace, like Darwin, had been influenced by the geological speculations of Charles Lyell, and the population theories of Thomas Malthus.
In a covering letter Wallace politely asked Darwin to forward the essay to Lyell. Now, Darwin could have quietly destroyed Wallace’s package and no one would have known a thing – it had taken months to arrive, and the mail between the Dutch East Indies could hardly have been reliable in the mid-19th century. But Darwin was an honourable man, and knew that he could never live with himself if he behaved scurrilously. And yet he was in anguish. In his own letter to Lyell, that accompanied Wallace’s essay, which Darwin forwarded that same day, he lamented: “So all my originality, whatever it may amount to, will be smashed.” He was surprised at the depth of his own feelings about priority, about being first. As Janet Browne notes in her biography of Darwin, the excitement of discovery in his work had been replaced by profound anxieties about possession and ownership. He was ambushed by low emotions – mortification, irritation, rancour. In a much-quoted phrase, he was “full of trumpery feelings”.
He had held off publishing his own work in a desire to perfect it, to amass instances, to make it as immune to disproof as he could. And, of course, he was aware of his work’s theological implications – and that had made him cautious too. But he had been “forestalled”. That day he decided he must yield priority to Wallace. He must, he wrote, “resign myself to my fate”.
Within a day, he had even more pressing concerns. His 15-year-old daughter, Henrietta, fell ill and there was fear that she had diphtheria. The next day the baby, Charles, his and Emma’s 10th and last child, developed a fever. Meanwhile, Lyell was urging Darwin to concede nothing and to publish a “sketch”, which would conclusively prove Darwin’s priority over Wallace.
Taking his turn to nurse the sick baby, Darwin could decide nothing, and left the matter to his close friend Joseph Hooker, and to Lyell. They discussed the matter and proposed that Darwin’s “sketch” should be read along with Wallace’s essay at a meeting of the Linnean Society, and the two pieces would be published in the society’s journal. Speed was important. Wallace might have sent his essay to a magazine, in which case, Darwin’s priority would be sunk, or at least compromised. There was no time to ask Wallace’s permission to have his essay read.
But before Darwin could consider the proposal, the baby died. In his grief, Darwin hastily made a compilation for Hooker to edit. An 1844 set of notes, though out of date, seemed to make a conclusive case for priority, for they bore Hooker’s pencilled marks. A more recent 1857 letter to Asa Grey, the professor of botany at Harvard, set out concisely Darwin’s thoughts on evolution by natural selection.
Lyell, Hooker and Darwin were eminent insiders in the closed world of Victorian metropolitan science. Wallace was the outsider. He came from a far humbler background, and if he was known at all, it was as a provider of material for gentlemen experts. It was customary at the Linnean Society for double contributions to be read in alphabetical order. And so, in Darwin’s absence – he and Emma buried their baby that day – his 1844 notes were followed by his detailed 1857 letter, and then, almost as a footnote, came Wallace’s 1858 essay.
Darwin had delved far deeper over many years and certainly deserved priority. Wallace found it difficult to think through the implications of natural selection, and was reluctant in later years to allow that humans too were subject to evolutionary change. The point, however, is Darwin’s mortification about losing possession. As he wrote later to Hooker, “I always thought it very possible that I might be forestalled, but I fancied that I had a grand enough soul not to care.”
Hooker began to press his friend to write a proper scientific paper on natural selection. Darwin protested. He needed to set out all the facts, and they could not be accommodated within a single paper. Hooker persisted, and so Darwin began his essay, which in time grew to become On the Origin of Species. In Browne’s description, what was suddenly released were “years of pent-up caution”. Back at Down House, Darwin did not use a desk, but sat in an armchair with a board across his knees and wrote like a fiend. “All the years of thought,” writes Browne, “climaxed in these months of final insight … the fire within came from Wallace.”
The Origin, written in 13 months, represents an extraordinary intellectual feat: mature insight, deep knowledge and observational powers, the marshalling of facts, the elucidation of near-irrefutable arguments in the service of a profound insight into natural processes. The reluctance to upset his wife Emma’s religious devotion, or to contradict the theological certainties of his scientific colleagues, or to find himself in the unlikely role of iconoclast, a radical dissenter in Victorian society, all were swept aside for fear of another man taking possession of and getting credit for the ideas he believed to be his.
In modern times, we have come to take for granted in art – literature as well as painting and cinema – the vital and enduring concept of originality. Despite all kinds of theoretical objections, it remains central to our notion of quality. It carries with it an idea of the new, of something created in a godlike fashion out of nothing. “Perfectly unborrowed”, as Coleridge said of Wordsworth’s poetry. Originality is inseparable from a powerful sense of the individual, and the boundaries of this individuality are strongly protected.
In traditional societies, conformity to certain respected patterns and conventions was the norm. The pot, the carving, the exquisite weaving needed no signature. By contrast, the modern artefact bears the stamp of personality. The work is the signature. The individual truly possesses his or her own work, has rights in it, defines himself by it. It is private property that cannot be trespassed on. A great body of law has grown up around this possessiveness. Countries that do not sign up to the Berne Convention and other international agreements relating to intellectual property rights find themselves excluded from the mainstream of a globalised culture. The artist owns his work, and sits glowering over it, like a broody hen on her eggs. We see the intensity of this fusion of originality and individuality whenever a plagiarism scandal erupts. (I’ve had some experience of it myself.)
The dust-jacket photograph, though barely relevant to an appreciation of a novel, seals the ownership. This is me, it says, and what you have in your hands is mine. Or is me. We see it too in the cult of personality that surrounds the artist – individuality and personality are driven to inspire near-religious devotion. The coach parties at Grasmere, the cult of Hemingway, or Picasso, or Neruda. These are big figures – their lives fascinate us sometimes even more than their art.
This fascination is relatively new. In their day, Shakespeare, Bach, Mozart, even Beethoven were not worshipped, they did not gleam in the social rankings the way their patrons did, or in the way that Byron or Chopin would do, or in the way a Nobel Prize-winner does today. How the humble artist was promoted to the role of secular priest is a large and contentious subject, a sub-chapter in the long discussion about individuality and modernity. The possible causes make a familiar list – capitalism, a growing leisured class, the Protestant faith, the Romantic movement, new technologies of communication, the elaboration of patent law following the Industrial Revolution. Some or all of these have brought us to the point at which the identification of the individual and her creativity is now complete and automatic and unquestionable. The novelist today who signs her name in her book for a reader, and the reader who stands in line waiting for his book to be signed collude in this marriage of selfhood and art.
There is an antithetical notion of artistic creation, and though it has been expressed in different forms by artists, critics and theoreticians, it has never taken hold outside the academies. This view holds that, of course, no one escapes history. Something cannot come out of nothing, and even a genius is bound by the constraints and opportunities of circumstance. The artist is merely the instrument on which history and culture play. Whether an artist works within his tradition or against it, he remains its helpless product. The title of Auden’s essay, “The Dyer’s Hand”, is just a mild expression of the drift. Techniques and conventions developed by predecessors – perspective, say, or free indirect style (the third person narrative coloured by a character’s subjective state) are available as ready-made tools and have a profound effect. Above all, art is a conversation conducted down through the generations. Meaningful echoes, parody, quotation, rebellion, tribute and pastiche all have their place. Culture, not the individual talent, is the predominant force; in creative writing classes, young writers are told that if they do not read widely, they are more likely to be helplessly influenced by those whose work they do not know.
Such a view of cultural inheritance is naturally friendly to science. Darwin worked against a background of all kinds of evolutionary views, including those of his grandfather, Erasmus. Darwin relied on the observations of animal breeders, pigeon fanciers, natural historians, as well as the work of Malthus and Lyell. Einstein, another great creator, could not have begun his special theory of relativity without the benefit of countless others, including Hendrik Lorentz and Max Planck. He was entirely dependent on mathematicians to give expression to his ideas. (Newton’s much-cited claim to have stood on the shoulders of giants was inverted some years ago to illustrate the potency of predecessors in science: “If I have seen less far than others, it was because giants were standing on my shoulders.”)
Given the tools that were available to scientists in the mid-20th century, including x-ray crystallography, and given the suppositions that were in the air, and the different groups that were working in this field, DNA would have been described sooner or later by someone or other. It should hardly matter then, in the realms of pure rationality and scientific advance who actually got there first. If it had been Linus Pauling and not Crick and Watson, what difference would it have made in the sum of things? But what a difference being ahead by a few months made to the lives of Crick and Watson.
Consider another celebrated moment of priority-anxiety. It came at the end of a 10-year process during which Einstein pursued the ambitious project of “generalising” his special theory of relativity, formulated in 1905. As his thinking developed in the years after its publication, he predicted that light would be influenced by gravitation. His biographer Walter Isaacson points out that Einstein’s success thus far had “been based on his special talent for sniffing out the underlying physical principles of nature”, leaving to others the more mundane task of providing the best mathematical expression. “But,” as Isaacson notes, “by 1912 Einstein had come to appreciate that maths could be a tool for discovering – and not merely describing – nature’s laws.”
Isaacson quotes the physicist James Hartle: “The central idea of general relativity is that gravity arises from the curvature of space-time.” Two complementary processes were to be described – how matter is affected by a gravitational field, and how matter generates a gravitational field in space-time and causes it to curve. These startling, near-ungraspable notions were eventually to find expression in Einstein’s adaptation of the non-Euclidean geometry of tensors devised by the mathematicians Riemann and Ricci. By 1912 Einstein had come close to a mathematical strategy for an equation, but then he turned aside, looking for a more physics-based route. It was only partially successful, and he had to be satisfied with publishing with his colleague Marcel Grossmann an outline of a theory, the famous “Entwurf” of 1913, which, as Einstein came to realise, contained important errors.
The upheavals of the first world war, and Einstein’s struggle against German nationalism among scientific colleagues, his ongoing attempts to see his young sons in Zurich and to obtain a divorce from their mother form the background to another extraordinary intellectual super nova, extending not over 13 months this time, but four outstanding weeks.
In June of 1915 Einstein lectured on the Entwurf at the University of Göttingen. The lectures were a great success. Also, in private conversations with the eminent German mathematician David Hilbert, a fellow pacifist, Einstein explained relativity and what he was attempting to achieve, and the mathematical problems he was encountering. Afterwards, Einstein declared himself enchanted with Hilbert. He seemed to understand right down to the fine details what Einstein was trying to achieve, and the mathematical obstacles in his way.
In fact, Hilbert understood rather too well and soon he was working hard to find a formulation of his own for a general theory, just as Einstein was discovering more errors and contradictions in the Entwurf. He abandoned it in October and turned back to the maths-based strategy of 1912. And thus, painfully conscious of Hilbert, the superior mathematician, on his heels, Einstein began what Isaacson, surely rightly, calls “the most concentrated frenzies of scientific creativity in history”. As he worked on his theory, he was presenting his ideas immediately to the Prussian Academy in a set of four-weekly lectures, beginning on 4 November 1915.
By his third lecture, Einstein’s theory in its present state accurately predicted the shift in Mercury’s orbit – he was, he wrote to a friend, “beside myself with joyous excitement”. Just days before Einstein was about to give his final lecture, Hilbert submitted his own formulation of general relativity to a journal in an essay with the not-so-humble title of “The Foundation of Physics”. Einstein wrote bitterly to a friend: “In my personal experience I have hardly come to know the wretchedness of mankind better.”
Unlike Wallace, who worked independently of Darwin, Hilbert was trying to give mathematical expression to theories that were Einstein’s. Nevertheless, Einstein, like Darwin, was driven to a great creative outpouring for fear of losing priority. The formulation he gave in his final lecture on 28 November was described by the physicist Max Born as “the greatest feat of human thinking about nature, the most amazing combination of philosophical penetration, physical intuition and mathematical skill”. Einstein himself said of the theory that it was of “incomparable beauty”.
The Einstein-Hilbert priority dispute still rumbles on in its small way. But it should be noted that both Wallace and Hilbert were quick and generous to concede priority to Darwin and Einstein. If Einstein’s friendship with Hilbert became strained during that momentous month of November 1915, their friendship was soon re-established.
As children we race each other to be first into the sea. There have been heroic, sometimes fatal races to be first at the north or south poles, or round the north-west passage or up this river or across that desert. Sometimes, intense nationalistic passions are involved. First to swim or fly across the Channel, first to ascend into space, first on the Moon, on Mars – these great endeavours, for all their technical accomplishment, have a childlike quality.
In literature, everyone is first. We do not need to ask who was first to write Don Quixote. Better, in fact, to consider the possibility of being the second – Pierre Menard, who in Borges’s famous story independently reconceives, centuries after Cervantes, the entire novel, down to the last word. The worst novelist in the world can at least be assured that he will be the first to write his terrible novel. And mercifully, the last. And yet, to be first, to originate, to be original is key to the quality of a work of literature. However minimally, it must advance – in subject matter, in means of expression – our understanding of ourselves, of ourselves in the world.
But novelists are the grateful inheritors of an array of techniques and conventions and subject matter, which themselves are the products of social change. I’ve mentioned indirect free style, first deployed in extended form by Jane Austen. Samuel Richardson’s novel Clarissa was perhaps the first to describe in exacting detail and at length the qualities of a subjective mental state. Nineteenth-century novelists bequeathed penetrating and sophisticated means for delineating character. A long time had to pass before a novelist troubled to inhabit the mind of a child. In Ulysses, Joyce made a new poetry out of the minutiae of the every day. And he and modernists like Virginia Woolf found new means of representing the flow of consciousness that now are common, even in children’s books. But Richardson, Austen, Joyce and Woolf were inheritors in their turn. They sat on the shoulders of giants too.
Darwin and Einstein came first and were overwhelmed by celebrity and profound respect, and became icons in the culture, while Wallace and Hilbert languished in relative obscurity. And this “first”, this originality, is precisely defined. Not first along an absolute Newtonian timeline, but first in a recognisable and respectable public forum. Hence the Linnaen Society, hence the Prussian Academy – presentations made at speed and under immense pressure.
Nineteenth-century science had teetered for decades on the edge of evolutionary ideas, and if Darwin – or for that matter Wallace – had not given expression to the idea of evolution by natural selection, others surely would have. The same biological realities confronted everyone, and taxonomy was at an advanced stage.
Likewise, it is inconceivable that the brilliant generation that laid down the foundations of classical quantum mechanics in the first 30 years of the 20th century would not have found a means of binding matter, energy, space and time, though their routes may have differed from Einstein’s, and they may not at first have achieved it with such elegant economy by way of Riemann’s tensor.
To be first, to be original in science matters profoundly. Laboratories race each other to publication. Powerful passions are involved, and Nobel prizes too. To be for ever associated with a certain successful idea is a form of immortality. In longing for it, scientists demonstrate a concern for themselves as creators, as irreplaceable makers. In this we see a parallel with the fiercely individualistic world of novelists, poets, artists and composers who know in their hearts that they are utterly reliant on those who went before them. In both, we see a human face.
I want to touch on another point of convergence between the arts and science. And this is the question of aesthetics. In 1858 and 1915, Darwin and Einstein, driven in part by the somewhat ignoble or worldly ambition to be first, redirected not only the course of science, but redefined our sense of ourselves. These twin revolutions, barely 60 years apart, represent the most profound as well as the most rapid shift and dislocation in human thought that has ever occurred. This rapidity is worth considering. The counter-intuitive notion that the Earth revolves around the Sun took generations to spread and take hold across Europe. Likewise, the brilliant invention of three- and four-crop rotation. A teeming microscopic world was available to medicine from the time in the 1670s onwards when Antoni van Leeuwenhoek began sending his observations to the Royal Society in London. But stubborn tradition-bound medicine kept its back turned on science, and it took almost another 200 years before an understanding of harmful microorganisms and the concept of anti-sepsis shaped medical practice.
A theory that suggested the relatedness of all species, including humans, was a challenge to dignity, and the church found it hard at first to accept the suggestion that species were not fixed, unchanging and recently made by God. Generally, however, Darwin’s ideas explained too much, too well, and were too much in accord with new observations in geology to be resisted, especially by biologists, and many English clergymen with country livings were good naturalists and could immediately grasp the theory’s utility. What is interesting about the publication of On the Origin of Species is the rapidity of its acceptance.
Einstein’s theory could be empirically tested by observing the degree of refraction of starlight by the Sun, best achieved at a full eclipse. Various expeditions were sent from 1918, and though they returned what seemed a positive result, in reality the margin of error in measurements was too great to provide absolute confirmation. And, meanwhile, the theory was already in the textbooks by the late 1920s. Radio telescopes in the early 50s provided the definitive proof, and by then relativity theory was a staple of physics and astronomy.
The accelerated acceptance of Darwin and Einstein’s work in 1858 and 1915 cannot be explained entirely by reference to their effectiveness or truthfulness. Here is what the great American biologist Edward O. Wilson has to say about a scientific theory: “The elegance, we can fairly say the beauty, of any particular scientific generalisation is measured by its simplicity relative to the number of phenomena it can explain.” Many physicists, notably Steven Weinberg, are convinced that it was the elegance, the sheer beauty, of Einstein’s general theory that drove its rapid acceptance ahead of its empirical validation.
Those lucky enough to understand Paul Dirac’s famous equation (it explains the spin of the electron and predicted the existence of anti-matter) speak of its intellectual daring and breathtaking beauty. This is a music most of us will never hear. The equation, as brief as Einstein’s, can be found carved in stone in Westminster Abbey.
If one might make use of Darwin’s theory to think about Einstein’s, we could speculate that evolution has granted us only sufficient understanding of space and time as is necessary to function and reproduce effectively. The relentless logic of natural selection is not organised to grant organisms, even most humans, an intuitive grasp of the kinds of counter-intuitive insights that the special and general theories of Einstein present.
Gravity may well be a function of the bending of space-time, matter and energy may lie along a continuum, but most of us cannot feel this as part of our immediate world. We are the evolved inhabitants of Middle Earth. You might say we continue to dwell in a Newtonian universe, but in fact it is one that would also be familiar to Jesus and Plato.
When a well-known scientist, John Wheeler, writes that “matter tells space-time how to curve, and curved space-time tells matter how to move”, we may or may not be impressed, but it is hard to reorient one’s worldview accordingly, to abandon the sense that there is an absolute “now” in every corner of the universe and that empty space is just a void ready to be filled, and cannot be bent, and is a distinct entity from time. The Einsteinian revolution may have redefined the absolute basics of matter, energy, space and time, but the limits of our mental equipment keep us in our evolutionary homelands, in the savannah of commonsense.
On the other hand, as Steven Pinker has pointed out, the ramifications of natural selection are multiple. And, relatively, they are easily, if uneasily, understood: the Earth and life on it are far older than the Bible suggests. Species are not fixed entities created at one time. They rise, fall, become extinct, and there is no purpose, no forethought in these patterns. We can explain these processes now without reference to the supernatural. We ourselves are related, however distantly, to all living things. We can explain our own existence without reference to the supernatural. We may have no purpose at all except to continue. We have a nature derived in part from our evolutionary past. Underlying natural selection are physical laws. The evolved material entity we call the brain is what makes consciousness possible. When it is damaged, so is mental function. There is no evidence for an immortal soul, and no good reason beyond fervent hope that consciousness survives the death of the brain.
It is testimony to the originality as well as the diversity of our species that some of us find such ramifications horrifying, or irritating, or self-evidently untrue and (literally) soulless, while others find them both beautiful and liberating and discover, with Darwin, “grandeur in this view of life”. Either way, if we do not find our moments of exaltation in religious awe and the contemplation of a supreme supernatural being, we will find them in the contemplation of our arts and our science. When Einstein found that his general theory made correct predictions for the shift in Mercury’s orbit, he felt so thrilled he had palpitations, “as if something had snapped inside. I was,” he wrote, “beside myself with joyous excitement.” This is the excitement any artist can recognise. This is the joy, not of simple description, but of creation. It is the expression, common to both the arts and science, of the somewhat grand, somewhat ignoble, all too human pursuit of originality in the face of total dependence on the achievements of others.