Monday, January 31, 2011

What's it really about?


If the Daily Mail review is anything to go by (see below), people are going to find in Unnatural just what they expect to find.

And I think that must be how also to understand Robert Harris’s review in the Sunday Times. Take a look at it (if you have access), and I think you’ll allow that this is a review of a book in which a scientist complains bitterly about Mary Shelley and Aldous Huxley, accusing them of having “a lot to answer for” for writing novels that doomed blameless modern biomedical research to alarmist newspaper headlines which would never have appeared in the novels’ absence.

Well, well. So how do we account for this passage in the book’s introduction? –

Scientists engaged in new ways of ‘making people’ – what I here call anthropoesis, which condenses that phrase into its Greek equivalent – in modern times, such as those researching in vitro fertilization and cloning, resent and lament these intrusions of myth and legend into their field of work. Here we are, the scientists will say, trying to improve medicine and to relieve man’s estate – trying to do good – and all the rest of the world can see are Gothic ghouls and mad inventors. ‘Whatever today’s embryologists may do, Frankenstein or Faust or Jekyll will have foreshadowed, looming over every biological debate’, said Robert Edwards, a pioneer of IVF, in 1989 at the height of the debate about research on the human embryos that IVF had suddenly made available. Edwards was impatient with the way, in his view, science-fiction narratives were shaping the discussion: ‘The necessity or otherwise for experiments on human embryos sparks the most intense argument, as fears arise about tailor-made babies, or clones, or cyborgs, or some other nightmarish fancy.’

‘The trouble really started way back in the 1930s, by courtesy of the brilliant Aldous Huxley’, Edwards asserted. But he was wrong about that. Aldous Huxley did not conceive a tale that subsequently shaped thinking about embryo research, any more than did Mary Shelley, Robert Louis Stevenson or Goethe. Rather, they and other writers gave particular embodiments to pre-existing myths and legends that would have exerted their influence come what may. Edwards might well have wished that Brave New World had never been written, but as we shall see, Huxley’s authorship of that novel was almost incidental; the ideas were firmly bedded down before he put pen to paper.

Edwards also failed to perceive the true role of fictional tropes of anthropoesis. It is not simply the case that there happen to be stories and legends that create inconvenient and misleading stereotypes. In the stories we tell about artificial people – how they are made, and what we assume they are like – we reveal some of our most profound feelings about what is natural and what is not, and about what this distinction implies in moral terms. For making people has always been cause for moral judgement, which is at root a judgement about naturalness.

Could this be more clearly saying that it is naïve to hold Shelley, Huxley, Stevenson et al. responsible for the way Frankenstein and Brave New World loom large in all public/media debates about these issues?

Yet here is what Harris presents as a criticism of my alleged position:
“If it had not been raining in the summer of 1816… [Mary Shelley] almost certainly would not have written Frankenstein, and there would have been no plays and no films showing a man with a bolt through his neck. But I doubt whether the widespread contemporary unease aroused by scientific intervention in the processes of procreation would be one jot the less. Our response has not been “conditioned” by Mary, Huxley and the rest: it has been expressed by them.”

Now let me give you again a bit of that quote from Unnatural:
“Aldous Huxley did not conceive a tale that subsequently shaped thinking about embryo research, any more than did Mary Shelley, Robert Louis Stevenson or Goethe. Rather, they and other writers gave particular embodiments to pre-existing myths and legends that would have exerted their influence come what may.”

Can you see any substantial difference between these two points of view?

Ah, but that’s why Harris encounters a little cognitive dissonance that compels him to say, “to be fair”, that I do put forward that case (while citing a different quote that masks the fact of how exactly I do so). However, he says, this means I am trying to “have it both ways.” Well, I suppose it would if one of those ways were not Harris’s invention.

Incidentally, you’ll come away from his review imagining I am a “staunch supporter” of cloning. I should be very surprised if you would come away from the book with the same opinion.

My book is not, as Harris claims, about ‘artificial life’. I steer clear of that phrase, for the simple reason that it is the wrong one. ‘Artificial life’ can mean several things, and the idea of making human beings by artificial means or interventions has some overlap with some of them, but is a much more specific enterprise – which is why I invented a new word for it. I say very explicitly why the computer-science visions of artificial life are not very relevant here – that the myths matter primarily for the ‘wetware’ rather than the ‘software’ version. So HAL and AI and all the rest of that stuff are not simply ignored but explicitly set aside. It is also irksome that Harris implies that my statement that IVF “does not exactly make people” is an admission that it is off-topic. He neglects to mention that I also explained why the association of IVF with legends of ‘making people’ is nonetheless “not only permissible but essential… the central issue in both cases is that human life is seen to be initiated by art, by means of human ingenuity rather than merely human biology.” Perhaps he disagrees with this claim (which is central to the thrust of the book). But to imply that I don’t even offer such a justification for my choices strikes me as odd. Besides, if Harris himself felt it was indeed the case that IVF is not a valid part of what he calls ‘artificial life’, why does he then say, “The commercial success of IVF, and its social and political acceptance by almost everyone except the Roman Catholic church, may well point the way to a new attitude regarding artificial life”?

So why do I think Harris wrote this stuff? I certainly don’t think he was trying to be unfair. Rather, I think it is obvious that he has a deafening narrative playing in his mind: ‘here’s a scientist berating popular/media culture for presenting serious scientific research in ways that are [in his words] “highly unscientific, irresponsible, alarmist, dangerous and just plain wrong”.’ It’s frustrating that this is so badly to misunderstand my point; but it is doubly more so given the pains I took to argue that rants of that kind, which scientists do make, are themselves missing the real point.

But really, what a fool I would be to be surprised by this. My book is about how the assumptions and prejudices we bring to this issue get in the way of a clear debate (as opposed, let me reiterate at the risk of boring, to getting in the way of an uncritical acceptance of what the scientists want to do). Can I be surprised if some reviews will exhibit that same problem? Indeed, I now have evidence of two utterly different readings of the book based on what the reviewers expected to hear. For Robert Harris, this is a polemic against the alarmist ignorance obstructing the noble science. For the Daily Mail, it is a book pointing out how ‘our hunger to play God could be the death of us’. I suppose I must look on the bright side and tell myself that it’s heartening to have my thesis illustrated so clearly.

Saturday, January 29, 2011

'Frankenstein plays God' shock horror



When I wrote recently in New Humanist, apropos of my new book Unnatural, that “It is all too easy for self-appointed moralists who warn that reproductive technologies will lead to Frankenstein monsters and Brave New Worlds – whether they are the Daily Mail, the religiously motivated bioethicists who determined George W Bush’s biomedical policies, or anti-biotechnology crusaders – to tap into familiar, legendary nightmares that foreclose a grown-up debate about how, why and when to regulate the technical possibilities” (the piece was accompanied by the wonderful image above by Martin Rowson), I have to admit that I wasn’t thinking very much about the possibility that the Daily Mail would review Unnatural. But if I had been, I needn’t have worried, for it seems that folks at the Mail have ruthless mental filters that transform words into precisely what they want to hear. And so it is that Christopher Hudson’s extensive review in yesterday’s Mail is amazingly favourable, calling Unnatural a “fascinating and disturbing book”. Never mind that I pick apart the idle journalese of “playing God” – Unnatural is apparently “the story of all the Frankensteins who wanted to play God by creating mankind.” Never mind that my point about Brave New World is that it is not “eerily prophetic” at all. Never mind that the central point of the book (there’s a clue in the title) is to challenge accepted notions of “unnaturalness” – one of the key problems for human cloning is apparently “How can its unnaturalness be overcome?” Hudson says that my book “demonstrates [that cloning] could eventually destroy what it means to us to be a human being.” Well, no; I set out to demonstrate that, whatever cloning might do, it will not in fact be that. I would like to imagine that a lot of Mail readers will be in for a surprise if they buy the book on the back of this review (and please don’t let me stop you); but who knows, maybe they’ll convince themselves into drawing the same conclusions.

“From Frankenstein to clones, how our hunger to play God could be the death of us”, says the standfirst to the review. Now if only they had run it a bit earlier, that would have supplied the perfect headline for the tabloid that Martin’s Creature grasps. 

Thursday, January 27, 2011

Artificial hydrogen poses heavy challenge to quantum theory


Another piece for Nature’s online news, and while this is pretty hardcore, it is also a gorgeously bold experiment.

*****************************************************************

Analogues of hydrogen made with exotic particles test quantum chemistry to its limits.

Scientists have made new ultralight and ultraheavy forms of the element hydrogen, and investigated their chemical properties.

Donald Fleming of the University of British Columbia in Vancouver, Canada, and his coworkers have created artificial analogues of hydrogen that have masses of a little over one tenth and four times that of ordinary hydrogen. These pseudo-hydrogens both contain short-lived subatomic particles called muons, superheavy versions of the electron.

The researchers looked at how these new forms of hydrogen behave in a chemical reaction in which a lone hydrogen atom plucks another out of a two-atom hydrogen molecule – just about the simplest chemical reaction conceivable. They find that both the weedy and the bloated hydrogen atoms behave just as quantum theory predicts they should [1] – which is itself surprising.

The experiment is a ‘tour de force’, says Paul Percival of Simon Fraser University in Burnaby, Canada, a specialist in muonium chemistry.

‘I would never attempt such a difficult task myself’, Percival admits, ‘and when I first saw the proposal I was very doubtful that anything of value could be gained from the herculean effort.  Don Fleming proved me wrong. I doubt if anyone else could have achieved these results.’

A normal hydrogen atom contains a single, negatively charged electron orbiting a single positively charged proton in the nucleus. About 0.015 percent of natural hydrogen consists of the heavy isotope deuterium, in which the atoms also contain an electrically neutral neutron in the nucleus. And there is a third isotope of hydrogen (tritium) with two neutrons, produced in some nuclear reactions, but which is too dangerously radioactive for use in such experiments.

Because the chemical behaviour of atoms depends on the number of electrons they have, the three hydrogen isotopes are chemically almost identical. But the greater mass of the heavy isotopes means that they vibrate at different frequencies, and quantum theory suggests that this will produce a small difference in the rate of their chemical reactions, such as the one examined by Fleming and colleagues.

If lighter and heavier versions of hydrogen could be made, that theory could be subjected to more rigorous testing. Fleming and colleagues did this using muons produced by collisions in the Canadian particle accelerator TRIUMF in Vancouver.

Muons are related to electrons, but are more massive. “A muon is an overgrown electron – an electron on steroids – with a mass about 200 times that of an electron”, explains Richard Zare, a physical chemist at Stanford University. “But unlike the free electron the free muon falls apart, with a mean lifetime of about 2.2 microseconds.” This meant that the researchers had to work fast to study their pseudo-hydrogen.

To make the ultralight form, they substituted the proton for a positively charged muon, which has just 11 percent of the mass of a proton. And to make ultraheavy hydrogen, they replaced one of the electrons in a helium atom with a negative muon.

Helium has two electrons, two protons and two neutrons. But because it is more massive, the negative muon orbits much more tightly around the nucleus, and so in effect the atom becomes a kind of composite nucleus – the existing two-proton nucleus plus the muon – orbited by the remaining electron. So it has a mass of a little over four times that of hydrogen.

Fleming and colleagues found that the reaction rates calculated from quantum theory were close to those measured experimentally. “This gives confidence in similar theoretical methods applied to more complex systems”, says Fleming.

The good agreement wasn’t necessarily to be expected, since the calculations rely on the so-called the Born-Oppenheimer approximation which assumes that the electrons adapt their trajectories instantly to any movement of the nuclei. This is generally true for electrons, which are nearly 2000 times lighter than protons. But it wasn’t obvious that it would hold up for muons, which have a tenth of the proton’s mass.

“It surprises me at first blush that the theoretical treatments hold up so well”, says Zare. “The Born-Oppenheimer approximation is based on the small ratio of the mass of the electron to that of the mass of the nuclei. Yet suddenly the mass of the electron is increased by two-hundred-fold and all seems to be well.”

Because the muon has such a short lifetime, extending such studies to more chemically complex systems is even more challenging. However, Fleming and his colleagues propose now to look at the ‘hydrogen’ exchange reaction between the superheavy ‘hydrogen’ and methane (CH4).

References

1. Fleming, D. G. et al. Science 331, 448-450 (2011).

Monday, January 24, 2011

How words get the message across


Here is the pre-edited version of my latest news article for Nature online, with a bit of extra stuff appended for which there was no room.
***********************************************************

Languages are adapted to deliver information efficiently and smoothly.

Longer words tend to carry more information, according to research by a team of cognitive scientists at the Massachusetts Institute of Technology.

It’s a suggestion that might sound intuitively obvious, until you start to think about it. Why, then, the difference in length between ‘now’ and ‘immediately’? For many years, linguists have tended to believe that word length depended primarily on how often the word is used – a relationship discovered in the 1930s by the Harvard linguist George Kingsley Zipf [1].

Zipf believed that this link between word length and frequency stemmed from an impulse to minimize the amount of time and effort needed for speaking and writing, since it means we use more short words than long ones. But Steven Piantadosi and colleagues say that, to convey a given amount of information, it is more efficient to shorten the least informative – and therefore the most predictable – words, rather than the most frequent ones.

Zipf’s relationship is roughly correct, as implied by how much more often ‘a’, ‘the’ and ‘is’ are used in English than, say, ‘extraordinarily’. And this relationship of length to use seems to hold up in many languages. Because written and spoken length are generally similar, it applies to both speech and text.

But after analysing word use in 11 different European languages, Piantadosi and colleagues found that word length was more closely correlated with their information content than with their usage frequency. They describe their results in the Proceedings of the National Academy of Sciences USA [2].

This is a landmark study”, says linguist Roger Levy of the University of California at San Diego. “Our understanding of the relationship between word frequency and length has remained relatively static since Zipf’s discoveries’, he says, and he feels that this new study may now supply “the largest leap forward in 75 years in our understanding of how principles of communicative efficiency govern the evolution of natural language lexicons.”

Measuring the information content of a word isn’t easy, especially because it can vary depending on the context. The more predictable a word is, the less informative it is. The word ‘nine’ in ‘A stitch in time saves nine’ contains less information than it does in the phrase ‘The word that you will hear is nine’, because in the first case it is highly predictable.

The MIT group devised a method for estimating the information content of words in digitized texts by looking at how it is correlated with – and thus, predictable from – the preceding words. For just a single preceding word, Piantadosi explains that “we count up how often all pairs of words occur together in sequence, such as ‘the man’, ‘the boy’, ‘a man’, ‘a tree’ and so on. Then we use this count to estimate the probability of a word conditioned on the previous word – or more generally, the probability of any word conditioned on any preceding sequence of a given number of words.” According to information theory, the information content is then proportional to the negative logarithm of this probability.

However, physicist Damián Zanette of the Centro Atómico Bariloche in Argentina, who has studied Zipf-type relationships in linguistics, is not persuaded that this method accurately captures the real information content of a word in context. This, he says, is typically determined by a span of several surrounding hundred words, not just a few [3].

Piantadosi and colleagues suggest that the relationship of word length to information content might not only make it more efficient to convey information linguistically but also make language cognition a smoother ride for the reader or listener. If shorter and briefer words carry less information, then the density of information throughout a phrase or sentence will be smoothed out, so that it is delivered at a roughly steady rate rather than in lumps. In this way, the results suggest how the lexical structure of language might aid communication.

Surprising though it may seem, some linguists have suggested previously that communication might not in fact be the primary purpose of language – Noam Chomsky, for example, has claimed that it is about establishing social relationships. Yet according to cognitive scientist Florian Jaeger of the University of Rochester in New York, these new results “suggest that communication is a sufficiently important aspect of language to shape it over time”.


References

1. Zipf, G. The Psychobiology of Language (Routledge, London, 1936).
2. Piantadosi, S. T., Tily, H. & Gibson, E. Proc. Natl Acad. Sci. USA 10.1073/pnas.1012551108 (2011).
3. Montemurro, M. A. & Zanette, D. H. Adv. Complex Syst. 13, 135-153 (2010).


Some further comments from Steven Piantadosi in response to my questions:

PB: In terms of the possible reasons for your central finding: are you suggesting that shorter words carry less information largely so that information tends to be rather evenly distributed through both text and (because of the relationship of orthographic to phonetic length) speech, i.e. the short, 'rapid-fire' words don't carry a lot of info and so don't impose a sudden high demand on cognitive processing?

SP: Yes, that's probably the most likely theory for what's going on. There are quite a few papers in psycholinguistics showing these kinds of effects (references 7,8,9,10,12 in the paper). In Levy & Jaeger, for instance, people insert optional syntactic elements like "that" in locations where there would otherwise be a peak in information content – inserting another word helps keep information per unit time lower.

PB: In this respect, what do the findings imply for the long-standing idea that language is a compromise between the needs of the speaker and those of the listener? It rather seems the balance here is in favour of the listener, who gets a smooth rather than lumpy informational stream, whereas the speaker has to do rather more speaking than if length depended primarily on frequency. Or does your idea also optimize the total amount (time) of speaking needed to convey a given amount of information, and so benefit the speaker too?

SP: This is a really interesting issue. It could be caused by speakers thinking about what listeners would want, or it could just reflect intrinsic properties of language production systems, or both. Speakers have more trouble accessing low frequency (probably also high information content) words, so I wouldn't say that this necessarily has to come from speakers designing speech for listeners. It's true that speakers have to do more speaking, but that also means they have more time to plan and produce their utterances. It also helps listeners by giving them more time to process. I don't think we know who it's really for, yet.

PB: Finally, more for my own curiosity than anything, I can't help wondering if anything of this sort works for Chinese. Obviously one tends to lose the phonetic/orthographic link there - and while commonly used words do sometimes have simpler written characters, this is not always so. Do you nonetheless expect to see any kind of relationship between information content and the number of strokes in the characters? Does any such thing then survive in speech patterns?

SP: Ah that's interesting. I'm not sure I would necessarily predict effects in Chinese orthography per se, but it would be interesting to look – it would be a neat case for seeing if there are actually influences on the writing system. In the current work, we used orthography largely as a proxy for phonetic length. Chinese has very many monosyllabic words so its not clear that word length has much variance to be explained there. That raises the interesting question of why Chinese is like that. It may be that information content is modulated in other ways in Chinese, but I don't know.

Thursday, January 20, 2011

Unnatural events


Seems a timely point to mention that my new book Unnatural is about to appear – it’s officially released at the start of February. I have a forthcoming Opinion piece in New Scientist on the topic (5 Feb issue), and have just recorded an item about it for the Guardian books podcast. I have several talks on this (and other things) coming up in the next few months, and will put a list on my web site.

Thursday, January 13, 2011

For geeks only

That means you.

First, for anyone interested in the regulation of synthetic biology, there is a set of guidelines issued by the International Risk Governance Council in Geneva, in the writing of which I played a part.

Second, here is a little news item about lead-acid batteries with a fun bottom line (I know, it sounds unlikely).

Friday, January 07, 2011

What is a bond?

My piece on the chemical bond is now published in Nature. I hope it attracts more comment – already I’m pleased to see remarks from the IUPAC team who are redefining the hydrogen bond (I had no room to talk about this in any detail, or to supply the link), and also some comment on Bader’s perspective, to which again I could only allude in the briefest of terms – it deserves more space.

... ah, Julie's post about the inaccessibility behind Nature's firewall makes me feel bad, so here's the whole piece after all, before final editing so with a few more refs and details included:

******************************

Not so long ago the chemistry student’s standard text on the theory of chemical bonding was Charles Coulson’s Valence (1952). Absent from it was Coulson’s real view of the sticks that generations of students have drawn to link atoms into molecules. ‘A chemical bond is not a real thing: it does not exist: no one has ever seen it, no one ever can. It is a figment of imagination which we have invented,’ he wrote [1].

There is a good reason for postponing this awkward truth. The bond is literally the glue that makes the entire discipline cohere, and so to consider it an objective reality is necessary for any kind of chemical discourse. Chemistry is in fact riddled with such convenient (but contested [2]) fictions, such as electronegativity, oxidation state, tautomerism and acidity.

Disputes about the correct description of bonding have ruffled chemists’ feathers since the concept of molecular structure first emerged in the mid-nineteenth century. Now they are proliferating, as new theoretical and experimental techniques present new ways to probe and quantify chemical bonds [3]. Traditional measures such as crystallographic atomic distances and dissociation energies have been supplemented by spectroscopic techniques for determining vibrational frequencies, shifts in the electronic environment of the atom, magnetic interactions between atoms, measurements of force constants, and a host of quantum-chemical tools for calculating such aspects as electron distributions, electron localization and orbital overlap.

The nature of the chemical bond is now further complicated by the introduction of the dynamical dimension. Molecules have traditionally been regarded, if not as static, then as having platonic architectural frameworks which are merely shaken and rotated by thermal motions. The bonds get stretched and bent, but they still have an equilibrium length and strength that seems to justify their depiction as lines and stalks. Now, thanks to ultrafast spectroscopies, we are no longer restricted to these time-average values to characterize either structure or reactivity. What you ‘measure’ in a bond depends also on when you measure it.

Some chemists argue that in consequence the existence (or not) of a bond depends on how the problem is probed; others are committed to absolute criteria [4]. This difference of opinion goes to the heart of what chemistry is about: can all be reduced to quantum physics or are fuzzy heuristics essential? More pressingly, the issue of how best to describe a chemical bonding pattern has tangible implications for a wide range of problems in chemistry, from molecules in which atoms are coerced out of their usual bonding geometry [5] to the symmetric hydrogen bond (where the hydrogen is shared equally between two atoms) [6,7] and new variations on old themes such as aromaticity (special patterns of ‘smeared-out’ bonding like that in benzene) [8].

Just about every area of chemistry harbours its own bonding conundrums, almost any of which illustrate that we have a far from exhaustive understanding of the ways in which quantum rules will permit atoms to unite – and that in consequence our chemical inventiveness suffers from a limited view of the possibilities.

Carving up electrons

We can all agree on one thing: chemical bonding has something to do with electrons. Two atoms stick together because of the arrangement of electrons around their nuclei. In the nineteenth century it was commonly thought that this attraction was electrostatic: that atoms in molecules are positively or negatively ionized. That left the puzzle of how identical atoms can form diatomic molecules such as H2 and O2. American chemist G. N. Lewis proposed that bonding can instead result from the sharing of electrons to create filled shells of eight, visualized as the corners of a cube [9].

In the 1920s and 30s Linus Pauling showed how this interaction could be formulated in the language of quantum mechanics as the overlap of electron wavefunctions [10]. In essence, if two atomic orbitals each containing a single electron can overlap, a bond is formed. Pauling generalized earlier work on the quantum description of hydrogen to write an approximate equation for the wavefunction created by orbital overlap. This became known as the valence-bond (VB) description.

But an approximation is all it is. At the same time, Robert Mulliken and Friedrich Hund proposed another way to write an approximate wavefunction, which led to an alternative way to formulate bonds: not as overlaps between specific orbitals on separate atoms but as electron orbitals that extend over many atoms, called molecular orbitals (MOs). The relative merits of the VB and MO descriptions were debated furiously for several decades, with no love lost between the protagonists: Mulliken’s much-repeated maxim ‘I believe the chemical bond is not so simple as some people seem to think’ was possibly a jibe at Pauling. By the 1960s, for all Pauling’s salesmanship, it was generally agreed that MO theory was more convenient for most purposes. But the debate is not over [11], and Roald Hoffmann of Cornell University insists that ‘discarding any one of the two theories undermines the intellectual heritage of chemistry’.

Both options are imperfect, because they insist on writing the electronic wavefunction as some combination of one-electron wavefunctions. That’s also the basis of the so-called Hartree-Fock method for calculating the ground-state wavefunction and energy of a molecular system – a method that became practical in the 1950s, when computers made it possible to solve the equations numerically. But separating the wavefunction into one-electron components is a fiction, since the distribution of one electron depends upon the distributions of the others. The difference between the true ground-state energy and that calculated using the Hartree-Fock approach is called the correlation energy. More recent computational methods can capture most of the correlation energy – but none can give an exact solution. As a result, describing the quantum chemical bond remains a matter of taste: all descriptions are, in effect, approximate ways of carving up the electron distribution.

If that were the limit of the bond’s ambiguity, there would be little to argue about. It is not. There is, for example, the matter of when to regard two atoms as being bonded at all. Pauling’s somewhat tautological definition gave the game away: ‘there is a chemical bond between two atoms or groups of atoms in case that the forces acting between them are sufficient to lead to the formation of an aggregate with sufficient stability to make it convenient for the chemist to consider it as an independent molecular species’ [1]. Pauling himself admitted that although his definition will in general exclude the weak van der Waals (‘induced dipole’) attraction between entities, occasionally – as in the association of two oxygen molecules into the O4 cluster – even this force can be strong enough to be regarded as a chemical bond.

It’s no use either suggesting (as Coulson did) that a bond exists whenever the combined energy of the objects is lower than that when they are separated by an infinite distance. This is essentially always the case, at least for electrically neutral species. Even two helium atoms experience mutual van der Waals attraction, which is after all why helium is a liquid at very low temperature, but they are not generally thought to be chemically bonded as a result.

Besides, the ‘bonded or not’ question becomes context-dependent once atoms are embedded in a molecule, where they may be brought into proximity merely by geometric factors, and where there is inevitably some arbitrariness in assigning them an individual electronic configuration. The resulting ambiguities were illustrated recently when three experts on inorganic compounds failed to agree about whether two sulphur atoms in an organometallic compound are linked by a bond [12]. The argument involved different interpretations of quantum-chemistry calculations, tussles over the best criteria for identifying a bond, and evidence of precedent from comparable compounds.

All this is merely a reminder that the molecule is ultimately a set of nuclei embedded in a continuous electron cloud that stabilizes a particular configuration, which balls and sticks can sometimes idealize and sometimes not. This doesn’t mean that disputes about the nature of the chemical bond are simply semantic. It matters, for example, whether we regard a very strong multiple bond as quintuple or sextuple, even if this is a categorization that only textbooks, and not nature, recognize.

Besides, how we choose to talk about bonds can determine our ability to rationalize real chemical behaviour. For example, the different descriptions of the bonds in what are now called non-classical ions of hydrocarbons – whose relative merits were furiously debated in the 1950s and 60s – have direct implications for the way these species react. Whether to consider the bonding non-classical, in the sense that it involved electrons spread over more than two atomic nuclei, or tautomeric, involving rapid fluctuations between conventional two-atom bonds, was not just a question of convention. It had immediate consequences for organic chemistry [13]

Perhaps one might seek a distinction between bonded and not-bonded in terms of how the force between two atoms varies with their separation? Yes, there is an exponential fall-off for a covalent bond like that in H2, and a power-law decay for van der Waals attraction. But the lack of any clear distinction between these two extremes has been emphasized in the past two decades by the phenomenon of aurophilicity [14,15]. Organometallic compounds containing gold with only a few chemical groups attached tend to aggregate, forming dimers or linear chains. In aurophilic bonds, the basic interaction has the same origin as the van der Waals force: the electron clouds ‘feel’ each other’s movements, so that random fluctuations of one induce mirror-image fluctuations of the other. But that interaction is modified here by relativistic effects: the changes in electron energies resulting from their high speeds in orbitals close to gold’s highly charged, massive nuclei [15,16]. Aurophilic bonds have therefore been described as a ‘super van der Waals’ interaction. Does that make them true bonds? It’s chemically meaningful to treat them that way (they’ll even serve for cementing new ‘designer’ molecular crystals [17]), but perhaps at the cost of relinquishing potential distinctions.

In uniting ‘closed-shell’ atoms, aurophilicity has sometimes been compared to hydrogen bonding, which is of comparable strength. Hydrogen bonds have traditionally been rationalized in electrostatic terms: positively polarized hydrogen atoms drawn towards regions of high electron density, due for example to ‘lone pairs’. But the bond has some covalent, electron-sharing character too, as is clear from its directional nature (it tends to have a 180o bond angle). Quantifying that is not at all straightforward, however, and has only very recently been done experimentally [18], prompting a task group of the International Union of Pure and Applied Chemistry to propose a new definition of the hydrogen bond (open for comment until this March) to replace the older electrostatic picture [19]. It’s an indication of how new methodology can restructure thinking about apparently familiar – and vitally important – modes of bonding. Even then, the IUPAC report warns that ‘there will be borderline cases for which the interpretation of the evidence might be subjective’: an explicit admission that categorizing bonds must remain an art, informed but not wholly determined by scientific criteria.

Moving target

How dynamics colours the notion of a chemical bond is an increasingly subtle matter. Atomic motions make even a ‘simple’ molecule complex; any movement of one nucleus demands that the entire electron cloud adjusts. So a jiggle of one group of nuclei can make it easier to cleave off another.

This complication never used to matter much in chemistry. The movements were too rapid to be observable, much less exploitable. But ultrashort pulsed lasers have moved the goal posts. For example, we can pump energy into a vibrational mode to weaken a specific bond, enabling selective molecular surgery [20]. We can ask about the chemical behaviour of a molecule at a particular moment in its dynamical evolution: even a strong bond is weakened when a vibration stretches it beyond its average, equilibrium length, so in ultrafast chemistry it may no longer be meaningful to characterize bonds simply as strong or weak. As Fleming Crim of the University of Wisconsin-Madison puts it, ‘a bond is an entity described by quantum mechanics but not a fixed ‘entity’ in that it will behave differently depending on how we perturb and interrogate it.’ The trajectory of a chemical reaction must then be considered not as a simple making and breaking of bonds but as an evolution of atoms on a potential-energy surface. This was always implicit in classical drawings of transition states as molecular groupings containing dashed lines, a kind of ‘almost bond’ in the process of breaking or forming. Now that is explicitly revealed as a mere caricature of a complicated dynamical process in space and time.

Underlying most these discussions is an unspoken assumption that it is meaningful to speak, if not of a ‘bond’ as an unchanging entity, then at least of an instantaneous bound state for a particular configuration of nuclei. This assumes that the electrons can adjust more or less instantly to any change in the nuclear positions: the so-called Born-Oppenheimer approximation. Because electrons are so much lighter than nucleons, this assumption is usually justified. But some clear breakdowns of the approximation are now well documented [21]. They are best known in solid-state systems [22], and in fact superconductivity is one of the consequences, resulting from a coupling of electron and nuclear motions. Such things may also happen in molecules, particularly in the photochemistry of polyatomic molecules, which have a large number of electronic states close in energy [23]; they have also been observed for simple diatomic molecules in strong electric fields [24]. As a result, the molecular degrees of freedom may become interdependent in strange ways: rotation of the molecule, for example, can excite vibration. In such situations, the very notion of an electronic state begins to crumble [21].

Embrace the fuzziness

These advances in dynamical control of quantum states amount to nothing less than a new vision of chemistry. The static picture of molecules with specific shapes and bond strengths is replaced by one of a bag of atoms in motion, which can be moulded and coaxed into behaviours quite different from those of the equilibrium species. It does not demand that we abandon old ideas about chemical bonds, nor does it truly challenge the ability of quantum theory to describe atoms and their unions. But it recommends that we view these bonds as degrees of attraction that wax and wane – or as cartoon representations of a molecule’s perpetual tour of its free-energy landscape. At a meeting in 1970, Coulson asserted that the simple notion of a chemical bond had already become lost, and that it seemed ‘something bigger’ was needed to replace it. ‘Whether that ‘something bigger’… will come to us or not is a subject, not for this Symposium, but for another one to be held in another 50 years time’, he said [25]. That moment is almost upon us.

But we needn’t fret that the ‘rules’ of bonding are up for grabs — quite the converse. While there may be some parts of science fortunate enough to be exhaustively explained by a single, comprehensive theory, this isn’t likely to be a general attribute. We are typically faced with several theories, some overlapping, some conflicting, some just different expressions of the same thing. Our choice of theoretical framework might be determined not so much by the traditional criterion of consistency with experiment but for more subjective reasons. According to Roald Hoffmann of Cornell University, these preferences often have an aesthetic component: depending on factors such as simplicity, utility for ‘telling a story’ about chemical behaviour, the social needs of the community, and the question of whether a description is productive.

As Hoffmann says, ‘any rigorous definition of a chemical bond is bound to be impoverishing’. So his advice to ‘have fun with the fuzzy richness of the idea’ seems well worth heeding.


References

1. C.A. Coulson, The Spirit of Applied Mathematics, 20-21 (Clarendon Press, Oxford, 1953).
2. Jansen, M. & Wedig, U. Angew. Chem. Int. Ed. 47, 10026-10029 (2008).
3. J. Comput. Chem. special issue, 28, 1-466 (2007).
4. Cortés-Guzmán, F. & Bader, R. F. W. Coord. Chem. Rev. 249, 633 (2005).
5. Merico, G., Médnez-Rojas, M. A., Vela, A. & Heine, T. J. Comput. Chem. 28, 362-372 (2007).
6. Jensen, S. J. K. & Csizmadia, I. G. Chem. Phys. Lett. 319, 220-222 (2000).
7. Benoit, M., Marx, D. & Parrinello, M. Nature 392, 258-261 (1998).
8. Abersfelder, K., White, A. J. P., Rzepa, H. S. & Scheschkewitz, D. Science 327, 564-566 (2010).
9. Lewis, G. N. J. Am. Chem. Soc. 38, 762 (1916).
10. Pauling, L. The Nature of the Chemical Bond (Cornell University Press, Ithaca, 1939).
11. Hoffmann, R., Shaik, S. & Hiberty, P. C. Acc. Chem. Res. 36, 750-756 (2003).
12. Alvarez, S., Hoffmann, R. & Mealli, C. Chem. Eur. J. 15, 8358-8373 (2009).
13. Brown, H. C. The Nonclassical Ion Problem (Springer, Berlin, 1977).
14. Schmidbaur, H. Gold Bull. 13, 3-10 (2000).
15. Pyykkö, P. Chem. Soc. Rev. 37, 1967-1997 (2008).
17. Schmidbaur, H., Cronje, S., Djordjevic, B. & Schuster, O. Chem. Phys. 311, 151-161 (2005).
17. Katz, M. J., Sakai, K. & Leznoff, D. B. Chem. Soc. Rev. 37, 1884-1895 (2008).
18. Isaacs, E. D. et al., Phys. Rev. Lett. 82, 600-603 (1999).
19. Arunan, E. et al., ‘Definition of the hydrogen bond’, recommendation submitted by IUPAC task group 2004-026-2-100, October 2010. See http://media.iupac.org/reports/provisional/abstract11/arunan_310311.html
20. Crim, F. F. Science 249, 1387 (1990).
21. Sukumar, N. Found. Chem. 11, 7-20 (2009).
22. Pisana, S. et al., Nature Mater. 6, 198-201 (2007).
23. Worth, G. A. & Cederbaum, L. S. Ann Rev. Phys. Chem. 55, 127-158 (2004).
24. Sindelka, M., Moiseyev, N. & Cederbaum, L. S., Preprint http://www.arxiv.org/abs/1008.0741.
25. Coulson, C. A. Pure Appl. Chem. 24, 257-287 (1970).

Water mess

I could say a lot about this murky business, but won’t. Michael Banks has done a good job of presenting the facts here, as far as I (as one of the organizing committee) can tell. None of us knows quite what is going to come of it all, except that it seems unlikely that the Nobel decision will be changed. It seems to set a troubling precedent. But if nothing else, it seems to confirm how woefully vulnerable water research is to outbreaks of a pathological nature.

Sunday, January 02, 2011

The Year of Chemistry (but some physics and biology too)

I seem to have ended 2010 with a little cluster of articles here and there. In Physics World I have a feature on single-molecule sequencing of DNA using nanopores – an exciting area that I’m now convinced is going to pay off some time soon, and which will demonstrate that advances in understanding of biology still frequently hinge on the technical capability that physics and chemistry supply. Oddly the December issue of Physics World seems still not to be in circulation or live online, but there’s a preview of the piece here. In Nature I have a couple of pieces to mark the Year of Chemistry in 2011 – an In Retrospect perspective on Linus Pauling’s classic text The Nature of the Chemical Bond and, as the main course to that hors d’oeuvre, an article on changing views of the chemical bond. The first of these is the first item below (the long version, with material that was rightly cut for the published version); the second is too long for that, but will appear in this week’s issue of Nature. I have a follow-up on the Peter Debye story below as my Crucible column in the January Chemistry World; that’s the second item below. And finally, I have a piece in New Humanist that trails my next book Unnatural, coming out in February, which picks up on the forthcoming production of Frankenstein at the National Theatre, directed by Danny Boyle. I’m greatly looking forward to that performance, and hope to be reviewing it for Nature. The NH piece is graced by one of Martin Rowson’s fabulous illustrations – worth the cover price for this alone.

And Happy New Year to everyone.

***********************

Linus Pauling’s The Nature of the Chemical Bond has, like Newton’s Principia or Darwin’s Origin of Species, the kind of legendary status that is commonly deemed to obviate any obligation to read it. Every chemist learns of its transformative role in uniting the prevailing view of molecules as assemblies of atoms with the new quantum-mechanical picture of atomic wavefunctions. But the book is long, by chemists’ standards mathematical, and anyway we now know that there are more versatile and useful approaches to the quantum bond than Pauling’s.

Yet Pauling’s book remains a good primer on the basic facts of chemical bonding – impressive for a book almost 70 years old. That’s not to say that the book should be more widely read – there are naturally better and more relevant treatments of the subject now, and The Nature of the Chemical Bond does not benefit from the elegant prose of Darwin’s works – but it is still bracing to do so. The best preparation is to look first at what more or less contemporary textbooks have to say about bonding. To take two random examples: Inorganic Chemistry, (Macmillan, 1922), by the eminent T. Martin Lowry, professor of physical chemistry at Cambridge, barely gets beyond John Dalton’s symbolic ‘ball’ molecules and Berzelius’s Law of Multiple Proportions (elements combine in simple ratios); Outlines of Physical Chemistry (16th edn, Methuen, 1930) by George Senter of Birkbeck College, a student of Wilhelm Ostwald and Nernst, doesn’t even mention the chemical bond but speaks in terms of affinities. They are products of the nineteenth century.

It’s true that this is not entirely representative, for the problem of how to describe the chemical bond was already by then acknowledging atomic physics. The English chemist Edward Frankland introduced the term in 1866, but regarded it not as a physical connection, as implied by the practice then common of drawing lines between elemental symbols, but as a kind of force akin to that which binds the solar system. Berzelius suspected that this force was electrostatic: the attraction of oppositely charged ions. That view seemed favoured by J. J. Thomson’s discovery of the electron in 1897, since ions could result from an exchange of electrons between nuclei.

But Gilbert Lewis, another Nernst protégé at the University of California at Berkeley, argued that bonding results instead from sharing, not exchange, of electrons. More precisely, this gives rise to what Irving Langmuir later called a covalent bond, as opposed to the ionic bond that comes from electron exchange. In 1916 Lewis outlined the view that atoms are stabilized by having a full ‘octet’ of electrons, visualized as the corners of a cube, and that this might come about by sharing vertices or edges of the cubes. Langmuir popularized (in Lewis’s view, appropriated) this model, which seemed vindicated when Niels Bohr explained how the octets arise from quantum theory, as discrete electron shells.

Yet this remained a rudimentary grafting of quantum theory onto the notions that chemists used to rationalize molecular formulae. Pauling, a supremely gifted young man from a poor family in Oregon who won a scholarship to the prestigious California Institute of Technology in 1922, was convinced that chemical bonding needed instead to be understood from quantum first principles. He wasn’t (as sometimes implied) alone in that – in particular, Richard Tolman at Caltech held the same view. Pauling had a golden opportunity to develop the notion, however, when in 1926 a Guggenheim scholarship allowed him to come to Europe to visit the architects of quantum theory: Bohr at Copenhagem, Arnold Sommerfeld at Munich and Erwin Schrödinger at Zurich. He also met Fritz London and his student Walter Heitler, who in 1927 published their quantum-mechanical description of the hydrogen molecule. Here they found an approximate way to write the wavefunction of the molecule which, when inserted into the Schrödinger equation, allowed them to calculate the binding energy, in reasonable agreement with experiment.

Pauling expanded this treatment to the molecular hydrogen ion H2+, and generalized it into a description called the valence-bond model. He considered that if the wavefunction that offers the lowest energy turns out to be one that is a combination of the wavefunctions of two or more structures, the molecule can be considered to ‘resonate’ between the structures. The molecule is ten stabilized by ‘resonance energy’. “It is found that there are many substances whose properties cannot be accounted for by means of a single electronic structure of the valence-bond type, but which can be fitted into the scheme of classical valence theory by the consideration of resonance among two or more such structures.” For example, the H2+ ion can be considered a resonance between HA+ .HB and HA. HB+ The electron resonates between the two nuclei.

Pauling also showed in a paper of 1928 how the bonding in molecules such as those of four-valent carbon can be explained in terms of the concept of ‘hybridization’, in which atomic electron orbitals (here the so-called 2s and three 2p orbitals) are ‘mixed’ into hybrid orbitals with a new geometric distribution in space: for carbon, they give rise to four sp3 orbitals which create a tetrahedral covalent bonding arrangement. Thiese ideas were published in a series of papers in 1931 in the Journal of the American Chemical Society that formed the core of The Nature of the Chemical Bond. The book remained in print, with (three) revised editions, until 1960. The scope of the book is breathtaking: it brings multiple bond, ionic, metallic and hydrogen bonds all within the framework, and explains how the ideas fit with observations of bond lengths and ionic sizes in X-ray crystallography, the technique that Pauling studied from the outset at Caltech and which eventually led to his seminal work in the 1950s on the structure of proteins and nucleic acids.

Pauling acknowledges in his book that it is a bit arbitrary to divide up the bonding into particular, resonating configurations of nuclei and electrons; but he says we do that all the time. “The description of the propane molecule as involving carbon-carbon single bonds and carbon-hydrogen single bonds is arbitrary; the concepts themselves are idealizations.” The wavefunction is all that really matters.

It is one thing to say it, however, and quite another to accept this arbitrariness in the face of an alternative. In the late 1920s, Robert Mulliken at the University of Chicago and Friedrich Hund in Göttingen devised a different quantum description of chemical bonding which approximated the electron wavefunctions in another way, giving rise to ‘molecular orbitals’ in which electrons were considered to be distributed over several nuclei. This model gave a rather simpler picture for explaining molecular electronic spectra: the quantum energy levels of electrons. What is more, it could offer a single description of some molecules for which the valence-bond approach needed to invoke resonance between a great many discrete structures. This was especially true for aromatic molecules such as benzene: the VB model needed something like 48 separate structures for naphthalene, and, in the case of ferrocene described in the 3rd (1960) edition of The Nature of the Chemical Bond, no fewer than 560. Evidently, while neither the MO nor VB models could lay claim to being more fundamental or ‘correct’, the former had significant advantages from a practical point of view. This was suspected even when Pauling’s book first appeared – some reviewers criticised him for not mentioning the rival theory, while one suspected that the VB method might triumph purely because of Pauling’s superior presentational skills. Pauling himself never accepted that MO theory was generally more useful, although it was the consensus among chemists by the 1970s.

The significance of The Nature of The Chemical Bond was not so much that it pioneered the quantum-mechanical view of bonding – London and Heitler had done that – but that it made this a chemical theory, a description that chemists could appreciate rather than an abstract physical account of wavefunctions. It recognized that, for a mathematical model of physical phenomena to be useful, it needs to accommodate itself to the intuitions and heuristics that scientists need in order to talk coherently about the problem. Emerging from the forefront of physics, this was nevertheless fundamentally a book for chemists.

**********************

In Kurt Vonnegut’s 1961 novel Mother Night, an American writer named Howard Campbell is brought to trial for his crimes as a Nazi propagandist during the Second World War. The apolitical Campbell decided to remain in Germany after Hitler came to power in 1933, where he is persuaded to make English radio broadcasts of Nazi propaganda. But he has also been enlisted by an operative of the US War Department to lace his broadcasts with intelligence messages coded in coughs and pauses. This role is never made public, and Campbell is constantly threatened with exposure of his ‘Nazism’ while trying to lead an anonymous life post-war in New York.

It would be unwise to stretch too far any parallels with the life of Peter Debye, the Dutch physical chemist who won the 1936 chemistry Nobel for his work on molecular structure and dipole moments. But Mother Night came to my mind after hearing the latest suggestion that Debye, who has been reviled in the past for alleged collaboration with the pre-war Nazi regime, might have been passing on information about German war technology to a spy for the British secret service in Berlin.

The evidence for that, outlined in a paper by retired chemist Jurrie Reiding after consulting Debye’s archival documents in America, is extremely circumstantial [1]. Debye was a lifelong friend of Paul Rosbaud, an Austrian chemist who hated the Nazis and spied for the Allies during the war under the codename ‘Griffin’. Reiding says that such a friendship would be inconceivable if Debye was a Nazi sympathizer. But there are no more than vague hints about whether Debye was actually one of Rosbaud’s informants.

Debye’s links with Nazism were asserted in a 2006 book Einstein in Nederland by the Dutch journalist Sybe Rispens, and were outlined in an article ‘Nobel Laureate with dirty hands’ published in a Dutch periodical in connection with the book. Here Rispens explained (as already known to historians) that Debye, as president of the Germany Physical Society (DPG), had signed a letter in 1938 expelling Jews from the society. Panicked by the media exposé, the University of Utrecht removed Debye’s name from its institute for nanomaterials science, while the University of Maastricht withdrew from an annual research prize named after Debye.

A follow-up report on the matter commissioned by the Netherlands Institute for War Documentation (NIOD) changed the accusation of collaboration to one of ‘opportunism’, and the decisions of both universities have now been reversed. But Debye’s name remained tainted in the Netherlands, despite protestation from many scientists both in Europe and in the US, where Debye worked at Cornell University after leaving Germany in 1940.

There’s good reason to think that Debye was no friend of the Nazis. He collected his Nobel prize against their expressed wishes, and they thought him far too friendly to the Jews in his role as DPG president. Indeed, he even – with Rosbaud’s assistance – helped the Jewish nuclear physicist Lise Meitner flee Germany.

And yet why did he stay in Germany so long, when others left? Roald Hoffmann at Cornell has argued that this inevitably taints Debye’s reputation. ‘In the period 1933-39’, he says, ‘Debye took on positions of administration and leadership in German science, aware that such positions would involve collaboration with the Nazi regime. The oppressive, undemocratic, and obsessively anti-Semitic nature of that regime was clear. Debye chose to stay and, through his assumption of prominent state positions within a scientific system that was part of the state, supported the substance and the image of the Nazi regime.’

Clearly Debye’s story is not one of heroic self-sacrifice; the issue is rather where mild resistance blends into passive collusion. Cornelis Gorter, a physicist at Leiden University who knew Debye well, said that (like Howard Campbell) ‘he was not at all a Nazi sympathizer but was apolitical.’ Yet it seems that, also like Campbell, his deeds can tell quite different narratives viewed from different perspectives. The accusation of opportunism in the NIOD report came largely because, having occupied positions of power in Nazi Germany, Debye went on to serve the US war effort enthusiastically, for example through his work on synthetic rubber. That could suggest ingratiating collaboration with any ruling power, but it also fits the picture of Debye striving to limit Nazi abuses before finally fleeing to oppose them more openly.

This situation is reminiscent also of the controversy about Werner Heisenberg, memorably explored in Michael Frayn’s play Copenhagen. Did Heisenberg actively drag his heels to thwart the Nazi efforts to make an atomic bomb, or did he simply get the physics wrong? Did he even know his motives himself? And if not, how can we hope to?

A clue to Debye’s position may lie in a letter he wrote to the physicist Arnold Sommerfeld just before he left Germany for good. His aim, he said, was ‘not to despair and always be ready to grab the Good which whisks by, without granting the Bad any more room than is absolutely necessary. That is a principle of which I have already made much use.’

But maybe the real moral is the one that Vonnegut adduced for Mother Night: ‘We are what we pretend to be, so we must be careful about what we pretend to be.’

1. Reiding, J. Ambix 57, 275-300 (2010).