On nanotechnology and biology

The second issue of Nature Nanotechnology is now available on-line (see here for my comments on the first issue). I think this issue is also free to view, but from next month a subscription will be required.

Among the articles is an overview of nanoelectronics, based on a report from a recent conference, and a nice letter from a Belgian group describing the placement and reaction of individual macromolecules at surfaces using an AFM . The regular opinion column this month is contributed by me, and concerns one of my favourite themes: Is it possible to use modern science and engineering techniques to improve on nature, or has evolution already found the best solutions?

Silicon and steel

Two of the most important materials underpinning our industrial society are silicon and steel. Without silicon, the material from which microprocessors and memory chips are made, there would be no cheap computers, and telecommunications would be hugely less powerful and more expensive. Steel is at the heart of most building and civil engineering, making possible both cars and trucks and the roads they run on. So I was struck, while reading Vaclav Smil’s latest book, Transforming the Twentieth Century (about which I may write more later) by some contrasting statistics for the two materials.

In the year 2000, around 846 million tonnes of steel was produced in the world, dwarfing the 20,000 tonne production of pure silicon. In terms of value, the comparison is a little closer – at around $600 a tonne, the annual production of steel was worth $500 billion, compared to the $1 billion value of silicon. Smil quotes a couple of other statistical nuggets, which may have some valuable lessons for us when we’re considering the possible economic impacts of nanotechnology.

Steel, of course, has been around a long time as a material, but it’s easy to overlook how significant technological progress in steel-making has been. In 1920, it took the equivalent of 3 hours of labour to make 1 tonne of steel, but by 1999, this figure had fallen to about 11 seconds – a one thousand-fold increase in labour productivity. When people suggest that advanced nanotechnologies may cause social dislocation, by throwing workers in manufacturing and primary industries out of work, they’re fighting yesterday’s battle – this change has already happened.

As for silicon, what’s remarkable about it is how costly it is given the fact that it’s made from sand. One can trace the addition of value through the production chain. Pure quartz costs around 1.7 cents a kilogram; after reduction to metalurgical grade silicon the value has risen to $1.10 a kilo. This is transformed into trichlorosilane, at $3 a kilo, and then after many purification processes one has pure polycrystalline silicon at around $50 a kilo. Single crystal silicon is then grown from this, leading to monocrystalline silicon rod worth more than $500 a kilo, which is then cut up into wafers. One of the predictions one sometimes hears about advanced nanotechnology is that it will be particularly economically disruptive, because it will allow anything to be made from abundant and cheap elements like carbon. But this example shows the extent to which the value of products doesn’t necessarily reflect the cost of the raw ingredients at all. In fact, in cases like this, involving complicated transformations carried out with high-tech equipment, it’s the capital cost of the plant that is most important in determining the cost of the product.

Nature Nanotechnology

I’ve been meaning to write for a while about the new journal from the Nature stable – Nature Nanotechnology (there’s complete free web access to this first edition). I’ve written before about the importance of scientific journals in helping relatively unformed scientific fields to crystallise, and the fact that this journal comes with the imprint of the very significant “Nature” brand means that the editorial policy of this new journal will have a big impact on the way the field unfolds over the next few years.

Nature is, of course, one of the two rivals for the position as the most important and influential science publication in the world. Its US rival is Science. While Science is published by the non-profit American Association for the Advancement of Science, Nature, for all its long history, is a ruthlessly commercial operation, run by the British publishing company Macmillan. As such, it has been recently expanding its franchise to include a number of single subject journals, starting with biological titles like Nature Cell Biology, moving into the physical sciences with Nature Materials and Nature Physics, and now adding Nature Nanotechnology. Given the fact that just about everybody is predicting the end of printed scientific journals in the face of web-based preprint servers and open access models, how, one might ask, do they expect to make money out of this? The answer is an interesting one, in that it is to emphasise some old-fashioned publishing values, like the importance of a strong editorial hand, the value of selectivity and the role of design and variety. These journals are nice physical objects, printed on paper of good enough quality to read in the bath, and they have a thick front section, with general interest articles and short reviews, in addition to the highly selective selection of research papers at the back of the journal. What the subscriber pays for (and their marketing is heavily aimed at individual subscribers rather than research libraries) is the judgement of the editors in selecting the handful of outstanding papers in their field each month. It seems that the formula has, in the past, been successful, at least to the extent that the Nature journals have consistently climbed to the top of their subject league tables in the impact of the papers they publish.

So how is Nature Nanotechnology going about defining its field? This is an interesting question, in that at first sight there looks to be considerable overlap with existing Nature group journals. Nature Materials, in particular, has already emerged as a leading journal in areas like nanostructured materials and polymer electronics, which are often included in wider definitions of nanotechnology. It’s perhaps too early to be making strong judgements about editorial policies yet, but the first issue seems to have a strong emphasis on truly nanoscale devices, with a review article on molecular machines, and the lead article describing a single nanotube based SQUID (superconducting quantum interference device). The front material makes a clear statement about the importance of wider societal and environmental issues, with an article from Chris Toumey about the importance of public engagement, and a commentary from Vicki Stone and Ken Donaldson about the relationship between nanoparticle toxicity and oxidative stress.

I should declare an interest, in that I have signed up to write a regular column for Nature Nanotechnology, with my first piece to appear in the November edition. The editor is clearly conscious enough of the importance of new media to give me a contract explicitly stating that my columns shouldn’t also appear on my blog.

The Royal Society’s verdict on the UK government’s nanotech performance

The UK’s science and engineering academies – the Royal Society and the Royal Academy of Engineering – were widely praised for their 2004 report on nanotechnology – Nanoscience and nanotechnologies: opportunities and uncertainties, which was commissioned by the UK government. So it’s interesting to see, two years on, how they think the government is doing implementing their suggestions. The answer is given in a surprisingly forthright document, published a couple of days ago, which is their formal submission to the review of UK nanotechnology policy by the Council of Science and Technology. The press release that accompanies the submission makes their position fairly clear. Ann Dowling, the chair of the 2004 working group, is quoted as saying “The UK Government was recognised internationally as having taken the lead in encouraging the responsible development of nanotechnologies when it commissioned our 2004 report. So it is disappointing that the lack of progress on our recommendations means that this early advantage has been lost.”

Nanotechnology and the food industry

The use of nanotechnology in the food industry seems to be creeping up the media agenda at the moment. The Times on Saturday published an extended article by Vivienne Parry in its “Body and Soul” supplement, called Food fight on a tiny scale. As the title indicates, the piece is framed around the idea that we are about to see a rerun of the battles about genetic modification of food in the new context of nano-engineered foodstuffs. Another article appeared in the New York Times a few weeks ago: Risks of engineering a better ice cream.

Actually, apart from the rather overdone references to a potential consumer backlash, both articles are fairly well-informed. The body of Vivienne Parry’s piece, in particular, makes it clear why nanotechnology in food presents a confusingly indistinct and diffuse target. Applications in packaging, for example in improving the resistance of plastic bottles to gas permeation, are already with us and are relatively uncontroversial. Longer ranged visions of “smart packaging” also offer potential consumer benefits, but may have downsides yet to be fully explored. More controversial, potentially, is the question of the addition of nanoscaled ingredients to food itself.

But this issue is very problematic, simply because so much of food is made up of components which are naturally nanoscaled, and much of traditional cooking and food processing consists of manipulating this nanoscale structure. To give just one example, the traditional process of making whey cheeses like ricotta consists of persuading whey proteins like beta-lactoglobulin to form nanoparticles each containing a small number of molecules, and then getting those nanoparticles to aggregate in an open, gel structure, giving the cheese its characteristic mechanical properties. The first example in the NY Times article – controlling the fat particle size in ice cream to get richer feeling low fat ice cream – is best understood as simply an incremental development of conventional food science, which uses the instrumentation and methodology of nanoscience to better understand and control food nanostructure.

There is, perhaps, more apparent ground for concern with food additives that are prepared in a nanoscaled form and directly added to foods. The kinds of molecules we are talking about here are molecules which add colour, flavour and aroma, and increasingly molecules which seem to confer some kind of health benefit. One example of this kind of thing is the substance lycopene, which is available from the chemical firm BASF as a dispersion of particles which are a few hundred nanometers in size. Lycopene is the naturally occurring dye molecule that makes tomatoes red, for which there is increasing evidence of health benefits (hence the unlikely sounding claim that tomato ketchup is good for you). Like many other food component molecules, it is not soluble in water, but it is soluble in fat (as anyone who has cooked an olive oil or butter based tomato sauce will know). Hence, if one wants to add it to a water based product, like a drink, one needs to disperse it very finely for it to be available to be digested.

One can expect, then, more products of this kind, in which a nanoscaled preparation is used to deliver a water or oil soluble ingredient, often of natural origin, which on being swallowed will be processed by the digestive system in the normal way. What about the engineered nanoparticles, that are soluble in neither oil nor water, that have raised toxicity concerns in other contexts? These are typically inorganic materials, like carbon in its fullerene forms, or titanium dioxide, as used in sunscreen, or silica. Some of these inorganic materials are used in the form of micron scale particles as food additives. It is conceivable (though I don’t know of any examples) that nanoscaled versions might be used in food, and that these might fall within a regulatory gap in the current legal framework. I talked about the regulatory implications of this, in the UK, a few months ago in the context of a consultation document issued by the UK’s Food Standards Agency. The most recent research report from the UK government’s Nanotechnology Research Coordination Group reveals that the FSA has commissioned a couple of pieces of research about this, but the FSA informs me that it’s too early to say much about what these projects have found.

I’m guessing that the media interest in this area has arisen largely from some promotional activity from the nanobusiness end of things. The consultancy Cientifica recently released a report, Nanotechnologies in the food industry, and there’s a conference in Amsterdam this week on Nano and Microtechnologies in the
Food and Healthfood Industries
.

I’m on my way to London right now, to take part in a press briefing on Nanotechnology in Food at the Science Media Centre. My family seems to be interacting a lot with the press at the moment, but I don’t suppose I’ll do as well as my wife, whose activities last week provoked this classic local newspaper headline in the Derbyshire Times: School Axe Threat Fury. And people complain about scientific writing being too fond of stacked nouns.

A molecular computer that plays tic-tac-toe

I remember, when I was a (probably irritatingly nerdy) child, being absolutely fascinated by making a tic-tac-toe playing automaton out of match-boxes and beads, following a plan in one of Martin Gardner’s books. So my eye was caught by an item on Martyn Amos’s blog, reporting on a recent paper in Nano Letters (abstract and graphic freely available, subscription required for article) from a group in Columbia University, demonstrating a tic-tac-toe playing computer made, not from matchboxes or even more high-tech transistors, but from individual molecules.

The basic logic gate of this molecular computer is a single short DNA strand of a prescribed sequence which can act as a catalyst – a deoxyribozyme. Like the protein molecules used in the molecular computing and signalling operations inside living cells, these molecular logic gates operate by allostery. This is the principle that when one molecule binds to the gate molecule, it changes its shape and makes it either easier or harder for a second, different, molecule to bind. In this way you can get differential catalytic activity – that is, you can get a situation where the logic gate molecule will only catalyse a reaction to produce an output if a given input molecule is present. This simple situation would define a gate that implemented the logical operation YES; if you needed two inputs to stimulate the catalytic activity, you would have an AND gate, and if you have an AND gate whose catalytic activity can be suppressed by the presence of a third molecule, you have the logical operation xANDyANDNOTz. It is these three logical operations that are integrated in their molecular computer, which can play a complete game of tic-tac-toe (or naughts and crosses, as we call it round here) against a human opponent.

The Columbia group have integrated a total of 128 logic gates, plausibly describing it as the first “medium-scale integrated molecular circuit”. In their implementation, the gates were in solution, in macroscopic quantities, in a multi-well plate, and the outputs were determined by detecting the fluorescence of the output molecules. But there’s no reason in principle at all why this kind of molecular computer cannot be scaled down to the level of single or a few molecules, paving the way, as the authors state at the end of their paper, ” for the next generation of fully autonomous molecular devices”.

The work was done by Joanne Macdonald and Milan Stojanovic, of Columbia University, and Benjamin Andrews and Darko Stefanovic of the University of New Mexico – there’s a useful website for the collaboration here. Also on the author list are five NYC high school students, Yang Li, Marko Sutovic, Harvey Lederman, Kiran Pendri, and Wanhong Lu, who must have got a great introduction to the excitement of research by their involvement in this project.

For Spanish speaking readers

A couple of weeks ago, Spanish television broadcast an extended interview with me by the academic, writer, and broadcaster Eduardo Punset (bio in English here). This is the interview I gave on my visit to Sevilla a few months ago. A full transcript of the interview, in Spanish, is now available on the web-site of Radio Televisión Española.

Does “Soft Machines” present arguments for Intelligent Design?

I’m normally pretty pleased when my book Soft Machines gets any kind of notice, but a recent rather favourable review of it leaves me rather troubled. The review is on the website of a new organisation called Truth in Science, whose aim is “to promote good science education in the UK”. This sounds very worthy, but of course the real aim is to introduce creationist thinking into school science lessons, under the guise of “teaching the controversy”. The controversy in question is, of course, the suggestion that “intelligent design” is a real scientific alternative to the Darwinian theory of evolution as an explanation of the origin and development of life.

The review approvingly quotes a passage from Soft Machines about the lack of evidence for how the molecular machine ATP synthase developed as evidence that Darwinian theory has difficulties. Luckily, my Darwinian credentials aren’t put in doubt – the review goes on to say “Despite the lack of hard evidence for how molecules are meant to have evolved via natural selection, Jones believes that evolution must have occurred because it is possible re-create a sort of molecular evolution ‘in silico’ – or via computer simulation. However, as more is discovered about the immense complexity of molecular systems, such simulations become increasing difficult to swallow.” This is wrong on a couple of counts. Firstly, as Soft Machines describes, we have real experiments – not in-silico ones – notably from Sol Spiegelman, that show that molecules really can evolve. The second point is more subtle and interesting. Actually, there’s a strong argument that it is in complex molecular systems that Darwinian evolution’s real power is seen. It’s in searching the huge, multidimensional conformational spaces that define the combinatorially vast number of possible protein conformations, for example, that evolution is so effective.

The review signs off with a reiteration of a very old argument about design: “In the final chapter, ‘Our nanotechnological future’, Jones acknowledges that our ‘…only true example of a nanotechnology…is cell biology…’. Could that lead to an inference of design? “ Maybe, like many scientists, I have brought this sort of comment on myself by talking extensively about “Nature’s design principles”. The point, though, is that evolution is a design method, and a very powerful one (so powerful that we’re seeing more use of it in entirely artificial contexts, such as in software engineering). However, design doesn’t necessarily need a designer.

“Truth in Science” may present itself as simply wishing to encourage a critical approach to evaluating competing scientific theories, but a little research reveals the true motives of its sponsors. The first name on the Board of Directors is Andy Mckintosh, Professor of Thermodynamics and Combustion Science at Leeds University. Far from being a disinterested student of purported controversies in evolutionary theory, this interview reveals him to be a young earth creationist:
“So you believe in a world created about 6,000 years ago, cursed on account of sin, then devastated by Noah’s Flood?
“Absolutely. There’s nothing in real science (if you take all the assumptions into account) to contradict that view.”

I don’t have a problem if people want to believe in the literal truth of either of the creation stories in Genesis. But I don’t think it is honest to pretend that a belief which, in reality, is based on faith, has any relationship to science, and I think it’s quite wrong to attempt to have these beliefs insinuated into science education in publicly funded schools.

Review of David Berube’s Nanohype in Chemical and Engineering News

My review of David Berube’s book Nano-Hype: The Truth Behind the Nanotechnology Buzz has been published in Chemical and Engineering News, the magazine of the American Chemical Society.

The review (which seems to be available without subscription) is a reworked, expanded and generally better edited version of what I wrote about Nanohype earlier this year on this blog.

DNA as a constructional material

The most sophisticated exercises in using self-assembly to make nanoscale structures and machines have used, as a constructional material, the biomolecule DNA. This field was pioneered by NYU’s Ned Seeman. DNA is not exactly stuff we’re familiar with as a constructional material, though, so I don’t suppose many people have much of a feel for some of its basic mechanical properties, like its stiffness. An elegant experiment, reported in Science at the end of last year, Rapid Chiral Assembly of Rigid DNA Building Blocks for Molecular Nanofabrication (abstract free, subscription required for full article), sheds a lot of light on this question.

The achievement of this work, reported also in this Science News article, was to devise a method of making rigid DNA tetrahedra, with edges less than 10 nm in size, at high (95%) yield (previous methods of making DNA polyhedra had much lower yields than this). A model of one of these tetrahedra is shown below. But, not satisfied with just making these tetrahedra, Russell Goodman (a graduate student in Andrew Turberfield’s group at Oxford) was able to image them with an atomic force microscope and measure the response of a tetrahedron to being compressed by the AFM tip. In this way he was able to measure the spring constant of each tetrahedron.

The spring constants he found had an average of 0.18 N/m, which is reasonable in the light of what we know about the stiffness of DNA double helices. We can use this number to estimate what the stiffness – the Young’s Modulus – of the solid that would be made if you coupled together many of these tetrahedra. The precise value will depend on how the tetrahedra are linked, but a good estimate is about 20 MPa. Compared with a covalently bonded solid, like diamond (whose modulus, at around 1000 GPa, is 50 thousand times greater than our DNA solid), it’s very much floppier. In fact, this modulus is in the range of a relatively hard rubber, of the kind a shoe sole might be made of. On the other hand, given that the material would be mostly water, it’s pretty stiff – probably about a thousand times stiffer from Jello, which is similarly made up of a network of biopolymers in water.

A DNA tetrahedron

A rigid tetrahedron formed by self-assembly from DNA, figure from Goodman et al, Science 310 p1661 (2005)