Self-replicating (macro-)bots

A brief communication in this week’s Nature reports (macroscopic) machines that autonomously self-replicate from randomly positioned components. The work is from Saul Griffith and colleagues at MIT’s Media lab. Self-replication is done in a two stage process; there’s a recognition step, in which when two correctly oriented building blocks randomly collide they latch together, and there’s an error correction step, in which incorrectly joined sub-units are separated.

A movie (9 MB Quicktime movie – I think this works without a subscription) shows the self-replication happen. The blocks are placed on an air-table and agitated to bring blocks randomly into contact with each other. Of course, this is just a macroscopic analogue of the random, Brownian motion that is so important at the nanoscale. It’s interesting to compare this with another, much publicised, example of a macroscale, self-replicating system reported earlier this year in Nature. In that case, self-replication was a deterministic process that relied on its components being supplied in a well-ordered way. Griffith’s approach (see his web-site for more context) consciously mimics the self-assembly processes used in biological systems; the architecture of the structure is encoded in each of its components, and assembly depends on random interactions between these components. The combination of these two features is what leads to the huge potential advantage of this approach to self-replication over more deterministic techniques – the potential to make the process massively parallel and inherently scalable. This is fascinating and thought-provoking work.

More on Nanojury UK

Here are a few more links about Nanojury UK, the citizens’ jury on nanotechnology which has just reported its verdict.

The press release about the results, from Greenpeace.
An article about it, from the German newspaper die Tageszeitung (in German). Thanks to author Niels Boeing for letting me know about this.
Detailed commentary on the results and the launch day from David Berube (research director of NanoScience and Technology Studies at the University of South Carolina).

Finally, here’s a complete list of my posts on the process as it unfolded:
The launch
Week 1
Week 3
Finalising the conclusions
The verdict.

Nanojury UK – the final verdict

Nanojury UK – the citizens’ jury on Nanotechnology that has been deliberating over the summer – delivered its verdict on Wednesday at an event in London. In full, there were twenty recommendations which attracted various degrees of support. But at the launch, four jurors attended in person, and they singled out four recommendations which they felt the whole jury felt most strongly about. After presenting these four key recommendations, they took questions from a large audience, and then the sponsors of the process gave their reactions.

The four recommendations were:
1. Health – nano-enabled medicines had big potential for reducing the time people spent in hospital. These should be developed via improved funding mechanisms and should be available without discrimination on the National Health Service.
2. The Government should support those nanotechnologies that bring jobs to the UK by investment in education, training and research.
3. Scientists should learn to communicate better – some of the jury felt sometimes patronised, they didn’t like all the long words scientists used, and scientists didn’t always agree with each other.
4. Products containing manufactured nanoparticles should be labelled in plain English.

The questions threw up some interesting insights. The most direct and straightforward came from the Guardian reporter – after this process, what was their general impression of nanotechnology. All four were in agreement; if safety could be assured, they were very positive. Another journalist asked them what they felt were the most exciting applications, and again they agreed on medicine and renewable energy. A Greenpeace person asked them a rather leading question about whether they would agree with the proposition that he claimed many scientists held, that if the public only understood the science they would support it. They answered this by saying that as they learned about the science, they got excited about it and talked about it to their friends. One juror told a story about how his daughter was at school and the class was asked about nanotechnology. She said “oh, yes, I know loads about nanotechnology”, to which the teacher replied along the lines of “how can you know about that, your dad’s just a taxi driver”, to which she was able to say that her father was taking part in this citizens jury and was telling her all about it.

One thing was absolutely clear – the jurors were tremendously positive about the process itself. They even managed to say some positive things about the scientists involved, despite conclusion 3. One juror rather accurately identified the problem with the upstream nature of the process – commenting that “some of this stuff is so far ahead that even the scientists aren’t sure where it is going”. This positive view chimed well with the independent evaluation made by Nick Pidgeon, a social scientist from UEA who assessed the ill-fated GM nation project. His view was also very positive, and he noted as good features the very representative jury, the very strong multi-stakeholder oversight panel, and the direct link into government. He noted as a challenge for upstream approach precisely the problem that the juror had pointed out.

From the sponsors, Mark Welland, from the Cambridge Nanotechnology IRC, talked a lot about the importance of the integrity of the process, and pronounced himself very satisfied with this. Doug Parr, from Greenpeace, sounded a slight air of disappointment. He didn’t think the recommendations reflected the richness of the discussions, he noted the importance of discussing, beyond pure technology, the wider issues of economics and the wider disconnects between science, government, industry and the public. He noted that there had been no mention of the idea of a moratorium on the new technology. I should note here, of course, that Jim Thomas, of the ETC group, which has been calling for a moratorium, was one of the witnesses and presented the case for one to the jury.

For the Government, the reaction was given by Adrian Butt, Chair of the Nanotechnology Issues Dialogue Group, the multi-department body set up to coordinate nanotechnology policy across government. He gave an explicit commitment to table the recommendations in the policy meetings of the NIDG and report back the outcome of discussions. He seemed really rather pleased with the outcome, which he took as being not far from endorsing the approach the government was taking. Nonetheless, he did exercise a certain amount of “expectations management” about how seriously the government would take this. In his words, “the results of this kind of exercise will not by themselves directly determine policy, but will provide social intelligence on the wider environment in which policy is made”

For nanobusiness, Barry Park, COO of Oxonica, expressed broad comfort with the balanced tone of the recommendations.

What of my personal recollections and feelings? I found it one of the most stressful things I’ve done in my career. I have massive admiration for Becky Willis, who chaired the oversight panel and kept the whole thing together in the face of what seemed at times overwhelming centrifugal forces (I composed one unsent resignation letter, and I suspect I wasn’t the only one who came close to walking out on the whole thing). The facilitators have immense power in this kind of exercise, and I ended up with immense respect for the professional effectiveness of Tom Wakeford and his team. But Tom has his own strong political views, which as he himself conceded in his own self-critique, he doesn’t always rigorously exclude from the process, and these aren’t calculated to make life easy for the scientists. It would be impossible for me not to take the criticisms of scientists communication skills personally, but I honestly don’t think the scientific witnesses should have done anything differently. I think the jurors got a very honest, unspun and unvarnished impression of the science, and in return I found the interactions with the jurors very rewarding.

At the end of it all, one thing that is disappointing was the very low level of press coverage – this perfunctory piece in the Guardian was the only thing in the nationals. There are some mitigating circumstances for the lack of press interest – the fact that the Guardian was the media sponsor limited the appeal for other papers, while the Guardian itself basically lost interest as a result of the decision to drop its weekly science section when the paper relaunched as a near-tabloid. But I can’t help feeling that there would have been a lot more coverage if the result had been different. There were approving words in an editorial in this weeks Nature (subscription required). Its conclusion is a good place to finish: “The results of the citizens’ jury suggest that nanotechnology is not perceived as a serious threat to the values of anyone but die-hard anti-technologists”.

Nanotechnology Engagement Group

I was in London on Monday for the first meeting of the Nanotechnology Engagement Group (NEG), a body funded by the UK government to coordinate activities around public engagement and the discussion of social and ethical issues in the context of nanotechnology. The establishment of the body was announced in a rather low-key way in the summer, when the government issued its draft strategy public engagement on nanotechnologies. The group is being run by the think-tank Involve, and I’m chairing it.

Here are a few first impressions, mostly of the potential pitfalls that it’s easy to imagine this enterprise falling into. The first is that it might cement the trend already identified by Demos, and contribute to a simultaneous professionalization and marginalization of the public engagement field. One can easily imagine NEG developing as a forum in which the professionals cheerfully discuss at length the methodological advantages of citizens’ juries against consensus conferences or focus groups, while failing to make any real impact either on the development of science policy or on the wider public discourse about technology as it’s carried out through the media.

The second is the tension that exists between the idea of public engagement and the idea of “engaging stakeholders”. A very popular way of doing some sort of wider consultation about something like technology is to assemble a bunch of “stakeholders” – regulators, industry groups, consultancy organisations, and advocacy groups. I have deep worries about the representativeness of such groups on all sides. There’s an unwillingness of the private sector to put its collective head above the parapet, on the one hand, and on the other there’s a tendency to assume that NGOs, sometimes representing very narrow constituencies, have a mandate to represent the concerns of a wider public. It’s tempting to view the results of such consultations as being much more representative than they are; when so many people are unwilling or unable to speak the voice of anyone who is willing and motivated to say anything at all ends up with far too much weight. This, to my mind, is one of the main strengths of processes like citizens’ juries – done well, you should get something that represents the views of the public much more accurately than an advocacy group.

Finally, there is the question of what the public, in these engagement exercises, are actually being asked to decide on. The drawback of this kind of upstream engagement is that it is not clear what the outcomes of the technology might be. Maybe we need to start doing some serious scenario construction to try and present a range of plausible futures to focus the discussion down a bit.

All these issues come into sharp focus with the launch of the findings of Nanojury UK (see here for previous reports on this), which took place today at the headquarters of the Guardian. I’ll be writing my impressions about the launch event tomorrow.

‘Twas on the good ship Venus…

If you’ve enjoyed the bout of transatlantic name-calling that my piece on public engagement produced (generally along the well-worn lines of Europeans from Venus versus Martian Americans), you might want to look at this exchange on the Foresight Institute’s Nanodot blog. Here Foresight VP Christine Peterson enthusiastically agrees with my not wholly serious suggestion that the origin of the UK’s aversion to the positive vision of Drexlerian nanotechnology can be traced to the generally pessimistic and miserabilist disposition of the inhabitants of this rain-sodden archipelago, and I desperately try and extract myself from the hole I’ve dug myself into.

Model Railways

I’ve been in Leeds for a few days for the biennial conference of the Polymer Physics Group of the UK’s Institute of Physics. Among many interesting talks, the one that stood out for me was the first – an update from Andrew Turberfield on his efforts to make a molecular motor from DNA.

Turberfield, who is at the Oxford IRC in Bionanotechnology, is building on the original work from Ned Seeman, exploiting the remarkable self-assembling properties of DNA to make nanoscale structures and devices. A few years ago, Turberfield, working with Bernie Yurke at Lucent Bell Labs, designed and built a DNA nano-machine (see here for a PDF preprint of the original Nature paper), and in 2003 they published a paper describing a free-running motor powered by the energy released when two complementary strands of DNA meet to make a section of double helix (abstract here).

This motor doesn’t actually do anything, apart from sit around in solution cyclically changing shape. What Turberfield wants to do now is make something a bit like the linear motors common in cell biology, in which the motor molecule moves along a track, often carrying a cargo. To make this kind of molecular railway, Turberfield’s scheme is to prepare a track along a surface by grafting strands of DNA to it. The engine is another DNA molecule; what needs to be done is get some scheme whereby the engine molecule is systematically passed along from strand to strand.

His first effort, in collaboration with Duke University’s John Reif, involves using enzymes to alternately cut DNA strands and rejoin them in a sequence that has the effect of making a short strand of DNA move linearly in one direction. In this case, it’s the energy used by the enzyme that joins two bits of DNA that makes the motor run. The full paper is here (PDF). In motor mark 2, it’s a so-called nicking enzyme that makes the engine move, and the directionality is imposed by the fact that the track is destroyed in the path of the engine (abstract here, subscription probably required for full article). What Andrew really wants to do, though, is have a motor that is solely powered by the energy released when DNA strands make a helix, which doesn’t chew up the track behind it, and which doesn’t involve the use of any biological components like enzymes. He has a scheme, and he is confident that it’s not far off working.

These motors are inefficient and slow in their current form. But they are important, because they work on the same basic principles as biological motors, principles which are very different to the mechanical principles that underly the motors we are familiar with. They rely on the Brownian motion and stickiness of the nanoscale environment. But because of the simplicity of the base pair interaction, the calculations you need to do to predict whether the motor will work or not are feasibly simple. By learning to make model railways from these simple, modular components, we’ll learn the design rules that will enable us to make a wider variety of practical nanoscale motors.

Uncertainties about public engagement

The thinktank Demos has released another report on science and public engagement. The Public Value of Science is, in some ways, a follow up to their earlier pamphlet See-through Science. But whereas the earlier report was rather confident in its diagnosis of the failings of previous attempts to engage the public in science, and in its prescription of a new type of “upstream engagement”, the new report seems much more uncertain in its tone.

On the face of it, this is odd, because the news seems good. There is no evidence of any growing crisis in public confidence in science; on the contrary, the report quotes a recent opinion poll from the UK which found that “86 per cent of people think science ‘makes a good contribution to society’– up 5 per cent on two years ago.”. And the idea of “upstream engagement” is riding high in fashionability, both in government and among the scientific great and good. Nonetheless, there seems to be a nagging worry, a sense that this conversion to real public engagement is only skin deep. It’s true that there’s been some open opposition (for example from Lord Taverne’s organisation, Sense about Science) but this seems to worry Demos less than the feeling that all the attention paid to public engagement still amounts to little more than lip-service, leading to “a well-meaning, professionalised and busy field,propelled along by its own conferences and reports, but never quite impinging on fundamental practices,assumptions and cultures. “

I think they are quite right. The danger they have identified is that all this activity about public engagement still isn’t actually pulling the levers they need to operate to achieve their ambition, which is to steer the direction of the research enterprise itself. The next phase is to work on what they call the “software” of scientific engagement – “the codes,values and norms that govern scientific practice,but which are far harder to access and change.” This is a much more difficult matter than simply setting up a few focus groups and citizens’ juries. In essence, their aim here is to use the input from this kind of deliberative process to redefine the way the scientific community defines “good science”.

This kind of cultural shift isn’t entirely unprecedented. In fact, I’ve argued myself that the rise of nanoscience itself constitutes just such a shift; in this case the definition of good science swung away from testing theories and characterising materials, and towards making widgets or gizmos. But the process of change is difficult, unpredictable and hard to control. It’s not about the Minister for Science issuing a rational order to his obedient research councils; the process is probably closer to the way fashions spread among sub-teenagers. The editors of Nature and Science, like the editors of Smash Hits, might think they have some influence, but they’re at the mercy of the social dynamics of the playground. One obvious difficulty is that the values of the scientific enterprise are now highly globalized. All over the world scientists aspire to publish the same kinds of paper in the same journals, and to be invited to the same conferences. Another difficulty is the sheer self-confidence of the scientific community. Lord Broers’ Reith lectures captured the spirit exactly – paraphrasing Marx, scientists may concede that philosophers and social scientists have done something to understand the world, but scientists and technologists have a deep conviction that it is they who have changed it.

Moving to some more parochial issues, the report identifies some specific barriers that UK scientific politics puts in the way of their vision. The Research Assessment Exercise, which determines the level of baseline research funding in UK universities over a five year period, operates on a strictly disciplinary basis, using peer review of papers describing original research. There’s been some lip-service paid to the notion that there may be valid outputs that aren’t papers in Physical Review Letters, but I’m not sure many people are going to be willing to gamble on this, and I can’t disagree with Demos’s conclusion that ‘”it reinforces the model ofthe highly specialised researcher,locked in a cycle of publish-or-perish”. The research councils clearly see some of the problems and are starting some useful initiatives, but they’re hampered by the difficulty that the different councils have in working cooperatively. The big picture, though, is that there are precious few career incentives for scientists to divert their efforts in this way, and quite a few significant disincentives.

The big weakness in the Demos analysis, in my view, is its failure to address the issue of the power of the market. The authors are very equivocal about the growing emphasis on the commercialisation of university generated research. Agreeing that in principle this is a good thing, they nonetheless report ” growing disquiet among university scientists that the drive for ever closer ties with business is distorting research priorities”, and worry about the effects of this on the openness and integrity of the research process. All these are valid concerns, but what’s missing is a recognition that the market is now the predominant mechanism by which technology impacts on society. Demos says “We believe everyone should be able to make personal choices in their daily lives that contribute to the common good. “ The truth is, the way society is set up now what people buy is one of the major ways in which these choices are made. And the messages that people send through the market by these personal choices might well differ from the messages they would send if you asked them directly. If you ask a bunch of young people where they would like to see money spent to develop nanotechnology, they might well answer that they’d like to see it being spent on improving the environment and on ending world poverty, but then if they go and spend their money on iPods and personal care products their votes are effectively cast for quite different priorities.

This isn’t to say that the market is a very efficient way of setting research priorities – far from it. At the moment we have marketing and product development people making more or less informed guesses (which often turn out to be spectacularly inaccurate) about what people are going to want to buy. On the other hand, researchers are obliged to try and predict some kind of application for the outcome of their research when they apply for funding, and to do this they end up trying to guess, not so much what the potential markets might be, but what they think will best match the preconceptions of referees and research councils. Somehow the idea that in ten years everyone will want flexible television sets, or personal gene testing kits, or neutriceutical laden yoghourts, enters and spreads through the collective mind of the research community like a Pokemon craze. This isn’t to say that these ideas are necessarily wrong; it’s just that the process by which they gain currency is not particularly well controlled or evidence based. It’s this sort of process that sociologists of science ought to understand, but I’m not convinced they do.

On the road again

I’m sorry that I’ve left my blog unattended for a few days; I went away and forgot that I’d changed the blog’s password, so I couldn’t get to it from my laptop.

I’ve been doing a whistlestop tour of the Celtic capitals – to Dublin for the meeting of the British Association, where I was appearing in a panel discussion about whether we should use nanotechnology for human enhancement. Then to Edinburgh, where the EuroNanoForum was discussing nanotechnology and the health of the European citizen. I gave a talk in the session on converging technologies, recorded an interview for French radio, and went to an interesting session on public engagement, after which I had the pleasure of meeting my fellow nano-blogger, David Berube. Then, over a supper of haggis, neeps and tatties, I was subjected to what I thought was rather an aggressive interrogation from some of my fellow European citizens about the quality of the British contribution to international food culture. I’ll post something more substantive tomorrow.

Farewell to Nanobot

Howard Lovy announced last week that he’s drawing a line under his popular and entertaining blog Howard Lovy’s Nanobot. I guess this is the natural consequence of his transition from nanobusiness gamekeeper to poacher, with his new post as Director of Communications at the nanotechnology company Arrowhead Research. I’ve never met Howard in person, though I’ve felt I’ve got to know him through exchanges on our respective blogs and through some email correspondence; I’m delighted that he’s found a niche to use his talents in the nanotechnology sector and I wish him all the best in this new phase of his career.

I’ll miss Nanobot. I certainly didn’t agree with everything Howard said, and I wish he’d got to understand the scientific community better. But it’s been a provocative and interesting read, and its emphasis on the way the idea of nanotechnology is being interpreted in the wider world has been helpful and salutory.

Making life from the bottom up

I wrote below about Craig Venter’s vision of synthetic biology – taking an existing, very simple, organism, reducing its complexity even further by knocking out unneccessary genes, and then inserting new genetic material to accomplish the functions you want. One could think of this as a kind of top-down synthetic biology; one is still using the standard mechanisms and functions of natural biology, but one reprogrammes them as desired. Could there be a bottom-up synthetic biology, in which one designs entirely new structures and systems for metabolism and reproduction?

One approach to this goal has been pioneered by Steven Benner at the University of Florida. He’s been concentrating on creating synthetic genetic systems by analogy with DNA, but he’s not shy about where he wants his research to go: “The ultimate goal of a program in synthetic biology is to develop chemical systems capable of self-reproduction and Darwinian-like evolution.” He’s recently written a review of this kind of approach in Nature Genetics Reviews (subscription only): Synthetic biology.

David Deamer, from UC Santa Cruz, has a slightly different take on the same problem in another recent review, this time in Trends in Biotechnology (again, subscription only, I’m afraid). “A giant step towards artificial life?” concentrates on the idea of creating artificial cells by using self-assembling lipids to make liposomes (the very same creatures that L’Oreal uses in its expensive face creams). Encapsulated within these liposomes are some of the basic elements of metabolism, such as the mechanisms for protein synthesis. How close can this approach get to creating something like a living, reproducing organism? In Deamer’s words: “Everything in the system grows and reproduces except the catalytic macromolecules themselves, the polymerase enzymes or ribosomes. Every other part of the system can grow and reproduce, but the catalysts get left behind. This is the final challenge: to encapsulate a system of macromolecules that can make more of themselves, a molecular version of von Neumann’s replicating machine.” He sees a glimmer of hope in the work of David Bartel at MIT, who has made a RNA enzyme that synthesizes short RNA sequences, pointing the way to RNA-based self-replication.

But all these approaches still follow the pattern set by the life we know about on earth; they depend on the self-assembling properties of a familiar repertoire of lipids and macromolecules, like DNA, RNA and proteins, in watery environments. Could you do without water entirely? Benner is quoted in an article by Philip Ball in this week’s Nature (Water and life: Seeking the solution, subscription required) arguing that you can: “Water is a terrible solvent for life…. We are working to create alternative darwinian systems based on fundamentally different chemistries. We are using different solvent systems as a way to get a precursor for life on Earth.”