Nanotechnology gets complex

The theme of my book Soft Machines is that the nanomachines of biology operate under quite different design principles from those we are familiar with at the macroscale. These design principles exploit the different physics of the nanoworld, rather than trying to engineer around them. The combination of Brownian motion – the relentless shaking and jostling that’s ubiquitous in the nanoworld, at least at normal temperatures – and strong surface forces is exploited in the principle of self-assembly. Brownian motion and the floppiness of small scale structures are exploited in the principle of molecular shape change, which provides the way our muscles work. We are well on our way to exploiting both these principles in synthetic nanotechnology. But there’s another design principle that’s extensively used in Nature, that nanotechnologists have not yet exploited at all. This is the idea of chemical computing – processing information by using individual molecules as logic gates, and transmitting messages through space using the random motion of messenger molecules, driven to diffuse by Brownian motion. These are the mechanisms that allow bacteria to swim towards food and away from toxins, but they also underly the intricate way in which cells in higher organisms like mammals interact and differentiate.

One argument that holders of a mechanical conception of radical nanotechnology sometimes use against trying to copy these control mechanisms is that they are simply too complicated to deal with. But there’s an important distinction to make here. These control systems and signalling networks aren’t just complicated – they’re complex. Recent theory of the statistical mechanics of this sort of multiply connected, evolving networks is beginning to yield fascinating insights (see, for example, Albert-László Barabási’s website). It seems likely that these biological signalling and control networks have some generic features in common with other complex networks, such as the internet, and even, perhaps, free market economies. Rather than being the hopelessly complicated result of billions of years of aimless evolutionary accretion, we should perhaps think of these networks as being optimally designed for robustness in the noisy and unpredictable nanoscale environment.

It seems to me that if we are going to have nanoscale systems of any kind of complexity, we are going to have to embrace these principles. Maintaining rigid, central control of large scale systems always seems to be a superficially good idea, but such control systems are often brittle and fail to adapt to unpredictability, change and noise. The ubiquity of noise in the nanoscale world offers a strong argument for using complex, evolved control systems. But we still lack some essential tools for doing this. In particular, biological signalling relies on allostery. This principle underlies the operation of the basic logic gates in chemical computing; the idea is that when a messenger molecule binds to a protein, it subtly changes the shape of the protein and affects its ability to carry out a chemical operation. Currently synthetic analogues for this crucial function are very thin on the ground (see this abstract for something that seems to be going the right way). It would be good to see more effort put in this difficult, but exciting, direction.

Self-replicating (macro-)bots

A brief communication in this week’s Nature reports (macroscopic) machines that autonomously self-replicate from randomly positioned components. The work is from Saul Griffith and colleagues at MIT’s Media lab. Self-replication is done in a two stage process; there’s a recognition step, in which when two correctly oriented building blocks randomly collide they latch together, and there’s an error correction step, in which incorrectly joined sub-units are separated.

A movie (9 MB Quicktime movie – I think this works without a subscription) shows the self-replication happen. The blocks are placed on an air-table and agitated to bring blocks randomly into contact with each other. Of course, this is just a macroscopic analogue of the random, Brownian motion that is so important at the nanoscale. It’s interesting to compare this with another, much publicised, example of a macroscale, self-replicating system reported earlier this year in Nature. In that case, self-replication was a deterministic process that relied on its components being supplied in a well-ordered way. Griffith’s approach (see his web-site for more context) consciously mimics the self-assembly processes used in biological systems; the architecture of the structure is encoded in each of its components, and assembly depends on random interactions between these components. The combination of these two features is what leads to the huge potential advantage of this approach to self-replication over more deterministic techniques – the potential to make the process massively parallel and inherently scalable. This is fascinating and thought-provoking work.

‘Twas on the good ship Venus…

If you’ve enjoyed the bout of transatlantic name-calling that my piece on public engagement produced (generally along the well-worn lines of Europeans from Venus versus Martian Americans), you might want to look at this exchange on the Foresight Institute’s Nanodot blog. Here Foresight VP Christine Peterson enthusiastically agrees with my not wholly serious suggestion that the origin of the UK’s aversion to the positive vision of Drexlerian nanotechnology can be traced to the generally pessimistic and miserabilist disposition of the inhabitants of this rain-sodden archipelago, and I desperately try and extract myself from the hole I’ve dug myself into.

A view from the Greenhouse

Here’s another brief report on the Nottingham nanotechnology debate. It’s from Jack Stilgoe, from the thinktank Demos, who was the non-scientist on the panel. He frames the debate in rather nationalistic terms. Is this really just a clash between the habitual rainsoaked pessimism of the British, and sunny American optimism and its associated can-do attitude?

Nanotechnology debate at Nottingham

I don’t know about anybody else, but I enjoyed yesterday’s nanotechnology debate at Nottingham. The whole thing was filmed, and as soon as it’s been edited and tidied up we’ll get the video put up on the web. Given that everyone will soon have the opportunity to judge for themselves how the thing went, I’ll confine myself here to some general observations. There was a big crowd, mostly graduate students attending the surface science summer school, supplemented by a good fraction of the local nanoscientists. The nature of the audience meant that the debate rapidly got quite technical; I don’t think anyone could say that the molecular manufacturing point of view didn’t get a serious hearing. I must say that I was a little apprehensive, given the rancour that has entered previous debates, but I felt the tone was robust but mutually respectful.

My prize for gnomic aphorism of the evening goes to my fellow-panellist Saul Tendler (bionanotechnologist and pharmacy professor). “If a cat had wheels, who would change its tyres?”

An open debate about radical nanotechnology

A public debate about nanotechnology – Nanotechnology: Radical New Science or Plus ca Change? – has been organised by Philip Moriarty at the University of Nottingham as part of a Surface Science Summer School at 4.30 pm on Wednesday 24th August . The themes of the debate are:

  • Are nanofactories capable of manufacturing virtually anything with little or no environmental impact really just a few decades away, as some groups are claiming?
  • Is nanotechnology based on scaled-down everyday engineering concepts viable or should we look to biology for insights into how to tame the nanoworld?
  • Are there potential risks associated with the manipulation of matter at the atomic and molecular levels and how might those risks be controlled?
  • Or, is nanotechnology simply a buzzword for science which, far from being a radical departure from what has gone before, simply represents a natural convergence of the conventional disciplines…?
  • The panel includes myself and J. Storrs Hall, author of the recently published book Nanofuture: What’s next for nanotechnology. As it happens, Nanofuture was part of my holiday reading, so I know that we will be getting a robust and wholehearted defense of the Drexlerian position. In addition, we have a science policy expert from the thinktank Demos who has been studying public perceptions of nanotechnology, Jack Stilgoe, and further names to be announced.

    The primary audience for the debate will be young graduate students doing PhDs in nanoscience, so we can be sure that there’ll be a vigorous technical discussion. But anyone’s welcome to turn up (Philip asks that you drop him an email – see his personal web-page for an address – if you want to come). And if you can’t make it in person, submit your question online via this link.

    (Updated 13 July following Philip’s information below that Dave King can’t now come to the Summer School)

    Debating the feasibility of molecular manufacturing

    The Soft Machines blog is getting some visitors referred from a page on the new Foresight Institute website discussing the various debates there have been on the feasibility of Drexler’s version of a radical nanotechnology. For their convenience, and for anyone else who is interested, here is a quick summary of some the relevant posts on Soft Machines. When I get a moment, I will move a version of this summary to a more permanent home.

  • “Molecular nanotechnology, Drexler and Nanosystems – where I stand” is a concise summary of my overall position.
  • The mechanosynthesis debate. This began with a critique by Philip Moriarty, an experimental nanoscientist from the University of Nottingham, of a detailed proposal by Robert Freitas for implementing diamondoid mechanosynthesis. The debate is introduced here. The critique received a riposte from Chris Phoenix, of the Center for Responsible Nanotechnology, which developed into an extensive exchange of views. The whole 56 page correspondence can be downloaded as a PDF from this post: “Is mechanosynthesis feasible? The debate continues.” My commentary on the debate can be found in this post: “The mechanosynthesis debate”. Each of these posts also contains many illuminating comments from various readers. As a postscript to this debate, I hope Philip Moriarty won’t mind me adding that the private correspondence he mentions between Robert Freitas and himself is still constructively continuing.
  • Is matter digital? Here I argue that the focus of radical nanotechnology should be moved away from the question of how artefacts are to be made, and towards a deeper consideration of how they will function, and I question the assumption that a single basic technology, like diamondoid-based molecular nanotechnology, can carry out all the functions we need in an optimal way. The original post, Making and doing, attracted detailed comments from Christine Peterson, of the Foresight Institute, and Chris Phoenix. I responded to these criticisms in Bits and Atoms.
  • Drexler and Smalley. The most high-profile scientific opponent of Drexler has been Richard Smalley. I asked the question Did Smalley deliver a killer blow to Drexlerian MNT?, and concluded that he probably didn’t.
  • The argument from biology. The existence of biology is often cited as an existence proof for radical nanotechnology. In this post – What biology does and doesn’t prove about nanotechnology – I argue that we can learn a lot from the biological example, but that the conclusions we should draw aren’t the ones that the supporters of MNT reach.
  • Biomimetic nanotechnology with synthetic macromolecules

    This is a draft of a piece I’ve been invited to write for the special edition of Journal of Polymer Science: Polymer Physics Edition that is associated with the March meeting of the American Physical Society. The editors invited views from a few people about where they saw the future of polymer science. Here’s my contribution, with themes that will be familiar to readers of Soft Machines. Since the intended audience consists of active researchers in polymer science, the piece has more unexplained technical language than I usually use here.

    In the first half of the twentieth century, polymer science and biochemistry developed together. With synthetic polymer chemistry in its infancy, most laboratory examples of macromolecules were of natural origin, and the conceptual foundations of polymer science, such as Staudinger’s macromolecular hypothesis, were as important for biology as for chemistry. Techniques for the physical characterisation of macromolecules, like Svedberg’s ultracentrifuge, were applied as much to biological macromolecules as synthetic ones. But with the tremendous development of the field of structural biology that x-ray protein crystallography made possible, the preoccupations of polymer science increasingly diverged from those of what was now being termed molecular biology. The issues that are so central to protein structure – secondary and tertiary structural motifs, ligand-receptor interactions and allostery, had no real analogue in synthetic polymer science. Meanwhile, the issues that exercised polymer scientists – crystallisation, melt dynamics and rheology – had little relevance to biology. Of course there were exceptions, but conceptually and culturally the two disciplines had become worlds apart.

    I believe that the next fifty years we need to see much more interaction between polymer science and cell biology. In polymer science, we’ve seen the focus shift away from the properties of bulk materials to the search for new functionality by design at the molecular level. In cell biology, the new methods of single molecule biophysics permit us to study the behaviour of biological macromolecules in their natural habitat, rather than in a protein crystal, allowing us to see how these molecular machines actually work. Meanwhile synthetic polymer chemistry has started to give us access to control over molecular architecture. This is not yet at the precision that we obtain from biology, but we are already seeing the exploitation of non-trivial macromolecular architectures to achieve control over structure and function. The next stage is surely to take the insights from single molecule biophysics about how biological molecular machines work and design synthetic molecules to perform similar tasks.

    We could call this field biomimetic nanotechnology. Biomimetics, of course, is a well-known field in material science; what we are talking about here is biomimetics at the level of single molecules, at the level of cell biology. Can we make synthetic analogues of molecular motors and other energy conversion devices? Can we learn from membrane biophysics to make selective pumps and valves, which would allow the easy and energy-efficient separation and sorting of molecules? Will it be possible to create any synthetic analogue of the systems of molecular sensing, communication and computation that systems biology is just starting to unravel? It’s surely only by achieving this degree of nanoscale control that the promise of molecular medicine could be fulfilled, to give just one example of a potential application.

    What are the areas of polymer science that need to be advanced to enable these developments? Obviously, in polymer chemistry, synthesis with precise architectural control is key, and achieving this goal in water-soluble systems is going to be important if this technology is going to find wide use, particularly in medical applications. Polymer physicists are still much less comfortable dealing with systems involving water and charges than with polymer solutions in simple non-polar solvents, and we’ll need more work to ensure that we have a good understanding of the physical environment in which our devices will be operating.

    The importance of self-assembly as a central theme will continue to grow. This way of creating intricate nanostructures by programmed interactions in macromolecules is well known to polymer science; the richness of the morphologies that can be obtained in block copolymer systems is well-known. But in comparison with the sophistication of biological self-assembly, synthetic self-assembly still operates at a very crude level. One new element that we should import from biology is the exploitation of secondary structure and its coupling to nanoscale morphology. Another important idea is to exploit the single chain folding of a sequenced copolymer in an analogue of protein folding. This, of course, would require considerable precision in synthesis, but theoretical developments are also necessary. We have learnt from the theory of protein folding theory that only a small fraction of possible sequences are foldable, so we will need to learn how to design foldable sequences.

    Another important principle will be exploiting molecular shape change. In biology, this principle underlies the operation of most sophisticated nanoscale machines, including molecular motors, ion channel proteins and signalling molecules. In polymer physics the phenomenon of the coil-globule transition in response to changing solvent conditions is well known and has its macroscopic counterpart in thermoresponsive gels. To be widely useful, we need to engineer responsive systems with much more specific triggers and with a more highly amplified response. One promising way of doing this uses the coupling between transitions in secondary structure and global conformation; however we’re still a long way from the remarkable lever arms of biological motor proteins, in which rather subtle changes at a binding site produce a large overall mechanical response.

    Some of the most powerful ideas from biology still remain essentially unexploited. An obvious one is, of course, evolution. At the molecular level, evolution offers a spectacularly powerful way of searching multidimensional parameter spaces to find efficient design solutions. It’s arguable that, given the combinatorial complexity that arises with even modest degrees of architectural control and our unfamiliarity with the design rules that are appropriate for the nanoscale environment, that significant progress will positively require some kind of evolutionary approach, whether that is executed in computer simulation or with real molecules.

    Perhaps the most fundamental difference between the operating environments of biology and polymer science is the question of thermodynamic equilibrium. Polymer scientists are used to systems at, or perturbed slightly away from, equilibrium, while biological systems are driven far from equilibrium by a continuous energy input. How can we incorporate this most basic feature of life into our synthetic devices? What will be our synthetic analogue of life’s universal energy currency, adenosine triphosphate?

    Ultimately, what we are talking about here is the reverse engineering of biology. It’s obvious that the gulf between the crudities of synthetic polymer science and the intricacies of cell biology is currently immense (certainly quite big enough to mean that the undoubted ethical issues that would arise if we could make any kind of reasonable facsimile of life are still very distant). Nonetheless, even rudimentary devices inspired by cell biology would be of huge practical benefit. Potentially even more significant a benefit than this, though, would be the deep understanding of the workings of biology that would arise from trying to copy it.

    Nanotechnology – with nature or against it?

    I’ve been covering two big debates about nanotechnology here. One the on hand, there’s the question of the relative merits of Drexler’s essentially mechanical vision of nanotechnology and the more biologically inspired soft and biomimetic approaches. On the other, we see the efforts of campaigning groups like ETC to paint nanotechnology as the next step after genetic modification in humanity’s efforts to degrade and control the natural world. Although these debates at first sight look very different, they both revolve around issues of control and our proper relationship with the natural world.

    These issues are identified and situated in a deep historical context in a very perceptive article by Bernadette Bensaude-Vincent, of the Philosophy Department in the Université Paris X. The article, Two Cultures of Nanotechnology?, is in HYLE-the International Journal for Philosophy of Chemistry, Vol. 10, No.2 (2004).

    The whole article is well worth reading, but this extract gets to the heart of the matter:

    “There is nothing new in the current artificialization of nature. Already in antiquity, there were two different and occasionally conflicting views of technology. On the one hand, the arts or technai were considered as working against nature, as contrary to nature. This meaning of the term para-physin provided the ground for repeated condemnations of mechanics and alchemy. On the other hand, the arts – especially agriculture, cooking, and medicine – were considered as assisting or even improving on nature by employing the dynameis or powers of nature. In the former perspective, the artisan, like Plato’s demiurgos, builds up a world by imposing his own rules and rationality on a passive matter. Technology is a matter of control. In the latter perspective the artisan is more like the ship-pilot at sea. He conducts or guides forces and processes supplied by nature, thus revealing the powers inherent in matter. Undoubtedly the mechanicist [i.e. Drexlerian] model of nanotechnology belongs to the demiurgic tradition. It is a technology fascinated by the control and the overtaking of nature.”

    Bensaude-Vincent argues soft and biomimetic approaches to nanotechnology fall more naturally into that second culture, conducting or guiding forces and processes supplied by nature, thus revealing the powers inherent in matter.

    Intelligent yoghurt by 2025

    Yesterday’s edition of the Observer contained the bizarre claim that we’ll soon be able to enhance the intelligence of bacteria by using molecular electronics. This came in an interview with Ian Pearson, who is always described as the resident futurologist of the British telecoms company BT. The claim is so odd that I wondered whether it was a misunderstanding on the part of the journalist, but it seems clear enough in this direct quote from Pearson:

    “Whether we should be allowed to modify bacteria to assemble electronic circuitry and make themselves smart is already being researched.

    ‘We can already use DNA, for example, to make electronic circuits so it’s possible to think of a smart yoghurt some time after 2020 or 2025, where the yoghurt has got a whole stack of electronics in every single bacterium. You could have a conversation with your strawberry yogurt before you eat it.’ “

    This is the kind of thing that puts satirists out of business.