Watching an assembler at work

The only software-controlled molecular assembler we know about is the ribosome – the biological machine that reads the sequence of bases on a strand of messenger RNA, and, converting this genetic code into a sequence of amino acids, synthesises the protein molecule that corresponds to the gene whose information was transferred by the RNA. An article in this week’s Nature (abstract, subscription required for full paper, see also this editor’s summary) describes a remarkable experimental study of the way the RNA molecule is pulled through the ribosome as each step of its code is read and executed. This experimental tour-de-force of single molecule biophysics, whose first author is Jin-Der Wen, comes from the groups of Ignacio Tinoco and Carlos Bustamante at Berkeley.

The experiment starts by tethering a strand of RNA between two micron-size polystyrene beads. One bead is held firm on a micropipette, while the other bead is held in an optical trap – the point at which a highly focused laser beam has its maximum intensity. The central part of the RNA molecule is twisted into a single hairpin, and the ribosome binds to the RNA just to one side of this hairpin. As the ribosome reads the RNA molecule, it pulls the hairpin apart, and the resulting lengthening of the RNA strand is directly measured from the change in position of the anchoring bead in its optical trap. What’s seen is a series of steps – the ribosome moves about 2.7 nm in about a tenth of a second, then pauses for a couple of seconds before making another step.

This distance corresponds exactly to the size of the triplet of bases that represent a single character of the genetic code – the codon. What we are seeing, then, is the ribosome pausing on a codon to read it, before pulling the tape through to read the next character. What we don’t see in this experiment, though we know it’s happening, is the addition of a single amino acid to the growing protein chain during this read step. This takes place by means of the binding to RNA codon, within the ribosome, of a shorter strand of RNA – the transfer RNA – to which the amino acid is attached. What the experiment does make clear that the operation of this machine is by no means mechanical and regular. The times taken for the ribosome to move from the reading position for one codon to the next – the translocation times – are fairly tightly distributed around an average value of around 0.08 seconds, but the dwell times on each codon vary from a fraction of a second up to a few seconds. Occasionally the ribosome stops entirely for a few minutes.

This experiment is far from the final word on the way ribosomes operate. I can imagine, for example, that people are going to be making strenuous efforts to attach a probe directly to the ribosome, rather than, as was done here, inferring its motion from the location of the end of the RNA strand. But it’s fascinating to have such a direct probe of one of the most central operations of biology. And for those attempting the very ambitious task of creating a synthetic analogue of a ribosome, these insights will be invaluable.

Leading nanotechnologist gets top UK defense science job

It was announced yesterday that the new Chief Scientific Advisor to the UK’s Ministry of Defense is to be Professor Mark Welland. Mark Welland is currently Professor of Nanotechnology and the head of Cambridge University’s Nanoscience Centre. He is one of the pioneers of nanotechnology in the UK; he was, I believe, the first person in the country to build a scanning probe microscope. Most recently he has been in the news for his work with the mobile phone company Nokia, who recently unveiled their Morph concept phone at an exhibition at New York’s Museum of Modern Art, Design and the Elastic Mind.

How can nanotechnology help solve the world’s water problems?

The lack of availability of clean water to many of the world’s population currently leads to suffering and premature death for millions of people, and as population pressures increase, climate change starts to bite, and food supplies become tighter (perhaps exacerbated by an ill-considered move to biofuels) these problems will only intensify. It’s possible that nanotechnology may be able to contribute to solving these problems (see this earlier post, for example). A couple of weeks ago, Nature magazine ran a special issue on water, which included a very helpful review article: Science and technology for water purification in the coming decades. This article (which seems to be available without subscription) is all the more helpful for not focusing specifically on nanotechnology, instead making it clear where nanotechnology could fit into other existing technologies to create affordable and workable solutions.

One sometimes hears the criticism that there’s no point worrying about the promise of new nanotechnological solutions, when workable solutions are already known but aren’t being implemented, for political or economic reasons. That’s an argument that’s not without force, but the authors do begin to address it, by outlining what’s wrong with existing technical solutions. “These treatment methods are often chemically, energetically and operationally intensive, focused on large systems, and thus require considerable infusion of capital, engineering expertise and infrastructure” Thus we should be looking for decentralised solutions, that can be easily, reliably and cheaply installed using local expertise and preferably without the need for large scale industrial infrastructure.

To start with the problem of the sterilisation of water to kill pathogens, traditional methods start with chlorine. This isn’t ideal, as some pathogens are remarkably tolerant of it, and it can lead to toxic by-products. Ultra-violet sterilisation, on the other hand, offers a lot of promise – it’s good for bacteria, though less effective for viruses. But in combination with photocatalytic surfaces of titanium dioxide nanoparticles it could be very effective. Here what is required is either much cheaper sources of ultraviolet light, (which could come from new nanostructured semiconductor light emitting diodes) or new types of nanoparticles with surfaces excited by lower wavelength light, including sunlight.

Another problem is the removal of contamination by toxic chemicals, which can arise either naturally or through pollution. Problem contaminants include heavy metals, arsenic, pesticide residues, and endocrine disrupters; the difficulty is that these can have dangerous effects even at rather low concentrations, which can’t be detected without expensive laboratory-based analysis equipment. Here methods for robust, low cost chemical sensing would be very useful – perhaps a combination of molecular recognition elements integrated in nanofluidic devices could do the job.

The reuse of waste water offers hard problems because of the high content organic matter that needs to be removed, in addition to the removal of other contaminants. Membrane bioreactors combine the use of the sorts of microbes that are exploited in activated sludge processes of conventional sewage treatment with ultrafiltration through a membrane to get faster throughputs of waste water. The tighter the pores in this sort of membrane, the more effective it is at removing suspended material, but the problem is that this sort of membrane quickly gets blocked up. One solution is to line the micro- and nano- pores of the membranes with a single layer of hairy molecules – one of the paper’s co-authors, MIT’s Anne Mayes, developed a particularly elegant scheme for doing this exploiting self-assembly of comb-shaped copolymers.

Of course, most of the water in the world is salty (97.5%, to be precise), so the ultimate solution to water shortages is desalination. Desalination costs energy – necessarily so, as the second law of thermodynamics puts a lower limit on the cost of separating pure water from the higher entropy solution state. This theoretical limit is 0.7 kWh per cubic meter, and to date the most efficient practical process uses a not at all unreasonable 4 kWh per cubic meter. Achieving these figures, and pushing them down further, is a matter of membrane engineering, achieving precisely nanostructured pores that resist fouling and yet are mechanically and chemically robust.

A methanol economy?

Transport accounts for between a quarter and a third of primary energy use in developed economies, and currently this comes almost entirely from liquid hydrocarbon fuels. Anticipating a world with much more expensive oil and a need to dramatically reduce carbon dioxide emissions, many people have been promoting the idea of a hydrogen economy, in which hydrogen, generated in ways that minimise CO2 emissions, is used as a carrier of energy for transportation purposes. Despite its superficial attractiveness, and high profile political support, the hydrogen economy has many barriers to overcome before it becomes technically and economically feasible. Perhaps most pressing of these difficulties is the question of how this light, low energy density gas can be stored and transported. An entirely new pipeline infrastructure would be needed to move the hydrogen from the factories where it is made to filling stations, and, perhaps even more pressingly, new technologies for storing hydrogen in vehicles will need to be developed. Early hopes that nanotechnology would provide new and cost-effective solutions to these problems – for example, using carbon nanotubes to store hydrogen – don’t seem to be bearing fruit so far. Since using a gas as an energy carrier causes such problems, why don’t we stick with a flammable liquid? One very attractive candidate is methanol, whose benefits have been enthusiastically promoted by George Olah, a Nobel prize winning chemist from the University of Southern California, whose book Beyond Oil and Gas: The Methanol Economy describes his ideas in some technical detail.

The advantage of methanol as a fuel is that it is entirely compatible with the existing infrastructure for distributing and using gasoline; pipes, pumps and tanks would simply need some gaskets changed to switch over to the new fuel. Methanol is an excellent fuel for internal combustion engines; even the most hardened petrol-head should be convinced by the performance figures of a recently launched methanol powered Lotus Exige. However, in the future, greater fuel efficiency might be possible using direct methanol fuel cells if that technology can be improved.

Currently methanol is made from natural gas, but in principle it should be possible to make it economically by reacting carbon dioxide with hydrogen. Given a clean source of energy to make hydrogen (Olah is an evangelist for nuclear power, but if the scaling problems for solar energy were solved that would work too), one could recycle the carbon dioxide from fossil fuel power stations, in effective getting one more pass of energy out of it before releasing it into the atmosphere. Ultimately, it should be possible to extract carbon dioxide directly from the atmosphere, achieving in this way an almost completely carbon-neutral energy cycle. In addition to its use as a transportation fuel, it is also possible to use methanol as a feedstock for the petrochemical industry. In this way we could, in effect, convert atmospheric carbon dioxide into plastic.

To Canada

I’m off to Canada on Sunday, for a brief canter round Ontario. On Monday I’m in the MaRS centre in Toronto, where I’m speaking about nanotechnology in the UK as part of a meeting aimed at promoting UK-Canada collaboration in nanotechnology. On Tuesday I’m going to the University of Guelph, where I’m giving the Winegard lecture in Soft Matter Physics. On Wednesday and Thursday I’ll be at the University of Waterloo, visiting Jamie Forrest, and McMaster University, to congratulate Kari Dalnoki-Veress on winning the American Physical Society’s Dillon Medal. My thanks to Guelph’s John Dutcher for inviting me.

The right size for nanomedicine

One reason nanotechnology and medicine potentially make a good marriage is that the size of nano-objects is very much on the same length scale as the basic operations of cell biology; nanomedicine, therefore, has the potential to make direct interventions on living systems at the sub-cellular level. A paper in the current issue of Nature Nanotechnology (abstract, subscription required for full article) gives a very specific example, showing that the size of a drug-nanoparticle assembly directly affects how effective the drug works in controlling cell growth and death in tumour cells.

In this work, the authors bound a drug molecule to a nanoparticle, and looked at the way the size of the nanoparticle affected the interaction of the drug with receptors on the surface of target cells. The drug was herceptin, a protein molecule which binds to a receptor molecule called ErbB2 on the surface of cells from human breast cancer. Cancerous cells have too many of these receptors, and this affects the communications between different cells which tell cells whether to grow, or which marks cells for apoptosis – programmed cell death. What the authors found was that herceptin attached to gold nanoparticles was more effective than free herceptin at binding to the receptors; this then led to reduced growth rates for the treated tumour cells. But how well the effect works depends strongly on how big the nanoparticles are – best results are found for nanoparticles 40 or 50 nm in size, with 100 nm nanoparticles being barely more effective than the free drug.

What the authors think is going on is connected to the process of endocytosis, by which nanoscale particles can be engulfed by the cell membrane. Very small nanoparticles typically only have one herceptin molecule attached, so they behave much like free drug – one nanoparticle binds to one receptor. 50 nm nanoparticles have a number of herceptin molecules attached, so a single nanoparticle links together a number of receptors, and the entire complex, nanoparticles and receptors, is engulfed by the cell and taken out of the cell signalling process completely. 100 nm nanoparticles are too big to be engulfed, so only that fraction of the attached drug molecules in contact with the membrane can bind to receptors. A commentary (subscription required) by Mauro Ferrari sets this achievement in context, pointing out that a nanodrug needs to do four things: successfully navigate through the bloodstream, negotiate any biological barriers preventing it from getting it where it needs to go, locate the cell that is its target, and then to modify the pathological cellular processes that underly the disease being treated. We already know that nano-particle size is hugely important for the first three of these requirements, but this work directly connects size to the sub-cellular processes that are the target of nanomedicine.

A culture of improvement

If one wants to comment on the future of technology, it’s a good idea to have some understanding of its history. A new book by Robert Friedel,
A Culture of Improvement: Technology and the Western Millennium, takes on the ambitious task of telling the story of the development of technology in Europe and North America over the last thousand years.

The book is largely a very readable narrative history of technology, with some rather understated broader arguments. One theme is suggested by the title; in Friedel’s view the advance of technology has been driven, not so much by the spectacular advances of the great inventors, but by a mindset that continually seeks to make incremental improvements in existing technologies. The famous inventors, the James Watts and Alexander Graham Bells of history, certainly get due space, but there’s also an emphasis on placing the best-known inventions in the context of the less well known precursor technologies from which they sprung, and on the way engineers and workers continuously improved the technologies once they were introduced. Another theme is the way in which the culture of improvement was locked into place, as it were, by the institutions that promoted technical and scientific education, and the media that brought new scientific and technical ideas to a wide audience.

This provokes some revision of commonly held ideas about the relationship between science and engineering. In Friedel’s picture, the role of science has been, less to provide fundamental discoveries that engineers can convert into practical devices, and more to provide the mental framework that permits the process of incremental improvement. Those who wish to de-emphasise the importance of science for innovation often point to the example of the development of the steam engine – “thermodynamics owes much more to the steam engine than the steam engine owes to thermodynamics”, the saying goes. This of course is true as far as it goes – the academic subject of thermodynamics was founded by Sadi Carnot’s analysis of the steam engines that were already in widespread use, and which had been extensively developed without the benefit of much theoretical knowledge. But it neglects the degree to which an understanding of formal thermodynamics underlay the development of the more sophisticated types of engines that are still in use today. Rudolph Diesel’s efforts to develop the engine that bears his name, and which is now so important, were based on an explicit project to use the thermodynamics he had learned from his professor, Carl Linde (who also made huge contributions to the technology of refrigeration), to design the most efficient possible internal combustion engine.

Some aspects of the book are open to question. The focus on Europe, and the European offshoots in North America, is justified by the premise that there was something special in this culture that led to the “culture of improvement”; one could argue, though, that the period of unquestioned European technological advantage was a relatively short fraction of the millennium under study (it’s arguable, for example, that China’s medieval technological lead over Europe persisted well into the 18th century). And many will wonder whether technological advances always lead to “improvement”. A chapter on “the corruption of improvement” discusses the application of technology to weapons of mass destruction, but one feels that Friedel’s greatest revulsion is prompted by the outcome of the project to apply the culture of improvement to the human race itself. It’s useful to be reminded that the outcome of this earlier project for “human enhancement” was, particularly in the USA and Scandinavia, a programme of forced sterilisation of those deemed unfit to reproduce that persisted well into living memory. In Germany, of course, this “human enhancement” project moved beyond sterilisation to industrial-scale systematic murder of the disabled and those who were believed to be threats to “racial purity”.

Another UK government statement on nanotechnology

As I mentioned on Wednesday, the UK government took the opportunity of Thursday’s nano-summit organised by the consumer advocate group Which? to release a statement about nanotechnology. The Science Minister’s speech didn’t announce anything new or dramatic – the minister did “confirm our commitment to keep nanotechnology as a Government priority”, though as the event’s chair, Nick Ross, observed, the Government has a great many priorities. The full statement (1.3 MB PDF) is at least a handy summary of what otherwise would be a rather disjointed set of measures and activities.

The other news from the Which? event was the release of the report from their Citizen’s Panel. Some summaries, as well as a complete report, are available from the Which? website. Some flavour of the results can be seen in this summary: “Panellists were generally excited about the potential that nanotechnologies offer and were keen to move ahead with developing them. However, they also recognised the need to balance this with the potential risks. Panellists identified many opportunities for nanotechnologies. They appreciated the range of possible applications and certain specific applications, particularly for health and medicine. The potential to increase consumer choice and to help the environment were also highlighted, along with the opportunity to ‘start again’ by designing new materials with more useful properties. Other opportunities they highlighted were potential economic developments for the UK (and the jobs this might create) and the potential to help developing countries (with food or cheaper energy).” Balanced against this generally positive attitude were concerns about safety, regulation, information, questions about the accessibility of the technology to the poor and the developing world, and worries about possible long-term environmental impacts.

The subject of nanotechnology was introduced at the meeting with this short film.

Which nanotechnology?

It seems likely that nanotechnology will move a little higher up the UK news agenda towards the end of this week – tomorrow sees the launch event for the results of a citizens’ panel run by the consumer group Which?. This will be quite a high profile event, with a keynote speech by the Science Minister, Ian Pearson, outlining current UK nanotechnology policy. This will be the first full statement on nanotechnology at Ministerial level for some time. I’m one the panel responding to the findings, which I will describe tomorrow.

Drew Endy on Engineering Biology

Martyn Amos draws our attention to a revealing interview from MIT’s Drew Endy about the future of synthetic biology. While Craig Venter up to now monopolised the headlines about synthetic biology, Endy has an original and thought-provoking take on the subject.

Endy is quite clear about his goals: “The underlying goal of synthetic biology is to make biology easy to engineer.” In pursuing this, he looks to the history of engineering, recognising the importance of things like interchangeable parts and standard screw gauges, and seeks a similar library of modular components for biological systems. Of course, this approach must take for granted that when components are put together they behave in predictable ways: “Engineers hate complexity. I hate emergent properties. I like simplicity. I don’t want the plane I take tomorrow to have some emergent property while it’s flying.” Quite right, of course, but since many suspect that life itself is an emergent property one could wonder how much of biology will be left after you’ve taken the emergence out.

Many people will have misgivings about the synthetic biology enterprise, but Endy is an eloquent proponent of the benefits of applying hacker culture to biology: “Programming DNA is more cool, it’s more appealing, it’s more powerful than silicon. You have an actual living, reproducing machine; it’s nanotechnology that works. It’s not some Drexlarian (Eric Drexler) fantasy. And we get to program it. And it’s actually a pretty cheap technology. You don’t need a FAB Lab like you need for silicon wafers. You grow some stuff up in sugar water with a little bit of nutrients. My read on the world is that there is tremendous pressure that’s just started to be revealed around what heretofore has been extraordinarily limited access to biotechnology.”

His answer to societal worries about the technology, then, is an confidence in the power of open source ideals, common ownership rather than corporate monopoly for the intellectual property, and an assurance that an open technology will automatically be applied to solve pressing societal problems.

There are legitimate questions about this vision of synthetic biology, both as to whether it is possible and whether it is wise. But to get some impression of the strength of the driving forces pushing this way, take a look at this recent summary of trends in DNA synthesis and sequencing. “Productivity of DNA synthesis technologies has increased approximately 7,000-fold over the past 15 years, doubling every 14 months. Costs of gene synthesis per bases pair have fallen 50-fold, halving every 32 months.” Whether this leads to synthetic biology in the form anticipated by Drew Endy, the breakthrough into the mainstream of DNA nanotechnology, or something quite unexpected, it’s difficult to imagine this rapid technological development not having far-reaching consequences.