Disentangling thin polymer films

Many of the most characteristic properties of polymer materials like plastics come from the fact that their long chain molecules get tangled up. Entanglements between different polymer chains behave like knots, which make a polymer liquid behave like a solid over quite perceptible time scales, just like silly putty. The results of a new experiment show that when you make the polymer film very thin – thinner than an individual polymer molecule – the chains become less entangled with each other, with significant effects on their mechanical properties.

The experiments are published in this weeks Physical Review Letters; I’m a co-author but the main credit lies with my colleagues Lun Si, Mike Massa and Kari Dalnoki-Veress at McMaster University, Canada. The abstract is here, and you can download the full paper as a PDF (this paper is copyright the American Physical Society and is available here under the author rights policy of the APS).

This is the latest in a whole series of discoveries of ways in which the properties of polymer films dramatically change when their thicknesses fall towards 10 nm and below. Another example is the discovery that the glass transition temperature of polymer films – the temperature at which a polymer like polystyrene changes from a glassy solid to a gooey liquid – dramatically decreases in thin films. So a material that would in the bulk be a rigid solid may, in a thin enough film, turn into a much less rigid, liquid-like layer (see this technical presentation for more details). Why does this matter? Well, one reason is that, as feature sizes in the microelectronics industry fall below 100 nm, the sharpness with which one can define a line in a thin film of a polymer resist could limit the perfection of the features one is making. So the fact that the mechanical properties of the polymer themselves change, purely as a function of size, could lead to problems.

Nanoscience, small science, and big science

Quite apart from the obvious pun, it’s tempting to think of nanoscience as typical small science. Most of the big advances are made by small groups working in universities on research programs devised, not by vast committees, but by the individual professors who write the grant applications. Equipment is often quite cheap, by scientific standards – a state of the art atomic force microscope might cost $200,000, and doesn’t need a great deal of expensive infrastructure to house it. If you have the expertise and the manpower, but not the money, you could build one yourself for perhaps a tenth of this price or less. This is an attractive option for scientists in developing countries, and this is one reason why nanoscience has become such a popular field in countries like India and China. It’s all very different from the huge and expensive multinational collaborations that are necessary for progress in particle physics, where a single experiment may involve hundreds of scientists and hundreds of millions of dollars – the archetype of big science.

Big science does impact on the nanoworld, though. Techniques that use the highly intense beams of x-rays obtained from synchrotron sources like the ESRF at Grenoble, France, and the APS, on the outskirts of Chicago, USA, have been vital in determining the structure, at the atomic level, of the complex and efficient nanomachines of cell biology. Neutron beams, too, are unique probes of the structure and dynamics of nanoscale objects like macromolecules. To get a beam of neutrons intense enough for this kind of structure, you either need a research reactor, like the one at the Institut Laue-Langevin, in Grenoble (at which I am writing this), or a spallation source, such as ISIS, near Oxford in the UK. This latter consists of a high energy synchrotron, of the kind developed for particle physics, which smashes pulses of protons into a heavy metal target, producing showers of neutrons.

Synchrotron and neutron sources are run on a time-sharing basis; individual groups apply for time on a particular instrument, and the best applications are allocated a few days of (rather frenetic) experimentation. So in this sense, even these techniques have the character of small science. But the facilities themselves are expensive – the world’s most advanced spallation source for neutrons, the SNS currently being built in Oak Ridge, TN, USA, will cost more than $1.4 billion, and the Japanese source J-PARC, a few years behind SNS, has a budget of $1.8 billion. With this big money comes real politics. How do you set the priorities for the science that is going to be done, not next year, but in ten years time? Do you emphasise the incremental research that you are certain will produce results, or do you gamble on untested ideas that just might produce a spectacular pay-off? This is the sort of rather difficult and uncomfortable discussion I’ve been involved in for the last couple of days – I’m on the Scientific Council of ILL, which has just been having one of its twice yearly meetings.

Cancer and nanotechnology

There’s a good review in Nature Reviews: Cancer (with free access) about the ways in which nanotechnology could help the fight against cancer – Cancer Nanotechnology: Opportunities and Challenges . The article, by Ohio State University’s Mauro Ferrari, concentrates on two themes – how nanotechnologies can help diagnose and monitor cancer, and how it could lead to more effective targeting and delivery of anti-cancer agents to tumours.

The extent to which we urgently need better ways of wrapping up therapeutic molecules and getting them safely to their targets is highlighted by a striking figure that the article quotes – if you inject monoclonal antibodies and monitor how many of these molecules get to a target within an organ, the fraction is less than 0.01%. The rest are wasted, which is bad news if these molecules are expensive and difficult to make, and even worse news if, like many anti-cancer drugs, they are highly toxic. How can we make sure that every one of these drug molecules get to where they are needed? One answer is to stuff them into a nanovector, a nanoscale particle that protects the enclosed drug molecules and delivers them to where they are needed. The simplest example of this approach uses a liposome – a bag made from a lipid bilayer. Liposome encapsulated anti-cancer drugs are now clinically used in the treatment of Karposi’s sarcoma and breast and ovarian cancers. But lots of work remains to make nanovectors that are more robust, more resistant to non-specific protein adsorption, and above all which are specifically targeted to the cells they need to reach. Such specific targeting could be achieved by coating the nanovectors with antibodies with specific molecular recognition properties for groups on the surface of the cancer cells. The article cites one cautionary tale that illustrates that this is all more complicated than it looks – a recent simulation suggests that it is possible to get a situation in which targeting a drug precisely to a tumour can make the situation worse, by causing the tumour to break up. It may be necessary not just to target the drug carriers to a tumour, but to make sure that the spatial distribution of the drug through the tumour is right.

The future will probably see complex nanovectors engineered to perform multiple functions, protecting the drugs, getting them through all the barriers and pitfalls that lie between the point at which the drug is administered and the part of the body where it is needed, and releasing them at their target. The recently FDA approved breast cancer drug, Abraxane, is an advance in the right direction; one can think of it as a nanovector that combines two functions. The core of the nanovector consists of a nanoparticulate form of the drug itself; dispersing it so finely dispenses with the need for toxic solvents. And bound to the drug nanoparticle are protein molecules which help the nanoparticles get across the cells that line blood vessels. It’s clear that as more and more functions are designed into nanovectors, there’s a huge amount of scope for increases in drug effectiveness, increases that could amount to orders of magnitude.

Massive Change

Philip Ball‘s column in the March edition of Nature Materials (subscription required) draws attention to the designer Bruce Mau‘s Massive Change project.

Massive Change is an exhibition, currently on show at the Art Gallery of Ontario in Toronto, a website, and a book, all with the ambitious aim of showing how design can change the world for the better. Design here is interpreted broadly, to encompass town planning, architecture, and above all technology, and the aims are summarised in bold, manifesto statements. These three examples give the flavour:

  • We will bring energy to the entire world
  • We will eradicate poverty
  • We will eliminate the need for raw material and banish all waste
  • Nanotechnology, in various guises, makes frequent appearances in support of these goals, though it’s the incremental and evolutionary versions rather than the Drexlerian kind that are invoked. Nonetheless, advances in materials science are described in these visionary terms:

    Material has traditionally been something to which design is applied. New methods in the fields of nanotechnology have rendered material as the object of design development. Instead of designing a thing, we design a designing thing. In the process we have created superhero substances endowed with superlative characteristics, from the hyperbolic to the almost human. Materials now have strength, agility, memory, intelligence. Mere matter no longer, materials have become active carriers of meaning and program.

    One can quibble at the hyperbole and the lack of detail, but I can’t help applauding a project which is both idealistic and assertive, in the sense that it stresses the view that we aren’t simply helpless victims of the progress of technology, but that we can imagine the outcomes we want and decide how to use technology to get there.

    New book on Nanoscale Science and Technology

    Nanoscale Science and Technology is a new, graduate level interdisciplinary textbook which has just been published by Wiley. It’s based on the Masters Course in Nanoscale Science and Technology that we run jointly between the Universities of Leeds and Sheffield.

    Nanoscale Science and Technology Book Cover

    The book covers most aspects of modern nanoscale science and technology. It ranges from “hard” nanotechnologies, like the semiconductor nanotechnologies that underly applications like quantum dot lasers, and applications of nanomagnetism like giant magnetoresistance read-heads, via semiconducting polymers and molecular electronics, through to “soft” nanotechnologies such as self-assembling systems and bio-nanotechnology. I co-wrote a couple of chapters, but the heaviest work was done by my colleagues Mark Geoghegan, at Sheffield, and Ian Hamley and Rob Kelsall at Leeds, who, as editors, have done a great job of knitting together the contributions of a number of authors with different backgrounds to make a coherent whole.

    Nanotechnology and public engagement

    The UK government today announced funding for work in public engagement in a number of technology areas, including nanotechnology, under their Science Wise scheme. There are two schemes related to nanotechnology. One of these, “Nanodialogues” will be run by the thinktank Demos, and will carry out four experiments in “upstream public engagement”. At the back of everybody’s mind as people try to design these schemes is a previous, not entirely happy, experiment in public engagement over genetic modification of food, GM Nation. There’s a general will to learn from the shortcomings of that experience.

    Entirely co-incidentally, I was in London today, at the Greenpeace UK headquarters, for the first meeting of the steering group of a pilot experiment in nano-public engagement. This is a project to run a citizen’s jury about nanotechnology. The project is supported by Greenpeace UK, the Guardian newspaper, and the Cambridge University Nanoscience Centre, and operations will be run by an outfit from Newcastle University with experience of this sort of thing, Policy, Ethics and Life Sciences. I’m chairing the science advisory panel.

    It’s too early to be saying much about the project yet, but I’ll be reporting on the process as it unfolds over the spring and summer. It’s unknown territory for me, but even this first meeting was fascinating. We had representatives from the NGOs Greenpeace and ETC, high level representation from government and research councils, and a few academics. Just getting this bunch round the table in the first place was impressive enough, but I was surprised at how easily the group was able to reach a consensus.

    Debating nanotechnologies

    To the newcomer, the nanotechnology debate must be very confusing. The idea of a debate implies two sides, but there are many actors debating nanotechnology, and they don’t even share a common understanding of what the word means. The following extended post summarises my view of this many-faceted discussion. Regular readers of Soft Machines will recognise all the themes, but I hope that newcomers will find it helpful to find them all in one place.

    Nanotechnology has become associated with some very far-reaching claims. Its more enthusiastic adherents believe that it will be utterly transformational in its effects on the economy and society, making material goods of all sorts so abundant as to be essentially free, restoring the environment to a pristine condition, and revolutionising medicine to the point where death can be abolished. Nanotechnology has been embraced by governments all over the world as a source of new wealth, with the potential to take the place of information technology as a driver for rapid economic growth. Breathless extrapolations of a new, trillion-dollar nanotechnology industry arising from nowhere are commonplace. These optimistic visions have led to new funding being lavished on scientists working on nanotechnology, with the total amount being spent a subject for competition between governments across the developed world. As an antidote to all this optimism, NGOs and environmental groups have begun to mobilise against what they see as another example of excessive scientific technological hubris, which falls clearly in the tradition of nuclear energy and genetic modification, as a technology which promised great things but delivered, in their view, more environmental degradation and social injustice.

    And yet, despite this superficial agreement on the transformational power of nanotechnology, whether for good or bad, there are profound disagreements not just about what the technology can deliver, but about what it actually is. The most radical visions originate from the writings of K. Eric Drexler, who wrote an influential and widely read book called “Engines of Creation”. This popularised the term “nanotechnology”, developing the idea that mechanical engineering principles could be applied on a molecular scale to create nano-machines which could build up any desired material or artefact with ultimate precision, atom by atom. It is this vision of nanotechnology, subsequently developed by Drexler in his more technical book Nanosystems, that has entered popular culture through films and science fiction books, perhaps most notably in Neal Stephenson’s novel “The Diamond Age”.

    To many scientists, science fiction novels are where Drexler’s visions of nanotechnology should stay. In a falling out which has become personally vituperative, leading scientific establishment figures, notably the Nobel Laureate Richard Smalley, have publically ridiculed the Drexlerian project of shrinking mechanical engineering to molecular dimensions. What is dominating the scientific research agenda is not the single Drexlerian vision, but instead a rather heterogenous collection of technologies, whose common factor is simply a question of scale. These evolutionary nanotechnologies typically involve the shrinking down of existing technologies, notably in information technology, to smaller and smaller scales. Some of the products of these developments are already in the shops. The very small, high density hard disk drives that are now found not just in computers, but in consumer electronics like MP3 players and digital video recorders, rely on the ability to create nanoscale multilayer structures which have entirely new physical properties like giant magnetoresistance. Not yet escaped from the laboratory are new technologies like molecular electronics, in which individual molecules play the role of electronic components. Formidable obstacles remain before these technologies can be integrated to form practical devices that can be commercialised, but the promise is yet another dramatic increase in computing power. Medicine should also benefit from the development of more sophisticated drug delivery devices; this kind of nanotechnology will also play a major role in the development of tissue engineering.

    What of the products that are already on shop shelves, boasting of their nanotechnological antecedents? There are two very well publicised examples. The active ingredient in some sunscreens consists of titanium dioxide crystals whose sizes are in the nanoscale range. In this size range, the crystals, and thus the sunscreen, are transparent to visible light, rather than having the intense white characteristic of the larger titanium dioxide crystals familiar in white emulsion paint. Another widely reported applications of nanotechnology are in fabric treatments, which by coating textile fibres with molecular size layers give them properties such as stain resistance. These applications, although mundane, result from the principle that matter when divided on this very fine scale, can have different properties to bulk matter. However, it has to be said that these kinds of products represent the further development of trends in materials science, colloid science and polymer science that have been in train for many years. This kind of incremental nanotechnology, then, does involve new and innovative science, but it isn’t different in character to other applications of materials science that may not have the nano- label. To this extent, the decision to refer to these applications as nanotechnology involves marketing as much as science. But what we will see in the future are more and more of this kind of application making their way to the marketplace, offering real, if not revolutionary, advances over the products that have gone before. These developments won’t be introduced in a single “nanotechnology industry”; rather these innovations will find their way into the products of all kinds of existing industries, often in rather an unobtrusive way.

    The idea of a radical nanotechnology, along the lines mapped out by Drexler and his followers, has thus been marginalised on two fronts. Those interested in developing the immediate business applications of nanotechnology have concentrated on the incremental developments that are close to bringing products to market now, and are keen to downplay the radical visions because they detract from the immediate business credibility of their short-term offerings. Meanwhile the nano-science community is energetically pursuing a different evolutionary agenda. Is it possible that both scientists and the nanobusiness community are too eagerly dismissing Drexler’s ideas – could there be, after all, something in the idea of a radical nanotechnology?

    My personal view is that while some of Smalley’s specific objections don’t hold up in detail, and it is difficult to dismiss the Drexlerian proposals out of hand as being contrary to the laws of nature, the practical obstacles they face are very large. To quote Philip Moriarty, an academic nanoscientist with a great deal of experience of manipulating single molecules, “the devil is in the details”, and as soon as one starts thinking through how one might experimentally implement the Drexlerian program a host of practical problems emerge.

    But one aspect of Drexler’s argument is very important, and undoubtedly correct. We know that a radical nanotechnology, with sophisticated nanoscale machines operating on the molecular scale, can exist, because cell biology is full of such machines. This is beautifully illustrated in David Goodsell’s recent book Bionanotechnology: Lessons from Nature. But Drexler goes further. He argues that if nature can make effective nanomachines from soft and floppy materials, with the essentially random design processes of evolution, then the products of a synthetic nanotechnology, using the strongest materials and the insights of engineering, will be very much more effective. My own view (developed my book “Soft Machines”) is that this underestimates the way in which biological nanotechnology exploits and is optimised for the peculiar features of the nanoscale world. To take just one example of a highly efficient biological nanomachine, ATP-synthase is a remarkable rotary motor which life-forms as different as bacteria and elephants all use to synthesise the energy storage molecular ATP. The efficiency with which it converts energy from one form to another is very close to 100%, a remarkable result when one considers that most human-engineered energy conversion devices, such as steam turbines and petrol engines, struggle to exceed 50% efficiency. This is one example, then, of a biological nanomachine that is close to optimal. The reason for this is that biology uses design principles very different to those we learn about in human-scale engineering, that exploit the special features of the nanoworld. There’s no reason in principle why we could not develop a radical nanotechnology that uses the same design principles as biology, but the result will look very different to the miniaturised cogs and gears of the Drexlerian vision. Radical nanotechnologies will be possible, then, but they will owe more to biology than to conventional engineering.

    Discussion of the possible impacts of nanotechnology, both positive and negative, has shown signs of becoming polarised along the same lines as the technical discussion. The followers of Drexler promise on the on hand a world of abundance of all material needs, and an end to disease and death. But they’ve also introduced perhaps the most persistent and gripping notion – the idea that artificial, self-replicating nanoscale robots would escape our control and reproduce indefinitely, consuming all the world’s resources, and rendering existing life extinct. The idea of this plague of “grey goo” has become firmly embedded in our cultural consciousness, despite some indications of regret from Drexler, who has more lately emphasised the idea that self-replication is neither a desirable nor a necessary feature of a nanoscale robot. The reaction of nano-scientists and business people to the idea of “grey goo” has been open ridicule. Actually, it is worth taking the idea seriously enough to give it a critical examination. Implicit in the notion of “grey goo” is the assumption that we will be able to engineer what is effectively a new form of life that is more fit, in a Darwinian sense, and better able to prosper in the earth’s environment than existing life-forms. On the other hand, the argument that biology at the cell level is already close to optimal for the environment of the earth means that the idea that synthetic nano-robots will have an effortless superiority over natural lifeforms is much more difficult to sustain.

    Meanwhile, mainstream nanobusiness and nanoscience has concentrated on one very short-term danger, the possibility that new nanoparticles may be more toxic than their macroscale analogues and precursors. This fear is very far from groundless; since one of the major selling points of nanoparticles is that their properties may be different from the analogous matter in a less finely divided state, it isn’t at all unreasonable to worry that toxicity may be another property that depends on size. But I can’t help feeling that there is something odd about the way the debate has become so focused on this one issue; it’s an unlikely alliance of convenience between nanobusiness, nanoscience, government and the environmental movement, all of whom have different reasons for finding it a convenient focus. For the environmental movement, it fits a well-established narrative of reckless corporate interests releasing toxic agents into the environment without due care and attention. For nanoscientists, it’s a very contained problem which suggests a well-defined research agenda (and the need for more funding). By tinkering with regulatory frameworks, governments can be seen to be doing something, and nanobusiness can demonstrate their responsibility by their active participation in the process.

    The dominance of nanoparticle toxicity in the debate is a vivid illustration of a danger that James Wilsdon has drawn attention to – the tendency for all debates on the impact of science on society to end up exclusively focused on risk assessment. In the words of a pamphlet by Willis and Wilsdon – “See-through Science” – “in the ‘risk society’ perhaps the biggest risk is that we never get around to talking about anything else.” Nanotechnology – even in its evolutionary form – presents us with plenty of very serious things to talk about. How will privacy and civil liberties survive in a world in which every artefact, no matter how cheap, includes a networked computer? How will medical ethics deal with a blurring of the line between the human and the machine, and the line between remedying illness and enhancing human capabilities?

    Some people argue that new technologies like nanotechnology are potentially so dehumanising that we should consciously relinquish them. Bill McKibben, for example, makes this case very eloquently in his book “Enough“. Although I have a great deal of sympathy with McKibben’s rejection of the values of the trans-humanists, who consciously seek to transcend humanity, I don’t think the basic premise of McKibben’s thesis is tenable. The technology we have already is not enough. Mankind currently depends for its very existence at current population levels on technology. To take just one example, our agriculture depends on the artificial fixation of nitrogen, which is made possible by the energy we derive from fossil fuels. And yet the shortcomings of our existing technologies are quite obvious, from the eutrophication that excessive use of synthetic fertilisers causes, to the prospect of global climate change as a result of our dependence on fossil fuels. As the population of the world begins to stabilise, we have the challenge of developing new technologies that will allow for the whole population of the world to have decent standards of living on a sustainable basis. Nanotechnology could play an important role, for example by delivering cheap solar cells and the infrastructure for a hydrogen economy, together with cheap ways of providing clean water. But there’ll need to be real debates about how to set priorities so that the technology bring benefits to the poor as well as the rich.

    Flat panel displays and organic semiconductors

    Plastic electronics offers the possibility of making devices like flat screen displays, solar cells and logic circuits from semiconducting polymers, exploiting low-cost polymer processing techniques to make devices cheaply, in very large areas, on flexible substrates. But the existing technologies with which these would-be disruptive technologies are competing are also evolving very fast. It is in this context that the news that Covion Organic Semiconductors has been bought by the German chemical company Merck (press release here), for 50 million euros in cash is rather interesting. Covion is one of the few companies that has been attempting to make a business making the semiconducting polymers that will be used in flat-panel displays from light emitting polymers, while Merck is the world’s largest producer of the liquid crystals used in flat panel liquid crystal displays.

    This news led to a little bit of interest in the chemical industry trade press, as another part of the process in which the privately held speciality chemical company Avecia is gradually being liquidated. Covion was wholly owned by Avecia, and the deal includes Avecia’s research effort in organic semiconductors. But there are a couple of lessons for nanotechnology businesses to learn here .

    The first is simply how difficult it can be for new technologies to catch up with the rapid, incremental development of existing technologies. You don’t need a lot of research to see how rapidly the liquid crystal display industry has been developing; a few trips to the mall suffice to convince one that liquid crystal display TVs, which only a few years ago were an expensive curiosity, are plummeting in price and growing in area. Thus two out of the three main potential advantages of polymer light emitting diode displays – cost, and the ability to make large areas – are rapidly eroding. As the Merck Chairman Bernhard Scheuble is quoted as saying in the press release ,“It is apparent that liquid crystal displays will be the dominant flat-panel technology for some years to come … We see this acquisition as an opportunity to explore alternative technologies for the future, which is a prudent step for any market leader.”

    It’s also interesting to look at Merck’s liquid crystal business, with which Covion will be integrated. Although this probably wouldn’t be generally recognised as a nanotechnology business, and it is certainly not described as one by Merck, it has some features that make it rather a good model for successful nanotech firms in the future. Its major product is a class of molecules whose value depends on the rather subtle nanoscale arrangements these molecules take up, and the way in which those arrangements are altered by interactions with a nanostructured surface and with applied fields. The business is large – sales are in excess of half a billion euros – but the physical quantity of material produced is tiny. I’d guess that the total annual world production of liquid crystals is less than one hundred tonnes, an amount that would fit into a couple of double garages. Most of us have some of this product in our houses, in our offices, in our cell-phones and laptops, but in miniscule quantities. And the business is stunningly profitable, with a return on sales of more than 50%. Why should it not be so? They’re selling combinations of mostly carbon, nitrogen, hydrogen and oxygen for a huge price which reflects, not the cost of the material or the cost of production, but the cost of the research and development, and the functional value that a tiny amount of the material can add to a desirable product.

    Another ��200 million for nanotechnology in the UK

    The UK government announced yesterday ��200 million (US$380 million) of funding for nanotechnology over the next three years. The announcement came rather buried in yesterday’s press release accompanying the details of the breakdown of the science allocations from the 2004 – 2008 Comprehensive Spending Review.

    There are a couple of caveats to be born in mind when interpreting this figure. Firstly, as the precise wording is “Raising total DTI investment in nanotechnology research to ��200 million” we should probably assume that the ��200m isn’t in addition to the ��90m or so already announced – the new money is thus in the region of ��110m. Secondly, this is only the spend on nanotechnology directly controlled by the Department of Trade and Industry. Most academic nanoscience is still supported by the research councils, particularly EPSRC (whose roughly ��0.5 billion annual budget sees healthy rises over the next few years, though these probably won’t be translated into a lot of new science).

    I can’t say I look at this story without mixed feelings. It isn’t clear to me that the DTI has got its act together about its nanotechnology program; the money spent so far seems to be on very short term, rather niche, applications. The definition they give in the press release doesn’t inspire confidence that they have much of a long term vision: “Nanotechnology is the science of minute particles. Nanotechnology manipulates and controls these particles to create structures with unique properties, and promises advances in manufacturing, medicine and computing. Potential applications include medical dressing that kill off microbes, stain-free fabrics that repel liquids and self-cleaning windows.”

    Directly reading DNA

    As the success of the Human Genome Project has made clear, DNA stores information at very high density – 15 atoms per bit of stored information. But, while biology has evolved some very sophisticated and compact ways of reading that information, we’re stuck with some clunky and expensive methods of sequencing DNA. Of course, driven by the Human Genome Project, the techniques have improved hugely, but it still costs about ten million dollars to sequence a mammal-sized genome (according to this recent press release from the National Institutes of Health). This needs to get much cheaper, not only to unlock the potential of personalised genomic medicine, but also if we are going to use DNA or analogous molecules as stores of information for more general purposes. One thousand dollars a genome is a sum that is often mentioned as a target.

    Clearly, it would be great if we could simply manipulate a single DNA molecule and directly read out its sequence. One of the most promising approaches to doing this envisages threading the molecule through a nanoscale hole and measuring some property which changes according to which base is blocking the pore. A recent experiment shows that it is possible, in principle, to do this. The experiment is reported by Ashkenasy, Sanchez-Quesada, and M. Reza Ghadiri, from Scripps, and Bayley from Oxford, in a recent edition of Angewandte Chemie (Angew Chemie Int Ed 44 p1401 (2005)) – the full paper can be downloaded as a PDF here. In this case the pore is formed by a natural pore forming protein in a lipid membrane, and what is measured is the ion current across the membrane.

    This approach isn’t new; it originated with David Deamer at Santa Cruz and Dan Branton at Harvard (Branton’s website in particular is an excellent resource). A number of groups around the world are trying to do something similar; there are various variations possible, such as using an artificially engineered nanopore instead of a membrane protein, and using a different probe than the ion current. It feels to me like this ought to work, and this latest demonstration is an important step along the path.