Nobels, Nanoscience and Nanotechnology

It’s interesting to see how various newspapers have reported the story of yesterday’s award of the physics Nobel prize to the discoverers of giant magnetoresistance (GMR). Most have picked up on the phrase used in the press release of the Nobel foundation, that this was “one of the first real applications of the promising field of nanotechnology”. Of course, this begs the question of what’s in all those things listed in the various databases of nanotechnology products, such as the famous sunscreens and stain-resistant fabrics.

References to iPods are compulsory, and this is entirely appropriate. It is quite clear that GMR is directly responsible for making possible the miniaturised hard disk drives on which entirely new product categories, such as hard disk MP3 players and digital video recorders, depend. The more informed papers (notably the Financial Times and the New York Times) have noticed that one name was missing from the award – Stuart Parkin – a physicist working for IBM in Almaden, in California, who was arguably the person who took the basic discovery of GMR and did the demanding science and technology needed to make a product out of it.

The Nobel Prize for Chemistry announced today also highlights the relationship between nanoscience and nanotechnology. It went to Gerhard Ertl, of the Fritz-Haber-Institut in Berlin, for his contributions to surface chemistry. In particular, using the powerful tools of nanoscale surface science, he was able to elucidate the fundamental mechanisms operating in catalysis. For example, he worked out the basic steps of the Haber-Bosch process. A large proportion of the world’s population quite literally depends for their lives on the Haber-Bosch process, which artificially fixes nitrogen from the atmosphere to make the fertilizer on which the high crop yields that feed the world depend.

The two prizes illustrate the complexity of the interaction between science and technology. In the case of GMR, the discovery was one that came out of fundamental solid state physics. This illustrates how what might seem to the scientists involved to be very far removed from applications can, if the effect turns out to be useful, be very quickly be exploited in products (though the science and technology needed to make this transition will itself often be highly demanding, and is perhaps not always appreciated enough). The surface science rewarded in the chemistry prize, by contrast, represents a case in which science is used, not to discover new effects or processes, but to understand better a process that is already technologically hugely important. This knowledge, in turn, can then underpin improvements to the process or the development of new, but analogous, processes.

Giant magnetoresistance – from the iPod to the Nobel Prize

This year’s Nobel Prize for Physics, it was announced today, has been awarded to Albert Fert, from Orsay, Paris, and Peter Grünberg, from the Jülich research centre in Germany, for their discovery of giant magnetoresistance, an effect whereby a structure of layers of alternating magnetic and non-magnetic materials, each only a few atoms thick, has an electrical resistance that is very strongly changed by the presence of a magnetic field.

The discovery was made in 1988, and at first seemed an interesting but obscure piece of solid state physics. But very soon it was realised that this effect would make it possible to make very sensitive magnetic read heads for hard disks. On a hard disk drive, information is stored as tiny patterns of magnetisation. The higher the density of information one is trying to store on a hard drive, the weaker the resulting magnetic field, and so the more sensitive the read head needs to be. The new technology was launched onto the market in 1997, and it is this technology that has made possible the ultra-high density disk drives that are used in MP3 players and digital video recorders, as well as in laptops.

The rapidity with which this discovery was commercialised is remarkable. One probably can’t rely on this happening very often, but this is a salutory reminder that sometimes discoveries can move from the laboratory to a truly industry-disrupting product very quickly indeed, if the right application can be found, and if the underlying technology (in this case the nanotechnology required for making highly uniform films only a few atoms thick) is in place.

Venter in the Guardian

The front page of yesterday’s edition of the UK newspaper the Guardian was, unusually, dominated by a science story: I am creating artificial life, declares US gene pioneer. The occasion for the headline was an interview with Craig Venter, who fed them a pre-announcement that they had successfully managed to transplant a wholly synthetic genome into a stripped down bacterium, replacing its natural genetic code by an artificial one. In the newspaper’s somewhat breathless words: “The Guardian can reveal that a team of 20 top scientists assembled by Mr Venter, led by the Nobel laureate Hamilton Smith, has already constructed a synthetic chromosome, a feat of virtuoso bio-engineering never previously achieved. Using lab-made chemicals, they have painstakingly stitched together a chromosome that is 381 genes long and contains 580,000 base pairs of genetic code.”

We’ll see what, in detail, has been achieved when the work is properly published. It’s significant, though, that this story was felt to be important enough to occupy most of the front page of a major UK newspaper at a time of some local political drama. Craig Venter is visiting the UK later this month, so we can expect the current mood of excitement or foreboding around synthetic biology to continue for a while yet.

George Whitesides interview in ACS Nano

The American Chemical Society has launched a new journal devoted to nanotechnology, ACS Nano, to accompany its existing, and very successful, letters journal, Nano Letters, about which I wrote a little while ago. In contrast to the short report format of Nano Letters, ACS Nano publishes full length papers about original research, together with some perspectives and editorial material. The journal is now on its second issue, and features an interesting interview (I think this is available without subscription) with one of the leading figures of US academic nanotechnology, Harvard’s George Whitesides.

The interview is worth reading in its entirety, but a few points are worth picking out. Firstly, contrary to the hype that has surrounded nanotechnology, Whitesides exhibits rather a lack of confidence that nanotechnology ever will have a revolutionary impact, in the sense of supplying a fundamentally new capability. He doesn’t doubt that it is “a big, big deal”, but more through enabling incremental developments in many different industries and sectors. In a far future, the ability to exploit fundamentally quantum objects at room temperature, which nanoscale fabrication can facilitate, is his possible exception to this pessimism. “we talk about quantum computation, and quantum entanglement, and quantum communications, and the concepts are there, but the realization is going to require nanotechnology to make it work. If there is something there (I don’t know whether there is), what we’re seeing now is the beginning of the materials base that will lead to that, and that could be revolutionary in some major way.”

Whitesides is famous, among other achievements, for inventing soft lithography, and he tells a rueful but instructive story about the original motivation for this new technology. In the mid-90’s, it was felt that the continued miniaturisation of electronic circuits was threatened by the limits on how much optical lithography could be scaled down. It turned out that this was a misconception, which greatly underestimated how effective the semiconductor industry would be at driving down the working length scale in incremental (though immensely clever) ways. Nonetheless, soft lithography found many other uses, exploiting its unique advantages. As Whitesides says, “you don’t know until you get into it, you find out what works”.

Finally, he has excellent advice to young scientists – whatever else you do, make sure the problems you are working on are the really important ones, even if they seem more difficult or challenging than less interesting ones, on which one might feel one had a better chance of success. His logic for this is that it’s better to fail on an important problem than to succeed on a boring one.

Towards the $1000 human genome

It currently costs about a million dollars to sequence an individual human genome. One can expect incremental improvements in current technology to drop this price to around $100,000, but the need that current methods have to amplify the DNA will make it difficult for this price to drop further. So, to meet a widely publicised target of a $1000 genome a fundamentally different technology is needed. One very promising approach uses the idea of threading a single DNA molecule through a nanopore in a membrane, and identifiying each base by changes in the ion current flowing through the pore. I wrote about this a couple of years ago, and a talk I heard yesterday from one of the leaders in the field prompts me to give an update.

The original idea for this came from David Deamer and Dan Branton, who filed a patent for the general scheme in 1998. Hagan Bayley, from Oxford, whose talk I heard yesterday, has been collaborating with Reza Ghadiri from Scripps, to implement this scheme using a naturally occuring pore forming protein, alpha-hemolysin, as the reader.

The key issues are the need to get resolution at a single base level, and the correct identification of the bases. They get extra selectivity by a combination of modification of the pore by genetic engineering, and insertion into the pore of small ring molecules – cyclodextrins. At the moment speed of reading is a problem – when the molecules are pulled through by an electric field they tend to go a little too fast. But, in an alternative scheme in which bases are chopped off the chain one by one and dropped into the pore sequentially, they are able to identify individual bases reliably.

Given that the human genome has about 6 million bases, they estimate that at 1 millisecond reading time per base they’ll need to use 1000 pores in parallel to sequence a genome in under a day (taking into account the need for a certain amount of redundancy for error correction). To prepare the way for commercialisation of this technology, they have a start-up company – Oxford NanoLabs – which is working on making a miniaturised and rugged device, about the size of a palm-top computer, to do this kind of analysis.

Stochastic sensor
Schematic of a DNA reader using the pore forming protein alpha-hemolysin. As the molecule is pulled through the pore, the ionic conduction through the pore varies, giving a readout of the sequence of bases. From the website of the Theoretical and Computational Biophysics group at the University of Illinois at Urbana-Champaign.

Soft Machines in Korean

Soft Machines Korean cover

My book “Soft Machines: nanotechnology and life” is now available in a Korean translation made by Dr Tae-Erk Kim, and published by Kungree, price 18,000 Won.

The publication of the English paperback version is imminent: in the UK, OUP is giving the publication date as October 2007, (OUP catalogue entry) with a price of £9.99. Readers in the USA will have to wait until December 17th, where their version is priced at $17.99.

Three good reasons to do nanotechnology: 2. For healthcare and medical applications

Part 1 of this series of posts dealt with applications of nanotechnology for sustainable energy. Here I go on to describe why so many people are excited about the possibilities for applying nanotechnology in medicine and healthcare.

It should be no surprise that medical applications of nanotechnology are very prominent in many people’s research agenda. Despite near universal agreement about the desirablility of more medical research, though, there are some tensions in the different visions people have of future nanomedicine. To the general public the driving force is often the very personal experience most people have of illness in themselves or people close to them, and there’s a lot of public support for more work aimed at the well known killers of western world, such as cardiovascular disease, cancer, and degenerative diseases like Alzheimer’s and Parkinson’s. Economic factors, though, are important for those responsible for supplying healthcare, whether that’s the government or a private sector insurer. Maybe it’s a slight exaggeration to say that the policy makers’ ideal would be for people to live in perfect health until they were 85 and then tidily drop dead, but it’s certainly true that the prospect of an ageing population demanding more and more expensive nursing care is one that is exercising policy-makers in a number of prosperous countries. In the developing world, there are many essentially political and economic issues which stand in the way of people being able to enjoy the levels of health we take for granted in Europe and the USA, and matters like the universal provision of clean water are very important. Important though the politics of public health is, the diseases that blight developing world, such as AIDS, tuberculosis and malaria, still present major science challenges. Finally, back in the richest countries of the world, there’s a climate of higher expectations of medicine, where people look to medicine to do more than to fix obvious physical ailments, and to move into the realm of human enhancement and prolonging of life beyond what might formerly be regarded as a “natural” lifespan.

So how can nanotechnology help? There are three broad areas.

1. Therapeutic applications of nanotechnology. An important area of focus for medical applications of nanotechnology has been in the area of drug delivery. This begins from the observation that when a patient takes a conventionally delivered drug, an overwhelmingly large proportion of the administered drug molecules don’t end up acting on the biological systems that they are designed to affect. This is a serious problem if the drug has side effects; the larger the dose that has to be administered to be sure that some of the molecule actually gets to the place where it is needed, the worse these side-effects will be. This is particularly obvious, and harrowing, for the intrinsically toxic molecules the drugs used for cancer chemotherapy. Another important driving force for improving delivery mechanisms is the fact that, rather than the simple and relatively robust small molecules that have been the main active ingredients in drugs to date, we are turning increasingly to biological molecules like proteins (such as monoclonal antibodies) and nucleic acids (for example, DNA for gene therapy and small interfering RNAs). These allow very specific interventions into biological processes, but the molecules are delicate, and are easily recognised and destroyed in the body. To deliver a drug, current approaches include attaching it to a large water soluble polymer molecule which is essentially invisible to the body, or wrapping it up in a self-assembled nanoscale bag – a liposome – formed from soap like molecules like phospholipids or block copolymers. Attaching the drug to a dendrimer – a nanoscale treelike structure which may have a cavity in its centre – is conceptually midway between these two approaches. The current examples of drug delivery devices that have made it into clinical use are fairly crude, but future generations of drug delivery vehicles can be expected to include “stealth” coatings to make them less visible to the body, mechanisms for targeting them to their destination tissue or organ and mechanisms for releasing their payload when they get there. They may also incorporate systems for reporting their progress back to the outside world, even if this is only the passive device of containing some agent that shows up strongly in a medical scanner.

Another area of therapeutics in which nanotechnology can make an impact is in tissue engineering and regenerative medicine. Here it’s not so much a question of making artificial substitutes for tissues or organs; ideally it would be in providing the environment in which a patient’s own cells would develop in such a way as to generate new tissue. This is a question of persuading those cells to differentiate to take up the specialised form of a particular organ. Our cells are social organisms, which respond to chemical and physical signals as they develop and differentiate to produce tissues and organs, and the role of nanotechnology here is to provide an environment (or scaffold) which gives the cells the right physical and chemical signals. Once again, self-assembly is one way forward here, providing soft gels which can be tagged with the right chemical signals to persuade the cells to do the right thing.

2. Diagnostics. Many disease states manifest themselves by the presence of specific molecules, so the ability to detect and identify these molecules quickly and reliably, even when they are present at very low concentrations, would be very helpful for the rapid diagnosis of many different conditions. The relevance of nanotechnology is that many of the most sensitive ways of detecting molecules rely on interactions between the molecule and a specially prepared surface; the much greater importance of the surface relative to the bulk for nanostructured materials makes it possible to make sensors of great sensitivity. Sensors for the levels of relatively simple chemicals, such as glucose or thyroxine, could be integrated with devices that release the chemicals needed to rectify any imbalances (these integrated devices go by the dreadful neologism of “theranostics”); recognising pathogens by recognising stretches of DNA would give a powerful way of identifying infectious diseases without the necessity for time-consuming and expensive culturing steps. One obvious and much pursued goal would be to find a way of reading, at a single molecule level, a whole DNA sequence, making it possible cheaply to obtain an individual’s whole genome.

3. Innovation and biomedical research. A contrarian point of view, which I’ve heard frequently and forcibly expressed by a senior figure from the UK’s pharmaceutical industry, is that the emphasis in nanomedicine on drug delivery is misguided, because fundamentally what it represents is an attempt to rescue bad drug candidates. In this view the place to apply nanotechnology is the drug discovery process itself. It’s a cause for concern for the industry that it seems to be getting harder and more expensive to find new drug candidates, and the hopes that were pinned a few years ago on the use of large scale combinatorial methods don’t seem to be working out. In this view, there should be a move away from these brute force approaches to more rational methods, but this time informed by the very detailed insights into cell biology offered by the single molecule methods of bionanotechnology.

Nanomandala

Martin Kemp’s “science in culture” column in this week’s Nature – Heaven in grains of sand (subscription required) – brings our attention to a collaboration between nanoscientists at UCLA and some Tibetan monks. This installation – Nanomandala – is based on a Tibetan mandala – a symbolic representation of the cosmos built up from individual grains of sand; nanoscientist Jim Gimzewski responds to the mandala by using optical and scanning electron microscopy to reveal its features on finer and finer scales, culminating in the molecular. In the resulting video installation by Victoria Vesna “visitors watch as images of a grain of sand are projected in evolving scale from the molecular structure of a single grain to the recognizable image of a pile of sand. On the atomic scale the sand particles are like atoms, but a thousand of times smaller. From a bottom-up method of visual image building, a sand mandala slowly emerges.”

Monks in nano-lab
Tibetan monks working with UCLA nanoscientist Jim Gimzewski

My own knowledge of Tibetan Buddhism (or indeed any other kind) is of the very superficial kind that came from growing up as a would-be bohemian teenager in provincial Britain – in Van Morrison’s words, “I went home and read my Christmas Humphreys book on Zen”. But I rather agree with Martin Kemp’s conclusion: “There is something very beautiful and moving in this holy alliance of Buddhist spiritual patience, founded on minute care and untiring repetition, and the unholy processes of iteration of which modern computers are capable. The mandala-makers and the nanoscientists share the wonder of scale, involving countless parts to compose the ordered whole.” The allusion to Blake that Kemp makes in the title of his piece makes the connection to Western mysticism too:

“To see a world in a grain of sand and heaven in a wild flower, Hold infinity in the palms of your hand and eternity in an hour”

Mandala
An 8ft sand mandala created at the Los Angeles County Museum of Art as part of the Nanomandala project.

Nanotechnology, water and development

What impact will nanotechnology make on the developing world? Some point to the possibility that nanotechnology might help solve pressing problems, such as the availability of clean water and more abundant renewable energy through cheap, nano-enabled solar cells. Others concede that these developments might be possible in principle, but that the political and economic barriers to development are more pressing than the technical ones.

An open meeting, to be held in London on November 7, will consider the issue. It’s organised by the thinktank Demos, and will involve NGOs, scientists and government representatives. Confirmed speakers include two scientists, Mark Welland, head of the Cambridge Nanoscience Centre, and me, as well as David Grimshaw, from the international development charity Practical Action, which was founded by E.F. Schumacher, the author of the famous book “Small is beautiful”.

In the meantime, a couple of interesting publications on the topic have appeared. The Demos project Nanodialogues had a section describing the results of a public engagement exercise carried out in Zimbabwe which explored the gulf between the reality of the water problems people face there and the more glib assurances that technical solutions will be easy. A much more detailed report, Nanotechnology, water and development, has been commissioned by the Meridian Institute, and written by Thembela Hillie and Mbhuti Hlope from South Africa, and Mohan Munasinghe and Yvani Deraniyagala from Sri Lanka. This explores a pair of case studies, and actually is quite positive in tone, concluding that “Developing countries are – on their own initiative – pursuing these technologies for both economic and humanitarian reasons.As the South African case study illustrates, developing countries are using existing nanotechnology products and are initiating nanotechnology projects to remove pollutants from water; the use of these technologies is not limited to developed countries.

The new geography of innovation

One of the many interesting features of nanotechnology is that its development is taking place at a time when more and more research is being carried out in the fast developing countries of Asia. The extent of this shift is underlined by a recent piece of research publicised with the headline Western knowledge gap widens with shift to the East. The research, by Robert Huggins and Hiro Izushi, from the Universities of Sheffield and Aston, analysed the destination of the $50 billion invested in R&D by multinationals between 2002 and 2005. They found that 58% of this money was spent in Asia (concentrated in a few locations, such as Bangalore, Hyderabad, and Mumbai in India and Beijing, Guangzhou, Hangzhou and Shanghai in China), 22% in Europe and only 14% in North America. Since North America was the origin of 50% of this money, this prompts the authors to talk about a ” net R&D investment deficit of US$18 billion” for North America (by the same definition, Europe has a smaller deficit of US$3 billion).

This strongly underlines the comment I made below in my article Nanotechnology and visions of the future (part 2): “Even the remaining large companies have embraced the concept of “open innovation”, in which research and development is regarded as a commodity to be purchased on the open market (and, indeed, outsourced to low cost countries) rather than a core function of the corporation.”