Against nanoethics

I spent a day the week before last in the decaying splendour of a small castle outside Edinburgh, in the first meeting of a working group considering the ethics of human enhancement. This is part of a European project on the ethics of nanotechnology and related technologies – Nanobioraise. It was a particular pleasure to meet Alfred Nordmann, of the Technical University of Darmstadt – a philosopher and historian of science who has written some thought provoking things about nanotechnology and the debates surrounding it.

Nordmann’s somewhat surprising opening gambit was to say that he wasn’t really in favour of studying the ethics of human enhancement at all. To be more precise, he was very suspicious of efforts to spend a lot of time thinking about the ethics of putative long-term developments in science and technology, such as the transcendence of human limitations by human enhancement technologies, or an age of global abundance brought about by molecular nanotechnology. Among the reasons for his suspicion is a simple consideration of the opportunity cost of worrying about something that may never happen – “ethical concern is a scarce resource and must not be squandered on incredible futures, especially where on-going developments demand our attention.” But Nordmann also identifies some more fundamental problems with this way of thinking.

He identifies the central rhetorical trick of speculative ethics as being an elision between “if” and “then”: we start out identifying some futuristic possibility along the lines of “if MNT is possible “, then we identify some ethical consequence from it “then we need to prepare for an age of global abundance, and adjust our economies accordingly”, which we take as a mandate for action now, foreshortening the conditional. In this way, the demand for early ethical consideration lends credence to possible futures whose likelihood hasn’t yet been tested rigorously. This gives a false impression of inevitability, which shuts off the possibility that we can steer or choose the path that technology takes, and it distracts us from more pressing issues. It’s also notable that some of those who are most prone to this form of argument are those with a strong intellectual or emotional stake in the outcome in question.

His argument is partly developed in unpublished article “Ignorance at the Heart of Science? Incredible Narratives on Brain-Machine Interfaces”, which is well worth reading. It closes with a set of recommendations, referring back to an earlier EU report coordinated by Nordman, Converging Technologies – Shaping the Future of European Societies, which recommends that:

  • “science policy attends also to the limits of technical feasibility, suggesting for example that one should scientifically scrutinize the all too naive assumptions, if not (citing Dan Sarewitz) “conceptual cluenessness” about thought and cognition that underwrites the US-report on NBIC convergence.
  • Along the same lines, a committee of historians and statisticians should produce a critical assessment of Ray Kurzweil’s thesis about exponential growth.
  • Also, as Jürgen Altmann has urged, we need an Academy report about the Drexlerian vision of nanotechnology – is molecular manufacturing a real possibility or not?
  • Finally and most generally, we need scientists and engineers who have the courage to publically distinguish between what is physically possible and what is technically feasible.
  • As a citizen, I am overtaxed if I am to believe and even to prepare for the fact that humans will soon engineer everything that does not contradict outright a few laws of nature.”

    In short, Nordmann believes that nanoethics needs to be done more ethically.

    Five challenges for nano-safety

    This week’s Nature has a Commentary piece (editor’s summary here, subscription required for full article) from the great and good of nanoparticle toxicology, outlining what they believe needs to be done, in terms of research, to ensure that nanotechnology is developed safely. As they say, “fears over the possible dangers of some nanotechnologies may be exaggerated, but they are not necessarily unfounded,” and without targeted and strategic risk research public confidence could be lost and innovation held up through fear of litigation.

    Their list of challenges is intended to form a framework for research over the next fifteen years; the wishlist is as follows:

  • Develop instruments to assess exposure to engineered nanomaterials in air and water, within the next 3–10 years.
  • Develop and validate methods to evaluate the toxicity of engineered nanomaterials, within the next 5–15 years.
  • Develop models for predicting the potential impact of engineered nanomaterials on the environment and human health, within the next 10 years.
  • Develop robust systems for evaluating the health and environmental impact of engineered nanomaterials over their entire life, within the next 5 years.
  • Develop strategic programmes that enable relevant risk-focused research, within the next 12 months.
  • Some might think it slightly odd that what amounts to a research proposal is being published in Nature. They give a positive view for stressing this program now. “Nanotechnology comes at an opportune time in the history of risk research. We have cautionary examples from genetically modified organisms and asbestos industries that motivate a real interest, from all stakeholders, to prevent, manage and reduce risk proactively.” Some indication of the potential downside of failing to be seen to move on this is seen in the recent results of a citizen’s jury on nanotechnology in Germany, reported today here (my thanks to Niels Boeing for bringing this to my attention). These findings seem notably more sceptical than the findings of similar processes in the UK.

    Silicon and steel

    Two of the most important materials underpinning our industrial society are silicon and steel. Without silicon, the material from which microprocessors and memory chips are made, there would be no cheap computers, and telecommunications would be hugely less powerful and more expensive. Steel is at the heart of most building and civil engineering, making possible both cars and trucks and the roads they run on. So I was struck, while reading Vaclav Smil’s latest book, Transforming the Twentieth Century (about which I may write more later) by some contrasting statistics for the two materials.

    In the year 2000, around 846 million tonnes of steel was produced in the world, dwarfing the 20,000 tonne production of pure silicon. In terms of value, the comparison is a little closer – at around $600 a tonne, the annual production of steel was worth $500 billion, compared to the $1 billion value of silicon. Smil quotes a couple of other statistical nuggets, which may have some valuable lessons for us when we’re considering the possible economic impacts of nanotechnology.

    Steel, of course, has been around a long time as a material, but it’s easy to overlook how significant technological progress in steel-making has been. In 1920, it took the equivalent of 3 hours of labour to make 1 tonne of steel, but by 1999, this figure had fallen to about 11 seconds – a one thousand-fold increase in labour productivity. When people suggest that advanced nanotechnologies may cause social dislocation, by throwing workers in manufacturing and primary industries out of work, they’re fighting yesterday’s battle – this change has already happened.

    As for silicon, what’s remarkable about it is how costly it is given the fact that it’s made from sand. One can trace the addition of value through the production chain. Pure quartz costs around 1.7 cents a kilogram; after reduction to metalurgical grade silicon the value has risen to $1.10 a kilo. This is transformed into trichlorosilane, at $3 a kilo, and then after many purification processes one has pure polycrystalline silicon at around $50 a kilo. Single crystal silicon is then grown from this, leading to monocrystalline silicon rod worth more than $500 a kilo, which is then cut up into wafers. One of the predictions one sometimes hears about advanced nanotechnology is that it will be particularly economically disruptive, because it will allow anything to be made from abundant and cheap elements like carbon. But this example shows the extent to which the value of products doesn’t necessarily reflect the cost of the raw ingredients at all. In fact, in cases like this, involving complicated transformations carried out with high-tech equipment, it’s the capital cost of the plant that is most important in determining the cost of the product.

    Nature Nanotechnology

    I’ve been meaning to write for a while about the new journal from the Nature stable – Nature Nanotechnology (there’s complete free web access to this first edition). I’ve written before about the importance of scientific journals in helping relatively unformed scientific fields to crystallise, and the fact that this journal comes with the imprint of the very significant “Nature” brand means that the editorial policy of this new journal will have a big impact on the way the field unfolds over the next few years.

    Nature is, of course, one of the two rivals for the position as the most important and influential science publication in the world. Its US rival is Science. While Science is published by the non-profit American Association for the Advancement of Science, Nature, for all its long history, is a ruthlessly commercial operation, run by the British publishing company Macmillan. As such, it has been recently expanding its franchise to include a number of single subject journals, starting with biological titles like Nature Cell Biology, moving into the physical sciences with Nature Materials and Nature Physics, and now adding Nature Nanotechnology. Given the fact that just about everybody is predicting the end of printed scientific journals in the face of web-based preprint servers and open access models, how, one might ask, do they expect to make money out of this? The answer is an interesting one, in that it is to emphasise some old-fashioned publishing values, like the importance of a strong editorial hand, the value of selectivity and the role of design and variety. These journals are nice physical objects, printed on paper of good enough quality to read in the bath, and they have a thick front section, with general interest articles and short reviews, in addition to the highly selective selection of research papers at the back of the journal. What the subscriber pays for (and their marketing is heavily aimed at individual subscribers rather than research libraries) is the judgement of the editors in selecting the handful of outstanding papers in their field each month. It seems that the formula has, in the past, been successful, at least to the extent that the Nature journals have consistently climbed to the top of their subject league tables in the impact of the papers they publish.

    So how is Nature Nanotechnology going about defining its field? This is an interesting question, in that at first sight there looks to be considerable overlap with existing Nature group journals. Nature Materials, in particular, has already emerged as a leading journal in areas like nanostructured materials and polymer electronics, which are often included in wider definitions of nanotechnology. It’s perhaps too early to be making strong judgements about editorial policies yet, but the first issue seems to have a strong emphasis on truly nanoscale devices, with a review article on molecular machines, and the lead article describing a single nanotube based SQUID (superconducting quantum interference device). The front material makes a clear statement about the importance of wider societal and environmental issues, with an article from Chris Toumey about the importance of public engagement, and a commentary from Vicki Stone and Ken Donaldson about the relationship between nanoparticle toxicity and oxidative stress.

    I should declare an interest, in that I have signed up to write a regular column for Nature Nanotechnology, with my first piece to appear in the November edition. The editor is clearly conscious enough of the importance of new media to give me a contract explicitly stating that my columns shouldn’t also appear on my blog.

    The Royal Society’s verdict on the UK government’s nanotech performance

    The UK’s science and engineering academies – the Royal Society and the Royal Academy of Engineering – were widely praised for their 2004 report on nanotechnology – Nanoscience and nanotechnologies: opportunities and uncertainties, which was commissioned by the UK government. So it’s interesting to see, two years on, how they think the government is doing implementing their suggestions. The answer is given in a surprisingly forthright document, published a couple of days ago, which is their formal submission to the review of UK nanotechnology policy by the Council of Science and Technology. The press release that accompanies the submission makes their position fairly clear. Ann Dowling, the chair of the 2004 working group, is quoted as saying “The UK Government was recognised internationally as having taken the lead in encouraging the responsible development of nanotechnologies when it commissioned our 2004 report. So it is disappointing that the lack of progress on our recommendations means that this early advantage has been lost.”

    Nanotechnology and the food industry

    The use of nanotechnology in the food industry seems to be creeping up the media agenda at the moment. The Times on Saturday published an extended article by Vivienne Parry in its “Body and Soul” supplement, called Food fight on a tiny scale. As the title indicates, the piece is framed around the idea that we are about to see a rerun of the battles about genetic modification of food in the new context of nano-engineered foodstuffs. Another article appeared in the New York Times a few weeks ago: Risks of engineering a better ice cream.

    Actually, apart from the rather overdone references to a potential consumer backlash, both articles are fairly well-informed. The body of Vivienne Parry’s piece, in particular, makes it clear why nanotechnology in food presents a confusingly indistinct and diffuse target. Applications in packaging, for example in improving the resistance of plastic bottles to gas permeation, are already with us and are relatively uncontroversial. Longer ranged visions of “smart packaging” also offer potential consumer benefits, but may have downsides yet to be fully explored. More controversial, potentially, is the question of the addition of nanoscaled ingredients to food itself.

    But this issue is very problematic, simply because so much of food is made up of components which are naturally nanoscaled, and much of traditional cooking and food processing consists of manipulating this nanoscale structure. To give just one example, the traditional process of making whey cheeses like ricotta consists of persuading whey proteins like beta-lactoglobulin to form nanoparticles each containing a small number of molecules, and then getting those nanoparticles to aggregate in an open, gel structure, giving the cheese its characteristic mechanical properties. The first example in the NY Times article – controlling the fat particle size in ice cream to get richer feeling low fat ice cream – is best understood as simply an incremental development of conventional food science, which uses the instrumentation and methodology of nanoscience to better understand and control food nanostructure.

    There is, perhaps, more apparent ground for concern with food additives that are prepared in a nanoscaled form and directly added to foods. The kinds of molecules we are talking about here are molecules which add colour, flavour and aroma, and increasingly molecules which seem to confer some kind of health benefit. One example of this kind of thing is the substance lycopene, which is available from the chemical firm BASF as a dispersion of particles which are a few hundred nanometers in size. Lycopene is the naturally occurring dye molecule that makes tomatoes red, for which there is increasing evidence of health benefits (hence the unlikely sounding claim that tomato ketchup is good for you). Like many other food component molecules, it is not soluble in water, but it is soluble in fat (as anyone who has cooked an olive oil or butter based tomato sauce will know). Hence, if one wants to add it to a water based product, like a drink, one needs to disperse it very finely for it to be available to be digested.

    One can expect, then, more products of this kind, in which a nanoscaled preparation is used to deliver a water or oil soluble ingredient, often of natural origin, which on being swallowed will be processed by the digestive system in the normal way. What about the engineered nanoparticles, that are soluble in neither oil nor water, that have raised toxicity concerns in other contexts? These are typically inorganic materials, like carbon in its fullerene forms, or titanium dioxide, as used in sunscreen, or silica. Some of these inorganic materials are used in the form of micron scale particles as food additives. It is conceivable (though I don’t know of any examples) that nanoscaled versions might be used in food, and that these might fall within a regulatory gap in the current legal framework. I talked about the regulatory implications of this, in the UK, a few months ago in the context of a consultation document issued by the UK’s Food Standards Agency. The most recent research report from the UK government’s Nanotechnology Research Coordination Group reveals that the FSA has commissioned a couple of pieces of research about this, but the FSA informs me that it’s too early to say much about what these projects have found.

    I’m guessing that the media interest in this area has arisen largely from some promotional activity from the nanobusiness end of things. The consultancy Cientifica recently released a report, Nanotechnologies in the food industry, and there’s a conference in Amsterdam this week on Nano and Microtechnologies in the
    Food and Healthfood Industries
    .

    I’m on my way to London right now, to take part in a press briefing on Nanotechnology in Food at the Science Media Centre. My family seems to be interacting a lot with the press at the moment, but I don’t suppose I’ll do as well as my wife, whose activities last week provoked this classic local newspaper headline in the Derbyshire Times: School Axe Threat Fury. And people complain about scientific writing being too fond of stacked nouns.

    For Spanish speaking readers

    A couple of weeks ago, Spanish television broadcast an extended interview with me by the academic, writer, and broadcaster Eduardo Punset (bio in English here). This is the interview I gave on my visit to Sevilla a few months ago. A full transcript of the interview, in Spanish, is now available on the web-site of Radio Televisión Española.

    Does “Soft Machines” present arguments for Intelligent Design?

    I’m normally pretty pleased when my book Soft Machines gets any kind of notice, but a recent rather favourable review of it leaves me rather troubled. The review is on the website of a new organisation called Truth in Science, whose aim is “to promote good science education in the UK”. This sounds very worthy, but of course the real aim is to introduce creationist thinking into school science lessons, under the guise of “teaching the controversy”. The controversy in question is, of course, the suggestion that “intelligent design” is a real scientific alternative to the Darwinian theory of evolution as an explanation of the origin and development of life.

    The review approvingly quotes a passage from Soft Machines about the lack of evidence for how the molecular machine ATP synthase developed as evidence that Darwinian theory has difficulties. Luckily, my Darwinian credentials aren’t put in doubt – the review goes on to say “Despite the lack of hard evidence for how molecules are meant to have evolved via natural selection, Jones believes that evolution must have occurred because it is possible re-create a sort of molecular evolution ‘in silico’ – or via computer simulation. However, as more is discovered about the immense complexity of molecular systems, such simulations become increasing difficult to swallow.” This is wrong on a couple of counts. Firstly, as Soft Machines describes, we have real experiments – not in-silico ones – notably from Sol Spiegelman, that show that molecules really can evolve. The second point is more subtle and interesting. Actually, there’s a strong argument that it is in complex molecular systems that Darwinian evolution’s real power is seen. It’s in searching the huge, multidimensional conformational spaces that define the combinatorially vast number of possible protein conformations, for example, that evolution is so effective.

    The review signs off with a reiteration of a very old argument about design: “In the final chapter, ‘Our nanotechnological future’, Jones acknowledges that our ‘…only true example of a nanotechnology…is cell biology…’. Could that lead to an inference of design? “ Maybe, like many scientists, I have brought this sort of comment on myself by talking extensively about “Nature’s design principles”. The point, though, is that evolution is a design method, and a very powerful one (so powerful that we’re seeing more use of it in entirely artificial contexts, such as in software engineering). However, design doesn’t necessarily need a designer.

    “Truth in Science” may present itself as simply wishing to encourage a critical approach to evaluating competing scientific theories, but a little research reveals the true motives of its sponsors. The first name on the Board of Directors is Andy Mckintosh, Professor of Thermodynamics and Combustion Science at Leeds University. Far from being a disinterested student of purported controversies in evolutionary theory, this interview reveals him to be a young earth creationist:
    “So you believe in a world created about 6,000 years ago, cursed on account of sin, then devastated by Noah’s Flood?
    “Absolutely. There’s nothing in real science (if you take all the assumptions into account) to contradict that view.”

    I don’t have a problem if people want to believe in the literal truth of either of the creation stories in Genesis. But I don’t think it is honest to pretend that a belief which, in reality, is based on faith, has any relationship to science, and I think it’s quite wrong to attempt to have these beliefs insinuated into science education in publicly funded schools.

    ETC makes the case against nanomedicine

    The most vocal and unequivocal opponent of nanotechnology – the ETC group – has turned its attention to nanomedicine, with a new report Nanotech Rx taking a sceptical look at the recent shift of emphasis we’ve seen towards medical applications of nanotechnology. The report, though, makes more sense as a critique of modern medicine in general rather than making many specific points about nanotechnology. Particularly in the context of health in the third world, the main thrust of the case is that enthusiasts of technocentric medicine have systematically underplayed the importance of non-technological factors (hygiene, better food, etc) on improving general health. As they say, “the global health crisis doesn’t stem from a lack of science innovation or medical technologies; the root problem is poverty and inequality. New medical technologies are irrelevant for poor people if they aren’t accessible or affordable.” However, in an important advance from ETC’s previous blanket opposition to nanotechnology, they do concede that “nanotech R&D related to water is potentially significant for the developing world. Access to clean water could make a greater contribution to global health than any single medical intervention.”

    The debate about human enhancement also gets substantial discussion, with a point of view strongly influenced by disability rights activist Gregor Wolbring. (Newcomers to this debate could do a lot worse than to start with the recent Demos pamphlet, Better Humans? which collects essays by those from a variety of points of view, including Wolbring himself.) ETC correctly identifies the crypto-transhumanist position taken in some recent government publications, and gets succinctly to the nub of the matter as follows: “Certain personality traits (e.g., shyness), physical traits (e.g., “average” strength or height), cognitive traits (e.g., “normal” intelligence) will be deemed undesirable and correctable (and gradually unacceptable, not to be tolerated). The line between enhancement and therapy – already blurry – will be completely obliterated. “ I agree that there’s a lot to be concerned about here, but the issue as it now stands doesn’t have a lot to do with nanotechnology – current points of controversy include the use of SSRIs to “treat” shyness, and modafinil to allow soldiers to go without sleep. However, in the future nanotechnology certainly will be increasingly important in permitting human enhancement, in areas such as the development of interfaces with the brain and in regenerative medicine, and so it’s not unreasonable to flag the area as one to watch.

    Naturally, the evils of big pharma get a lot of play. There are the well publicised difficulties big pharma seems to have in maintaining their accustomed level of innovation, the large marketing budgets and the concentration on “me-too” drugs for the ailments of the rich west, and the increasing trend to outsource clinical trials to third world countries. Again, these are all very valid concerns, but they don’t seem to have a great deal of direct relevance to nanotechnology.

    In the context of the third world, one of the most telling criticisms of the global pharmaceutical industry has been the lack of R&D spend on diseases that affect the poor. Things have recently changed greatly for the better, thanks to Bill and Melinda and their ilk. ETC recognise the importance of public private partnerships of the kind supported by organisations like the Bill and Melinda Gates foundation, despite some evident distaste that this money has come from the disproportionately rich. “Ten years ago, there was not a single PPP devoted to the development of “orphan drugs” – medicines to treat diseases with little or no financial profit potential – and today there are more than 63 drug development projects aimed at diseases prevalent in the global South.” As an example of a Bill and Melinda supported project, ETC quote a project to develop a new synthetic route to the anti-malarial agent artemisinin. This is problematic for ETC, as the project uses synthetic biology, to which ETC is instinctively opposed; yet since artemisinin-based combination treatments seem to be the only effective way of overcoming the problem of drug resistant malaria, it seems difficult to argue that these treatments shouldn’t be universally available.

    The sections of the report that are directly concerned with those areas of nanomedicine that are currently receiving the most emphasis seem rather weak. The section on the use of nanotechnology for drug delivery section discusses only one example, a long way from the clinic, and doesn’t really make any comments at all on the current big drive to develop new anti-cancer therapies based on nanotechnology. I’m also surprised that ETC don’t talk more about the current hopes for the widespread application of nanotechnology in diagnostics and sensor devices, not least because this raises some important issues about the degree to which diagnosis can be simply equated to the presence or absence of some biochemical marker.

    At the end of all this, ETC are still maintaining their demand for a “moratorium on nanotechnology”, though this seems at odds with statements like this: “Nanotech R&D devoted to safe water and sustainable energy could be a more effective investment to address fundamental health issues.” I actually find more to agree with in this report than in previous ETC reports. And yet I’m left with the feeling that, even more than in previous reports, ETC has not managed to get to the essence of what makes nanotechnology special.

    Is nanoscience different from nanotechnology?

    In definitions of nanotechnology, it has now become conventional to distinguish between nanoscience and nanotechnology. One definition that is now very widely used is the one introduced by the 2004 Royal Society report, which defined these terms thus:

    “Nanoscience is the study of phenomena and manipulation of materials at atomic, molecular and macromolecular scales, where properties differ significantly from those at a larger scale. Nanotechnologies are the design, characterisation, production and application of structures, devices and systems by controlling shape and size at nanometre scale.”

    This echoed the definitions introduced earlier in the 2003 ESRC report, Social and Economic Challenges of Nanotechnology (PDF), which I coauthored, in which we wrote:

    “We should distinguish between nanoscience, which is here now and flourishing, and nanotechnology, which is still in its infancy. Nanoscience is a convergence of physics, chemistry, materials science and biology, which deals with the manipulation and characterisation of matter on length scales between the molecular and the micron-size. Nanotechnology is an emerging engineering discipline that applies methods from nanoscience to create usable, marketable, and economically viable products.”

    And this formulation itself was certainly derivative; I was certainly strongly influenced at the time by a very similar formulation from George Whitesides.

    Despite having played a part in propagating this conventional wisdom, I’m now beginning to wonder how valid or helpful the distinction between nanoscience and nanotechnology actually is. Increasingly, it seems to me that the distinction tends to presuppose a linear model of technology transfer. In this picture, which was very widely held in post-war science policy discussions, we imagine a simple progression from fundamental research, predominantly curiosity driven, through a process of applied research, by which possible applications of the knowledge derived from fundamental science are explored, to the technological development of these applications into products or industrial processes. What’s wrong with this picture is that it doesn’t really describe how innovations in the history of technology have actually occurred. In many cases, inventions have been put into use well before the science that explains how they work was developed (the steam engine being one of many examples of this), and in many others it is actually the technology that has facilitated the science.

    Meanwhile, the way science and technology is organised has greatly changed from the situation of the 1950’s, 60’s and ’70’s. At that time, a central role both in the generation of pure science and in its commercialisation was played by the great corporate laboratories, like AT&T’s Bell Labs in the USA, and in the UK the central laboratories of companies like ICI and GEC. For better or worse, these corporate labs have disappeared or been reduced to shadows of their former size, as deregulation and global competition has stripped away the monopoly rents that ultimately financed them. Without the corporate laboratories to broker the process of taking innovation from the laboratory to the factory, we are left with a much more fluid and confusing situation, in which there’s much more pressure on universities to move beyond pure science to find applications for their research and to convert this research into intellectual property to provide future revenue streams. Small research-based companies begin whose main assets are their intellectual property and the knowledge of their researchers, and bigger companies talk about “open innovation”, in which invention is just another function to be outsourced.

    A useful concept for understanding the limitations of the linear model in this new environment is the idea of “mode II knowledge production” , (introduced, I believe, by Gibbons, M, et al (1994) The New Production of Knowledge. London: Sage). Mode II science would be fundamentally interdisciplinary, and motivated explicitly by applications rather than by the traditional discipline-based criteria of academic interest. These applications don’t necessarily have to be immediately convertible into something marketable; the distinction is that in this kind of science one is motivated not by exploring or explaining some fundamental phenomenon, but by the drive to make some device or gadget that does something interesting (nano-gizmology, as I’ve called this phenomenon in the past).

    So in this view, nanotechnology isn’t simply the application of nanoscience. It’s definition is as much sociological as scientific. Prompted, perhaps, by observing the material success of many academic biologists who’ve founded companies in the biotech sector, and motivated by changes in academic funding climates and the wider research environment, we’ve seen physicists, chemists and materials scientists taking a much more aggressively application driven and commercially oriented approach to their science. Or to put it another way, nanotechnology is simply the natural outcome of an outbreak of biology envy amongst physical scientists.