It seems likely that nanotechnology will move a little higher up the UK news agenda towards the end of this week – tomorrow sees the launch event for the results of a citizens’ panel run by the consumer group Which?. This will be quite a high profile event, with a keynote speech by the Science Minister, Ian Pearson, outlining current UK nanotechnology policy. This will be the first full statement on nanotechnology at Ministerial level for some time. I’m one the panel responding to the findings, which I will describe tomorrow.
Category: Social and economic aspects of nanotechnology
Scooby Doo, nano too
Howard Lovy returns to his coverage of nanotechnology in popular culture with news of a forthcoming film, Nano Dogs the Movie, in which some lovable family pets acquire super abilities after scoffing some carelessly abandoned nanobots. Not to be outdone, I’ve been conducting my own in-depth cultural research, which has revealed that no less an icon of saturday morning children’s TV than Scooby Doo has fully entered the nanotechnology age.
In the current retooling of this venerable cartoon, Shaggy and Scooby Doo Get a Clue, the traditional plot standbys (it was the janitor, back-projecting the ghostly figures onto the clouds, and he’d have got away with it if it hadn’t been for those meddling kids) have been swept away to be replaced by an evil nanobot wielding scientist. But the nanobots aren’t all bad; Scooby Doo’s traditionally energising Scooby snacks have themselves been fortified with nanobots, giving him a number of super-dog powers.
I wasn’t able to follow all the plot twists on Sunday morning, as I had to cook the children’s porridge, but it seems that the imprudent nano-scientist had attempted to mis-use his nanobots in order to make his appearance (formerly plump, ageing, balding and with a bad haircut, as you’d expect) more, well, Californian. Naturally, this all ended badly. I’ve seen some less incisive commentaries on the human (or, indeed, canine) enhancement debate.
Deccelerating change?
Everyone knows the first words spoken by a man on the moon, but what were the last words? This isn’t just a good pub quiz question, it’s also an affront to the notion that technological progress moves inexorably forward. To critics of the idea that technology is relentlessly accelerating, the fact that space travel now constitutes a technology that the world has essentially relinquished is a prime argument against the idea of inevitable technological progress. The latest of such critics is David Edgerton, whose book The Shock of the Old is now out in paperback.
Edgerton’s book has many good arguments, and serves as a useful corrective to the technological determinism that characterises quite a lot of discussion about technology. His aim is to give a history of innovation which de-emphasises the importance of invention, and to this end he helpfully draws attention to the importance of those innovations which occur during the use and adaptation of technologies, often quite old ones. One very important thing this emphasis on innovation in use does is bring into focus neglected innovations of the developing world, like the auto-rickshaw of India and Bangladesh and the long-tailed boat of Thailand. This said, I couldn’t help finding the book frequently rather annoying. Its standard rhetorical starting point is to present, generally without any reference, a “standard view” of the history of technology that wouldn’t be shared by anyone who knows anything about the subject: a series of straw men, in other words. This isn’t to say that there aren’t a lot of naive views about technology in wide circulation, but to suggest, for example, that it is the “conventional story” that the atomic bomb was the product of academic science, rather than the gigantic military-industrial engineering activity of the Manhatten Project, seems particularly far-fetched.
The style of the book is essentially polemic and anecdotal, the statistics that buttress the argument tending to be of the factoid kind (such as the striking assertion that the UK is home to 3.8 million unused fondue sets). In this and many other respects I found it a much less satisfying book than Vaclav Smil’s excellent 2-volume history of modern technology, Transforming the Twentieth Century: Technical Innovations and Their Consequences and Creating the Twentieth Century: Technical Innovations of 1867-1914 and Their Lasting Impact. These books reach similar conclusions, though Smil’s arguments are supported by substantially more data and carry a greater impact for being less self-consciously contrarian.
Smil’s view – and I suspect that Edgerton would share it, though I don’t think he states it so explicitly – is that the period of history in which there was the greatest leap forward in technology wasn’t present times, but the thirty or forty years of the late 19th and early 20th century that saw the invention of the telephone, the automobile, the aeroplane, electricity, mass production, and most important of all, the Haber-Bosch process. What then of that symbol of what many people think of as the current period of accelerating change – Moore’s law? Moore’s law is an observation about exponential growth of computer power with time, and one should start with an obvious point about exponential growth – it doesn’t come from accelerating change, but constant fractional change. If you are able to improve a process by x% a year, you get exponential growth. Moore’s law simply tells us that the semiconductor industry has been immensely successful at implementing incremental improvements to their technology, albeit at a rapid rate. Stated this way, Moore’s law doesn’t seem so out of place in Edgerton’s narrative of technology as being dominated, not by dramatic new inventions, but by many continuous small improvements in technologies old and new. This story, though, also makes clear how difficult it is to predict, before several generations of this kind of incremental improvement, which technologies are destined to have a major and lasting impact and which ones will peter out and disappoint their proponents. For me, therefore, the lesson to take away is not that new developments in science and technology might not have major and lasting impacts on society, it is simply that some humility is needed when one tries to identify in advance what will have lasting impact and what those impacts will end up being.
On December 17th, 1972, Eugene A. Cernan said the last words by a man on the moon: “OK Jack, let’s get this mutha outta here.”
The Tata Nano
The Tata Nano – the newly announced one lakh (100,000 rupees) car from India’s Tata group – hasn’t got a lot to do with nanotechnology (see this somewhat bemused and bemusing piece from the BBC), but since it raises some interesting issues I’ll use the name as an excuse to discuss it here.
The extensive media coverage in the Western media has been characterised by some fairly outrageous hypocrisy – for example, the UK’s Independent newspaper wonders “Can the world afford the Tata Nano?” (The answer, of course, is that what the world can’t afford are the much bigger cars parked outside all those prosperous Independent readers’ houses). With a visit to India fresh in my mind, it’s completely obvious to me why all those families one sees precariously perched on motor-bikes would want a small, cheap, economical car, and not at all obvious that those of us in the West, who are used to enjoying on average 11 times (for the UK) or 23 times (for the USA) more energy per head than the Indians, have any right to complain about the extra carbon dioxide emissions that will result. It’s almost certainly true that the world couldn’t sustain a situation in which all its 6.6 billion population used as much energy as the Americans and Europeans; the way that equation will be squared, though, ultimately must be by the rich countries getting by with less energy rather than by poorer countries being denied the opportunity to use more. It is to be hoped that this transformation takes place in a way that uses better technology to achieve the same or better living standards for everybody from a lot less energy; the probable alternative is the economic disruption and widespread involuntary cuts in living standards that will follow from a prolonged imbalance of energy supply and demand.
A more interesting question to ask about the Tata Nano is to wonder why it was not possible to leapfrog current technology to achieve something even more economical and sustainable – using, one hesitates to suggest, actual nanotechnology? Why is the Nano made from old-fashioned steel, with an internal combustion engine in the back, rather than, say, being made from advanced lightweight composites and propelled by an electric motor and a hydrogen fuel cell? The answers are actually fairly clear – because of cost, the technological capacity of this (or any other) company, and the requirement for maintainability. Aside from these questions, there’s the problem of infrastructure. The problems of creating an infrastructure for hydrogen as a fuel are huge for any country; liquid hydrocarbons are a very convenient store of energy, and, old though it is, the internal combustion engine is a pretty effective and robust device for converting energy. Of course, we can hope that new technologies will lead to new versions of the Tata Nano and similar cars of far greater efficiency, though realism demands that we understand the need for new technology to fit into existing techno-social systems to be viable.
Grand challenges for UK nanotechnology
The UK’s Engineering and Physical Sciences Research Council introduced a new strategy for nanotechnology last year, and some of the new measures proposed are beginning to come into effect (including, of course, my own appointment as the Senior Strategic Advisor for Nanotechnology). Just before Christmas the Science Minister announced the funding allocations for research for the next few years. Nanotechnology is one of six priority programmes that cut across all the Research Councils (to be precise, the cross-council programme has the imposing title: Nanoscience through Engineering to Application).
One strand of the strategy involves the funding of large scale integrated research programmes in areas where nanotechnology can contribute to issues of pressing societal or economic need. The first of these Grand Challenges – in the area of using nanotechnology to enable cheap, efficient and scalable ways to harvest solar energy – was launched last summer. An announcement on which proposals will be funded will be made within the next few months.
The second grand challenge will be launched next summer, and it will be in the general area of nanotechnology for healthcare. This is a very broad theme, of course – I discussed some of the potential areas, which include devices for delivering drugs and for rapid diagnosis, in an earlier post. To narrow the area down, there’s going to be an extensive process of consultation with researchers and people in the relevant industries – for details, see the EPSRC website. There’ll also be a role for public engagement; EPSRC is commissioning a citizens’ jury to consider the options and have an input into the decision of what area to focus on.
UK Government outlines nanorisk research needs
The UK government has released a second report reviewing progress and identifying knowledge gaps about the potential environmental and health risks arising from engineered nanoparticles. This is a comprehensive document, breaking down the problem into five areas. The first of these is the question of how you detect and measure nanoparticles and the second considers the ways in which people and the environment might be exposed to nanoparticles. The third area concerns the assessment of the degree to which some nanoparticles might be toxic to humans, while the fourth area considers potential environmental impacts. Finally, a fifth section considers wider social and economic dimensions of nanotechnology.
The document represents, in part, a response to the very critical verdict on the UK government’s response on nanotoxicology given by the Council for Science and Technology last March. It isn’t, of course, able to address the fundamental criticism: that the Government didn’t act on the recommendation of the Royal Society and set up a coordinated programme of research into the toxicology and health and environmental effects of nanomaterials, with dedicated funding, but instead relied on an ad-hoc process of waiting for proposals to come in through peer review with opportunistic funding from a number of sources. The response from the Royal Society reflects the continuing frustration at opportunities lost: “The Government has recognised the huge potential of nanotechnology and recognised what needs to be done to ensure that advances are realised safely, but by their own admission progress has been slow in some areas. Given the wealth of expertise in UK universities and industries we should be further ahead.”
That’s old ground now, of course, so perhaps it’s worth focusing on some of the positive outcomes reported in the report. Quite a lot of work has been carried out or at least started. In the area of nanoparticles in the environment, for example, the Natural Environment Research Council has funded more than £2.3 million worth of projects, in areas ranging from studies of the toxicity of nanoparticles to fish and other aquatic organisms, to studies of the fate of silicon dioxide nanoparticles from pharmaceutical and cosmetic formulations in wastewaters and of the effect of silver nanoparticles on natural bacterial populations.
For another view of the positives and negatives of this report, it’s interesting to see the response of nanoparticle expert Andrew Maynard. More shocking is the way this report is mendaciously misquoted in an article in the Daily Mail: Alert over the march of the ‘grey goo’ in nanotechnology Frankenfoods (via TNTlog).
Less than Moore?
Some years ago, the once-admired BBC science documentary slot Horizon ran a program on nanotechnology. This was preposterous in many ways, but one sequence stands out in my mind. Michio Kaku appeared in front of scenes of rioting and mayhem, opining that “the end of Moore’s Law is perhaps the single greatest economic threat to modern society, and unless we deal with it we could be facing economic ruin.” Moore’s law, of course, is the observation, or rather the self-fulfilling prophecy, that the number of transistors on an integrated circuit doubles about every two years, with corresponding exponential growth in computing power.
As Gordon Moore himself observes in a presentation linked from the Intel site, “No Exponential is Forever … but We can Delay Forever“ (2 MB PDF). Many people have prematurely written off the semiconductor industry’s ability to maintain, over forty years, a record of delivering a nearly constant, year on year, percentage shrinking in circuits and increase in computing power. Nonetheless, there will be limits to how far the current CMOS-based technology can be pushed. These limits could arise from fundamental constraints of physics or materials science, or from engineering problems like the difficulties of managing the increasingly problematic heat output of densely packed components, or simply from the economic difficulties of finding business models that can make money in the face of the exponentially increasing cost of plant. The question, then, is not if Moore’s law, for conventional CMOS devices, will run out, but when.
What has underpinned Moore’s law is the International Technology Roadmap for Semiconductors, a document which effectively choreographs the research and development required to deliver the continual incremental improvements on our current technology that are needed to keep Moore’s law on track. It’s a document that outlines the requirements for an increasingly demanding series of linked technological breakthroughs as time marches on; somewhere between 2015 and 2020 a crunch comes, with many problems for which solutions look very elusive. Beyond this time, then, there are three possible outcomes. It could be that these problems, intractable though they look now, will indeed be solved, and Moore’s law will continue through further incremental developments. The history of the semiconductor industry tells us that this possibility should not be lightly dismissed; Moore’s law has already been written off a number of times, only for the creativity and ingenuity of engineers and scientists to overcome what seemed like insuperable problems. The second possibility is that a fundamentally new architecture, quite different from CMOS, will be developed, giving Moore’s law a new lease of life, or even permitting a new jump in computer power. This, of course, is the motivation for a number of fields of nanotechnology. Perhaps spintronics, quantum computing, molecular electronics, or new carbon-based electronics using graphene or nanotubes will be developed to the point of commercialisation in time to save Moore’s law. For the first time, the most recent version of the semiconductor roadmap did raise this possibility, so it deserves to be taken seriously. There is much interesting physics coming out of laboratories around the world in this area. But none of these developments are very close to making it out of the lab into a process or a product, so we need to at least consider the possibility that it won’t arrive in time to save Moore’s law. So what happens if, for the sake of argument, Moore’s law peters out in about ten years time, leaving us with computers perhaps one hundred times more powerful than the ones we have now that take more than a few years to become obsolete. Will our economies collapse and our streets fill with rioters?
It seems unlikely. Undoubtedly, innovation is a major driver of economic growth, and the relentless pace of innovation in the semiconductor industry has contibuted greatly to the growth we’ve seen in the last twenty years. But it’s a mistake to suppose that innovation is synonymous with invention; new ways of using existing inventions can be as great a source of innovation as new inventions themselves. We shouldn’t expect that a period of relatively slow innovation in hardware would mean that there would be no developments in software; on the contrary, as raw computing power gets less superabundant we’d expect ingenuity in making the most of available power to be greatly rewarded. The economics of the industry would change dramatically, of course. As the development cycle lengthened the time needed to amortise the huge capital cost of plant would stretch out and the business would become increasingly commoditised. Even as the performance of chips plateaued, their cost would drop, possibly quite precipitously; these would be the circumstances in which ubiquitous computing truly would take off.
For an analogy, one might want to look a century earlier. Vaclav Smil has argued, in his two-volume history of technology of the late nineteenth and twentieth century (Creating the Twentieth Century: Technical Innovations of 1867-1914 and Their Lasting Impact and Transforming the Twentieth Century: Technical Innovations and Their Consequences ), that we should view the period 1867 – 1914 as a great technological saltation. Most of the significant inventions that underlay the technological achievements of the twentieth century – for example, electricity, the internal combustion engine, and powered flight – were made in this short period, with the rest of the twentieth century being dominated by the refinement and expansion of these inventions. Perhaps we will, in the future, look back on the period 1967 – 2014, in a similar way, as a huge spurt of invention in information and communication technology, followed by a long period in which the reach of these inventions continued to spread throughout the economy. Of course, this relatively benign scenario depends on our continued access to those things on which our industrial economy is truly existentially dependent – sources of cheap energy. Without that, we truly will see economic ruin.
Science journals take on poverty and human development
Science journals around the world are participating in a Global theme issue on poverty and human development; as part of this the Nature group journals are making all their contributions freely available on the web. Nature Nanotechnology is involved, and contributes three articles.
Nanotechnology and the challenge of clean water, by Thembela Hillie and Mbhuti Hlophe, gives a perspective from South Africa on this important theme. Also available is one of my own articles, this month’s opinion column, Thesis. I consider the arguments that are sometimes made that nanotechnology will lead to economic disruptions in developing countries that depend heavily on natural resources. Will, for example, the development of carbon nanotubes as electrical conductors impoverish countries like Zambia that depend on copper mining?
Quaint folk notions of nanotechnologists
Most of us get through our lives with the help of folk theories – generalisations about the world that may have some grounding in experience, but which are not systematically checked in the way that scientific theories might be. These theories can be widely shared amongst a group with common interests, and they both serve as lenses through which to view and interpret the world, and guides to action. Nanotechnologists aren’t exempt from the grip of such folk theories, and Arie Rip, from the University of Twente, one of the leading lights in European science studies, has recently published an analysis of these – Folk theories of nanotechnologists(PDF) , (Science as Culture 15 p349 (2006)).
He identifies three clusters of folk theories. The first is the idea that new technologies inevitably follow a “wow-to-yuck” trajectory, in which initial public enthusiasm for the technology is followed by a backlash. The exemplar of this phenomenon is the reaction to genetically modified organisms, which, it is suggested, followed exactly this pattern, with widespread acceptance in the ’70s, then a backlash in 80’s and 90’s. Rip suggests that this doesn’t at all represent the real story of GMOs, and questions the fundamental characterisation of the public as essentially fickle.
Another folk theory of nanotechnology implies a similar narrative of initial enthusiasm followed by subsequent disillusionment; this is the “cycle of hype” idea popularised by the Gartner group. The idea is that all new technologies are initially accompanied by a flurry of publicity and unrealistic expectations, leading to a “peak of inflated expectations”. This is inevitably followed by disappointment and loss of public interest; the technology then falls into a “trough of disillusionment”. Only then does the technology start to deliver, with a “slope of enlightenment” leading to a “plateau of productivity”, in which the technology does deliver real benefits, albeit less dramatic than those initially promised in the first stage of the cycle. Rip regards this as a plausible storyline masquerading as an empirical finding. But the key issue he identifies at the core of this is the degree to which it is regarded as acceptable – or even necessary – to exaggerate claims about the impact of a technology. In Rip’s view, we have seen a divergence in strategies between the USA and Europe, with advocates of nanotechology in Europe making much more modest claims (and thus perhaps positioning themselves better for the aftermath of a bubble bursting).
Rip’s final folk theory concerns how nanotechnologists view the public. In his view, nanotechnologists are excessively concerned about public concern, projecting onto the public a fear of the technology out of proportion to what empirical findings actually measure. Of course, this is connected to the folk theory about GMOs implicit in the “wow-to-yuck” theory. The most telling example Rip offers is the widespread fear amongst nanotechnology insiders that a film of Michael Crichton’s thriller “Prey” would lead to a major backlash. Rip diagnoses a widespread outbreak of nanophobia-phobia.
Nobels, Nanoscience and Nanotechnology
It’s interesting to see how various newspapers have reported the story of yesterday’s award of the physics Nobel prize to the discoverers of giant magnetoresistance (GMR). Most have picked up on the phrase used in the press release of the Nobel foundation, that this was “one of the first real applications of the promising field of nanotechnology”. Of course, this begs the question of what’s in all those things listed in the various databases of nanotechnology products, such as the famous sunscreens and stain-resistant fabrics.
References to iPods are compulsory, and this is entirely appropriate. It is quite clear that GMR is directly responsible for making possible the miniaturised hard disk drives on which entirely new product categories, such as hard disk MP3 players and digital video recorders, depend. The more informed papers (notably the Financial Times and the New York Times) have noticed that one name was missing from the award – Stuart Parkin – a physicist working for IBM in Almaden, in California, who was arguably the person who took the basic discovery of GMR and did the demanding science and technology needed to make a product out of it.
The Nobel Prize for Chemistry announced today also highlights the relationship between nanoscience and nanotechnology. It went to Gerhard Ertl, of the Fritz-Haber-Institut in Berlin, for his contributions to surface chemistry. In particular, using the powerful tools of nanoscale surface science, he was able to elucidate the fundamental mechanisms operating in catalysis. For example, he worked out the basic steps of the Haber-Bosch process. A large proportion of the world’s population quite literally depends for their lives on the Haber-Bosch process, which artificially fixes nitrogen from the atmosphere to make the fertilizer on which the high crop yields that feed the world depend.
The two prizes illustrate the complexity of the interaction between science and technology. In the case of GMR, the discovery was one that came out of fundamental solid state physics. This illustrates how what might seem to the scientists involved to be very far removed from applications can, if the effect turns out to be useful, be very quickly be exploited in products (though the science and technology needed to make this transition will itself often be highly demanding, and is perhaps not always appreciated enough). The surface science rewarded in the chemistry prize, by contrast, represents a case in which science is used, not to discover new effects or processes, but to understand better a process that is already technologically hugely important. This knowledge, in turn, can then underpin improvements to the process or the development of new, but analogous, processes.