Happy New Year

Here are a couple of nice nano-images for the New Year. The first depicts a nanoscale metal-oxide donut, whose synthesis is reported in a paper (abstract, subscription required for full article) in this week’s Science Magazine. The paper, whose first author is Haralampos Miras, comes from the group of Lee Cronin at the University of Glasgow. The object is made by templated self-assembly of molybdenum oxide units; the interesting feature here is that the cluster which forms the template for the ring – the “hole” around which the donut forms – forms as a precursor during the process before being ejected from the ring once it is formed.

A molybdenum oxide nanowheel templated on a transient cluster.  From Miras et al, Science 327 p 72 (2010).
A molybdenum oxide nanowheel templated on a transient cluster. From Miras et al, Science 327 p 72 (2010).

The second image depicts the stages in reconstructing a high resolution electron micrograph of a self-assembled tetrahedron made from DNA. In an earlier blog post I described how Russell Goodman, a grad student in the group of Andrew Turberfield at Oxford, was able to make rigid tetrahedra of DNA less than 10 nm in size. Now, in collaboration with Takayuki Kato and Keiichi Namba group at Osaka University, they have been able to obtain remarkable electron micrographs of these structures. The work was published last summer in an article in Nano Letters (subscription required). The figure shows, from left to right, the predicted structure, a raw micrograph obtained from cryo-TEM (transmission electron microscopy on frozen sections), a micrograph processed to enhance its contrast, and two three dimensional image reconstructions obtained from a large number of such images. The sharpest image, on the right, is at a 12 Å resolution, and it is believed that this is the smallest object, natural or artificial, that has been imaged using cryo-TEM at this resolution, which is good enough to distinguish between the major and minor grooves of the DNA helices that form the struts of the tetrahedron.

Cryo-TEM reconstruction of DNA tetrahedron
Cryo-TEM reconstruction of DNA tetrahedron, from Kato et al., Nano Letters, 9, p2747 (2009)

A happy New Year to all readers.

Why and how should governments fund basic research?

Yesterday I took part in a Policy Lab at the Royal Society, on the theme The public nature of science – Why and how should governments fund basic research? I responded to a presentation by Professor Helga Nowotny, the Vice-President of the European Research Council, saying something like the following:

My apologies to Helga, but my comments are going to be rather UK-centric, though I hope they illustrate some of the wider points she’s made.

This is a febrile time in British science policy.

We have an obsession amongst both the research councils and the HE funding bodies with the idea of impact – how can we define and measure the impact that research has on wider society? While these bodies are at pains to define impact widely, involving better policy outcomes, improvements in quality of life and broader culture, there is much suspicion that all that really counts is economic impact.

We have had a number of years in which the case that science produces direct and measurable effects on economic growth and jobs has been made very strongly, and has been rewarded by sustained increases in public science spending. There is a sense that these arguments are no longer as convincing as they were a few years ago, at least for the people in Treasury who are going to be making the crucial spending decisions at a time of fiscal stringency. As Helga argues, the relationship between economic growth in the short term, at a country level, and spending on scientific R&D is shaky, at best.

And in response to these developments, we have a deep unhappiness amongst the scientific community at what’s perceived as a shift from pure, curiosity driven, blue skies research into research and development.

What should our response to this be?

One response is to up the pressure on scientists to deliver economic benefits. This, to some extent, is what’s happening in the UK. One problem with this approach is that It probably overstates the importance of basic science in the innovation system. Scientists aren’t the only people who are innovators – innovation takes place in industry, in the public sector, it can involve customers and users too. Maybe our innovation system does need fixing, but it’s not obvious what needs most attention is what scientists do. But certainly, we should look at ways to open up the laboratory, as Helga puts it, and to look at the broader institutional and educational preconditions that allow science-based innovation to flourish.

Another response is to argue that the products of free scientific inquiry have intrinsic societal worth, and should be supported “as an ornament to civilisation”. Science is like the opera, something we support because we are civilised. One trouble with this argument is that it involves a certain degree of personal taste – I dislike opera greatly, and who’s to say that others won’t have the same feeling about astronomy? An even more serious argument is that we don’t actually support the arts that much, in financial terms, in comparison to the science budget. On this argument we’d be employing a lot fewer scientists than we are now (and probably paying them less).

A third response is to emphasise science’s role in solving the problems of society, but emphasising the long-term nature of this project. The idea is to direct science towards broad societal goals. Of course, as soon as one has said this one has to ask “whose goals?” – that’s why public engagement, and indeed politics in the most general sense, becomes important. In Helga’s words, we need to “recontextualise” science for current times. It’s important to stress that, in this kind of “Grand Challenge” driven science, one should specify a problem – not a solution. It is important, as well, to think clearly about different time scales, to put in place possibilities for the long term as well as responding to the short term imperative.

For example, the problem of moving to low-carbon energy sources is top of everyone’s list of grand challenges. We’re seeing some consensus (albeit not a very enthusiastic one) around the immediate need to build new nuclear power stations, to implement carbon capture and storage and to expand wind power, and research is certainly needed to support this, for example to reduce the high cost and energy overheads of carbon capture and storage. But it’s important to recognise that many of these solutions will be at best stop-gap, interim solutions, and to make sure we’re putting the research in place to enable solutions that will be sustainable for the long-term. We don’t know, at the moment, what these solutions will be. Perhaps fusion will finally deliver, maybe a new generation of cellulosic biofuels will have a role, perhaps (as my personal view favours) large scale cheap photovoltaics will be the solution. It’s important to keep the possibilities open.

So, this kind of societally directed, “Grand challenge”, inspired research isn’t necessarily short term, applied research, and although the practicalities of production and scale-up need to integrated at an early stage, it’s not necessarily driven by industry. It needs to preserve a diversity of approaches, to be robust in the face of our inevitable uncertainty.

One of Helga’s contributions to the understanding of modern techno-science has been the idea of “mode II knowledge production”, which she defined in an influential book with Michael Gibbons and others. In this new kind of science, problems are defined from the outset in the context of potential application, they are solved by the bringing together of transient, transdisciplinary networks, and their outcomes are judged by different criteria of quality than pure disciplinary research, including judgements of their likely economical viability or social acceptability.

This idea has been controversial. I think many people accept this represents the direction of travel of recent science. What’s at issue is whether it is a good thing; Helga and her colleagues have been at pains to stress that their work is purely descriptive, and implies no judgement of the desirability of these changes. But many of my colleagues in academic science think they are very definitely undesirable (see my earlier post Mode 2 and its discontents). One interesting point, though, is that in arguing against more directed ways of managing science, many people point to the many very valuable discoveries that have been serendipitously in the course of undirected, investigator driven research. Examples are manifold, from lasers to giant magneto-resistance, to restrict the examples to physics. It’s worth noting, though, that while this is often made as an argument against so-called “instrumental” science, it actually appeals to instrumental values. If you make this argument, you are already conceding that the purpose of science is to yield progress towards economic or political goals; you are simply arguing about the best way to organise science to achieve those goals.

Not that we should think this new. In the manifestos for modern science, written by Francis Bacon, that were so important in defining the mission of this society at its foundation three hundred and fifty years ago, the goal of science is defined as “an improvement in man’s estate and an enlargement of his power over nature”. This was a very clear contextualisation of science for the seventeenth century; perhaps our recontextualisation of science for the 21st century won’t prove so very different.

Easing the transition to a new solar economy

In the run up to the Copenhagen conference, a UK broadcaster has been soliciting opinions from scientists in response to the question “Which idea, policy or technology do you think holds the greatest promise or could deliver the greatest benefit for addressing climate change?”. Here’s the answer given by myself and my colleague Tony Ryan.”

We think the single most important idea about climate change is the optimistic one, that, given global will and a lot of effort to develop the truly sustainable technologies we need, we could emerge from some difficult years to a much more positive future, in which a stable global population lives prosperously and sustainably, supported by the ample energy resources of the sun.

We know this is possible in principle, because the total energy arriving on the planet every day from the sun far exceeds any projection of what energy we might need, even if the earth’s whole population enjoys the standard of living that we in the developed world take for granted.

Our problem is that, since the industrial revolution, we have become dependent on energy in a highly concentrated form, from burning fossil fuels. It’s this that has led, not just to our prosperity in the developed world, but to our very ability to feed the world at its current population levels. Before the industrial revolution, the limits on the population were set by the sun and by the productivity of the land; fossil fuels broke that connection (initially through mechanisation and distribution which led to a small increase in population, but in the last century by allowing us to greatly increase agricultural yields using nitrogen fertilizers made by the highly energy intensive Haber-Bosch process). Now we see that the last three hundred years have been a historical anomaly, powered by fossil fuels in a way that can’t continue. But we can’t go back to pre-industrial ways without mass starvation and a global disaster

So the new technologies we need are those that will allow us to collect, concentrate, store and distribute energy derived from the sun with greater efficiency, and on a much bigger scale, than we have at the moment. These will include new types of solar cells that can be made in very much bigger areas – in hectares and square kilometers, rather than the square meters we have now. We’ll need improvements in crops and agricultural technologies allowing us to grow more food and perhaps to use alternative algal crops in marginal environments for sustainable biofuels, without the need to bring a great deal of extra land into cultivation. And we’ll need new ways of moving energy around and storing it. Working technologies for renewable energies exist now; what’s important to understand is the problem of scale – they simply cannot be deployed on a big enough scale in a short enough time to fill our needs, and the needs of large and fast developing countries like India and China, for plentiful energy in a concentrated form. That’s why new science and new technology is urgently needed to develop these technologies.

This development will take time – with will and urgency, perhaps by 2030 we might see significant progress to a world powered by renewable, sustainable energy. In the meantime, the climate crisis becomes urgent. That’s why we need interim technologies, that already exist in prototype, that will allow us to cross the bridge to the new sunshine powered world. These technologies need development if they aren’t themselves going to store up problems for the future – we need to make carbon capture and storage affordable, and to implement a new generation of nuclear power plants that maximise reliability and minimise waste, and we need to learn how to use the energy we have more efficiently.

The situation we are in is urgent, but not hopeless; there is a positive goal worth striving for. But it will need more than modest lifestyle changes and policy shifts to get there; we need new science and new technology, developed not in the spirit of a naive attempt to implement a “technological fix”, but accompanied by a deep understanding of the world’s social and economic realities.

A crisis of trust?

One sometimes hears it said that there’s a “crisis of trust in science” in the UK, though this seems to be based on impressions rather than evidence. So it’s interesting to see the latest in an annual series of opinion polls comparing the degree of public trust in various professional groups. The polls, carried out by Ipsos Mori, are commissioned by the Royal College of Physicians, who naturally welcome the news that, yet again, doctors are the most trusted profession, with 92% of those polled saying they would trust doctors to tell the truth. But, for all the talk of a crisis of trust in science, scientists as a profession don’t do so badly, either, with 70% of respondents trusting scientists to tell the truth. To put this in context, the professions at the bottom of the table, politics and journalism, are trusted by only 13% and 22% respectively.

The figure below puts this information in some kind of historical context. Since this type of survey began, in 1983, there’s been a remarkable consistency – doctors are at the top of the trust league, journalists and politicians vie for the bottom place, and scientists emerge in the top half. But there does seem to be a small but systematic upward trend for the proportion trusting both doctors and scientists. A headline that would be entirely sustainable on these figures would be “Trust in scientists close to all time high”.

One wrinkle that it would be interesting to see explored more is the fact that there are some overlapping categories here. Professors score higher than scientists for trust, despite the fact that many scientists are themselves professors (me included). Presumably this reflects the fact that people lump together in this category scientists who work directly for government and for industry together with academic scientists; it’s a reasonable guess that the degree to which the public trusts scientists varies according to who they work for. One feature in this set of figures that does interest me is the relatively high degree of trust attached to civil servants, in comparison to the very low levels of trust in politicians. It seems slightly paradoxical that people trust the people who operate the machinery of government more than they trust those entrusted to oversee it on behalf of the people, but this does emphasise that there is by no means a generalised crisis of trust in our institutions in general; instead we see a rather specific failure of trust in politics and journalism, and to a slightly lesser extent business.

Trust in professions in the UK, as revealed by the annual IPSOS/MORI survey carried out for the Royal College of Physicians.
Trust in professions in the UK, as revealed by the annual IPSOS/MORI survey carried out for the Royal College of Physicians. Click on the plot for a larger version.

Moral hazard and geo-engineering

Over the last year of financial instability, we’ve heard a lot about moral hazard. This term originally arose in the insurance industry; there it refers to the suggestion that if people are insured against some negative outcome, they may be more liable to behave in ways that increase the risk of that negative outcome arising. So, if your car is insured for all kinds of accident damage, you might be tempted to drive that bit more recklessly, knowing that you won’t have to pay for all the consequences of an accident. In the last year, it’s been all too apparent that the banking system has seen more that its fair share of recklessness, and here the role of moral hazard seems pretty clear – why should one worry about the possibility of a lucrative bet going sour when you think that the taxpayer will bail out your bank, if it’s in danger of going under? The importance of the concept of moral hazard in financial matters is obvious, but it may also be useful when we’re thinking about technological choices.

This issue is raised rather clearly in a report released last week by the UK’s national science academy, the Royal Society – Geoengineering the climate: science, governance and uncertainty. This is an excellent report, but judging by the way it’s been covered in the news, it’s in danger of pleasing no-one. Those environmentalists who regard any discussion of geo-engineering at all as anathema will be dismayed that the idea is gaining any traction at all (and this point of view is not at all out of the mainstream, as this commentary from the science editor the Financial Times shows). Techno-optimists, on the other hand, will be impatient with the obvious serious reservations that the report has about the prospect of geo-engineering. The strongest endorsement of geo-engineering that the report makes is that we should think of it as a plan B, an insurance policy in case serious reductions in CO2 emission don’t prove possible. But, if investigating geo-engineering is an insurance policy, the report asks, won’t it subject us to the precise problem of moral hazard?

Unquestionably, people unwilling to confront the need for the world to make serious reductions to CO2 emissions will take comfort in the idea that geo-engineering might offer another way of mitigating dangerous climate change; in this sense the parallel with moral hazard in insurance and banking is exact. There are parallels in the potential catastrophic consequences of this moral hazard, as well. It’s likely that the largest costs won’t fall on the people who benefit most from the behaviour that’s encouraged by the belief that geo-engineering will be able to save them from the worst consequences of their actions. And in the event of the insurance policy being needed, it may not be able to pay out – the geo-engineering methods available may not end up being sufficient to avert disaster (and, indeed, through unanticipated consequences may make matters worse). On the other hand, the report wonders whether seeing geo-engineering being taken seriously might have the opposite effect – convincing some people that if such drastic measures are being contemplated, then urgent action to reduce emissions really is needed. I can’t say I’m hugely convinced by this last argument.

Mode 2 and its discontents

This essay was first published in the August 2008 issue of Nature Nanotechnology – Nature Nanotechnology 3 448 (2008) (subscription required for full online text).

A water-tight definition of nanotechnology still remains elusive, at least if we try and look at the problem from a scientific or technical basis. Perhaps this means we are looking in the wrong place, and we should instead seek a definition that’s essentially sociological. Here’s one candidate: “Nanotechnology is the application of mode 2 values to the physical sciences” . The jargon here is a reference to the influential 1994 book, “The New Production of Knowledge”, by Michael Gibbons and coworkers . Their argument was that the traditional way in which knowledge is generated – in a disciplinary framework, with a research agenda set from within that discipline – was being increasingly supplemented by a new type of knowledge production which they called “Mode 2”. In mode 2 knowledge production, they argue, problems are defined from the outset in the context of potential application, they are solved by the bringing together of transient, transdisciplinary networks, and their outcomes are judged by different criteria of quality than pure disciplinary research, including judgements of their likely economical viability or social acceptability. It’s easy to argue that the difference between nanotechnology research as it is developing in countries across the world, and the traditional disciplines from which it has emerged, like solid state physics and physical chemistry, very much fits this pattern.

Gibbons and his coauthors always argued that what they were doing was simply observing, in a neutral way, how things were moving. But it’s a short journey from “is” to “ought”, and many observers have seized on these observations as a prescription for how science should change. Governments seeking to extract demonstrable and rapid economic returns from their tax-payers’ investments in publicly funded science, and companies and entrepreneurs seeking more opportunities to make money from scientific discoveries, look to this model as a blueprint to reorganise science the better to deliver what they want. On the other hand, those arguing for a different relationship between science and society , with more democratic accountability and greater social reflexivity on the part of scientists, see these trends as an opportunity to erase the distinction between “pure” and “applied” science and to press all scientists to take much more explicit consideration of the social context in which they operate.

It’s not surprising that some scientists, for whom the traditional values of science as a source of disinterested and objective knowledge are precious, regard these arguments as assaults by the barbarians at the gates of science. Philip Moriarty made the opposing case very eloquently in an earlier article in Nature Nanotechnology (Reclaiming academia from post-academia, abstract, subscription required for full article) , arguing for the importance of “non-instrumental science” in the “post-academic world”. Here he follows John Ziman in contrasting instrumental science, directed to economic or political goals, with non-instrumental science, which, it is argued, has wider benefits to society in creating critical scenarios, promoting rational attitudes and developing a cadre of independent and enlightened experts.

What is striking, though, is that many of the key arguments made against instrumental science actually appeal to instrumental values. Thus, it is possible to make impressive lists of the discoveries made by undirected, investigator driven science that have led to major social and economic impacts – from the laser to the phenomenon of giant magneto-resistance that’s been so important in developing hard disk technology. This argument, then, is actually not an argument against the ends of instrumental science – it implicitly accepts that the development of economically or socially important products is an important outcome of science. Instead, it’s an argument about the best means by which instrumental science should be organised. The argument is not that science shouldn’t seek to produce direct impacts on society. It is that this goal can sometimes be more effectively, albeit unpredictably, reached by supporting gifted scientists as they follow their own instincts, rather than attempting to manage science from the top down. Likewise, arguments about the superiority of the “open-source” model of science, in which information is freely exchanged, to proprietary models of knowledge production in which intellectual property is closely guarded and its originators receive the maximum direct financial reward, don’t fundamentally question the proposition that science should lead to societal benefits, they simply question whether the proprietary model is the best way of delivering them. This argument was perhaps most pointed at the time of the sequencing of the human genome, where a public sector, open source project was in direct competition with Craig Venter’s private sector venture. Defenders of the public sector project, notably Sir John Sulston, were eloquent and idealistic in its defense, but what their argument rested on was the conviction that the societal benefits of this research would be maximised if its results remained in the public domain.

The key argument of the mode 2 proponents is that science needs to be recontextualised – placed into a closer relationship with society. But, how this is done is essentially a political question. For those who believe that market mechanisms offer the most reliable way by which societal needs are prioritised and met, the development of more effective paths from science to money-making products and services will be a necessary and sufficient condition for making science more closely aligned with society. But those who are less convinced by such market fundamentalism will seek other mechanisms, such as public engagement and direct government action, to create this alignment.

Perhaps the most telling criticism of the “Mode 2” concept is the suggestion that, rather than being a recent development, science being carried out in the context of application has actually been the rule for most of its history. Certainly, in the nineteenth century, there was a close, two-way interplay between science and application, well seen, for example, in the relationship between the developing science of thermodynamics and the introduction of new and refined heat engines. In this view, the idea of science as being autonomous and discpline-based is an anomaly of the peculiar conditions of the second half of the twentieth century, driven by the expansion of higher education, the technology race of the cold war and the reflected glory of science’s contribution to allied victory in the second world war.

At each point in history, then, the relationship between science and the society ends up being renegotiated according the perceived demands of the time. What is clear, though, is that right from the beginning of modern science there has been this tension about its purpose. After all, for one of the founders of the ideology of the modern scientific project, Francis Bacon, its aims were “an improvement in man’s estate and an enlargement of his power over nature”. These are goals that, I think, many nanotechnologists would still sign up to.

Five years on from the Royal Society report

Five years ago this summer, the UK’s Royal Society and the Royal Academy of Engineering issued a report on Nanoscience and nanotechnologies: opportunities and uncertainties. The report had been commissioned by the Government, and has been widely praised and widely cited. Five years on, it’s worth asking the question what difference has it made, and what is left to be done. The Responsible Nanoforum has collected a fascinating collection of answers to these questions – A beacon or just a landmark?. Reactions come from scientists, people in industry, representatives of NGOs, and the report is introduced by the Woodrow Wilson centre’s Andrew Maynard. His piece is also to be found on his blog. Here’s what I wrote.

The Royal Society/Royal Academy of Engineering report was important in a number of respects. It signaled a new openness from the science community, a new willingness by scientists and engineers to engage more widely with society. This was reflected in the composition of the group itself, with its inclusion of representatives from philosophy, social science and NGOs in addition to distinguished scientists, as well as in its recommendations. It accepted the growing argument that the place for public engagement was “upstream” – ahead of any major impacts on society; in the words of the report , “a constructive and proactive debate about the future of nanotechnologies should be undertaken now – at a stage when it can inform key decisions about their development and before deeply entrenched or polarised positions appear.” Among its specific recommendations, its highlighting of potential issues of the toxicity and environmental impact of some classes of free, engineered nano-particles has shaped much of the debate around nanotechnologies in the subsequent five years.

The impact in the UK has been substantial. We have seen a serious effort to engage the public in a genuinely open way; the recent EPSRC public dialogue on nanotechnology in healthcare gives a demonstration that these ideas have gone beyond public relations to begin to make a real difference to the direction of science funding. The research that the report called for in nanoparticle toxicity and eco-toxicity has been slower to get going. The opportunity to make a relatively small, focused investment in this area, as recommended by the report, was not taken and this is to be regretted. Despite the slow start caused by this failure to act decisively, however, there is now in the UK a useful portfolio of research in toxicology and ecotoxicology.

One of the consequences of the late start in dealing with the nanoparticle toxicity issue has been that this has dominated the public dialogue about nanotechnology, crowding out discussion of the potential far-reaching consequences of these technologies in the longer term. We now need to learn the lessons of the Royal Society report and apply them to the development of the new generations of nanotechnology now being developed in laboratories around the world, as well as to other, potentially transformative, technologies. Synthetic biology, which has strong overlaps with bionanotechnology, is now receiving similar scrutiny, and we can expect the debates surrounding subjects such as neurotechnology, pervasive information technology, and geoengineering to grow in intensity. These discussions may be fraught and controversial, but the example of the Royal Society nanotechnology report, as a model for how to set the scene for a constructive debate about controversial science issues, will prove enduring.

Soft machines and robots

Robots is a website featuring regular podcasts about various aspects of robotics; currently it’s featuring a podcast of an interview with me by Sabine Hauert, from EPFL’s Laboratory of Intelligent Systems. This was prompted by my talk at the IEEE Congress on Evolutionary Computing, which essentially was about how to build a nanobot. Regular readers of this blog will not be surprised to hear that a strong theme of both interview and talk is the need to take inspiration from biology when designing “soft machines”, which need to be optimised for the special, and to us very unfamiliar, physics of the nanoworld, rather than using inappropriate design principles derived from macroscopic engineering. For more on this, the interested reader might like to take a look at my earlier essay, “Right and wrong lessons from biology”.

Food nanotechnology – their Lordships deliberate

Today I found myself once again in Westminster, giving evidence to a House of Lords Select Committee, which is currently carrying out an inquiry into the use of nanotechnology in food. Readers not familiar with the intricacies of the British constitution need to know that the House of Lords is one of the branches of Parliament, the UK legislature, with powers to revise and scrutinise legislation, and through its select committees, hold the executive to account. Originally its membership was drawn from the hereditary peerage, with a few bishops thrown in; recently as part of a slightly ramshackle program of constitutional reform the influence of the hereditaries has been much reduced, with the majority of the chamber being made up of members appointed for life by the government. These are drawn from former politicians and others prominent in public life. Whatever the shortcomings of this system from the democratic point of view, it does mean that the membership includes some very well informed people. This inquiry, for example, is being chaired by Lord Krebs, a very distinguished scientist who previously chaired the Food Standards Agency.

All the evidence submitted to the committee is publicly available on their website; this includes submissions from NGOs, Industry Organisations, scientific organisations and individual scientists. There’s a lot of material there, but together it’s actually a pretty good overview of all sides of the debate. I’m looking forward to seeing their Lordships’ final report.

Environmentally beneficial nanotechnology

Today I’ve been at Parliament in London, at an event sponsored by the Parliamentary Office of Science and Technology to launch the second phase of the Environmental Nanoscience Initiative. This is a joint UK-USA research program led by the UK’s Natural Environment Research Agency and the USA’s Environmental Protection Agency. This is a very welcome initiative to give some more focus to existing efforts to quantify possible detrimental effects of engineered nanoparticles on the environment. It’s important to put more effort into filling gaps in our knowledge about what happens to nanoparticles when they enter the environment and start entering ecosystems, but equally it’s important not to forget that a major motivation for doing research in nanotechnology in the first place is for its potential to ameliorate the very serious environmental problems the world now faces. So I was very pleased to be asked to give a talk at the event to highlight some of the positive ways that nanotechnology could benefit the environment. Here are some of the key points I tried to make.

Firstly, we should ask why we need new technology at all. There is a view (eloquently expressed, for example, in Bill McKibben’s book “Enough”) that our lives in the West are comfortable enough, the technology we have now is enough to satisfy our needs without any more gadgets, and that the new technologies coming along – such as biotechnology, nanotechnology, robotics and neuro-technology – are so powerful and have such potential to cause harm that we should consciously relinquish them.

This argument is seductive to some, but it’s profoundly wrong. Currently the world supports more than six billion people; by the middle of the century that number may be starting to plateau out, perhaps between 8 and 10 billion people. It is technology that allows the planet to support these numbers; to give just one instance, our food supplies depend on the Haber-Bosch process, which uses fossil fuel energy to fix nitrogen to use in artificial fertilizers. It’s estimated that without Haber-Bosch nitrogen, more than half the world’s population would starve, even if everyone adopted a minimal, vegetarian diet. So we are existentially dependent on technology – but the technology we depend on isn’t sustainable. To escape from this bind, we must develop new, and more sustainable, technologies.

Energy is at the heart of all these issues; the availability of cheap and concentrated energy is what underlies our prosperity, and as the world’s population grows and becomes more prosperous, demand for energy will grow. It is important to appreciate the scale of these needs, which are measured in 10’s of terawatts (remember that a terawatt is a thousand gigawatts, with a gigawatt being the scale of a large coal-fired or nuclear power station). Currently the sources of this energy are dominated by fossil fuels, and it is the relentless growth of fossil fuel energy since the late 18th century that has directly led to the rise in atmospheric carbon dioxide concentrations. This rise, together with other greenhouse gases, is leading to climate change, which in turn will directly lead to other problems, such as pressure on clean water supplies and growing insecurity of food supplies. It is this background which sets the agenda for the new technologies we need.

At the moment we don’t know for certain which of the many new technologies being developed to address these problems will work, either technically or socio-economically, so we need to pursue many different avenues, rather than imagining that some single solution will deliver us. Nanotechnology is at the heart of many of these potential solutions, in the broad areas of sustainable energy production, storage and distribution, in energy conservation, clean water, and environmental remediation. Let me focus on a couple of examples.

It’s well known that the energy we use is a small fraction of the total amount of energy arriving on the earth from the sun; in principle, solar energy could provide for all our energy needs. The problems are ones of cost and scale. Even in cloudy Britain, if we could cover every roof with solar cells we’d end up with a significant fraction of the 42.5 GW which represents the average rate of electricity use in the UK. We don’t do this, firstly because it would be too expensive, and secondly because the total world output of solar cells, at about 2 GW a year, is a couple of orders of magnitude too small. A variety of nanotechnology enabled potential solutions exist; for example plastic solar cells offer the possibility of using ultra-cheap, large area processing technologies to make solar cells on a very large scale. This is the area supported by EPSRC’s first nanotechnology grand challenge.

It’s important to recognise, though, that all these technologies still have major technical barriers to overcome; they are not going to come to market tomorrow. In the meantime, the continued large scale use of fossil fuels looks inevitable, so the idea of the need to mitigate their impact by carbon capture and storage is becoming increasingly compelling to politicians and policy-makers. This technology is do-able today, but the costs are frightening. Carbon capture and storage increases the price of coal-derived electricity by between 43% and 91%; this is a pure overhead. Nanotechnologies, in the form of new membranes and sorbents could reduce this. Another contribution would be finding a use for carbon dioxide, perhaps using photocatalytic reduction to convert water and CO2 into hydrocarbons and methanol, which could be used as transport fuels or chemical feedstocks. Carbon capture and utilization is the general area of the 3rd nanotechnology grand challenge, whose call for proposals is open now.

How can we make sure that our proposed innovations are responsible? The idea of the “precautionary principle” is one that is often invoked in discussions of nanotechnology, but there are aspects of this notion which make me very uncomfortable. Certainly, we can all agree that we don’t want to implement “solutions” that bring there own, worse, problems. The potential impacts of any new technology are necessarily uncertain. But on the other hand, we know that there are near-certain negative consequences of failing to act. Not to actively seek new technologies is itself a decision that has impacts and consequences of its own, and in the situation we are now in these consequences are likely to be very bad ones.

Responsible innovation, then, means that we must speed up research to fill the knowledge gaps and reduce uncertainty; this is the role of the Environmental Nanotechnology Initiative. We need to direct our search for new technologies in areas of societal need, where public support is assured by a broad consensus about the desirability of the goals. This means increasing our efforts in the area of public engagement, and ensuring a direct connection between that public engagement and decisions about research priorities. We need to recognise that there will always be uncertainty about the actual impacts of new technologies, but we should do our best to choose directions that we won’t regret, even if things don’t turn out the way we first imagined.

To sum up, nanotechnologies, responsibly implemented, are part of the solution for our environmental difficulties.