On Descartes and nanobots

A couple of weeks ago I was interviewed for the Robots podcast special on on 50 years of robotics, and predictions for the next half century. My brief was nanorobots, and you can hear the podcast here. My pitch was that on the nanoscale we’d be looking to nature for inspiration, exploiting design principles such as self-assembly and macromolecular shape change; as a particularly exciting current development I singled out progress in DNA nanotechnology, and in particular the possibility of using this to do molecular logic. As it happens, last week’s edition of Nature included two very interesting papers reporting further developments in this area – Molecular robots guided by prescriptive landscapes from Erik Winfree’s group in Caltech, and A proximity-based programmable DNA nanoscale assembly line from Ned Seeman’s group in NYU.

The context and significance of these advances is well described in a News and Views article (full text); the references to nanorobots and nanoscale assembly lines have led to considerable publicity. James Hayton (who reads the Daily Mail so the rest of us don’t have to), in his 10e-9 blog comments very pertinently on the misleading use of classical nanobot imagery to illustrate this story. The Daily Mail isn’t the only culprit here – even the venerable Nature uses a still from the film Fantastic Voyage to illustrate their story, with the caption “although such machines are still a fantasy, molecular ‘robots’ made of DNA are under development.”

What’s wrong with these illustrations is that they are graphic representations of bad metaphors. DNA nanotechnology falls squarely in the soft nanotechnology paradigm – it depends on the weak interactions by which complementary sequences are recognised to enable the self-assembly of structures whose design is coded within the component molecules themselves, and macromolecular shape changes under the influence of Brownian motion to effect motion. Soft machines aren’t mechanical engineering shrunk, as I’ve written about at length on this blog and elsewhere.

But there’s another, more subtle point here. Our classical conception of a robot is something with sensors feeding information into a central computer, which responds to this sensory input by a computation, which is then effected by the communication of commands to the actuators that drive the robot’s actions. This separation of the “thinking” function of the robot from its sensing and action is something that we find very appealing; we are irresistibly drawn to the analogy with the way we have come to think about human beings since Descartes – as machines animated by an intelligence largely separate from our bodies.

What is striking about these rudimentary DNA robots is that what “intelligence” they possess – their capacity to sense the environment and process this information to determine which of a limited set of outcomes will be effected – arises from the molecules from which the robot is made and their interaction with a (specially designed) environment. There’s no sense in which the robot’s “program” is loaded into it; the program is implicit in the construction of the robot and its interaction with the environment. In this robot, “thought” and “action” are inseparable; the same molecules both store and process information and drive its motion.

In this, these proto-robots operate on similar general principles to bacteria, whose considerable information processing power arises from the interaction of many individual molecules with each other and with their physical environment (as beautifully described in Dennis Bray’s book Wetware: a computer in every living cell). Is this the only way to build a nanobot with the capacity to process and act on information about the environment? I’m not sure, but for the moment it seems to be the direction we’re moving in.

Science, Engineering and Innovation Summit at the Royal Society

I’ve been at the Science, Engineering and Innovation summit at the Royal Society this evening. This was an attempt to build on the reports on science and innovation published before the election (as discussed my last post) with the new government. It was a packed meeting, involving just about anyone in the UK with any interest in science policy. Here are my notes, as taken at the meeting. My apologies for any inaccuracies, and to anyone in the questions whose name I didn’t catch.

Martin Rees welcomes a packed audience – David Willetts is late.

James Wilsdon takes the chair.

Aims – an opportunity for the new minister to set out the direction of travel in the new spending round, and to make sure the weighty set of reports published before the election are not forgotten about.

Martin Taylor – (Chair, The Scientific Century)

The recent election saw science with a higher profile than usual, partly as a result of the many reports we’re talking about it. Of course science was still marginal, but our key arguments did register. This is partly because of the uncertainty we’re in – we anticipate tempestuous times. The next spending review will be the most important for a generation, setting national finances on an even keel but setting the tone for many years to come. There is a lot that unites the flurry of reports about science we’re talking about – one common theme is the need for stability. Three big themes in the Scientific Century. 1. Science and Innovation must be at the heart of any strategy for economic growth. 2. UK science is a marvellous asset for the UK, but there is a new global competitive environment for science, with major new investments in France, USA and elsewhere. 3. If we don’t continue to invest, we’ll lose our place at science’s top table.

We must show the new government how science and engineering can help the government both overcome immediate challenges and their long term aspirations. There will be difficult decisions ahead – the Royal Society is ready to offer help and advice. But short-term decisions must not undermine the ability of science to help meet the long term global challenges we face, and the health of the nation.

John Browne (President of Royal Academy of Engineering, ex CEO of BP, author of forthcoming Browne review into the financing of higher education).

What can be done to retool the British economy for growth and innovation? We’re the worlds 6th biggest manufacturer, with leading areas in sattelietes, aerospace, pharmaceuticals and design led manufacturing. But we don’t always turn ideas into business. Decisions about budget cuts must be made with an eye on the future. Businesses remain the main vehicle for wealth creation, but governments can help. This isn’t about picking winners, but supporting strategic sectors. Seven areas should be concentrated on:
1. ensuring we have the people with the right skills
2. keeping ideas flowing by funding the best researchers – then a debate about what other research we can afford
3. systems to bridge the gap between science and the market
4. stable environment with stable regulatory framework
5. incentives for small and large companies
6. government’s influence as a customer, with public procurement used as a tool for innovation
7. all the above to be put into a coherent framework, measured and assessed
Government doesn’t have to do everything – national academies, professsional scoeities etc. are ready to help.

We must all take a firmer lead to communicate the excitement of science and engineering and the careers it leads to.

Janet FInch – (Chair, CST Vision for UK research)

The CST report had one headline message – the UK’s research is a great success story but it’s under threat because of global competition. This is the most important message, for the scientific community, to government and to business. The “Rising above the gathering storm” report from the USA made this point strongly in the US context, emphasising the growing importance of China. So the UK’s strong position has been admirable for last decade but we can’t assume it will stay there. What we need to do is:
1. Government should adopt a clear long term vision – we need stable policy and stable policy directions – particularly to encourage private sector investment. In the current environment attracting private sector investment is going to be crucial, but we need government funding for upstream research, creative discovery based research across a range of disciplines (including social sciences).
2. We need to invest in people is important, more than projects. This includes both home grown people and attracting the best from abroad. We can’t predict the future, but if we have the best people they will adapt and respond to new challenges as they arise.
3. We need an ambitious innovation strategy which is directed at major global and social challenges. There need to be more incentives for collaboration, both in the UK and internationally. OUr highly competitive funding environment has served us well, but we need to balance competition with collaboration. This country is always going to be small compared to major world economies so we need to bring people together.

Hermann Hauser (Amadeus Capital Partners, author of Hauser review)

Let me make the case for Maxwell centres. The UK is second only to the USA in producing quality papers, in in papers per pound we’re number one. But we don’t use this excellence in research well to make new companies and to make our large companies more successful. It used to be the case that researchers produced papers, then industrial scientists read them and a few years later might do something with them. But things aren’t so leisurely any more, and it’s a race to commercialise new ideas. The Americans are derogatory about Europe using too much state support, but DARPA is a gigantic government support mechanism for Silicon valley. ITRI in Taiwan, the Fraunhofers in Germany, and others have all created new industries and supported existing ones. Fraunhofers have funding split 1/3 state, 1/3 private sector, 1/3 project based, and this is a good template.

In what areas should we establish a few such centres in the UK? Three criteria:
1. market size billions to 10’s of billions
2. demonstrable academic lead
3. a plan to keep most of the value in the UK – though we do need to recognise that in a global environment there will be international partners.

We should set up a small number of such centres, each funded at a level of 100 million over ten years, and we should support them with government procurement.

(David Willetts arrives)

Iain Gray (CEO, Technology Strategy Board)

A thought inspired by the portrait of Darwin: It’s not the strongest species that survives, nor the most intelligent, but the one can adapt to change.

Science and research produces the ideas for the future, and the exploitation of these ideas produces the innovation that provides economic growth. Innovation is the key to recovering from the recession, and many countries across the world are investing heavily in this area.

Key points – the TSB budget is tiny compared to the government procurement budget – this is a huge opportunity for driving innovation. The low carbon vehicle program is an exemplar of somewhere where strategic investment can help overcome some of the big challenges society is going to face. Nissan’s investment in the UK was because of the science budget and the support for innovation. Other examples of strong businesses – the Satellite business, with new business models. Ceres power, through support in innovation got a key contract with British Gas. Innovation is crucial to economic growth. Regenerative medicine could be a game-changer for UK plc.

We need to provide the right ammunition to get the arguments across that innovation is what’s needed for the short and long term growth of the UK economy.

Helga Nowotny (President, European Research Council)

Research – Innovation – Education: the knowledge triangle is still valid, but we see some adjustments taking place. Innovation becomes the flagship of the plans for Europe – but Europe needs changes to increase the speed with which discoveries are taken to market. We know how to do this. We need
1. the spirit of entrepreneurship both inside and outside academia – intellectual property, venture captial, and public procurement. Less often talked about: every technological innovaiton needs social innovation. Not all innovaiton is based on research. But the kinds of innovation that will take us further are science based. As de Gennes has said, “By improving the candle, we are not going to invent electricity”.
2. So to research – the ERC will continue to be driven by excellence, bottom-up approach, the researchers know best. The UK is very successful in the ERC – the UK is a winner, but so is Europe, because this develops healthy competition and a raising of standards of evaluation right across Europe. The ERC trusts in people and their talents – but we need the third arm of the triangle.
3. Education begins long before the University. Many countries have a leaky pipeline of talented youngsters, so in the national context this pipeline should be fixed.

We hear a great deal about the grand challenges, of energy, climate, etc. There is one grand challenge that needs to be addressed first – how to integrate the three arms of the knowledge triangle.

David Willetts (Minister for Science and Higher Education)

I think fondly of a visit to the RS a couple of years ago when Martin Rees let me handle first editions of Principia and Origin of Species. This is the excitement of science which we should never forget. I’m not the only minister here – Pauline Neville-Jones (Minister for Security) is also in the audience.

I have dual responsibility for Universities and Science – I think this is a very exciting connection. What does this imply for dual support system? Firstly it means there is clear responsibility – the research councils can’t pass off responsibility to HEFCE, and vice versa.

Impact – Martin Rees is eloquent on some of these issues. Most academics do hope for and aspire to work that has an impact – researchers in medicine want to make a drug that will save lives, if you are a historian you hope your work will change the percpetions of the nation. The issue, then, isn’t impact per se – we all agree that research needs impact, but the “impact agenda”. I am wary of clunky attempts to measure and fund impact through the REF, and the impact agenda needs to be methodologically sound and commands widespread research. Blue skies research is still very important.

Innovation – this is often wrongly reduced to R&D. Coming from Birmingham, I start with Joseph Priestley and the Lunar Society. I would like to apologise retrospectively for the Tory riots in 1792 that burnt his house down! He discovered oxygen, but Lavoisier created the theoretical understanding, and the Swiss man Schweppes that made money out of it! I find the concept of the cluster a valuable way of thinking how innovation arises, much better than the sausage machine idea where science goes in one end and innovation comes out the other. We need low risk environments for doing high risk ventures. One idea for strengthening the cluster agenda is the idea of reproducing the Fraunhofers – I am struck by the similarity of Hauser and Dyson recommendations. Of course money is tight and some people will say you shouldn’t be thinking about making new institutions, but this is a very important area that I will be studying carefully.

Universities – Browne’s report on making the sector financially sustainable is very important , and the European agenda is very important here. Important arguments in Landes’s book – “The Wealth and Poverty of Nations” – point to the diversity of Europe compared to the monolithic nature of China as important in promoting innovation in Europe. Looking at the UK’s Nobel prize winners, so many of them had a moment of crisis or disjunction, moving disciplines or moving cultures. These shocks can create true intellectual greatness.

Questions now:

Imran Khan CASE – two key messages – it would be a false economy to cut support for science now, and we need a long-term plan

Someone from Imperial – medical research will die unless we pare back regulation

Alan Thorpe (CEO, NERC) – There’s much evidence on the economic benefits to the UK show factors like 15 in returns to the economy, many case studies on innovation deriving from fundamental research. Research Councils are proud of their “excellence with impact” theme. Is our evidence persuasive? What else should we do?

David Willetts – the evidence is powerfully set out. The absorbtive capacity of the economy allows you to benefit from advances around the world. There is a cash constraint, and that means that some things that are desirable are not affordalbe. I will make the argumetns about the role of science in the economy and civilisation. But I won’t be a shop steward – I understand the argumetns ane will do my best to convey them. I am here to learn from the panellists, and to serve this community.

Martin Taylor. Please put science and innovation at the heart of you plans for the economy – the figures for foreign investment, money coming in with foreign students. Please develop a plan for science that has ambition and vision, and give them stability.

Mary Phillips, UCL. Pleasing to see the role of social sciences highlighted – Ss and arts and humanities shouldn’t be neglected.

David Cope, POST. You emphasised diversity at the European level. But you can scale this down to the national level. Diversity is important, but this is in tension with concentrating resources on

James Woodhouse. You were much less robotic than your predecessors – would you favour research on robots in the home and the hospital. What is your attitude to nuclear power, and will you spend more money on carbon capture.

ANO. Food security – estimates that we will need lots more food, but I don’t see a new green revoution on the way,

DW. A common thread through these questions – the argument of John Beddington that there are a set of global challenges from which we are not exempt. We are dumping serious problems on the next generation (see the Pinch!) but we have a repairing lease on the planet. The resource that scientists offer is invaluable. We must revisit agricultural research, energy is crucial, robotics – Japan is instructive as robots seem to be their solution to the ageing population. And social science is very important – these challenges are about human behaviour. And I won’t say anything in the middle of a spending review about allocations to different institutions!

Janet Finch. The global challenges are getting much more severe; as China and India grow, we need to see much more collaboration between universities and this is much more important than having a debate about concentrating funding.

John Browne. The review on HE is taking public evidence this week – one question is trying to understand the difference between a set of world class institutions and a world class system. Carbon capture and storage will be debated for too long and action will be smaller than we expect. It’s a possibility but as cost and scale mounts alternatives will intervene. The discovery of unconventional natural gas will defer the need for CCS.

Hermann Hauser. You should have as much diversity as possible when it comes to blue sky research, but for exploitation there are only a few sectors in which the UK can be world-class.

Mark Walport. When times get tight the temptation is to slash and burn. we must maintain excellence at critical mass. we need a stable poslicty enviornment if industry is going to innovate. With all hte talk of big society, you don’t have a stable environmnet – that needs strong central

Chi Onwurah (Labour MP for Newcastle Central). I am a chartered engineer, Parliament at least is the most diverse environment I have worked in, having been a (black, female) engineer for the last 20 years. We need to attract a very wide range of people into science. How can we attract less well-off people to professional jobs like this?

Joyce Tait (Edinburgh) Enormous opportunities for innovation in life sciences if we can adapt the regulatory environment – a small change in the regulatory system could yield big benefits.

Helga Nowotny. Diversity is a source of creativity. But as Hermann says you have to look at what stage you mean. But diversity can turn into fragmentation. We need more gender diversity – more women in science and those we have don’t leave. we have a majority of female students, but many leave as the postdoc lifestyle demands mobility and insecurity inconsistent with family life. Too much measurement means that people become cynical and learn the rules of the game, at the expense of creativity. For the ERC we see a growing number of applicants. 14% is the fraction of women professors/advanced grant holders, but things are better for younger women.

Iain Gray. The Big Bang science fair brought many underprivileged children to be involved in Manchester. Regulation is a hugely important area. Maybe there are some special factors in Scotland we could look at.

DW. Stable policy environment – we are trying to operate on the basis of a strong coalition government to last five years. The PM made it clear that he didn’t want to reorganise Whitehall – so we have an opportunity to provide stability. I agree about diversity. On regulation, let’s have some concrete proposals. Many exciting discussions about the difference between risk and hazard and the regulation thereof remain!

Science in the British election

It’s now clear that our election has produced no winners, least of all science. But it’s worth reflecting on what’s worked and what’s not worked in the various efforts there’ve been to raise the profile of science in an election that, in the end, was always going to be dominated by other issues.

The Campaign for Science and Engineering did a great job in extracting statements about science from each of the main parties, which have been published on their excellent blog The Science Vote. The New Scientist blog The S Word has been another excellent source of information and commentary on the campaign to raise science’s profile in the election. Predictably, the parties commitments to science have been notably short on detail, particularly on commitments to maintain current levels of science spending, but it’s progress even to have some warm words.

The background has been set with a few heavyweight reports earlier in the year. In March, the Royal Society released its contribution – The Scientific Century (I was on the advisory group for this, which was a fascinating experience), while the Government’s own highest level advisory body, the Council for Science and Technology, produced their own Vision for UK Research (PDF). Three big themes emerged from these reports; the excellence of the UK science base and of the best individual researchers within it, the importance of science and technology for economic growth and our future prosperity, and the need for science to solve the pressing problems the whole world faces, of dealing with climate change, moving to a low carbon economy and keeping a growing population healthy and fed.

Predictably, it has been the economic argument that’s gained the most political traction; the Labour government produced the Hauser review, calling for translational research centres along the lines of Germany’s Fraunhofer institutes, and the Conservatives have their Dyson review, with remarkably similar conclusions. Though the emphasis of both of these contributions is on near-market research, they both at least pay lip-service to the importance of having a strong science base.

We shall see, of course, how much these warm words translate into action. One has to worry, after an election campaign in which all sides have conspicuously failed to confront the really hard choices that a government will face in dealing with a deficit, that the science budget is going to be seen as a soft target, politically, compared to areas such as health, education or defense.

Is there a significant constituency for science, that might impose any political price on cutting science budgets? This election has seen high hopes for social media as a way of mobilising a science voting block – see #scivote on Twitter. Looking at this, one sees something that looks very much like an attempt to develop an identity politics for science – the idea that there might be a “science vote”, in the way that people talk (correctly or not) about a “gay vote” or a “christian vote”. There’s a sense of a community of right-minded people, with leaders from politics and the media, and clear dividing lines from the forces of unreason. What’s obvious, though, is this strategy hasn’t worked – a candidate standing on a single issue science platform ended up with 197 votes, which compares unfavourably with the 228 votes the Monster Raving Loony Party got in my own, nearby constituency. And Evan Harris, the Liberal Democrat science spokesman and #scivote favourite, lost his own seat.

I think that science is much too important to be treated as a sectional interest; identity politics will never work for science, simply because a serious interest in science for its own sake will only ever be shown by a minority. Instead, support for science must be built from a coalition of people with many different interests and outlooks. For some the intrinsic wonder of science will be enough to strongly support it, but for many others it will be the role of science in the economy, the appeal of medical research or the importance of science for making the transition to a low carbon economy, that persuades them to take the subject seriously.

My congratulations to Dr Julian Huppert, elected Liberal Democrat MP for Cambridge. He’s a research scientist in the Cavendish Laboratory, who will now have a little less time to spend thinking about theoretical biophysics, and a bit more time worrying about science policy, and, I’m sure, many other pressing issues.

Targeted delivery of siRNA by nanoparticles in humans

An important milestone in the use of nanoparticles to deliver therapeutic molecules is reported in this week’s Nature – full paper (subscription required), editors summary. See also this press release. The team, led by Mark Davis from Caltech, used polymer nanoparticles to deliver small interfering RNA (siRNA) molecules into tumour cells in humans, with the aim of preventing the growth of these tumours.

I wrote in more detail about siRNA back in 2005 here. If one can introduce the appropriate siRNA molecules into a cell, they can selectively turn off the expression of any gene in that cell’s genome, potentially giving us a new class of powerful drugs which would be an absolutely specific treatment both for viral diseases and cancers. When I last wrote about this subject, it was clear that the problem of delivering of these small strands of RNA to their target cells was going to be a major barrier to fulfilling the promise of this very exciting new technology. In this paper, we see that substantial progress has been made towards overcoming this barrier. In this study the RNA was incorporated in self-assembled polymer nanoparticles, the surfaces of which were decorated with groups that selectively bind to proteins that are found on the surfaces of the tumour cells being targeted.

The experiments were carried out as part of a phase 1 clinical trial on humans. What the Nature paper shows is that the nanoparticles do indeed accumulate at tumour cells and are incorporated within them (see the micrograph below), and that the siRNA does suppress the synthesis of the particular protein at which it is aimed, a protein which is necessary for the growth of the tumour. If this trial doesn’t demonstrate unacceptable harmful effects, further clinical trials will be needed to demonstrate whether the therapy works clinically to arrest the growth of these tumours.

Targeted nanoparticles carrying therapeutic siRNA molecules entering a tumor cell - Caltech/Swaroop Mishra
Targeted nanoparticles carrying therapeutic siRNA molecules entering a tumor cell - Caltech/Swaroop Mishra

Supporting science is about intergenerational justice

Science and technology have not yet played a big part in the current UK general election campaign, and, to be realistic, they are probably not going to. This is despite some energetic and successful ongoing efforts to raise the profile of science in UK politics, chronicled in The Science Vote – a blog from the Campaign for Science and Engineering. The immediate problem is that the lack of discussion of science reflects a broader failure of the election debate to engage with any of the really big issues facing the country at the moment. This means that it hasn’t yet been possible to find a wider context in which to place a serious discussion of science policy. One very obvious context for that discussion is the debate we need to have about how to put our economy on a more sustainable footing. But there’s a more general way of framing the role of public support for science in terms of intergenerational justice – the legacy that each generation leaves to its successors.

One British politician who does have something of a reputation for being a deep thinker, particularly in the area of social policy, is David Willetts, currently the Conservative spokesman on Higher Education. He’s recently published a book called The Pinch – how the baby boomers stole their childrens’ future – and how they can give it back. As its title suggests, this is about intergenerational justice – the obligations we have to our children and their contemporaries. Willetts’s thesis is that the great bulge of people born between 1945 and 1965 have rigged society for their own benefit, enjoying a level of comfort and prosperity that our children will only be able to envy. Inflated house prices have priced today’s young people out of the property market, they have to pay increasing sums of money for the university education their parents received for free, and they face a future of higher taxes to pay for the pensions and healthcare of their feckless elders.

One can argue about the economic details of this thesis, as is done in this critical review by Tim Congdon, and it’s worth noting as well that there are many people in this generation who have not benefitted from this prosperity. But there’s no doubt that the argument does capture a widespread feeling – by no means restricted to the conservative end of the political spectrum – that in recent years we’ve been living off capital, whether that’s measured by the recent growth of public and private debt, or by the increasing recognition of the environmental costs of our existing economic system.

What the baby-boomers indisputably did “pinch” from their children is the prosperity that came from the last fifty years of an economy based on cheap but unsustainable fossil-fuel-based energy. But there is another side of the balance sheet. Much of our prosperity and well-being now largely depends on the science that was done in the 1950’s, 1960’s and 1970’s – for example the semiconductor physics that led to the information technology revolution, and the cell biology that permitted many developments in modern medicine. That science and technology brought its own problems, too, and we need to learn those lessons. But if people in 2050’s and 2060’s are going to be able to live more prosperous and secure lives and solve the energy and environmental problems we are leaving them, it’s largely going be because of the science we are doing now. We owe it to them to make sure we support that science.

Nanotechnology and food: what should we worry about?

The potential use of nanotechnologies in food gives rise to two distinct types of issue. There’s a narrow issue, about whether engineered nanoparticles are entering the food chain, and if so, whether this leads to any dangers to human health. But there’s also a rather broader set of issues that arise from the fact that we are now able to alter the nanoscale structure of food with much greater control and purpose than before. Of course, we should balance the positive benefits of such interventions against any potential risks. But this assessment will necessarily take place in a context which goes well beyond the technical issues that surround nanotechnology, and takes in people’s deeply held instincts about our proper relationship with our food.

The key worry about engineered nanoparticles was that, because approvals for food additives didn’t typically specify size, a nanoscale version of an existing additive, which might have new problems of toxicity, might slip through the regulatory net. In Europe, at least, this potential problem seems to have been headed off by last year’s amendment of the Novel Food Regulation, making clear that any food containing or consisting of engineered nanoparticles will need mandatory pre-market assessment and approval. There are still issues relating to the precise definition of a nanomaterial to be sorted out here; nanomaterials created from natural food substances will be a particular point of contention. Nonetheless, there is progress here – one welcome and concrete sign of this is that it seems no longer possible to buy nano-silver “food supplements” from health food shops, following a recent ruling by the European Food Safety Authority.

What about the broader applications of nanotechnology in food? In my evidence to the House of Lords inquiry, I picked out three key potential areas of food nanotechnology:

• Food science at the nanoscale. This is about using a combination of fairly conventional food processing techniques supported by the use of nanoscale analytical techniques to achieve desirable properties. A major driver here will be the use of sophisticated food structuring to achieve palatable products with low fat contents.
• Encapsulating ingredients and additives. The encapsulation of flavours and aromas at the microscale to protect delicate molecules and enable their triggered or otherwise controlled release is already widespread, and it is possible that decreasing the lengthscale of these systems to the nanoscale might be advantageous in some cases. We are also likely to see a range of “nutriceutical” molecules come into more general use.
• Water dispersible preparations of fat-soluble ingredients. Many food ingredients are fat-soluble; as a way of incorporating these in food and drink without fat manufacturers have developed stable colloidal dispersions of these materials in water, with particle sizes in the range of hundreds of nanometers. For example, the substance lycopene, which is familiar as the molecule that makes tomatoes red and which is believed to offer substantial health benefits, is marketed in this form by the German company BASF.

The key issues here aren’t so much about safety. For some, there will be an instinctive recoiling from the idea of manipulating our food at such a fundamental level, while others will regard these methods as no different in principle from more traditional cooking and food processing. Some people will argue that, rather than using nanotechnology to make food less fattening and more nutritious, we should just eat more fresh fruit and vegetables. Countering this, others will say that the reality of the way people live now dictates that pre-prepared and processed food will be increasingly important in people’s diets, so food manufacturers have a moral obligation to use available technology to make their products as healthy as possible. As is the case for many debates that centre on nanotechnology, it will be values as much as safety that people argue about.

Natural nanomaterials in food – the strange case of foie gras

Here’s a footnote to my commentary on the House of Lords nanofood report and the government response to it. There’s a recommendation (14, para 5.32) that the definition of nanomaterials for regulatory purposes should exclude nanomaterials created from natural food substances, with which the government agrees. I accept that this distinction is a practical necessity, and I would go along with the report’s paragraph 5.31: “We acknowledge that nanomaterials created from naturally-occurring materials may pose a potential risk to human health. However, we also recognise also that it is impractical to include all natural nanomaterials present in food under the Novel Foods Regulation, and that many natural nanoscale substances have been consumed for many years with no ill effects reported.”

But I do think it is important to contest the general assertion that things that are natural are by definition harmless. There’s a long tradition of using food processing techniques to render harmless naturally occurring toxins, whether that’s simply the cooking process needed to destroy toxic lectins in kidney beans, or the more elaborate procedures needed to make some tropical tubers, like cassava, safe to eat.

There’s been a recent report of a situation where a potential link between eating naturally formed nanomaterials and disease has been identified. The nanomaterials in question are amyloid fibrils – nanoscale fibrous aggregates of misfolded proteins, of a kind that have been associated with a number of diseases, notably Alzheimer’s disease and Creutzfeldt-Jacob disease (see this earlier post for an overview of the good and bad sides of these materials – Death, life, and amyloids).

In a paper published in Proceedings of the National Academy of Sciences a couple of years ago, Solomon et al (2007) showed that foie gras – the highly fatty livers of force-fed geese – contains nanofibers (amyloid fibrils) of amyloid A protein, which when fed to mice susceptible to AA amyloidosis lead to the development of that disease.

AA amyloidosis is a chronic, incurable condition often associated with rheumatoid arthritis. The suggestion is that, if AA amyloid fibrils enter the body, they will act as “seeds” to nucleate the conversion of more AA protein into the amyloid form. A more speculative suggestion is that AA fibrils could also nucleate the formation of amyloid fibrils by other susceptible proteins, leading to other kinds of amyloid diseases. The authors of the paper draw the conclusion that “it may be hazardous for individuals who are prone to develop other types of amyloid-associated disorders, e.g., Alzheimer’s disease or type II diabetes, to consume such products” (i.e. ones contaminated with amyloid protein A fibrils). It seems to me that it is stretching what the data shows too far to come to this conclusion at the moment, but it’s an area that would bear closer investigation.

Nanotechnologies and food – the UK government responds to the House of Lords

Last week the UK government issued its response to a report on nanotechnologies and food from the House of Lords Select Committee on Science and Technology, which was published on 8 January this year.

The headlines arising from this report concentrated on a perceived lack of openness from the food industry (see for example, the BBC’s headline Food industry ‘too secretive’ over nanotechnology), and it’s this aspect that the consumer group Which? concentrates on in its reaction to the response – Government ignores call for openness about nano food. This refers to House of Lords recommendation 10, which calls for the establishment by the Food Standards Agency of a mandatory, but confidential, database of those nanomaterials being researched by the food industry. The government has rejected the proposal that this should be mandatory, on the grounds that this would drive research away from the UK. However, the government has accepted the recommendation (26) that the FSA maintains a publicly accessible list of food and food packaging products that contain nanomaterials. This will include, as recommended, all materials that have been approved by the European Food Safety Authority, but the FSA will explore including other materials that might be considered to have nanoscale elements, to allow for the uncertainties of definition of nanomaterials in the food context. Where their Lordships and the government agree (and disagree with Which?) is in rejecting the idea of compulsory labelling of nanomaterials on packaging.

The House of Lords report, together with all the written and oral evidence submitted to the inquiry, can be downloaded from here. For my own written evidence, see here – I mentioned my oral evidence in this blog post from last year.

Responsible innovation still needs innovation

The UK government’s policies for nanotechnology seem to unfold in a predictable and cyclical way – some august body releases a weighty report criticising some aspect of policy, the government responds in a way that disappoints the critics, until the cycle is repeated with another critical report and a further response. The process began with the Royal Society/ Royal Academy of Engineering report in 2004, and several cycles on, last week we saw a new comprehensive government Nanotechnology Strategy launched (downloadable, if you’re lucky, from this somewhat flakey website). One might have thought that this process of dialectic might, by now, have led to a compelling and robust strategy, but that doesn’t seem to be the case here.

The immediate prompt for the strategy this time was the Royal Commission on Environmental Pollution (RCEP) report ‘Novel materials in the environment: the case of nanotechnology’, from 2008 (see Deja view all over again for my view of that report). As its title suggests, that report had much to say about the potential risks posed by nanomaterials in the environment; it also had some rather interesting general points to make about the problems of regulating new technologies in the face of inevitable uncertainty. Unfortunately, it’s the former rather than the latter that dominates the new Nanotechnology Strategy. Having been criticised so much, ever since the Royal Society/Royal Academy of Engineering report, about the lack of action on the possibility of nanoparticle toxicity, it is defensiveness about this issue that dominates this strategy. Even then, the focus is narrowly on toxicology, missing yet again the important broader issues around life-cycle analysis that will determine the circumstances and extent of potential human exposure to nanomaterials.

Moving to the section on business, the stated aim is to have a transparent, integrated, responsible and skilled nanotechnologies industry. I can’t argue with transparent, responsible and skilled, but I wonder whether there’s an inherent contradiction in the idea of an integrated nanotechnologies industry. Maybe the clue as to why the industry is fragmented is in this phrase; the report talks about nanotechnologies, recognising that there are many technologies contained within this category, and it lists a dozen or more markets and sectors in which these technologies are being applied. Given that both the technologies and the markets are so diverse, why would one expect an integrated industry, or even think that is desirable?

The arm of government charged with promoting technological innovation in business and industry is the Technology Strategy Board (TSB), an agency of government which has an arms-length relationship with its sponsoring department, Business, Innovation and Skills. The TSB published its own strategy on nanotechnology last year – Nanoscale Technologies Strategy 2009-2012 (PDF here), and the discussion in the Nanotechnology Strategy draws extensively on this. This makes clear that TSB doesn’t really regard nanotechnology as something to be supported in itself – instead, they expect nanotechnology to contribute, where appropriate, to their challenge-led funding programs – the Fighting Infection through Detection competition is cited as a good example. One very visible funding initiative that TSB is responsible for, that is focused on nano- (and micro-) technologies, is the network of MNT capital facilities (though it should be noted that TSB only inherited this program, which was initiated in the late Department of Trade and Industry before the TSB was formed). It now seems that these facilities will receive little or no dedicated funding in the future; instead they will have to bid for project funding in open competition. There’s a hint that there might be an exception to this. Nanomedicine is an area identified for future investment, and this comment is tantalisingly juxtaposed to a reference to a forthcoming report to BIS from the prominent venture capitalist Hermann Hauser, which is expected to recommend (in a report due out today) that the government funds a handful of centres for translational research, modelled on the German Fraunhofer Institutes. I think it is fair to say, on the basis of reading this and the TSB Nanoscale Technologies Strategy, that TSB is at best ambivalent in its belief in a nanotechnology industry, looking instead for important applications of nanotechnology in a whole variety of different application areas.

The largest chunk of government funding going to nanotechnology in the UK – probably in the region of £40-50 million a year – comes through the research councils, and here the Nanotechnology Strategy is at its weakest. The lead agency for nanotechnology is the Engineering and Physical Sciences Research Council (EPSRC), and the only initiatives that are mentioned are ones that have already been launched, as part of the minimum fulfillment of the EPSRC’s most recent nanotechnology strategy, published in 2006 (available here as a Word document). It looks like the Research Councils UK priority theme Nanoscale Science: Engineering through Application has run its course, and nanotechnology funding from the research councils in the future will have to come either from standard, responsive mode proposals or as part of the other mission programmes, such as Sustainable Energy Systems, Ageing: lifelong health and wellbeing, or the widely trailed new priority theme Resilient Economy.

Essentially, then, with the exception of a possible new TSB-led initiative in nanomedicine, it looks like there will be no further targeted interventions specifically for nanotechnology in the UK. For this reason, the section in the strategy on public engagement is particularly unsatisfying. We’ve seen a growing consensus about public engagement with science in the UK, which is simply not reflected in this strategy. This is that public engagement mustn’t simply be seen as a way of securing public acquiescence to new technology; instead it should be a genuine dialogue which aims to ensure that innovation is directed at widely accepted societal goals, carried out “upstream”, in the word introduced in an influential report some years ago. But without some upstream innovation to engage with, you can’t have upstream engagement.

Feynman, Drexler, and the National Nanotechnology Initiative

It’s fifty years since Richard Feynman delivered his famous lecture “There’s Plenty of Room at the Bottom”, and this has been the signal for a number of articles reflecting on its significance. This lecture has achieved mythic importance in discussions of nanotechnology; to many, it is nothing less than the foundation of the field. This myth has been critically examined by Chris Tuomey (see this earlier post), who finds that the significance of the lecture is something that’s been attached retrospectively, rather than being apparent as serious efforts in nanotechnology got underway.

There’s another narrative, though, that is popular with followers of Eric Drexler. According to this story, Feynman laid out in his lecture a coherent vision of a radical new technology; Drexler popularised this vision and gave it the name “nanotechnology”. Then, inspired by Drexler’s vision, the US government launched the National Nanotechnology Initiative. This was then hijacked by chemists and materials scientists, whose work had nothing to do with the radical vision. In this way, funding which had been obtained on the basis of the expansive promises of “molecular manufacturing”, the Feynman vision as popularized by Drexler, has been used to research useful but essentially mundane products like stain resistant trousers and germicidal washing machines. To add insult to injury, the material scientists who had so successfully hijacked the funds then went on to belittle and ridicule Drexler and his theories. A recent article in the Wall Street Journal – “Feynman and the Futurists” – by Adam Keiper, is written from this standpoint, in a piece that Drexler himself has expressed satisfaction with on his own blog. I think this account is misleading at almost every point; the reality is both more complex and more interesting.

To begin with, Feynman’s lecture didn’t present a coherent vision at all; instead it was an imaginative but disparate set of ideas linked only by the idea of control on a small scale. I discussed this in my article in the December issue of Nature Nanotechnology – Feynman’s unfinished business (subscription required), and for more details see this series of earlier posts on Soft Machines (Re-reading Feynman Part 1, Part 2, Part 3).

Of the ideas dealt with in “Plenty of Room”, some have already come to pass and have indeed proved economically and societally transformative. These include the idea of writing on very small scales, which underlies modern IT, and the idea of making layered materials with precisely controlled layer thicknesses on the atomic scale, which was realised in techniques like molecular beam epitaxy and CVD, whose results you see every time you use a white light emitting diode or a solid state laser of the kind your DVD contains. I think there were two ideas in the lecture that did contribute to the vision popularized by Drexler – the idea of “a billion tiny factories, models of each other, which are manufacturing simultaneously, drilling holes, stamping parts, and so on”, and, linked to this, the idea of doing chemical synthesis by physical processes. The latter idea has been realised at proof of principle level by the idea of doing chemical reactions using a scanning tunnelling microscope; there’s been a lot of work in this direction since Don Eigler’s demonstration of STM control of single atoms, no doubt some of it funded by the much-maligned NNI, but so far I think it’s fair to say this approach has turned out so far to be more technically difficult and less useful (on foreseeable timescales) than people anticipated.

Strangely, the second part of the fable, which talks about Drexler popularising the Feynman vision, I think actually underestimates the originality of Drexler’s own contribution. The arguments that Drexler made in support of his radical vision of nanotechnology drew extensively on biology, an area that Feynman had touched on only very superficially. What’s striking if one re-reads Drexler’s original PNAS article and indeed Engines of Creation is how biologically inspired the vision is – the models he looks to are the protein and nucleic acid based machines of cell biology, like the ribosome. In Drexler’s writing now (see, for example, this recent entry on his blog), this biological inspiration is very much to the fore; he’s looking to the DNA-based nanotechnology of Ned Seeman, Paul Rothemund and others as the exemplar of the way forward to fully functional, atomic scale machines and devices. This work is building on the self-assembly paradigm that has been such a big part of academic work in nanotechnology around the world.

There’s an important missing link between the biological inspiration of ribosomes and molecular motors and the vision of “tiny factories”- the scaled down mechanical engineering familiar from the simulations of atom-based cogs and gears from Drexler and his followers. What wasn’t fully recognised until after Drexler’s original work, was that the fundamental operating principles of biological machines are quite different from the rules that govern macroscopic machines, simply because the way physics works in water at the nanoscale is quite different to the way it works in our familiar macroworld. I’ve argued at length on this blog, in my book “Soft Machines”, and elsewhere (see, for example, “Right and Wrong Lessons from Biology”) that this means the lessons one should draw from biological machines should be rather different to the ones Drexler originally drew.

There is one final point that’s worth making. From the perspective of Washington-based writers like Kepier, one can understand that there is a focus on the interactions between academic scientists and business people in the USA, Drexler and his followers, and the machinations of the US Congress. But, from the point of view of the wider world, this is a rather parochial perspective. I’d estimate that somewhere between a quarter and a third of the nanotechnology in the world is being done in the USA. Perhaps for the first time in recent years a major new technology is largely being developed outside the USA, in Europe to some extent, but with an unprecedented leading role being taken in places like China, Korea and Japan. In these places the “nanotech schism” that seems so important in the USA simply isn’t relevant; people are just pressing on to where the technology leads them.