A good way of assessing whether a writer knows what they are talking about when it comes to nanotechnology is to look at what they say about quantum mechanics. There’s a very widespread view that what makes the nanoscale different to the macroscale is that, whereas the macroscale is ruled by classical mechanics, the nanoscale is ruled by quantum mechanics. The reality, as usual, is more complicated than this. It’s true that there are some very interesting quantum size effects that can be exploited in things like quantum dots and semiconductor heterostructures. But then lots of interesting materials and devices on the nanoscale aren’t ruled by quantum mechanics at all; for anything to do with mechanical properties, for example, nanoscale size effects have quite classical origins, and with the exception of photosynthesis almost nothing in bionanotechnology has anything to do with quantum mechanics. Conversely, there are some very common macroscopic phenomena that simply can’t be explained except in terms of quantum mechanics – the behaviour of electrical conductors and semiconductors, and the origins of magnetic materials, come immediately to mind.
Here’s a fairly typical example of misleading writing about quantum effects: The ���novel properties and functions��� are derived from ���quantum physics��� effects that sometimes occur at the nanoscale, that are very different from the physical forces and properties we experience in our daily lives, and they are what make nanotechnology different from other really small stuff like proteins and other molecules. This is from NanoSavvy Journalism, an article by Nathan Tinker. Despite its seal of approval from David Berube, this is very misleading, as we can see if we look at his list of applications of nanotechnology and ask which depend on size-dependent quantum effects.
…right so far; the use of things like semiconductor heterostructures to make quantum wells certainly does depend on exploiting qm…
…here we are talking about systems with high surface to area ratios, self-assembled structures and tailoring the interactions with biological macromolecules like proteins, all of which has nothing at all to do with qm…
…if we are talking liposomes, again we’re looking at self-assembly. To explain the transparency of nanoscale titania for sunscreen, we need the rather difficult, but entirely classical, theory of Mie scattering.
The list goes on, but I think the point is made. All sorts of interesting and potentially useful things happen at the nanoscale, only a fraction of which depend on quantum mechanics.
On the opposition side, the argument about the importance of quantum mechanical effects is pressed into service as a reason for anxiety; since everyone knows that quantum mechanics is mysterious and unpredictable, it must also be dangerous. I’ve commented before on the misguided use of this argument by ETC; here’s the Green Party member of the European Parliament, Caroline Lucas, writing in the Guardian: The commercial value of nanotech stems from the simple fact that the laws of physics don’t apply at the molecular level. Quantum physics kicks in, meaning the properties of materials change. This idea of the nanoscale as a lawless frontier in which anything can happen is rather attractive, but unfortunately quite untrue.
Of course, the great attraction of quantum mechanics is all the fascinating, and usually entirely irrelevant, metaphysics that surrounds it. This provides a trap for otherwise well-informed business people to fall into, exposing themselves to the serious danger of ridicule from TNTlog, (whose author, besides being a businessman, has had the unfair advantage of having a good physics education).
I know that it’s scientists who are to blame for this mess. Macroscopic=classical, nanoscale=quantum is such a simple and clear formula that it’s tempting for scientists communicating with the media and the public to use it even when they know it is not strictly true. But I think it’s now time to be a bit more accurate about the realities of nanoscale physics, even if this brings in a bit more complexity.
I agree with you Richard, maybe it is time to start a counter – meme. Much of the power and advantage of nanotechnology comes simple geometrical relationships not quantum mechanics.
A high school level understanding of geometry is all you need to grasp many of the advantages of nanotechnology. This way of thinking can really demystify many of the dangers and opportunities of nanotechnology. Just understanding the implications of how the surface area to volume ratio changes as you get smaller would go a long way in educating the public. Relatively simple scaling laws can give a good indication of how physical properties change as something gets smaller. (For example: inertia is proportional to the cube of the length of an object, two spheres made of the same material one is 1 cm in length and the other 0.1 cm in length the smaller sphere has 1,000 times less inertia.)
Another well-thought-out piece, Richard. Let me chime in with my usual paranoid-sounding rantings, but it is based on extensive field studies of the political animal in his natural habitat. Nothing against Nathan Tinker, since I come from the same kind of background as he — except I suspect that he knows much more than I do about 16th century women’s poetry, unicorns and leprechauns, but what Nathan does is something completely different from what you do. He considers himself to be a public-relations person for nanotechnology. That means it’s his job to put a positive spin on everything. His attempt to set journalists on the correct path to nano wisdom is nothing but a press release (“Nano is responsible technology”) and no more time should be wasted on it than any other press release that crosses your desk every day. What I find especially ridiculous is that when official nano spokespeople want to evoke a sense of wonder, they take out that whole “quantum effects” phrase for, well, effect. This isn’t just small tech, this is magical and mysterious because of those “quantum effects.” However, “quantum effects” only have the power to do good. There is no dark side of this mysterious force …
I have a slightly different outlook upon it, which is that Quantum mechanics is at the basis of what you are talking about- its just that its is unusable in many situations. Or maybe its a bit like som ephysicists have said, mostly jokingly, that chemistry is just physics. Sure, it depends upon the deeper physics for overall comprehension (eg orbitals etc) but operates on a different level.
I think the real question is whether describing the system using a quantum mechanical framework is any more accurate than using classical theory. Take, for example, the difference between classical thermodynamics and quantum statistical mechanics. Both will give the same answer in the averaged, macroscopic world, but quantum stat mech provides a deeper understand of what is happening on the molecular level. Using quantum mechanics doesn’t change what happens…it allows us to understand better what is happening.
An example: How does eukaryotic DNA Polymerase synthesize DNA with such high fidelity? It turns out that DNA Polymerase has multiple functions: it can catalyze the DNA polymerase reaction, it can cut DNA in between any base pair (endo nuclease activity), and it can also cut at the ends of the DNA (exonuclease activity). Classically, we say that the enzyme performs it polymerase activity and when a misaligned or synthesized base pair is created the incorrect end or incorrect ‘bubble’ in the DNA is cut out before the Polymerase continues to proceed. But, what is really happening? There is always a non-zero probability of a correct end being cut and always a non-zero probability of an incorrect bubble or base pairing not being repaired. The electron densities of the misaligned DNA cause the potential energy well of the DNA motion through the polymerase to more favor the exo or endo nuclease activity and the electron densities of the correct base shifts the potential energy back into the polymerase activity. But, like all multidimensional potentials with multiple minima, there are transition probabilities from one minima to another. You can measure the probability of incorrect repair, but with quantum mechanics you can _calculate it_. You can’t do that with classical physics.
(To be precise, because there’s reactions going on, you can’t use classical stat mech to exactly describe the system. If you were only dealing with the movement and motions of molecules, classical stat mech should do fine.)
-Howard Salis
Well, Howard (wonderful name, by the way. I’m sure you must be a handsome fella), the original point of Richard’s piece was to critique some advice being given to journalists covering nanotechnology.
We’ve established that quantum mechanics comes into play some of the time, but not as often as the media like to portray, and not always dependent on size.
What you describe above makes sense. You can talk about synthesizing DNA in terms of a straightforward, classical process, or you could up the “cool” factor for extra-geeky readers by introducing quantum mechanical concepts like potential energy wells and transition probabilities. And either description would be correct. However, if your local newspaper used the same language you used above, readers would be scratching their heads and flipping back to the THONG protest.
So, the question remains — and I’d really like an answer to this, since this is what I’m attempting to do for a living — is how deep into quantum mechanics does a writer need to go when describing a nanotech concept for a general audience. What I would tell a reporter who’d ask me that question is it all depends on the context of the story. How relevant to the story’s central theme are these distinctions? Is there a “need to know,” or would rambling off on a quantum tangent only confuse and lose your readers? On the Web, I have a perfect coward’s way out. I can describe things in simple terms and link to technical explanations or definitions for anybody who’s interested. But that is still not acceptable if those definitions are relevant within the context of the story.
The trick is to make nanotechnology understandable to a general audience by breaking it down into easily digestible chunks. Guide readers along through concepts. That is far different from “dumbing it down,” simplifying or using meaningless generalizations as Tinker suggested.
Howard
This post is quite interesting!
I have some questions. We may have by 2020 Terahertz laptops if present trends continue, with supercomputers being in the Pentahertz range. With this sort of computing, lattice models become possible which should fundementally change how computational chemistry is done.
What impact if any will such models have regarding using quantum or classical approaches to Nanotech? In particular, how much trust do experimentalists have in computer modeling to help in Nanotech problems?
An amateur mathematican
Well, Howard (it’s like I’m talking to myself 😉 ),
Sometimes using quantum mechanics to describe a system will just give you a more accurate answer. If you can get a good answer using classical methods, then you might as well (because doing ab initio calculations is extremely computationally intensive). But the ‘cool factor’ comes in when the answer provided by using quantum mechanics is very different from using a classical description. That doesn’t happen that often. Even with very ‘small’ systems.
You can still describe self-assembly, hydrogen bonding, van der Waals forces, entropic effects, etc with classical stat mech methods. And it’s cheaper than calculating wave functions.
So I would use the mathematical description that the scientist, engineer, or mathematician used to describe the system to get their answer. But to play up the ‘cool factor’ for just use quantum mechanics is pointless. You can use quantum mechanics to desribe a macroscopic system, but it would give you the same answer as a classical description (and it would be nigh impossible to get the quantum answer due to the computational cost).
-Howard Salis
I was browsing through my journal update emails and I found a representative ab initio calculation study. The abstract is:
“An ab initio study of the electronic and vibrational properties of pyrazinecdots, HX and XH, pyrazinec HX hydrogen-bonded complexes (X=F, NC, Cl, CN and CCH)
Jo?£o Bosco P. da Silva, Corresponding Author Contact Information, E-mail The Corresponding Author, M?°rio R. Silva J??nior, Mozart N. Ramos and S?©rgio E. Galembeck
Ab initio molecular orbital calculations have been performed at the MP2/6-31++G** theoretical level in order to obtain molecular properties of hydrogen-bonded complexes involving pyrazine and the HX linear acids with, X=F, NC, Cl, CN, and CCH. Both the 1:1 and 1:2 adducts have been investigated, i.e., Pyzcdots, three dots, HX and XH, Pyzcdots, HX. The H-bond strength for these complexes can be interpreted in terms of the H‚ÄìX stretching frequency downward displacement. This latter shows a good linear correlation with the intermolecular charge transfer and follows the order: X=F>X=NC>X=Cl>X=CN>X=CCH. As expected, the H‚ÄìX stretching intensity is much enhanced upon H-bond formation. Here these effects on the HX infrared spectra are much more pronounced than those previously found in complexes with unsaturated hydrocarbons as proton acceptors. On the other hand, no important change has been verified in pyrazine, except the inhibition of the lone-pair trans effect from the lone-pair of nitrogen to the œÉ* antibonding orbital of the CC chemical bond in pyrazine. The new vibrational modes arising from complexation show several interesting features, specially the bending modes of the proton-donor molecule and the intermolecular stretching mode.”
These guys used computers to calculate the electron densities of molecules with different properties…and found good correlation to macroscopically observed behavior. So is there a ‘cool factor’ or does QM simply give an accurate answer? For here, I think it’s the latter.
Jim, your point is a good one. If you’ve got a system that starts behaving oddly as it gets smaller, the first place to look for an explanation is simple geometrical scaling, particularly surface to volume ratios. Then you ask if the phenomenon you are looking at has a natural length scale that separates “small” from “large”. This could be quantum related, for example an electron mean free path if you are looking at the transition from normal to ballistic transport in a small semiconductor device. But it could equally be classical, like the Griffiths critical crack length for mechanical properties, or the wavelength of light for scattering behaviour. The appropriate length is going to be quite different for different properties (which is why, incidentally, it makes no sense at all to try and define “nanotechnology” with some arbitrary lengthscale like 100 nm).
Howard, it probably was unfair to single out Nathan Tinker as what he said has been said by many others. But I was struck by the juxtaposition of a very strong statement that nanotechnology must involve quantum size effects with a list of nanotechnologies that quite clearly don’t.
As regards politics, here’s how it might have happened (as I’m an academic, not a journalist, this isn’t a conspiracy theory, simply a hypothesis I throw out to my social scientist colleagues for rigorous testing). When it first became apparent that the “nanotechnology” label might be a bit of a funding magnet, the semiconductor physicists took the view that they ought to be first in line. After all, they had all kinds of cool kit, like molecular beam epitaxy, electron beam lithography and the rest, that allowed them to make new materials, not quite atom by atom, but certainly with nanoscale precision, with dramatic and useful new properties. The only fly in the ointment was the thought that those low-life scum materials scientists would try and muscle in on the act by tipping a mixture of clay and nylon into an extruder and calling the result a nanocomposite. The solution was obvious – make sure nanotechnology had to involve quantum mechanics, by definition, and those pesky materials scientists, not to mention (shudder!), those grubby chemical engineers would be thwarted. Then the dot com crash arrived; suddenly saying you were going to make a better laser for getting more bandwidth out of your optical fibres didn’t look such a compelling selling point and million-dollar molecular beam epitaxy machines could be had for a few thousands in garage sales all over the western world. The materials scientists were still standing, though. But no-one seemed that bothered about changing the definition; quantum mechanics conveyed mystery and glamour, and in any case few people really knew the difference between Mie scattering and quantum confinement.
T’other Howard: as Guthrie says, in a real sense all chemistry is quantum mechanics, so you certainly do need qm to calculate things from first principles. But in many situations the quantum mechanics can be hidden. Atoms and molecules interact through potentials which we need qm to calculate, but if we take the potentials as given (and if we can’t do the first principles calculations we can work them out from empirical evidence) then classical methods work fine. This contrasts with situations where the quantum mechanics can’t be hidden, because the concepts we are dealing with are fundamentally quantum in character and simply don’t have any meaning in classical mechanics. A relevant example here would be the band-gap of a semiconductor quantum dot. But even in this example, you can’t say that the difference between the macroscopic and the nanoscale is that classical mechanics work in the former but you need quantum mechanics for the latter. You still need quantum mechanics to understand what the electrons are doing in a brick of cadmium selenide; it’s just that as your lump of semiconductor gets small the size starts to affect the (still quantum mechanical) way in which the electrons are behaving, and you get the characteristic size dependence of the fluorescence of a quantum dot.
Zelah, computer modelling is very powerful but there are still limitations, and it will remain so for some time. The point is that to understand from first principles the behaviour of a moderately complex nanomachine (the molecular motor ATP-synthase, for example) you need to account for not only a large number of atoms, but also a very wide range of timescales – from picoseconds out to seconds. To do this using ab-initio quantum chemistry will be out of the question for some time yet. Instead, what people do is really suggested by the argument above; you attack the problem hierarchically. Use ab-initio methods to give you parameters for the potential that you would use in a technique like classical molecular dynamics, and then use the results of MD to input parameters into an even more coarse-grained model, with which you can compute the problem out to experimentally relevant timescales.
This is fascinating on a number of levels, but when it comes to a journalist making sense of nanoTECHNOLOGY and describing it to the general public, the writer should simply back up a little bit and listen closely to his sources. If the nanotech “breakthrough” being discussed involves exploitation of quantum effects, then it’s important to say so. the knowledge base of the readership will determine exactly how detailed you need to be, but always with links or references to more detailed descriptions just in case you pick up a reader or two who really wants or needs to learn more.
But there comes a point in the discussion that you simply reach the limits of human knowledge, right? Unless I missed it, there still is no settled-upon theory that unites quantum and classical physics, and the person who does that would likely be too busy claiming his Nobel than to bother reading blogs.
Nanotechnology harnesses what is possible at the (yes, arbitrary) scale of 100 nanometers or less. It is not necessary for a scientist to know all the “whys” in order to apply his or her discovery to a practical application, only that it does work.
Richard, you’re much better than I am at weaving complex conspiracy theories. It’s very possible that from the point of view of the semiconductor industry, you’re correct. But the truth is that what is today called “nanotechnology” is a simultaneous convergence of all these fields — chemistry, materials science, life science, etc. into this tiny realm. And, in almost all cases, reaching the nanometer scale enables the kind of targeted, personalized products that all industries have been longing for. The semiconductor industry is, in fact, a little late in the game. They’re calling their chips with features of 90 nanometers and less “nanotechnology,” but it’s still the old, clunky photolithography, and that’s why Intel and others were having such trouble at the factory. They finally got nano religion only recently, with attempts to launch their a semiconductor nanotech initiative, and are looking at nanotech companies that can offer them some near-term gain. Ultimately, though, the entire industry is going to have to be torn down and built again from the bottom up.
Along the way, the semiconductor industry, along with everybody else along for the nano ride, will learn from one another how these quantum effects can both thwart and enhance their efforts. But the bottom line is that nanotechnology exploits what is possible. The whys and hows will still keep the theoretical physicists gainfully employed.
Howard
Howard said: “Richard, you‚Äôre much better than I am at weaving complex conspiracy theories” … of course, I’m an academic, it’s in the job description! But I must correct you on one thing. When I was talking about semiconductor physicists, I didn’t mean people in the semiconductor industry. In academia, semiconductor physicists have truly been exploiting quantum effects in nanostructures for more than twenty years (I’ll be writing more about this in Feynman part 3). Where these devices have become commercially important is not in microprocessor manufacturers like Intel, but in the optoelectronic equipment that makes the internet work. In the home, it’s the laser that makes your DVD work that derives from this technology.
As the author of the paper in question, a couple of points:
1) If I remember my Logic 101 course, to throw out an entire piece of work because of a single issue is fallacy called a Hasty Generalization. Perhaps my use of “quantum effects” was imprecise, but that single sentence is the only point Dr. Jones takes issue with in a 5-page paper. If that’s the only real issue a professor of physics and astronomy can find with the paper, I feel pretty good. (As an aside, the paper was vetted before publication by 3 emminent nano-scientists, including a Japan Prize winner, and none took issue with the “quantum effect” phrase.)
2) Howard Lovy is right in one thing he says about the paper, despite the rather disparaging way he put it: the paper was not intended for the nanotelligencia, but for journalists, most of whom have had little or no introduction to nano before being asked to write about it. Throughout the paper I encourage media to GO TO EXPERT SOURCES IN ACADEMIA for a reality check.
3) I never suggested “‚Äúdumbing it down,‚Äù simplifying or using meaningless generalizations.” Quite the opposite–I encourage talking to practitioners about the subject before going on about nanobots and Racquel Welch.
4) As for pons asinorum, I may indeed be a fool or even an ass, but I and my “public-relations for nanotechnology” colleagues have done a lot to keep the issue, the science, and the people in front of a audience that is increasingly skeptical of advanced science. Nano-savvy Journalism is simply another part of the endeavor to encourage continued support, attention, and funding, of nanotechnologies.
As a former academic myself, I certainly understand the tendency to ridicule and brush off the encroachment of non-specialists upon one’s area of expertise. Others do it for more mundane reasons: TNTLog ridicules others in order to drive business their way. But, let’s face it, the scientific community has been pretty inept at getting media past nanobots and grey goo. We all have a vested interest in media understanding and communicating the full range of nanotech’s promise and even peril. If Nano-savvy Journalism is imprecise in its description, please consider putting out a release that addresses the issue more precisely.
Nathan, as I said in my comment to Howard it’s perhaps unfortunate that you were the sole victim of my opprobrium here, as what you said is no different to what has been said many times before (and indeed there’s very much that’s good about the rest of your article). And, as I say in my post, it is us scientists who are actually responsible for this misconception having gone unchecked for so long in the first place, so you should consider this to be primarily an attempt by one scientist to start to repair the damage caused by his colleagues.
I do insist, though, that the point at issue is a really central one, which goes to the heart of what nanotechnology actually is. I don’t think that what’s at issue is being imprecise about quantum effects; the fact is that defining nanotechnology as being that realm in which quantum size effects are important is just wrong, if that definition is to include many, or even most of the applications that currently are held to depend on nanotechnology. As I say in my post, this misconception has been positively unhelpful towards efforts to promote understanding of nanotechnology, and I think it’s time to start to try and correct it.
I am an academic, but believe me, I don’t at all underestimate either the importance or the difficulty of promoting public understanding of nanotechnology. I’ve tried to play my part in this, more than most of my colleagues, I think, via my book, journalism and lectures, and indeed this blog, and I certainly don’t want to ridicule or brush off non-specialists. At some point one needs to draw a line between the inevitable imprecision that follows when one simplifies very difficult concepts, and statements that are plain wrong. This is even more important when these statements are constantly repeated and are left uncorrected by those (scientists) who ought to know better.
The public doesn’t need to understand semiconductors in order to enjoy using their computer, their iPod, or any other consumer electronics.
Why the emphasis on selling nanotech to the public? Sell it to the scientists first. 😉