It’s widely believed that, whereas the macroscopic world is governed by the intuitive and predictable rules of classical mechanics, the nanoscale world operates in an anarchy of quantum weirdness . I explained here why this view isn’t right; many changes in material behaviour at small scales have their origin in completely classical physics. But there’s another way of approaching this question, which is to ask what you would have to do to be able to see a nanoscale particle behaving in a quantum mechanical way. In fact, this needn’t be a thought experiment; Anton Zeilinger at the University of Vienna specialises in experiments about the foundations of quantum mechanics, and one of the themes of his research is in finding out how large an object he can persuade to behave quantum mechanically. In this context, the products of nanotechnology are large, not small, and among the biggest things he’s looked at are fullerene molecules – buckyballs. The results are described in this paper on the interference of C70 molecules.
What Zeilinger is looking for, as the signature of quantum mechanical behaviour, is interference. Quantum interference is that phenomenon which arises when the final position of a particle depends, not on the path it’s taken, but on all the paths it could have taken. Before the position of the particle is measured, the particle doesn’t exist at a single place and time; instead it exists in a quantum state which expresses all the places at which it could potentially be. But it isn’t just measurement which forces the particle (to anthropomorphise) to make up its mind where it is; if it collides with another particle or interacts with some other kind of atom, then this leads to the phenomenon known as decoherence, by which the quantum weirdness is lost and the particle behaves like a classical object. To avoid decoherence, and see quantum behaviour, Zeilinger’s group had to use diffuse beams of particles in a high vacuum environment. How good a vacuum do they need? By adding gas back into the vacuum chamber, they can systematically observe the quantum interference effect being washed out by collisions. The pressures at which the quantum effects vanish are around one billionth of atmospheric pressure. Now we can see why nanoscale objects like bucky-balls normally behave like classical objects, not quantum mechanical ones. The constant collisions with surrounding molecules completely wash out the quantum effects.
What, then, of nanoscale objects like quantum dots, whose special properties do result from quantum size effects? What’s quantum mechanical about a quantum dot isn’t the dot itself, it’s the electrons inside it. Actually, electrons always behave in a quantum mechanical way (explaining why this is so is a major part of solid state physics), but the size of the quantum dot affects the quantum mechanical states that the electrons can take up. The nanoscale particle that is the quantum dot itself, in spite of its name, remains resolutely classical in its behaviour.
I have an historical question to ask. Then some computational chemistry questions.
First, when Eric Drexler came out in the late 1980’s early 1990’s with Nanotech, one of the most persistent criticisms was that quantum rules were fundemental at the Nanolevel. Why was this false arguement ever put up by serious scientists?
Now for the technical question. The main difference between quantum systems and classical systems is their phase spaces. Each quantum particle has an enormous number of of states and to do quantum calculations for molecules, one has to take interactions of ALL of these states. For classical systems, you only have to do calculations at the particle level.
However, you have indicated that the properties of molecules are essentially classical unless one is working with isolated electrons!
My 3 questions are:
1. Why are so many chemists doing computation quantum chemistry if molecules are classical!
2. Classical systems are of course mostly nonlinear. However, the biggest reason you give for the failure of Drexlerian Nanotech is that the nanoworld is governed by brownian motion. But Brownian motion occurs due to forces which are quantum in origin (Van der Waal , London forces)! I understand that there is a classical theory of brownian motion, but this was for behaviour of pollen in solution, not nanoparticles.
3. Finally, in what domain is the main difficulties of carrying out Nanomanipulations, is it in the Classical or Quantum domains?
An amateur mathematician
Good questions all.
Firstly, why do quantum chemistry? What sticks atoms together to make molecules are electrons, and as I said, electrons always are quantum mechanical objects. So, to understand the forces that hold molecules and solids together, you do need to use quantum mechanics.
Secondly, what is the origin of Brownian motion? This is classical, whether for pollen or any other nanoparticles – the physics was worked out by Einstein and Smoluchowski. There are quantum corrections to the classical theory of Brownian motion which are important at low temperature. How low is low here depends on the characteristic frequency of the vibrations. These may br significant in calculating some of the higher frequency thermally excited vibrations of very stiff materials.
However, you are right to say that the origins of van der Waals forces are quantum mechanical in character.
Now to phase space – actually statistical mechanics can be done either on the
basis of quantum mechanics or classical mechanics. In fact, the quantum version is actually easier because you only have to sum over discrete quantum states. In classical stat mech, you have to do integrals over a multidimensional phase space (how many dimensions is this space? For N particles, it’s 6N dimensional – 3N for all the position coordinates and 3N for all the momenta). So it isn’t actually clear that life is much easier for classical systems, at least when they get to be non-linear.
I can’t answer the historical question – I’m only responsible for my own criticisms of Drexler!
Thank you for your reply.
I have now a good think about Quantum versus Classical representations.
I have come to the conclusion that the main problem is the atomic model of chemistry itself!
The reason that Drexlerians are so convinced of their position, is that the visual picture of atoms stuck together with elastic forces is so powerful that it leads to serious oversimplifications.
Now is this a correct description of your position on this?
Molecules are better thought of as many electron systems (as opposed to atomic systems) which can be modeled via classical or quantum methods given the problem at hand.
This means that Drexler’s naive pictures of the nanoscale is only useful if one can ignore electron / electron interactions which Drexler in Nanosystems seems to treat as a pertubation!
My final comments.
If the above is true, it then seems that the teaching of chemistry at the undergraduate level would need to be changed to give a more realist description of molecular chemistry (which gives the appearance that ANY molecule can be synthesized).
Also, it appears that for complex molecular systems, that the atomic model has had its day, as it is too easy to fall into a drexlerian trap!
An amateur mathematician
The models Drexler uses treat atoms as classical objects, but the interaction between the atoms is described by force fields which are quantum mechanical in origin. The parameters for the force fields need to be fixed by a combination of first principles calculations and fitting to experiment. This is a perfectly correct and above board method of proceeding, as long as one is aware of the limitations. One of the major limitations is that it is very difficult to find force fields that accurately represent what goes on when chemical reactions take place. Another limitation is that it isn’t particularly easy to correctly include the effects of finite temperature; this leads to a tendency to confuse mechanical stability with thermodynamic stability. Both of these problems are particularly pointed in systems with a lot of surface (as all nanoscale objects have).