Tuesday, June 12, 2007

The role played by surface structure in breaking molecules

Herokazu Ueta
Many industrial chemicals are made by breaking other molecules up and sticking them back together in different ways. The problem is that the precursor chemicals—the ones that are used to make the end product—are quite stable so a lot of energy is needed to break them up. To overcome this problem catalysts are used. Often, these are metallic surfaces, which act to make the desired reaction require less energy. However, catalysts, and their interactions with the chemicals they are reacted with, are often poorly understood. Thus, much theoretical and experimental research effort is devoted to understanding catalyst systems.

Some recent work has focused on the interaction between methane and nickel, which is important for steam reforming of natural gas (methane is reacted with water to generate hydrogen and carbon monoxide). Recent experimental work has shown that the efficiency of this reaction depends not only on how high the kinetic energy of methane is but also on how the methane molecule is vibrating as it hits the nickel surface. Unfortunately, this is a difficult problem to model because the quantum mechanical description becomes quite complex once the surface, the molecule, and its vibrations are included.

However, in a recent issue of Physical Review Letters, researchers are reporting the effect of nickel lattice motion and surface reconstruction on methane dissociation. These theoretical calculations used a combination of approaches, where the nickel surface atoms are described exactly by their quantum mechanical state and the methane is approximated as a quasi-diatomic—CH3-H, where CH3 is treated as an atomic entity. The simulations show that the atoms in the nickel surface rearrange themselves during methane bond breaking. They also show that the reconstruction changes the local environment of the methane and reduces the barrier allowing the reaction to proceed. Their results indicate that, not only the excitation state of methane, but also the configuration of the surface atoms plays a role in methane dissociation.

These results need to be followed up with experiments utilizing molecules with a well-defined vibrational state. Such experiments can be used to observe the behavior of the surface during the interaction with the molecule.

Friday, June 8, 2007

An experimental test of non-local realism

By Hein Teunissen
Quantum mechanics(QM) seriously challenges classical, intuitive ideas of how nature works. Einstein’s objections to QM were based on his belief that any physical theory must obey the concept of ‘local realism’. Local realism assumes that the results of measurements on a system localized in space-time are fully determined by pre-existing properties carried along by that system (its physical reality) and cannot be instantaneously influenced by a distant event (locality). But a famous thought experiment on entangled particles, by Einstein, Podolsky and Rosen, showed that QM did not fulfill this condition (EPR paradox).

In quantum mechanics, entangled particles are groups of particles that cannot be described independently even though they may be physically isolated from each other. Entangled particles form a state described by a single quantum mechanical wavefunction, which means that they have perfectly correlated quantum numbers, and this remains so when the particles are separated (in contrast to the apparent randomness of microscopic phenomena).

In the ‘Copenhagen interpretation’ of QM, the act of measurement on a quantum system causes an instantaneous collapse of the wavefunction of the system, so measurement on one of the entangled particles must instantaneously alter the state of the other particle. According to Einstein, this was ‘spooky action at a distance’, which defies locality. In an attempt to preserve locality, it was proposed that the description of reality given by the quantum mechanical wavefunction was incomplete. It was thought that a more complete theory could be formulated based on local realism, in which the physical reality of a system could be fully described by a set of ‘local hidden-variables’ (a limitation of QM being that not all variables are known).

Many experiments have been performed on entangled particles, for which quantum mechanics provides very accurate predictions. In 1964, John Bell showed that the excellent predictions of QM could not be reproduced by any alternative theory based on local realism (Bell’s theorem). The observed accuracy of QM in so-called Bell test experiments has led to the acceptance that local realism is violated.

Because the argument put forth by Einstein and his colleagues is very powerful, scientists have made numerous attempts to find out more about the ‘correct’ interpretation of QM. In the paper ‘An experimental test of non-local realism’ (Nature, April 2007), Gröblacher et al. try to determine which of the separate concepts of local realism may be the problem: ‘locality’ or ‘realism’. In the same vein, John Leggett had previously proposed a broad class of theories that give up the concept of locality. These theories provide both an explanation of all Bell-type experiments and model perfect correlations of entangled states, while still being based on a plausible type of realism.

In the recent Nature article, Gröblacher and his co-workers show, both theoretically and experimentally, that the class of non-local realistic theories as proposed by Leggett are incompatible with experimentally observable quantum correlations between entangled photons. The authors start from an inequality derived by Leggett for his proposed class of non-local realistic theories, and extend this inequality to make it applicable to real experimental situations and to allow simultaneous tests of all local hidden-variable models. Thus, the experimental results allow one to (once more) exclude all local hidden-variable models (called the CHSH inequality), and to test the mentioned class of non-local hidden-variable theories (Leggett inequality).

To experimentally test the obtained inequalities, Gröblacher and his colleagues generate pairs of photons whose polarization states are entangled. They modified an experiment that was previously used to test Bell’s inequality, where entangled photons are generated and then separated. In the experiment to test Leggett’s inequality, the polarization of one of the photons is altered to an elliptical state. Each photon is then passed through a medium which will absorb the photon depending on its polarization. The inequalities are tested by counting the number of times both entangled photons are not absorbed. It was found that the experimental results are accurately predicted by quantum mechanics and that both the CHSH inequality and Leggett’s inequality are significantly violated.

The authors state:

Our result suggests that giving up the concept of locality is not sufficient to be consistent with quantum experiments, unless certain intuitive features of realism are abandoned…We believe that the experimental exclusion of this particular class [introduced by Leggett] indicates that any non-local extension of quantum theory has to be highly counterintuitive…We believe that our results lend strong support to the view that any future extension of quantum theory that is in agreement with experiments must abandon certain features of realistic descriptions.


The history of quantum mechanics is very interesting. Many notable physicists have participated in this field to resolve complications with regard to the interpretation of QM. This history is characterized by a mixture of physical test-cases and philosophical arguments. The claim following from Bell’s theorem is very powerful, namely that no physical theory based on both locality and realism can ever parallel the accuracy of quantum mechanical predictions. The claim following from the violation of Leggett’s inequality is merely that realism combined with a certain type of non-locality are incompatible with QM. However, useful as such, this research also contributes to a better understanding of phenomena like entanglement, which is anticipated to possibly lead to a new technological revolution, called ‘quantum computation’.