The Essence of the Concept of Decoherence
H. Dieter Zeh (www.zeh-hd.de)
- June 2010 (last revised June 2011)
Decoherence is now often identified
with certain
phenomena that have either been experimentally observed or are relevant
in
practice for some other reasons. This may be in accord with an
operational
approach to quantum physics, but it tells us nothing about the basis of
this concept in the established formalism from that it had originally
been derived.
The second kind of phenomena are best known from attempts
to realize
quantum computers, where decoherence is regarded as an unwanted
"distortion" of he qubits caused by the environment. This picture has
led
to various unsuccessful attempts to construct "error correction codes"
by means of redundant information storage as in classical computers.
They can hardly ever be successful, since (1) quantum states cannot be
cloned,
while (2) genuine decoherence is an
irreversible process. Only "virtual decoherence", defined by means of
an unrealistic microscopic
environment, could be
reversed in practice. A similar reversibility is used in delayed
choice
measurements or
so-called quantum erasers, where a virtual
measurement is
"undone" (cf. here).
The key experiments that have confirmed the phenomenon of
decoherence, on the other hand, demonstrate the
disappearance of an
otherwise observed interference pattern, usually for mesoscopic objects
whose effective
environment may be varied to exhibit
decoherence or not. Although this statistical phenomenon is indeed a consequence of a decoherence
process that affects the individual objects, it can be observed only
for
ensembles of
measurement
results (such as many spots on a screen). In
these experiments, decoherence
does in fact affect the investigated mesoscopic objects twice:
first while they pass the slits of an interferometer or while they live
for a
short
time as "Schrödinger cats" in an isolated cavity, and for the
second time during the final measurement that leads to the
"spontaneous" appearance of individual
spots, bubbles, or clicks. Only the first decoherence process is
discussed in these
experiments, while for the measurement proper experimentalists
analyzing their results often forget what they have just demonstrated,
and
thus return to a
pragmatic statistical interpretation without referring to the second
decoherence process. Some of them even regard their own
inconsistency as support of a wave-particle dualism or of
complementarity.
The reduced density matrix, derived from a complete description of
the state of a global system by tracing out the environment, is a
useful tool for describing the decoherence of a system under
consideration. It can be used to investigate in detail how certain
phase relations disappear from this system, thus transforming a pure
state for it into a "mixed state", for example. Given a realistic
environment,
this tells us (very successfully) which variables must appear
classically (lacking any superpositions), or in which situation we have
to expect almost sudden "quantum jumps" or other stochastic
"events" to occur. So it seems
that we do neither need fundamental classical variables any more, nor
any
indeterministic dynamics. However, this success, which led to
the early popularity of the decoherence concept, is partly based on an
ambiguity of the concept of the density matrix of a "mixed state".
(That for a pure state is unique, since it is just the dynadic product
of the state
vector or wave function with itself.)
The reason for this ambiguity is that the density matrix is defined
only to describe the
correct formal probabilities for all measurements that can be performed
on
this quantum system according to Born's statistical interpretation.
This means, first of all, that it cannot be used to derive this statistical
interpretation in terms of stochastic quantum jumps itself. (The
concept of "systems" is again an arbitrary tool - similar to the choice
of coordinates. An arbitrarily chosen system need not even be defined
by spatial boundaries, for example, although
such an assumption appears usually natural.) A density matrix may
represent a pure state or a
mixed
state. The latter may then either represent a statistical ensemble of
pure states of the system with given probabilities, or form the reduced
density matrix with respect to the state of some global system that
includes all other systems with which the subsystem is entangled. In
the first case one could simply "select" a pure state from the
ensemble by an increase of information (just as for a classical
probability distribution),
in the second case one would first have to apply a stochastic
interpretation or process
to the global state in order to obtain an ensemble to select from. (A
mixed state for
the whole universe can always be interpreted as representing lacking
information about its pure state.) Although the difference
between entanglement and lacking information should
by now be well known to all quantum physicists, this confusion is still
responsible for many misunderstandings of decoherence. The
concept of decoherence did in fact arise from the insight that
entanglement
describes a generic nonlocality of quantum states rather than
statistical
correlations based on incomplete information.
The
situation can be conceptually correctly and completely described
by means of the wave
function or Hilbert space vector for the required global system. If
either a measurement or an
uncontrollable interaction with the environment occurs to some system,
the latter becomes entangled with whatever it interacted with. A
factorizing
pure global state
would thereby be transformed into a pure but entangled global state. An
initial superposition is thus "dislocalized": it is neither in the
system nor in the environment thereafter - something that can happen
only in a kinematically nonlocal world. It has always been known that
the quantum
formalism is nonlocal in this way - actually long before John Bell
conceived of his
arguments that demonstrated convincingly that this nonlocality
cannot be a statistical artifact due to incomplete knowledge about some
as yet hidden local variables. (Note that in the literature one finds
several popular but questionable "measures of entanglement" which
measure only "controllable" entanglement that can somehow be used,
while they
neglect
precisely all the uncontrollable entanglement that leads to
decoherence. One has to keep in mind that entanglement is defined as a
property of pure quantum
states, and based on their phase relations. The latter are simply
averaged
over
in a mixed state - whatever its origin - thus apparently reducing the
entanglement. However, an entangled state cannot lose this property
just because it is considered in an ensemble of other states because of
incomplete information.) The concept of information should therefore
better be avoided when describing objective physical processes.
As I pointed out above, the reduced density matrix contains complete
information about everything that can be observed at a certain system
according to the Born rules.
So, decoherence describes an irreversible transition of the "system"
state into an apparent ensemble
for all practical
purposes. This irreversibility is induced by the time arrow
characterizing the environment. If a measurement apparatus could
be treated as a (controllable) microscopic system, the measurement
would be reversable (it could be "undone"). However, a macroscopic
pointer must unavoidably interact with its uncontrollable environment
in each individual measurement. Therefore, it appears quite unmotivated
to invent some fundamental irreversible process, such as a collapse of
the
wave function, or to assume fundamental classical concepts to apply,
precisely where and when the observable or irreversible decoherence
phenomena
occur according to the unitary quantum formalism. In particular,
classical concepts
(often describing the pointer
basis of a measurement device) emerge according to the
objective irreversible process of decoherence, while there remain
various possibilities to explain why we observe individual measurement
outcomes. If no new physics will ever be found to apply somewhere
between apparatus and observer, we may have to accept the "many worlds"
interpretation. Before decoherence was understood as a unitary process
that includes the environment, its occurrence was usually interpreted
as a break-down of quantum mechanics that seemed to require the
application of independently postulated
classical concepts. However, while a collapse of the wave function
would have to proceed with superluminal speed, the mere possibility of
an
interpretation in terms of branching observers based on unitary
decoherence demonstrates a forteriori
that it cannot be used to send superluminal signals.
Severeral authors have claimed that the concept of decoherence has
failed to explain the measurement process. They are all either
(wrongly) assuming that decoherence has to be based on the mentioned
confusion between
different interpretations of the density matrix, or they are (tacitly)
criticizing only that the solution is not of the kind they had
expected. One has to realize that the success of decoherence is not
only challenged by traditionalists who still believe in
complementarity and similar non-concepts, such as "quantum
information", but
perhaps even more passionately by those "dissidents" who are justifying
their search for a quite novel theory precisely by their
dissatisfaction with this pragmatic approach.
The essence of decoherence is thus given by the permanent and
uncontrollable increase of
entanglement between all systems. It describes the realistic situation
of our world, which is very far
from
equilibrium, and it thus leads to the permanent dislocalization of
superpositions. Its time arrow is formally analogous to the creation of
"irrelevant" statistical correlations by Boltzmann
collisions. Neglegting these classical correlations, for example by
using a
µ-space
distribution, would lead to an increase of ensemble entropy. This
consequence remains
true as
well in quantum theory (in the sense of an "apparent ensemble entropy")
if one neglects
entanglement by relying on reduced density
matrices for subsystems. However, one should keep in mind that
entanglement represents individual
properties of combined systems (such as
total angular momentum) - hence not just incomplete information.
Certain entangled states, such as Bell states, are even used as
potential individual measurement outcomes in some experiments. In spite
of the analogy with
statistical correlations, the neglect of entanglement describes a
change of the individual
physical states. (In Everett's description, however, this can be
understood as a change of the "relative state" with respect to the
branching observer.) The arrow of time defined by
the decoherence process requires a special initial condition for the
universal wave
function (namely: little or no initial entanglement).
Evidently, this must be a physical condition - it cannot just be a
condition for initial "human knowledge" or some kind of "information".