May 2005 - last revised October 2009
How decoherence can
solve
the measurement problem
H. D. Zeh
Decoherence may be defined as the uncontrollable dislocalization of
quantum mechanical
superpositions. It is an unavoidable
consequence of the interaction of all local systems with their
environments according to the
Schrödinger equation. Since the dislocalization propagates in
general without
bounds, this concept of decoherence does not depend on any precise
definition of (sub)systems. All
systems should thus
be
entangled with their growing environments, and generically cannot
possess pure quantum
states by their own. They may then formally be described by a reduced
density matrix \rho
that represents a "mixed state", with a von Neumann entropy -trace(\rho
ln\rho) that grows in time (unless there were "advanced entanglement").
This reduced density matrix is
operationally
indistinguishable (by means of local
operations) from that describing an ensemble
of states – as though some really existing pure state were just
incompletely
known. One of the states diagonalizing this density matrix could then
be selected by a mere increase of knowledge. For this reason, the mixed
state arising from entanglement is often erroneously
identified with such an
ensemble.
Since the dynamical situation of increasing entanglement
applies in particular to systems representing macroscopic outcomes of
quantum measurements ("pointer positions"), decoherence has
occasionally been
claimed
to
explain
the probabilistic nature of quantum mechanics ("quantum
indeterminism").
However,
such a conclusion would evidently contradict the determinism of the
assumed global unitary dynamics. (Note that this claimed solution –
even if correct – would require decoherence to be
irreversible, as the
measurement could otherwise be
undone or
"erased" – see Quantum teleportation
and other quantum
misnomers). Although the claim would then be
operationally
unassailable, it is wrong. The very
concept of a
density
matrix is already based on local operations (measurements), which
presume the probability
interpretation to replace global unitarity at some point.
Because of the popularity of this "naive" misinterpretation of
decoherence, I
have often emphasized that the latter does "not by itself solve
the
measurement problem". This remark has in turn been quoted to argue
that
decoherence is irrelevant for the understanding of quantum
measurements. This argument has mainly been used by
physicists who insist on a traditional solution: by means of a
stochastic interpretation that has to complement unitary dynamics.
Their hope can
indeed not
be fulfilled by decoherence.
In particular, "epistemic" interpretations of the wave
function (as merely representing incomplete knowledge) usually
remain silent about what this missing knowledge is about, in
order
to avoid inconsistencies. A stochastic collapse of the
wave function,
on the other hand, would require
a fundamental
non-linear modification of the Schrödinger
equation. Since, in Tegmark's words,
decoherence "looks and
smells
like a
collapse", it is instructive first to ask in what
sense such
collapse theories would solve the measurement problem if
their prospective non-linear
dynamics were ever confirmed
empirically (for example, by studying systems that are completely
shielded
against decoherence – a very difficult condition to be achieved in
practice).
Some physicists prefer the questionable alternative that
the Schrödinger equation is exact but applicable only between the
"preparation" and "measurement" of a quantum state. However, it
appears absurd to assume that the wave function exists only for
the purpose of experimental physicists to make predictions for their
experiments. It would then
remain completely open how macroscopic objects, including
preparation and
measurement devices themselves, could ever be consistently described as
physical
systems consisting of atoms.
It is well known that superpositions of two or more possible states
may represent (new) individual
physical properties as long as the system remains isolated, while they
seem to turn
into
statistical ensembles when measured and hence subjected
to
decoherence. (As to my knowledge, no "real", that is, irreversible,
measurement
has ever been performed in the absence of decoherence.)
So what would it mean if
appropriate non-linear collapse terms in the dynamics were confirmed to
exist?
These theories require that any superposition of different positions of
a macroscopic pointer
(or any other macroscopic variables) indeterministically evolves or
jumps into
one of many possible
narrow wave
packets that may represent pointer states with an uncertainty
that is negligible but quite natural for wave packets. These wave
packets resemble
Schrödinger's coherent states, which he once used to
describe
quasi-classical
oscillators, and which he hoped to be representative for all
quasi-classical objects (apparent particles, in particular). His hope
failed not only for micrroscopic particles because of the dynamical
dispersion
of the wave packet under the
Schrödinger equation, while coherent states do successfully
describe
time-dependent quasi-classical states of electromagnetic field modes,
which interact very weakly with their environment. The ensemble of all
possible outcomes of the postulated collapse into such wave
packets, weighted by the empirical
Born probabilities, can be described by a density
matrix that is essentially the same as the reduced density matrix
arising
from
decoherence. The collapse assumption
would
mean, though, that no fundamental
classical concepts are needed
any more for an interpretation of
quantum mechanics.
Since macroscopic pointer states are assumed to collapse into
narrow wave
packets in their position
representation, there is no eigenvalue-eigenfunction link problem that
is claimed to arise in epistemic interpretations. Observables are no
fundamental concepts any more, as they can be derived from the specific
measurement interaction Hamiltonian.
As an example, consider the particle track arising
in a Wilson or bubble chamber, described by a succession
of collapse events. All the
little droplets (or bubbles in a bubble chamber) can be interpreted as
macroscopic
"pointers" (or documents). They can themselves be observed without
being disturbed
by means of "ideal measurements". According to Mott's unitary
description, the state of
the
apparently observed
"particle" (its wave function) becomes entangled with all these pointer
states in a way
that describes a superposition of many
different tracks, each one consisting of a number of droplets at
correlated positions. This superposition
would disappear according to the collapse, which is assumed to
remove all but one of the tracks. Individual tracks are
globally described by wave packets that approximately factorize into
localized
final states of the
particle, droplet positions, and their environment. So one assumes
that
the kinematical concept of a
wave function is complete, which means that there are no particles. In
contrast,
many interpretations of quantum theory, such as the Copenhagen
interpretation or those based on Feynman
paths or Bohm
trajectories, are all
entertaining
the prejudice that
classical concepts are fundamental
at some
level.
As mentioned above, decoherence leads to the same reduced
density
matrix (for the combined system of droplets and "particle"), which
therefore seems to represent
an
ensemble of tracks. This was all known to Mott in the
early days of quantum mechanics, but he did not yet take into
account the subsequent and unavoidable process of decoherence of the
droplet positions by their
environment. Mott did not see the need to solve any measurement
problem, as he accepted the probability interpretation in terms of
classical variables. In a global unitary quantum
description, however,
there
is still just one
global
superposition of all "potential" tracks consisting of droplets,
entangled with the particle wave function and the environment: a
universal Schrödinger cat. Since one does not
obtain a genuine ensemble of pointer states,
one cannot select one of
its members by a mere increase of information. Since such a selection seems to occur in a measurement, it
is this apparent increase of
information that requires further analysis.
For this purpose, one has to include an observer of the pointer or the
Wilson tracks into the description.
According to the
Schrödinger equation, he, too, would necessarily become part
of the entanglement with the "particle", the device, and the
environment. Clearly, the phase relations originating from the initial
superposition have now been irreversibly dislocalized (become an
uncontrollable
property of
the state of the whole universe). They can never be experienced any
more by an observer
who is assumed to be local for dynamical reasons. This dynamical
locality also means that decohered
components
of the universal wave function are dynamically autonomous
(see Quantum
nonlocality vs. Einstein locality). The in this way arising
branches of the global
wave function form entirely independent "worlds", which may contain
different states of all
observers who are involved in the process.
If
we intend to associate
unique contents of consciousness with physical states of local
observers, we can do
this only separately with
their
thus
dynamically defined component states. The observed quantum
indeterminism must then
be
attributed
to the indeterministic history of these quasi-classical branch wave
functions
with their internal observers. No
indeterminism is required for the global
quantum state. This identification of observers with states existing
only in certain branching components of the global wave function
is the
only
novel element that
has to be added to the quantum formalism for a solution of the
measurement problem.
Different observers of
the same measurement result living in the same branch world are
consistently
correlated with one
another in a
similar way as
the
positions of different droplets forming an individual track in the
Wilson chamber. Redefining the very concept of reality
operationally (that is, applying it only to "our" branch)
would eliminate from reality most of what we just concluded to exist
according to the unitary dynamics! The
picture of branching "worlds" perfectly
describes quantum measurements – although in an unconventional manner.
Decoherence may be
regarded as a "collapse without a collapse".
(Note, however, that decoherence occuring in quantum processes in the
brain must be expected to lead
to further indeterministic branching even after the information about a
measurement result has arrived at the sensorial system already in a
quasi-classical form.) Why should we reject the
consequence of the
Schrödinger equation that there must
be
myriads of (by us) unobserved quasi-classical worlds, or why should we
insist on the existence of fundamental classical objects that we seem
to observe, but that we don't need at all for a consistent physical
description our observations?
Collapse theories (when formulated by means of fundamental stochastic
quantum Langevin
equations)
would not only have to postulate
the indeterministic transition of
quantum states into definite component states, but
also their relative
probabilities according to the Born rules.
While, even without a
collapse, the relevant components (or robust "branches" of the wave
function) can be
dynamically justified by the
dislocalization of superpositions (decoherence) as described above,
the
probabilities themselves
can not. Since all outcomes
are assumed to exist in this picture, all attempts to
derive the empirical probabilities are doomed to remain
circular.
According to Graham, one may derive the observed
relative frequencies of measurement outcomes (their
statistical distribution) by merely
assuming that
our final (presently experienced)
branch of the universal wave function (in which "we" happen to live)
does
not
have an
extremely small norm in comparison to the others. Although the choice
of the norm is here
completely equivalent
to assuming the Born probabilities for all individual branchings, it is
a
natural choice for such a postulate, since the norm is
conserved under the Schrödinger equation (just as phase space is
conserved
in
classical theories, where it likewise serves as an appropriate
probability measure). Nonetheless, most physicists
seem to insist on a metaphysical (pre-Humean) concept of dynamical
probabilities, which would explain
the observed frequencies of measurement results in a "causal" manner.
However, this metaphysics seems to represent a prejudice
resulting from our causal experience of the classical world.
There is now a wealth of observed mesoscopic realizations of
"Schrödinger
cats", produced according to a general Schrödinger equation. They
include superpositions of different states of electromagnetic fields,
interference between partial waves describing
biomolecules passing through different slits of an appropriate device,
or
superpositions of currents consisting of millions of electrons moving
collectively in opposite directions. They can all be used to
demonstrate their gradual decoherence by interaction with the
environment (in contrast to previously assumed spontaneous quantum
jumps), while there is
so far no
indication whatsoever of a
genuine collapse. However, complex
biological systems (living beings)
can hardly ever be sufficently isolated, since they have to permanently
get rid of entropy. Such systems depend essentially on the arrow of
time that is manifest in the growing
correlations
(most
importantly in the form of quantum entanglement, and hence
decoherence).
Only in a Gedanken Experiment
may we conceive of an isolated observer, who for some interval of
time
interacts with an also isolated measurement device, or even directly
with a microscopic system (by absorbing a single photon, for example).
One may similarly imagine an observer who is himself passing through an
interference device while being aware of the slit he passes through.
What
would that mean according to a universal Schrödinger
equation? Since the observer's internal state of knowledge must become
entangled with the variables that he has observed, or with his path
through the slits, he would subjectively believe to pass through one slit
only.
Could we confirm such a prediction in principle? If we observed the
otherwise isolated
observer from
outside, he should behave just as any microscopic system – thus
allowing for interference when "erasing" his memory. So he would have
to
lose all his memory about what he experienced in order to restore the
complete superposition locally. Can we then not ask him
before this recoherence occurs? This would require him to
emit
information in some physical form, thereby preventing recoherence and
interference. An observer in a state that allows interference
could
never tell us which passage he was aware of! This demonstrates that the
Everett
branching is ultimately subjective,
although
we may always assume it to
happen objectively as soon as decoherence has become irreversible for
all
practical purposes. As this usually occurs in the apparatus of
measurement, this
description justifies the pragmatic Copenhagen rules – albeit in a
conceptually consistent manner and without
presuming classical terms.
See also "Quantum discreteness is an
illusion" (in particular Sects. 3 and 4) or "Roots
and Fruits of
Decoherence" (in particular Sects. 3, 5 and 6).
Back to www.zeh-hd.de