Quantum
entanglement, a phrase first coined by Erwin Schrödinger in 1935,
describes a condition of the separated parts of the same quantum system in which
each of the parts can only be described by referencing the state of other part.
This is one of the most counterintuitive aspects of quantum mechanics, because
classically one would expect system parts out of speed-of-light contact to be
completely independent. Thus, entanglement represents a kind
of quantum “connectedness” in which measurements on one isolated part of an
entangled quantum system have non-classical consequences for the outcome of
measurements performed on the other (possibly very distant) part of the same
system. This quantum connectedness that enforces the
measurement correlation and state-matching in entangled quantum systems has come
to be called quantum nonlocality.

Nonlocality was first highlighted by
Albert Einstein and his coworkers Boris Podolsky and Nathan Rosen in their
famous 1935 EPR paper. They argued that the nonlocal connectedness of quantum
systems requires a faster-than-light connection that appears to be in conflict
with special relativity. This criticism of quantum mechanic
was ignored by most of the physics community until 1964, when John S. Bell, a
theoretical physicists working at the CERN laboratory in Geneva, used the
formalism of quantum mechanics to show that certain experimental tests could
distinguish the predictions of quantum mechanics from those of alternative
theories that were “local”, in the sense that nonlocality was eliminated.
Bell
based his calculations not on measurements of position and momentum, the focus
of Einstein's arguments, but on measurements of the states of
polarization of photons of light.

In a propagating light wave, if the
electric field oscillates in the vertical or horizontal direction the light is
said to be linearly polarized vertically or horizontally.If
the electric vector corkscrews through space in a counterclockwise or clockwise
direction, as viewed from the front, the light is said to be right or left
circularly polarized.A mixture of linear and circular
polarization is called elliptical polarization.As an
example, my sunglasses pass light that has vertical linear polarization and
block light that has horizontal linear polarization because the latter is
produced by reflection and glare.

Bell showed, essentially, that when
pairs of polarization-entangled photons are measured for linear polarization in
particular directions, quantum mechanics predicts that the coincidence rates
between detections of the entangled pair vs. the angle between the polarization
measurements can be used to generate a quantity that has a value of 2.8
according to quantum mechanics, while “realistic local” theories predict that
the same quantity of 2.0 must be less than.This difference
occurs because the coincidence rate predicted by quantum mechanics (and the
classical Malus Law of polarization) falls off as the square
of angle between the measured linear-polarization directions, while all local
theories predict a linear falloff.
The mathematical expression of this dichotomy is called Bell’s
Inequalities or Bell’s Theorem.

Since the 1970s, experimentalists
have performed many “EPR experiments” based on
Bell’s Theorem.(See my AV Column “Einstein's
Spooks and Bell's Theorem”, Analog, January-1990).
These
experiments have consistently found with high statistical precision that
Bell’s quantity has a value of 2.8, demonstrating the
validity of the nonlocal predictions of quantum mechanics and falsifying
alternative theories that are local and “realistic” (see below).

But do such EPR experiments actually
demonstrate the existence of quantum nonlocality?As it
turns out, there is more than one way of interpreting the implications of the
EPR experimental results, and there is dispute as to whether it is locality or
“realism” (the objective observer-independent reality of external events) that
has been refuted by the EPR measurements.To put it another
way, to accommodate the results of the EPR experiments, either one has to accept
that there is some mysterious “nonlocal influence” that acts across space-time
to force the results of separated measurements to be consistent with each other
and with conservation laws, or one has to relinquish the concept of objective
reality, the idea that the universe exists with well-defined properties,
independent of what we choose to observe and measure.Local
realistic theories have been falsified by the EPR experiments based on
Bell’s Theorem, but is it locality
or realism (or both) that has been eliminated?

The reaction
of the general physics community to these Bell’s Inequality test results has
been either (a) to ignore them altogether (as the majority of working physicists
seem consistently to do) or (b) to assume objective reality is OK and to admit
grudgingly that nonlocality is perhaps an inseparable aspect of quantum
mechanics.

However,
Noble Laureate Tony Leggett of the
University
of Illinois has recently
pushed this issue somewhat farther.He has demonstrated that
by focusing on the falloff of correlations with elliptical
polarization rather than the linear polarization of the Bell Inequality EPR
experiments, one can compare the predictions of quantum mechanics with a class
of nonlocal realistic theories.The resulting Leggett
Inequalities can be used in the same way as the Bell Inequalities, but to test
nonlocal realism instead of local realism.

A group of experimentalists at the
Institute for Quantum Optics and Quantum Information (IQOQI) in
Vienna has now performed an EPR experiment that is a
definitive test of the Leggett Inequalities, and their results have recently
been published in the British science journal Nature.They show that in EPR measurements with elliptically polarized entangled
photons, the Leggett Inequalities in two observables are violated by 3.6 and by
9 standard deviations.This in interpreted as a
statistically significant falsification of the whole class of nonlocal realistic
theories studied by Leggett.

The group summarizes the implications
of their results with the statement, “We believe that our results lend strong
support to the view that any future extension of quantum theory that is in
agreement with experiments must abandon certain features of
realistic descriptions.”In other words, quantum
mechanics and reality appear to be incompatible.

Is the case against objective reality
truly so strong?To answer this question, we must examine in
more detail the nonlocal realistic theories that Leggett studied.This class of theories assumes that when entangled photons emerge
from their emission source, they are in a definite state of polarization.It is well know that when that assumption (and no others) is made, one
does not observe the quantum mechanical prediction of Malus’s Law for the
correlations of the photon pair.

However,
Leggett cures that problem by assuming an unspecified nonlocal connection
mechanism between the detection systems that fixes the discrepancy.In effect, the two measurements talk to each other nonlocally in such a
way that the detected linearly polarized photons obey Malus’ Law and produce the
same linear polarization correlations predicted by quantum mechanics
calculations.Leggett then shows that this nonlocal “fix”
cannot be extended into the realm of elliptical polarization, and that quantum
mechanics and this type of nonlocal realistic theories give differing
predictions for the elliptic polarization correlations.In
other words, the “reality” that is being tested is whether the photon source is
initially emitting the entangled photons in a definite state of polarization.It is this version of reality that has been falsified by the
IQOQI measurements.

We can clarify what is going on in
these experimental tests by applying the Transactional Interpretation (TI) of
quantum mechanics (see my column “The Quantum Handshake”
in Analog 11/86) to these Leggett Inequality tests..As some
of you readers may know, I originated the TI in 1986, and it is considered to be
one of the leading alternatives to the orthodox Copenhagen Interpretation of
quantum mechanics.

From the point of view of the TI,
Leggett’s assumption that the entangled photons are emitted in definite states
of polarization is wrong.The “offer wave” for each photon
that emerges from the source includes all possible polarization states.These offer waves travel to downstream detectors, and time-reversed
“confirmation waves” travel back up the time-stream to the source, arriving at
the instant of emission.A three-way transaction then forms
between the source and the two detections that matches the confirmation waves to
a mutually consistent overall state that satisfies appropriate conservation laws
(in this case, conservation of angular momentum).The final
result is a completed transaction with the two photons in definite states, but
this definite state was not present in the initial emission of the offer waves,
and that is the part of the process described in detail by the wave-mechanics
formalism of quantum mechanics.We note that the TI does not
in itself make any predictions about the linear or elliptical polarization
correlations of the entangled photon pair.It only describes
the quantum formalism that is making the predictions that the IQOQI group
has observed to be consistent with
their experiment, but it clarifies what is going on in those predictions.

Does this mean that the TI (and the
quantum formalism it describes) are not “realistic”, i.e. inconsistent with an
objective reality that is independent of the observer’s choice of measurements?I don’t think so.The transactions that form in
quantum processes arise from a “handshake” between the past and future across
space-time, but they are not specifically the result of measurements or observer
choices.The latter are only a small subset of the
transactions that form as the universe evolves in space-time.The message of the Leggett Inequality tests, from the point of view of
the TI, is that the assumption of emission in a definite polarization state is
too restrictive.I would argue that initial emission without
a definite polarization state is not inconsistent with objective reality and is
consistent with the quantum formalism.

The TI description of the quantum
formalism is realistic and nonlocal, in at least some definitions of those
terms, and it is completely consistent with the IQOQI results.To put it another way, Leggett has set up a straw man that has been
demolished by the IQOQI tests, but that is only an indication that his version
of “realism” is too naïve. And this theory and experiment
can be viewed as another demonstration of the value and power of the TI in
understanding the peculiar predictions and intrinsic weirdness of quantum
mechanics.

Since this is an SF magazine, we
should as usual consider the science fictional implications of this work.Ignoring my remarks above about the TI, these experimental results could
be viewed as reinforcement for the “observer-created quantum reality” (i.e.,
non-realism) that is an important theme in contemporary SF.
This theme goes back at least as far as Greg Bear’s pivotal novella Blood Music, a story that concludes with the formation of a
planet-wide group mind, an entity that acts as the proverbial 1,000 pound
gorilla and can collapse wave functions any way it damn well pleases.Perhaps the “ansible” used by LeGiun, Card, and others might be
considered a SF use of nonlocality. With that exception, there has not been much
SF written that uses the weirdness of quantum nonlocality in a central way
(although I may write some of that soon).The IQOQI tests
could be viewed as calling quantum nonlocality into question, but I would argue
against that view using the transactional analysis of the IQOQI experiment
described above.But, as usual, SF authors are free to work
all sides of the street, when it comes to quantum phenomena, and we readers reap
the benefits of that diversity.

References:

Reality Test:

“An
experimental test of non-local realism”, S. Gröblacher, T. Paterek, R.
Kaltenbaek, C. Brukner, M. Zukowski, M. Aspelmeyer, and A. Zeilinger,
Nature 446, 871-875 (2007);
available online at
http://www.arxiv.org/pdf/0704.2529
.

John Cramer's new book: a non-fiction
work describing his Transactional Interpretation of quantum mechanics, The
Quantum Handshake - Entanglement, Nonlocality, and Transactions,
(Springer, January-2016) is available for purchase online as a printed or eBook
at: http://www.springer.com/gp/book/9783319246406
.

SF
Novels by John Cramer:my two hard SF novels,TwistorandEinstein's Bridge, are newly released as eBooks by Book View
Cafe and are available at :http://bookviewcafe.com/bookstore/?s=Cramer
.

AV
Columns Online: Electronic
reprints of about 177 "The Alternate View" columns by John G.
Cramer, previously published in Analog
, are available online at: http://www.npl.washington.edu/av.