Case Study 1: Aspect's Experiment — Testing Bell's Theorem in the Lab
Overview
In 1982, Alain Aspect and his collaborators at the Institut d'Optique in Orsay, France, performed what is widely regarded as the most important experiment in the foundations of quantum mechanics. Their work did not merely test a prediction — it tested the very fabric of physical reality, asking whether the universe obeys the principle of local realism that Einstein considered non-negotiable.
This case study traces Aspect's experiment from its physical design through data analysis to its profound implications, and then follows the story to the loophole-free tests of 2015 that closed the last escape routes for local hidden variable theories.
Part 1: The Physical Setup
The Source: Atomic Cascade in Calcium
Aspect's source of entangled photons exploited a radiative cascade in calcium atoms. When a calcium atom is excited to a specific energy level (the $4p^2 \, ^1S_0$ state) and decays through an intermediate state back to the ground state, it emits two photons in rapid succession:
$$4p^2 \, ^1S_0 \xrightarrow{\gamma_1 \; (551.3 \text{ nm})} 4s4p \, ^1P_1 \xrightarrow{\gamma_2 \; (422.7 \text{ nm})} 4s^2 \, ^1S_0$$
The first photon ($\gamma_1$, green) and the second photon ($\gamma_2$, violet) are emitted in a cascade. Because the initial and final atomic states both have $J = 0$ and the intermediate state has $J = 1$, angular momentum conservation forces the two photons into a polarization-entangled state:
$$|\Psi\rangle = \frac{1}{\sqrt{2}}\bigl(|H\rangle_1|H\rangle_2 + |V\rangle_1|V\rangle_2\bigr) = |\Phi^+\rangle$$
where $|H\rangle$ and $|V\rangle$ denote horizontal and vertical linear polarization. This is the $|\Phi^+\rangle$ Bell state — maximally entangled.
🧪 Experiment: The calcium atoms were excited by two laser beams (a two-photon excitation scheme), achieving a pair production rate of about $5 \times 10^7$ pairs per second. This was a major improvement over earlier experiments (Freedman and Clauser, 1972) which used single-photon excitation and produced far fewer pairs.
The Detectors: Two-Channel Polarizers
Each photon was directed to a polarizing beam splitter — a device that transmits photons polarized along one axis and reflects photons polarized along the orthogonal axis. This gave a two-channel detection scheme:
- If photon 1 is transmitted (polarization along $\hat{a}$): outcome $A = +1$
- If photon 1 is reflected (polarization orthogonal to $\hat{a}$): outcome $A = -1$
Similarly for photon 2 with polarizer axis $\hat{b}$.
The two-channel scheme was a crucial improvement over single-channel polarizers (which simply block or transmit), because it eliminated the need for auxiliary "fair sampling" assumptions about undetected photons.
The Innovation: Rapid Switching
The key innovation in Aspect's third experiment (1982, with Dalibard and Roger) was the use of acousto-optic switches to change the polarizer orientations during the flight of the photons.
The switches were driven by independent acoustic oscillators at different frequencies, redirecting each photon to one of two polarizers with different orientations. The switching time was approximately 10 nanoseconds. The detectors were separated by 12 meters, corresponding to a light travel time of about 40 nanoseconds.
This meant that the polarizer setting on Alice's side was established after the photon pair was created, and before any signal traveling at the speed of light could reach Bob's detector from Alice's switch (or vice versa). Any correlation between the measurement settings and the hidden variables would require faster-than-light communication — a violation of special relativity.
📊 By the Numbers: Key parameters of Aspect's 1982 experiment: - Source: calcium atomic cascade, 2-photon laser excitation - Pair rate: $\sim 5 \times 10^7$ pairs/second - Detector separation: 12 meters (40 ns light travel time) - Switching time: ~10 ns - Integration time: several hours per setting - Total coincidence events: ~16,000 per orientation pair per 100s run - Detector quantum efficiency: ~5%
Part 2: The Data
Measuring the Correlation Function
For each pair of polarizer settings $(\hat{a}, \hat{b})$, Aspect recorded four coincidence rates:
- $N_{++}(\hat{a}, \hat{b})$: both photons transmitted
- $N_{+-}(\hat{a}, \hat{b})$: photon 1 transmitted, photon 2 reflected
- $N_{-+}(\hat{a}, \hat{b})$: photon 1 reflected, photon 2 transmitted
- $N_{--}(\hat{a}, \hat{b})$: both photons reflected
The correlation function is computed as:
$$E(\hat{a}, \hat{b}) = \frac{N_{++} + N_{--} - N_{+-} - N_{-+}}{N_{++} + N_{--} + N_{+-} + N_{-+}}$$
The CHSH Test
Aspect chose four polarizer settings optimized for maximum CHSH violation:
- Alice's settings: $\hat{a}_1 = 0°$, $\hat{a}_2 = 45°$
- Bob's settings: $\hat{b}_1 = 22.5°$, $\hat{b}_2 = 67.5°$
(These differ slightly from the optimal theoretical settings due to the use of $|\Phi^+\rangle$ instead of $|\Psi^-\rangle$, but the physics is the same.)
The measured CHSH parameter:
$$S_{\exp} = E(\hat{a}_1, \hat{b}_1) - E(\hat{a}_1, \hat{b}_2) + E(\hat{a}_2, \hat{b}_1) + E(\hat{a}_2, \hat{b}_2)$$
Results
| Setting pair | $E_{\text{exp}}$ | $E_{\text{QM}}$ | $E_{\text{LHV}}$ bound |
|---|---|---|---|
| $(\hat{a}_1, \hat{b}_1) = (0°, 22.5°)$ | $+0.685 \pm 0.006$ | $+0.707$ | satisfies CHSH |
| $(\hat{a}_1, \hat{b}_2) = (0°, 67.5°)$ | $-0.680 \pm 0.006$ | $-0.707$ | satisfies CHSH |
| $(\hat{a}_2, \hat{b}_1) = (45°, 22.5°)$ | $+0.677 \pm 0.006$ | $+0.707$ | satisfies CHSH |
| $(\hat{a}_2, \hat{b}_2) = (45°, 67.5°)$ | $+0.655 \pm 0.006$ | $+0.707$ | satisfies CHSH |
$$S_{\exp} = 0.685 + 0.680 + 0.677 + 0.655 = 2.697 \pm 0.015$$
$$S_{\text{QM}} = 2\sqrt{2} \approx 2.828$$
$$S_{\text{LHV}} \leq 2$$
The measured $S$ exceeds the local hidden variable bound by $(2.697 - 2.000)/0.015 \approx 46$ standard deviations.
The discrepancy between $S_{\exp} = 2.697$ and $S_{\text{QM}} = 2.828$ is fully explained by experimental imperfections: finite angular resolution of polarizers, imperfect entanglement of the source, detector dark counts, and accidental coincidences.
Part 3: The Loopholes
Locality Loophole
The concern: What if Alice's measurement device somehow communicates (at the speed of light or slower) with Bob's device or with the source, allowing the system to "conspire" to produce correlated results?
How Aspect addressed it: Rapid acoustic switching, with switching events space-like separated from the distant measurement events.
Remaining gap: Aspect's switches were periodic (driven by oscillators at known frequencies), not truly random. In principle, a deterministic hidden variable theory could have predicted the switch pattern.
How 2015 closed it: The Delft experiment used true random number generators (based on photon arrival times at remote locations) to choose settings. The choice events were space-like separated from the measurement events, and the random number generators were causally independent of the source.
Detection Loophole
The concern: Aspect's detectors captured only ~5% of photon pairs. If the hidden variables determined which photons were detected (and the detected sample was biased), a LHV model could produce an apparent violation even though the complete ensemble satisfied $|S| \leq 2$.
Critical efficiency: For the CHSH inequality with maximally entangled states, the detection efficiency must exceed $\eta_{\text{crit}} = 2(\sqrt{2} - 1) \approx 82.8\%$ (Eberhard, 1993).
How 2015 closed it: The Vienna experiment used superconducting nanowire single-photon detectors (SNSPDs) with system efficiency $\sim 75\%$, combined with an Eberhard inequality that has a lower efficiency threshold. The Delft experiment used matter qubits (nitrogen-vacancy centers) with near-unity detection efficiency (~96%).
Freedom-of-Choice Loophole
The concern: What if the random number generators choosing measurement settings are not truly free — what if they are correlated with the hidden variables through some shared past cause (perhaps all the way back to the Big Bang)?
This is "superdeterminism." It cannot be fully closed by any experiment, because it denies the possibility of free experimental choices. However:
- The 2018 "Cosmic Bell Test" (Rauch et al.) used light from distant quasars (billions of light-years away) as random number generators, pushing the shared past cause back to the very early universe.
- Most physicists reject superdeterminism because it would undermine the entire scientific method: if experimental settings cannot be freely chosen, no experiment tests anything.
Part 4: The 2015 Loophole-Free Tests
The Delft Experiment (Hensen et al., Nature 2015)
Platform: Electron spins in nitrogen-vacancy (NV) centers in diamond, separated by 1.3 km.
Protocol: Entanglement was established by entanglement swapping: each NV center emitted a photon entangled with its electron spin. The two photons were sent to a central station and subjected to a Bell-state measurement. A successful Bell-state measurement projected the distant electron spins into an entangled state — "heralded" entanglement.
Key numbers: - Heralded entanglement rate: ~1 event per hour (very low!) - Detection efficiency: ~96% (near-unity for electron spin readout) - Space-like separation: 1.3 km (4.3 $\mu$s light travel time) - Setting choice: quantum random number generators, space-like separated from measurements - Total events: 245 heralded entanglement events over 220 hours - Result: $S = 2.42 \pm 0.20$, $p$-value = 0.039
Significance: This was the first experiment to close all three loopholes simultaneously. The $p$-value of 0.039 is statistically significant ($> 2\sigma$) but not overwhelming — a consequence of the extremely low event rate.
The Vienna Experiment (Giustina et al., PRL 2015)
Platform: Polarization-entangled photon pairs from SPDC (spontaneous parametric down-conversion) in a periodically poled KTP crystal.
Protocol: High-efficiency photon detection with SNSPDs, using an Eberhard inequality (a modified Bell inequality with lower efficiency requirements).
Key numbers: - Pair rate: ~$6 \times 10^5$ entangled pairs per second - System detection efficiency: ~75% - Detector separation: 58 meters - Total events: $> 10^9$ detection events - Result: $p < 3.7 \times 10^{-31}$ (equivalent to $> 11\sigma$)
Significance: Overwhelmingly statistically significant, with all three loopholes closed.
The NIST Boulder Experiment (Shalm et al., PRL 2015)
Platform: Similar photonic setup to Vienna, with SNSPD detectors.
Key numbers: - Detector separation: 184 meters - System detection efficiency: ~75% - Result: $p < 2.3 \times 10^{-7}$ (equivalent to $> 5\sigma$)
Part 5: Implications and Legacy
What Has Been Proved
The 2015 experiments, taken together, establish the following:
-
Local realism is false. No local hidden variable theory — regardless of its complexity — can reproduce the observed correlations. This is not a theoretical argument; it is an experimental fact.
-
Quantum entanglement is real. The correlations between distant particles cannot be explained by any shared classical information. Something genuinely non-classical connects entangled particles.
-
The violation is robust. It persists across different physical platforms (photons, electron spins, atomic ions), different Bell inequalities (CHSH, Eberhard, Mermin), and different experimental groups.
What Has NOT Been Proved
-
Which interpretation is correct. The experiments are consistent with all standard interpretations (Copenhagen, many-worlds, Bohmian, QBism). They rule out local hidden variable theories but not non-local hidden variable theories (Bohmian mechanics) or interpretations that modify the concept of reality (QBism, many-worlds).
-
That signals travel faster than light. Bell inequality violation does not enable FTL communication. The no-signaling theorem remains intact.
-
That the measurement problem is solved. Bell tests confirm quantum predictions but do not explain how definite outcomes arise from superpositions.
The 2022 Nobel Prize
The 2022 Nobel Prize in Physics was awarded to Alain Aspect, John Clauser, and Anton Zeilinger "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science."
The Nobel Committee wrote: "Their results have cleared the way for new technology based upon quantum information... Being able to manipulate and manage quantum states and all their layers of properties gives us access to tools with unexpected potential."
Discussion Questions
-
Aspect's switches were periodic, not random. Design a thought experiment showing how a "superdeteministic" hidden variable theory could, in principle, exploit periodic switching to fake a Bell violation. Why is this considered conspiratorial?
-
The Delft experiment had only 245 events and a $p$-value of 0.039. The Vienna experiment had over $10^9$ events and a $p$-value of $10^{-31}$. Both closed all loopholes. Which do you consider more convincing, and why? What are the trade-offs between the two approaches?
-
If you had to explain the significance of Aspect's experiment to a non-physicist in 3 minutes, what would you say? Try writing it out.
-
Bell tests have now been done with photons, electrons in diamond, trapped ions, and superconducting qubits. Why is it important to test Bell inequalities on multiple physical platforms? What would it mean if one platform showed a violation and another did not?
-
The freedom-of-choice loophole can never be fully closed. Does this bother you? Should it bother physicists? What is the relationship between experimental physics and the assumption that measurement settings can be freely chosen?