Case Study 2: Quantum Hardware — The Race to Build a Useful Quantum Computer
Overview
Building a quantum computer is one of the greatest engineering challenges of the 21st century. The task is deceptively simple to state: construct a system of controllable qubits, apply unitary gates with sufficient precision, and read out the results — all while protecting fragile quantum states from environmental noise. In practice, this requires operating at temperatures colder than outer space, controlling individual atoms with lasers, or routing single photons through optical circuits. This case study surveys the leading hardware platforms, the milestone experiments that have defined the field, and the formidable challenges that remain on the path to fault-tolerant quantum computing.
Part 1: The DiVincenzo Criteria
In 2000, David DiVincenzo articulated five criteria that any physical system must satisfy to serve as a quantum computer:
-
A scalable physical system with well-characterized qubits. You need a two-level quantum system that you can reliably prepare and control, and a clear path to scaling to many such systems.
-
The ability to initialize the state to a simple fiducial state. You must be able to prepare $|00\ldots0\rangle$ at the start of every computation.
-
Long decoherence times, much longer than gate operation times. The ratio $T_2/t_{\text{gate}}$ (coherence time divided by gate time) determines how many operations you can perform before the quantum information is corrupted. This ratio must be much greater than 1.
-
A universal set of quantum gates. You must be able to implement a set of gates (e.g., $\{H, T, \text{CNOT}\}$) that can approximate any unitary.
-
A qubit-specific measurement capability. You must be able to measure individual qubits in the computational basis.
Two additional criteria were added for quantum communication:
-
The ability to interconvert stationary and flying qubits. Transfer quantum information from a processing qubit to a photon for transmission.
-
The ability to transmit flying qubits faithfully. Send photons over long distances without losing the quantum information.
These criteria provide a scorecard for evaluating hardware platforms. No single platform excels at all criteria; each makes different tradeoffs.
Part 2: Superconducting Qubits — The Microchip Approach
The Physics
A superconducting qubit is a macroscopic quantum object: an electrical circuit made of superconducting metals (aluminum, niobium) that behaves quantum mechanically when cooled to millikelvin temperatures. The key component is the Josephson junction — a thin insulating barrier between two superconductors, through which Cooper pairs tunnel.
The Josephson junction acts as a nonlinear inductor, creating an anharmonic oscillator. While a harmonic oscillator has evenly spaced energy levels (every transition has the same frequency), the Josephson junction creates unequal spacing, allowing the two lowest levels to be addressed individually without exciting higher levels. These two levels are $|0\rangle$ and $|1\rangle$.
Several types of superconducting qubits have been developed:
- Transmon (2007, Koch et al.): The dominant design. A capacitively shunted Josephson junction with reduced charge noise sensitivity. Used by IBM, Google, Rigetti, and most others.
- Fluxonium (2009, Manucharyan et al.): Uses a superinductance for longer coherence times. Emerging as a next-generation alternative.
- Cat qubit (Alice & Bob): Encodes information in superpositions of coherent states, providing built-in protection against bit flips.
Control and Measurement
Superconducting qubits are controlled by microwave pulses (typically 4-8 GHz). Single-qubit gates are implemented by applying shaped microwave pulses resonant with the qubit transition. Two-qubit gates (CNOT, CZ) are implemented by tuning qubits into and out of resonance with each other, or by using a microwave-activated cross-resonance interaction.
Measurement uses the dispersive readout technique: the qubit is coupled to a superconducting resonator, and the qubit state shifts the resonator frequency. By probing the resonator with a microwave tone, the qubit state can be inferred without directly disturbing it (in principle).
Milestone Experiments
| Year | Milestone | Group | Significance |
|---|---|---|---|
| 1999 | First superconducting qubit (charge qubit) | NEC (Nakamura et al.) | Demonstrated quantum coherence in a circuit |
| 2007 | Transmon qubit | Yale (Koch et al.) | Reduced charge noise, became standard design |
| 2012 | Surface code error detection (1 qubit) | UCSB (Martinis group) | First step toward error correction |
| 2014 | 5-qubit processor | IBM | First cloud-accessible quantum computer |
| 2019 | "Quantum supremacy" | Google (Sycamore, 53 qubits) | First computation claimed beyond classical reach |
| 2023 | 1000+ qubit processor | IBM (Condor, 1121 qubits) | Largest qubit count |
| 2024 | Below-threshold error correction | Google (Willow, 105 qubits) | Error correction improves with more qubits |
The 2019 Quantum Supremacy Experiment
Google's Sycamore processor (53 transmon qubits, 2D grid connectivity) performed random circuit sampling — generating samples from the output distribution of a random quantum circuit of depth 20. Google claimed the task took Sycamore 200 seconds but would take Summit (then the world's most powerful supercomputer) 10,000 years.
The claim was contested by IBM, who argued that with 250 petabytes of disk storage and a different algorithm, Summit could complete the task in 2.5 days. Subsequent classical algorithms have further reduced the classical cost.
Nevertheless, the experiment was a watershed moment. It demonstrated that quantum hardware could operate in a regime where classical simulation is, at minimum, extremely expensive. The term "quantum supremacy" itself became controversial (for non-technical reasons), and many researchers now prefer "quantum advantage" or "quantum utility."
📊 By the Numbers (Sycamore, 2019): 53 qubits, 86 couplers, $T_1 \approx 15\,\mu$s coherence, single-qubit gate error $\sim 0.15\%$, two-qubit gate error $\sim 0.36\%$, readout error $\sim 3.8\%$. Circuit depth: 20 layers. Total number of gates: $\sim 1500$. Cross-entropy benchmarking fidelity: 0.2% (above the theoretical minimum for the circuit size).
Current State (2025)
IBM, Google, Rigetti, IQM, and others are pursuing aggressive roadmaps:
- IBM: Heron processor (133 qubits, improved connectivity), moving toward modular architectures connecting multiple processors. Qiskit software ecosystem is the most mature.
- Google: Willow processor (105 qubits). Demonstrated error correction milestone: as the surface code distance increases, the error rate per logical qubit decreases — the first time this below-threshold behavior has been observed.
- Rigetti: Ankaa-2 (84 qubits) with tunable couplers and a focus on hybrid quantum-classical computing.
Part 3: Trapped Ion Qubits — Nature's Perfect Qubits
The Physics
Individual ions are trapped in electromagnetic potentials (Paul traps) and cooled to near their motional ground state using laser cooling. The qubit is encoded in two internal electronic states of the ion, typically:
- Hyperfine qubit ($^{171}$Yb$^+$): The qubit states are two hyperfine levels of the ground state, separated by 12.6 GHz. Controlled by microwave pulses.
- Optical qubit ($^{40}$Ca$^+$): The qubit states are the ground state and a metastable excited state, separated by an optical frequency. Controlled by laser pulses.
The key advantage of trapped ions: every ion of a given species is identical. There is no fabrication variability. A $^{171}$Yb$^+$ ion in Braunschweig is exactly the same as one in Boulder. Nature provides perfect quality control.
Two-Qubit Gates
Two-qubit gates in trapped ion systems exploit the shared motional modes of the ion crystal. The Cirac-Zoller gate (1995) and the Molmer-Sorensen gate (1999) couple the internal states of two ions through their collective vibrations (phonons). The process is:
- A laser pulse entangles ion A's internal state with the motional mode.
- The motional mode mediates an interaction with ion B.
- A second laser pulse disentangles the motion, leaving an entangling gate between the two ions.
Because all ions in the trap share the same motional modes, any two ions can interact — giving all-to-all connectivity. This is a significant advantage over superconducting qubits, which typically have only nearest-neighbor connections on a 2D grid.
Milestone Experiments
| Year | Milestone | Group | Significance |
|---|---|---|---|
| 1995 | First two-qubit gate proposal | Cirac and Zoller | Theoretical foundation |
| 2003 | CNOT gate in ion trap | NIST (Leibfried et al.) | First high-fidelity two-qubit gate |
| 2011 | 14-qubit GHZ state | Innsbruck | Largest entangled state at the time |
| 2016 | 5-qubit Shor's algorithm (factored 15) | MIT/Innsbruck | Shor on trapped ion hardware |
| 2020 | Quantum volume 128 | Honeywell (H0, 6 qubits) | Highest quantum volume per qubit |
| 2023 | 56-qubit processor | Quantinuum (H2) | Largest high-fidelity trapped ion system |
| 2024 | Two-qubit gate fidelity > 99.9% | Oxford/Quantinuum | Error rates approaching fault-tolerance threshold |
Current State (2025)
- Quantinuum (formerly Honeywell Quantum Solutions): H2 processor with 56 qubits, all-to-all connectivity, $>99.8\%$ two-qubit gate fidelity. Uses a QCCD (quantum charge-coupled device) architecture where ions are shuttled between different zones of a 2D trap.
- IonQ: Forte processor (36 algorithmic qubits), using $^{171}$Yb$^+$ ions in a linear Paul trap. Pursuing photonic interconnects for modular scaling.
- Alpine Quantum Technologies (AQT): Compact trapped ion systems with a focus on rack-mountable hardware.
The main challenge is scaling: individual traps become difficult to control beyond ~50-100 ions. The leading approach is modular architecture — multiple traps connected by photonic links or ion shuttling.
Part 4: Photonic Quantum Computing
The Physics
Photonic qubits use properties of individual photons as the qubit encoding:
- Polarization encoding: $|0\rangle = |H\rangle$ (horizontal), $|1\rangle = |V\rangle$ (vertical). Single-qubit gates are implemented with wave plates (half-wave, quarter-wave).
- Dual-rail encoding: $|0\rangle = |1_a, 0_b\rangle$ (photon in mode $a$), $|1\rangle = |0_a, 1_b\rangle$ (photon in mode $b$). Gates use beam splitters and phase shifters.
- Time-bin encoding: $|0\rangle$ = early arrival, $|1\rangle$ = late arrival. Useful for long-distance communication.
Single-qubit gates are straightforward: beam splitters and phase shifters are well-understood optical components. The challenge is two-qubit gates: photons do not naturally interact with each other (this is why light beams pass through each other).
Two-Photon Gates
Several approaches to the two-photon gate problem:
-
KLM scheme (Knill, Laflamme, Milburn, 2001): Uses linear optics, single-photon sources, and photon detectors. The two-qubit gate is nondeterministic (succeeds with probability $<1$), but this can be boosted using teleportation and cluster states.
-
Measurement-based quantum computing: Prepare a large entangled state (cluster state) of many photons, then perform the computation by adaptive single-photon measurements. This is the approach pursued by PsiQuantum.
-
Fusion-based quantum computing (Xanadu, PsiQuantum): Generate small entangled photon groups, then "fuse" them into a larger cluster state using probabilistic fusion operations with feed-forward.
Advantages and Challenges
Advantages: - Room-temperature operation (for the photons; detectors may need cooling) - Natural for quantum networking — photons are the only viable carriers of quantum information over long distances - Low decoherence — photons barely interact with the environment - High clock speeds — photonic operations are fast ($\sim$ ns)
Challenges: - Photon loss is the dominant error mechanism, and it accumulates with circuit depth - Deterministic two-photon gates are extremely difficult - Single-photon sources with high efficiency, purity, and indistinguishability are technically demanding - The KLM/fusion approaches require very large resource overheads
Key Players
- PsiQuantum: Pursuing a million-qubit photonic quantum computer using silicon photonics fabrication. The strategy is to leverage semiconductor manufacturing infrastructure for massive scale.
- Xanadu: Borealis processor demonstrated "quantum advantage" in Gaussian boson sampling (2022). Develops the Strawberry Fields software platform.
- Quandela: French company specializing in high-quality semiconductor quantum dot single-photon sources.
Part 5: Emerging Platforms
Neutral Atoms
Individual neutral atoms (typically rubidium or cesium) are trapped in optical tweezers — tightly focused laser beams. Qubit states are encoded in hyperfine levels.
Key advantages: - Massive scalability: arrays of $>1000$ atoms have been demonstrated - Reconfigurable connectivity: atoms can be physically moved by rearranging the tweezer array - Long coherence times ($\sim$ seconds) - Two-qubit gates via Rydberg interactions (exciting atoms to high-energy states where they have large dipole moments and interact strongly)
Key players: Pasqal, QuEra, Atom Computing. QuEra demonstrated a 280-qubit array with error correction in 2023.
Topological Qubits
Microsoft's approach: encode qubits in topologically protected states of matter, specifically in non-Abelian anyons (Majorana fermions) at the ends of topological superconductor nanowires. The qubit information is stored in the global topology of the system, making it inherently resistant to local perturbations.
Advantages: Hardware-level error protection — errors require non-local perturbations that are exponentially suppressed.
Status: Microsoft announced in 2025 that they had demonstrated a topological qubit with the required properties. However, the platform is still in the earliest stages compared to superconducting and trapped ion systems. If topological qubits work as theorized, they could drastically reduce the overhead for error correction.
Silicon Spin Qubits
Individual electron or nuclear spins in silicon quantum dots. The qubit is controlled by microwave pulses and electric fields.
Advantages: Compatibility with existing semiconductor fabrication (CMOS technology), extremely small qubit footprint, long nuclear spin coherence times.
Challenges: Precision fabrication requirements (single-atom placement), charge noise, and inter-qubit coupling engineering.
Key players: Intel, Silicon Quantum Computing (UNSW), Diraq.
Part 6: The Road to Fault Tolerance
Error Rates and the Threshold Theorem
The threshold theorem (Aharonov, Ben-Or, 1997; Knill, Laflamme, Zurek, 1998) states: if the physical error rate per gate is below a threshold $p_{\text{th}}$ (typically $\sim 10^{-3}$ to $10^{-2}$, depending on the code and architecture), then arbitrarily long quantum computations can be performed reliably using quantum error correction.
The overhead is significant: each logical qubit requires many physical qubits, and each logical gate requires many physical gates. The dominant candidate for fault-tolerant quantum computing is the surface code, a topological error-correcting code that:
- Requires only nearest-neighbor connectivity (compatible with 2D grid architectures)
- Has a relatively high threshold ($\sim 1\%$)
- Has well-understood decoder algorithms
- Requires $O(d^2)$ physical qubits per logical qubit, where $d$ is the code distance
Resource Estimates
Realistic estimates for useful quantum computations:
| Application | Logical qubits | Circuit depth | Physical qubits (estimated) | Timeline |
|---|---|---|---|---|
| Quantum chemistry (small molecule) | 50-100 | $10^4$-$10^6$ | $10^4$-$10^6$ | 2028-2035 |
| Drug discovery (protein folding) | 200-500 | $10^6$-$10^8$ | $10^6$-$10^8$ | 2030-2040 |
| Shor's algorithm (RSA-2048) | 4,000-20,000 | $10^8$-$10^{10}$ | $10^6$-$10^{10}$ | 2035-2050+ |
| Materials design | 100-1,000 | $10^5$-$10^8$ | $10^5$-$10^8$ | 2030-2040 |
These estimates are approximate and depend heavily on algorithmic improvements, error rates, and code efficiency. The trend is toward lower resource requirements as algorithms and error correction improve.
The NISQ Era and Beyond
John Preskill coined the term NISQ (Noisy Intermediate-Scale Quantum) in 2018 to describe the current generation of quantum computers: 50-1000+ qubits, with error rates too high for full error correction. The central question of the NISQ era is: can useful computation be done without fault tolerance?
Proposed NISQ applications include: - Variational quantum eigensolver (VQE): Hybrid quantum-classical algorithm for finding ground state energies of molecules. - Quantum approximate optimization algorithm (QAOA): Hybrid algorithm for combinatorial optimization. - Quantum machine learning: Using quantum circuits as feature maps or variational models.
Results so far have been mixed. While these algorithms work in principle, noise limits the circuit depth and the problems that can be addressed. No NISQ algorithm has demonstrated a clear practical advantage over classical methods for a problem of practical interest.
The field is increasingly focused on the transition from NISQ to fault-tolerant quantum computing, with error correction as the key enabling technology.
Part 7: The Bigger Picture
The Quantum Computing Ecosystem
Quantum computing is not just hardware. A mature ecosystem includes:
- Software frameworks: Qiskit (IBM), Cirq (Google), PennyLane (Xanadu), Amazon Braket (AWS), Azure Quantum (Microsoft)
- Cloud access: All major quantum computers are accessible via the cloud, democratizing access
- Compilers: Translating high-level algorithms into hardware-specific gate sequences
- Error mitigation: Classical post-processing techniques to reduce the effect of noise (distinct from error correction)
- Benchmarking: Quantum volume, circuit layer operations per second (CLOPS), application-specific benchmarks
Investment and Hype
Global investment in quantum computing has exceeded $30 billion (public and private combined) as of 2025. Major technology companies (IBM, Google, Microsoft, Amazon, Intel), startups (IonQ, Quantinuum, PsiQuantum, Rigetti, Xanadu, Pasqal, QuEra), and governments (US, EU, China, UK, Australia, Japan, South Korea) are all investing heavily.
This investment brings both opportunity and risk. The opportunity is that sustained funding will eventually produce fault-tolerant quantum computers capable of solving important problems. The risk is that overpromising leads to a "quantum winter" — a period of disillusionment and defunding if practical applications do not materialize on the timelines advertised.
The Physicist's Role
Understanding quantum hardware requires deep knowledge of quantum mechanics — the physics you have been building throughout this textbook. Superconducting qubits are quantum harmonic oscillators (Ch 4) with Josephson junction anharmonicity. Trapped ions are driven two-level systems (Ch 13, 21) with phonon-mediated couplings. Photonic qubits exploit the quantum optics of single photons (Ch 27). Error correction requires entanglement (Ch 24), density matrices (Ch 23), and the theory of open quantum systems (Ch 33).
Physicists who understand both the fundamental quantum mechanics and the engineering constraints are uniquely positioned to advance the field. The race to build a useful quantum computer is as much a physics problem as an engineering one.
Discussion Questions
-
Platform selection: If you were starting a quantum computing company today, which hardware platform would you choose, and why? Consider not just current performance but also the scalability path.
-
The supremacy debate: Google's 2019 experiment was contested by IBM and subsequently by improved classical algorithms. What does "quantum supremacy" (or "quantum advantage") mean in practice? Is it a useful milestone, or is it misleading?
-
Error correction timeline: Current quantum computers have $\sim 1000$ physical qubits. Fault-tolerant quantum computing may require millions. Is this a steady engineering challenge (like Moore's law scaling in classical computing) or a potential showstopper?
-
Quantum winter risk: The dot-com bubble (1999-2001) saw massive investment in internet companies, followed by a crash. Could quantum computing experience a similar bubble? What would trigger a "quantum winter," and what could prevent it?
-
Societal impact: If a large-scale quantum computer is built, the first practical application may be breaking current cryptographic systems. Is this a net positive or negative for society? How should the transition to post-quantum cryptography be managed?
-
Convergence of platforms: Some researchers argue that the winning platform has not been invented yet. Others argue that superconducting qubits have an insurmountable lead due to fabrication maturity. What evidence would you need to see to decide between these positions?