45 min read

> "The theory of solids is about electrons obeying quantum mechanics in the field of a lattice of nuclei."

Learning Objectives

  • Derive Bloch's theorem and explain why periodicity constrains wavefunctions to the form psi_k(x) = e^{ikx} u_k(x)
  • Calculate energy bands and band gaps using the Kronig-Penney model and the nearly free electron approximation
  • Classify metals, insulators, and semiconductors from quantum mechanical band structure considerations
  • Apply the tight-binding model to compute band structures for 1D chains and 2D lattices including graphene
  • Explain BCS superconductivity (Cooper pairs, energy gap) and the quantum Hall effect (Landau levels, quantized conductance) conceptually

Chapter 26: QM in Condensed Matter: Bands, Semiconductors, and Superconductivity

"The theory of solids is about electrons obeying quantum mechanics in the field of a lattice of nuclei." — Philip W. Anderson

"It is ironic that after struggling so hard to solve the Schrodinger equation for one atom, we must now confront $10^{23}$ of them." — Adapted from the lore of condensed matter physics

You have spent twenty-five chapters building the quantum mechanical toolkit. You can solve the hydrogen atom exactly. You can handle identical particles, perturbation theory, angular momentum coupling. You understand entanglement, decoherence, quantum information. But here is a question that might keep you up at night: Why does copper conduct electricity and diamond does not?

This is not a minor curiosity. It is a question about the macroscopic world — the world of electrical wires, solar cells, computer chips, MRI magnets, and everything plugged into a wall socket. And the answer is purely quantum mechanical. Classical physics cannot explain why some crystals conduct and others insulate. It cannot explain why silicon becomes a conductor when you add one arsenic atom per million. It certainly cannot explain why some materials lose all electrical resistance below a critical temperature, or why a two-dimensional electron gas in a magnetic field exhibits conductance quantized to parts per billion.

This chapter brings quantum mechanics to solids. We will derive the central theorem of solid-state physics — Bloch's theorem — and show how periodicity alone forces electron energy levels into bands separated by gaps. We will build models of increasing sophistication (free electron, nearly free electron, tight-binding) and use them to classify all solids into metals, insulators, and semiconductors. Finally, we will survey two of the most spectacular quantum phenomena in condensed matter: superconductivity and the quantum Hall effect.

🏃 Fast Track: If you are primarily interested in the conceptual framework, read Sections 26.1-26.5 for the band theory essentials, then skip to Section 26.9 for the technological payoff. Sections 26.7 and 26.8 on superconductivity and the quantum Hall effect are conceptual and can be read independently. The tight-binding derivation (Section 26.6) is essential for anyone continuing to Chapter 36 (topological phases).

🔗 Connection: This chapter draws on Dirac notation and inner products (Chapter 8), identical particles and the Pauli exclusion principle (Chapter 15), and perturbation theory (Chapter 17). The symmetry concepts from Chapter 10 (translation symmetry) appear throughout. Topological aspects of band theory return in Chapter 36.


26.1 From Atoms to Solids: The Many-Electron Problem

The Scale of the Problem

Consider a macroscopic crystal — say, a centimeter cube of copper. It contains roughly $N \sim 10^{23}$ atoms, each contributing one or more valence electrons. The full quantum mechanical description requires solving the Schrodinger equation for $\sim 10^{23}$ interacting electrons in the Coulomb field of $\sim 10^{23}$ nuclei.

The Hamiltonian is:

$$\hat{H} = \sum_{i=1}^{N_e} \left[ \frac{\hat{p}_i^2}{2m_e} + V_{\text{ion}}(\hat{\mathbf{r}}_i) \right] + \frac{1}{2} \sum_{i \neq j} \frac{e^2}{4\pi\epsilon_0 |\hat{\mathbf{r}}_i - \hat{\mathbf{r}}_j|}$$

where $V_{\text{ion}}(\mathbf{r})$ is the potential from all the ionic cores (nuclei plus inner-shell electrons) and the second sum is the electron-electron Coulomb repulsion. The wavefunction $\Psi(\mathbf{r}_1, \mathbf{r}_2, \ldots, \mathbf{r}_{N_e})$ lives in a Hilbert space of absurd dimensionality.

This is, of course, impossible to solve exactly. But the physical properties of solids are remarkably well described by a series of brilliant approximations.

The Born-Oppenheimer Approximation

Nuclei are $\sim 2000$ times heavier than electrons. They move slowly; electrons move fast. In the Born-Oppenheimer approximation, we treat the nuclei as fixed point charges and solve only for the electronic motion in the static nuclear potential. The nuclei define the crystal lattice; the electrons respond to it.

Quantitatively, the ratio of nuclear to electronic kinetic energy scales as $(m_e/M_{\text{nuc}})^{1/2} \sim 10^{-2}$, so the nuclear motion is a tiny perturbation on the electronic timescale. The electrons adiabatically follow the nuclear positions, adjusting their wavefunctions instantaneously (on the electronic timescale) as the nuclei slowly vibrate. This is the same adiabatic approximation you encountered in the context of time-dependent perturbation theory — the system remains in its instantaneous eigenstate when external parameters change slowly.

This reduces our problem to $N_e$ interacting electrons in a fixed periodic potential. The nuclear vibrations (phonons) can be treated separately as small perturbations on the electronic ground state, and they become important primarily as a source of scattering that limits electrical conductivity in metals — and, paradoxically, as the mediator of attractive interactions that produce superconductivity (Section 26.7).

The Independent Electron Approximation

The electron-electron interaction term is still intractable for $10^{23}$ particles. The independent electron approximation replaces the full electron-electron interaction with an effective single-particle potential $V_{\text{eff}}(\mathbf{r})$ that each electron experiences independently. Each electron moves in the combined potential of the ions and the average field of all other electrons.

The many-body Schrodinger equation then separates into $N_e$ independent single-particle equations:

$$\hat{H}_{\text{eff}} |\psi_n\rangle = \left[ \frac{\hat{p}^2}{2m_e} + V_{\text{eff}}(\hat{\mathbf{r}}) \right] |\psi_n\rangle = E_n |\psi_n\rangle$$

The many-body wavefunction is a Slater determinant (Chapter 15) of these single-particle states, ensuring antisymmetry under particle exchange as required by the Pauli exclusion principle.

💡 Key Insight: The independent electron approximation is shockingly effective. It works because the Pauli exclusion principle dramatically suppresses electron-electron scattering — most electrons cannot scatter because all nearby states are already occupied. This "Pauli blocking" makes electrons in a metal behave far more independently than you might expect.

The Periodic Potential

In a crystal, atoms are arranged in a regular, repeating pattern — a lattice. The effective potential inherits this periodicity:

$$V_{\text{eff}}(\mathbf{r} + \mathbf{R}) = V_{\text{eff}}(\mathbf{r})$$

for every lattice translation vector $\mathbf{R} = n_1 \mathbf{a}_1 + n_2 \mathbf{a}_2 + n_3 \mathbf{a}_3$, where $\mathbf{a}_i$ are the primitive lattice vectors and $n_i$ are integers.

This single mathematical fact — the periodicity of the potential — has profound consequences. It is the key that unlocks the entire theory of electronic band structure.

🔵 Historical Note: The program of understanding solids from quantum mechanics began in the late 1920s, almost immediately after quantum mechanics itself was formulated. Arnold Sommerfeld applied Fermi-Dirac statistics to electrons in metals in 1927 — his "free electron" model explained the specific heat paradox (why electrons contribute little to the specific heat of metals, despite being numerous) by recognizing that most electrons in a metal are "frozen" deep inside the Fermi sea and cannot absorb thermal energy because all nearby states are already occupied. Bloch's 1929 theorem then showed how periodicity modifies the free-electron picture, and within a few years the basic framework of band theory was in place. The quantum theory of solids was one of the first great triumphs of applied quantum mechanics.

Checkpoint: Before proceeding, make sure you can answer: (1) Why is the independent electron approximation reasonable? (2) What symmetry does a crystal potential possess? (3) What role does the Pauli exclusion principle play in building up the many-electron ground state from single-particle solutions?


26.2 Bloch's Theorem: Periodicity Constrains Solutions

Statement of the Theorem

Bloch's theorem (Felix Bloch, 1929) is the fundamental theorem of solid-state physics. It states:

The eigenstates of a single-particle Hamiltonian with a periodic potential $V(\mathbf{r} + \mathbf{R}) = V(\mathbf{r})$ can be written in the form:

$$\psi_{n\mathbf{k}}(\mathbf{r}) = e^{i\mathbf{k} \cdot \mathbf{r}} \, u_{n\mathbf{k}}(\mathbf{r})$$

where $u_{n\mathbf{k}}(\mathbf{r})$ has the same periodicity as the lattice: $u_{n\mathbf{k}}(\mathbf{r} + \mathbf{R}) = u_{n\mathbf{k}}(\mathbf{r})$.

The index $n$ labels the band and $\mathbf{k}$ is the crystal momentum (or quasi-momentum). The wavefunction is a plane wave $e^{i\mathbf{k} \cdot \mathbf{r}}$ modulated by a periodic function — it oscillates freely through the crystal but is sculpted by the lattice at every unit cell.

🔵 Historical Note: Felix Bloch derived this result in his doctoral thesis under Werner Heisenberg at Leipzig in 1928. The mathematical content was actually proven earlier by the mathematician Gaston Floquet (1883) for ordinary differential equations with periodic coefficients. In one dimension, the result is sometimes called the Floquet-Bloch theorem.

Proof of Bloch's Theorem

We present two proofs — one using translation operators (elegant, general) and one using Fourier analysis (constructive, computational).

Proof 1: Translation Operator Approach

Define the lattice translation operator $\hat{T}_{\mathbf{R}}$ by its action on any function:

$$\hat{T}_{\mathbf{R}} f(\mathbf{r}) = f(\mathbf{r} + \mathbf{R})$$

Because $V(\mathbf{r} + \mathbf{R}) = V(\mathbf{r})$, the translation operator commutes with the Hamiltonian:

$$[\hat{H}, \hat{T}_{\mathbf{R}}] = 0 \quad \text{for all lattice vectors } \mathbf{R}$$

From Chapter 10, you know that commuting operators share simultaneous eigenstates. Moreover, the translation operators form an abelian group (they all commute with each other: $\hat{T}_{\mathbf{R}} \hat{T}_{\mathbf{R}'} = \hat{T}_{\mathbf{R}+\mathbf{R}'}$), so their simultaneous eigenstates satisfy:

$$\hat{T}_{\mathbf{R}} |\psi\rangle = \lambda(\mathbf{R}) |\psi\rangle$$

The eigenvalue $\lambda(\mathbf{R})$ must satisfy $\lambda(\mathbf{R} + \mathbf{R}') = \lambda(\mathbf{R})\lambda(\mathbf{R}')$ (from the group property). The most general continuous solution of this functional equation is:

$$\lambda(\mathbf{R}) = e^{i\mathbf{k} \cdot \mathbf{R}}$$

for some vector $\mathbf{k}$. (The eigenvalues must have modulus 1 because $\hat{T}_{\mathbf{R}}$ is unitary — translations preserve normalization.)

So any energy eigenstate satisfies:

$$\psi(\mathbf{r} + \mathbf{R}) = e^{i\mathbf{k} \cdot \mathbf{R}} \psi(\mathbf{r})$$

Now define $u_{\mathbf{k}}(\mathbf{r}) \equiv e^{-i\mathbf{k} \cdot \mathbf{r}} \psi(\mathbf{r})$. Check its periodicity:

$$u_{\mathbf{k}}(\mathbf{r} + \mathbf{R}) = e^{-i\mathbf{k} \cdot (\mathbf{r}+\mathbf{R})} \psi(\mathbf{r} + \mathbf{R}) = e^{-i\mathbf{k}\cdot\mathbf{r}} e^{-i\mathbf{k}\cdot\mathbf{R}} e^{i\mathbf{k}\cdot\mathbf{R}} \psi(\mathbf{r}) = e^{-i\mathbf{k}\cdot\mathbf{r}} \psi(\mathbf{r}) = u_{\mathbf{k}}(\mathbf{r})$$

Therefore $\psi(\mathbf{r}) = e^{i\mathbf{k}\cdot\mathbf{r}} u_{\mathbf{k}}(\mathbf{r})$ with $u_{\mathbf{k}}$ periodic. $\blacksquare$

Proof 2: Fourier Analysis Approach (1D)

For a 1D lattice with period $a$, expand the periodic potential in a Fourier series:

$$V(x) = \sum_G V_G \, e^{iGx}$$

where $G = 2\pi n/a$ ($n$ integer) are the reciprocal lattice vectors. Since $V$ is real, $V_{-G} = V_G^*$.

Expand the wavefunction in plane waves:

$$\psi(x) = \sum_q c_q \, e^{iqx}$$

Substituting into the Schrodinger equation $\hat{H}\psi = E\psi$:

$$\frac{\hbar^2 q^2}{2m} c_q + \sum_G V_G \, c_{q-G} = E \, c_q$$

This equation couples $c_q$ only to $c_{q-G}$ — coefficients that differ by a reciprocal lattice vector. So the set of coefficients $\{c_{k+G}\}$ for a given $k$ (with $G$ running over all reciprocal lattice vectors) decouples from the set $\{c_{k'+G}\}$ for $k' \neq k \pmod{2\pi/a}$.

The wavefunction built from the coupled set is:

$$\psi_k(x) = \sum_G c_{k+G} \, e^{i(k+G)x} = e^{ikx} \underbrace{\sum_G c_{k+G} \, e^{iGx}}_{u_k(x)}$$

Since $e^{iGx}$ is periodic with period $a$, the function $u_k(x)$ is periodic. This is Bloch's theorem. $\blacksquare$

Crystal Momentum and the Brillouin Zone

The quantum number $\mathbf{k}$ is called the crystal momentum (strictly, $\hbar \mathbf{k}$ is the crystal momentum). It is not the true momentum of the electron — the true momentum is not a good quantum number because the periodic potential breaks continuous translation symmetry. Instead, $\mathbf{k}$ is a good quantum number associated with the discrete translation symmetry that remains.

A key consequence of Bloch's theorem is that $\mathbf{k}$ and $\mathbf{k} + \mathbf{G}$ label the same physical state (they give the same wavefunction, just with the Fourier coefficients reshuffled between $e^{i\mathbf{k}\cdot\mathbf{r}}$ and $u_{n\mathbf{k}}$). Therefore, all distinct $\mathbf{k}$ values lie within one Brillouin zone — the Wigner-Seitz cell of the reciprocal lattice.

In one dimension with lattice constant $a$, the first Brillouin zone is:

$$-\frac{\pi}{a} < k \leq \frac{\pi}{a}$$

The energy eigenvalue $E_n(\mathbf{k})$ is a continuous function of $\mathbf{k}$ within each band $n$. The set of all these energy-vs-$\mathbf{k}$ curves is the band structure — the central object of solid-state physics.

In three dimensions, the first Brillouin zone has a specific geometric shape determined by the crystal symmetry — a truncated octahedron for FCC lattices (relevant for silicon, aluminum, copper), a regular dodecahedron for BCC lattices (relevant for iron, tungsten), and a hexagonal prism for hexagonal lattices (relevant for graphene, GaN). The band structure is typically plotted along high-symmetry lines connecting special points of the Brillouin zone, labeled with standard notation: $\Gamma$ (center), X, L, K, M, etc. Learning to read these band structure plots — identifying the band gap, the Fermi energy, the effective masses at band extrema — is an essential skill in solid-state physics.

⚠️ Common Misconception: Crystal momentum $\hbar \mathbf{k}$ is not the electron's true momentum. It is the eigenvalue of the discrete translation operator, not the continuous momentum operator $\hat{p}$. In a perfect crystal, $\hbar \mathbf{k}$ is conserved (good quantum number), but $\hat{p}$ is not. Crystal momentum is conserved modulo a reciprocal lattice vector: in scattering processes, $\mathbf{k}_{\text{final}} = \mathbf{k}_{\text{initial}} + \mathbf{G}$ is allowed (these are called Umklapp processes).


26.3 The Nearly Free Electron Model

The simplest model of electrons in a crystal starts with free electrons and treats the periodic potential as a weak perturbation — exactly the machinery of Chapter 17.

Free Electrons First

For $V = 0$, the Schrodinger equation gives plane wave solutions with energies:

$$E^{(0)}_k = \frac{\hbar^2 k^2}{2m}$$

This is a simple parabola. In the extended zone scheme (plotting $E$ vs. $k$ over all $k$), there is a single band — the free electron parabola. This is the Sommerfeld model: treat the valence electrons as completely free particles in a box the size of the crystal, subject only to Fermi-Dirac statistics. Despite its simplicity, this model explains many metallic properties — the linear specific heat at low temperatures, the Wiedemann-Franz law (ratio of thermal to electrical conductivity), and the qualitative behavior of electrical conductivity.

But we can also fold this parabola into the first Brillouin zone using the periodicity of $k$ modulo $2\pi/a$. This is the reduced zone scheme: for each reciprocal lattice vector $G = 2\pi n/a$, the parabola $E = \hbar^2(k + G)^2/2m$ creates a new branch within the first Brillouin zone. The first branch ($G = 0$) runs from $E = 0$ at $k = 0$ up to $E = \hbar^2\pi^2/(2ma^2)$ at $k = \pi/a$. The second branch ($G = -2\pi/a$) starts at $k = -\pi/a$ with the same energy and curves upward. At special $k$-values (the zone boundary $k = \pm \pi/a$), branches from different $G$ values become degenerate — this degeneracy is where the periodic potential will have its most dramatic effect.

Turning On the Periodic Potential

Now turn on a weak periodic potential. Away from degeneracies, non-degenerate perturbation theory (Chapter 17) gives small energy corrections. But at the zone boundary $k = \pi/a$, the free-electron states $|k\rangle$ and $|k - 2\pi/a\rangle$ are degenerate. We must use degenerate perturbation theory.

The $2 \times 2$ secular equation at $k = \pi/a$ is:

$$\begin{pmatrix} E^{(0)}_{k} - E & V_G \\ V_G^* & E^{(0)}_{k-G} - E \end{pmatrix} \begin{pmatrix} c_1 \\ c_2 \end{pmatrix} = 0$$

where $G = 2\pi/a$. At the zone boundary, $E^{(0)}_k = E^{(0)}_{k-G}$ (the degeneracy), and the eigenvalues are:

$$E_\pm = E^{(0)}_k \pm |V_G|$$

The degeneracy is lifted. A band gap of magnitude $2|V_G|$ opens at the zone boundary.

The Physical Picture

The gap opening has a beautiful physical interpretation. At $k = \pi/a$, the two degenerate free-electron states (traveling waves $e^{i\pi x/a}$ and $e^{-i\pi x/a}$) mix to form standing waves:

$$\psi_+(x) \propto \cos(\pi x/a), \qquad \psi_-(x) \propto \sin(\pi x/a)$$

The $\cos$ solution ($\psi_+$) has its probability density $|\psi_+|^2$ peaked at the ion cores ($x = 0, a, 2a, \ldots$) where the potential is most negative, so its energy is lowered. The $\sin$ solution ($\psi_-$) has its density peaked between the ion cores, where the potential is less negative, so its energy is raised.

💡 Key Insight: Band gaps arise from Bragg reflection. At $k = \pi/a$, the electron wavelength satisfies the Bragg condition $2a \sin(90°) = \lambda = 2\pi/k = 2a$. The electron is Bragg-reflected by the lattice. Forward- and backward-traveling waves mix into standing waves, and the two standing waves have different energies because they sample different regions of the periodic potential. This energy splitting is the band gap.

Energy Bands: The Result

The nearly free electron model produces the essential qualitative picture:

  1. Allowed energy bands — ranges of energy where solutions exist.
  2. Forbidden gaps — ranges of energy where no Bloch state exists.
  3. Gaps occur at the Brillouin zone boundaries, with magnitudes determined by the Fourier components $V_G$ of the periodic potential.

For a weak periodic potential with only the first Fourier component $V_1 \neq 0$, the band structure near the zone boundary is:

$$E_{\pm}(k) = \frac{E^{(0)}_k + E^{(0)}_{k-G}}{2} \pm \sqrt{\left(\frac{E^{(0)}_k - E^{(0)}_{k-G}}{2}\right)^2 + |V_1|^2}$$

Away from the zone boundary, $|E^{(0)}_k - E^{(0)}_{k-G}| \gg |V_1|$ and the correction is negligible. Near the zone boundary, the square root opens the gap.

📊 By the Numbers: In a typical semiconductor like silicon, the band gap is $E_g = 1.12$ eV at room temperature. In diamond (an insulator), $E_g = 5.47$ eV. In a metal like copper, there is no gap at the Fermi energy — the bands overlap. These numbers — set by the Fourier components of the crystal potential — determine whether a material conducts, insulates, or lies somewhere in between.


26.4 Energy Bands and Band Gaps

The General Band Structure Picture

Whether we approach from the nearly free electron side (Section 26.3) or the tight-binding side (Section 26.6), the result is the same: the allowed energies for electrons in a periodic potential form bands — continuous ranges of energy $E_n(k)$ parameterized by $k$ within the Brillouin zone and indexed by a band number $n$.

Key features of band structure:

  1. Bands are separated by gaps. There are ranges of energy where no Bloch state exists. These are the forbidden gaps (or band gaps).

  2. Each band accommodates a finite number of states. For a 1D crystal with $N$ unit cells and periodic boundary conditions, there are exactly $N$ allowed $k$-values in the Brillouin zone, spaced by $\Delta k = 2\pi/(Na)$. With spin, each band holds $2N$ electrons.

  3. $E_n(k)$ is periodic in $k$. It has the full symmetry of the reciprocal lattice: $E_n(\mathbf{k} + \mathbf{G}) = E_n(\mathbf{k})$.

  4. $E_n(k)$ is an even function in systems with inversion symmetry: $E_n(-\mathbf{k}) = E_n(\mathbf{k})$ (Kramers' theorem in the absence of spin-orbit coupling, or time-reversal symmetry more generally).

The Kronig-Penney Model

To compute band structure explicitly, we use the Kronig-Penney model — a 1D periodic array of rectangular potential barriers. It is exactly solvable and captures all the essential physics.

Consider a 1D lattice of period $a = b + w$, where each unit cell consists of a region of zero potential (width $b$) and a rectangular barrier (width $w$, height $V_0$):

$$V(x) = \begin{cases} 0 & 0 < x < b \\ V_0 & b < x < a \end{cases}$$

repeated periodically: $V(x + a) = V(x)$.

In the region $0 < x < b$ (free region), the Schrodinger equation gives:

$$\psi(x) = A e^{i\alpha x} + B e^{-i\alpha x}, \quad \alpha = \sqrt{2mE}/\hbar$$

In the barrier region $b < x < a$ (for $E < V_0$):

$$\psi(x) = C e^{\beta x} + D e^{-\beta x}, \quad \beta = \sqrt{2m(V_0 - E)}/\hbar$$

Applying continuity of $\psi$ and $\psi'$ at $x = b$ and using Bloch's theorem ($\psi(x + a) = e^{ika}\psi(x)$) at $x = 0$ and $x = a$, we arrive at the transcendental equation:

$$\cos(ka) = \cos(\alpha b)\cosh(\beta w) + \frac{\alpha^2 - \beta^2}{2\alpha\beta}\sin(\alpha b)\sinh(\beta w)$$

This is the Kronig-Penney dispersion relation. For a given energy $E$ (which determines $\alpha$ and $\beta$), the right-hand side is a definite number. A Bloch state exists only if that number lies between $-1$ and $+1$ (since $|\cos(ka)| \leq 1$). Energy ranges where the right-hand side exceeds this bound are the band gaps.

🧪 Experiment (Computational): The code file code/example-01-bands.py solves the Kronig-Penney model numerically and plots the band structure. Run it to see how bands and gaps emerge as you increase the barrier height from zero (free electrons, no gaps) to large values (isolated atoms, flat bands).

The Delta-Function Limit

In the limit of very narrow, very tall barriers ($w \to 0$, $V_0 \to \infty$ with $V_0 w \to \text{const} \equiv P\hbar^2/ma$), the Kronig-Penney model simplifies to the Dirac comb:

$$\cos(ka) = \cos(\alpha a) + \frac{P}{\alpha a}\sin(\alpha a)$$

where $P$ is a dimensionless measure of barrier strength. This version is analytically cleaner and is the one most often assigned as a homework problem. The physics is identical: bands where the right-hand side lies in $[-1, 1]$; gaps where it does not.

Effective Mass

Near a band extremum (say at $k = k_0$), the energy can be expanded as:

$$E_n(k) \approx E_n(k_0) + \frac{1}{2}\frac{d^2E_n}{dk^2}\bigg|_{k_0} (k - k_0)^2$$

This looks like a free particle with an effective mass:

$$m^* = \hbar^2 \left(\frac{d^2 E_n}{dk^2}\right)^{-1}$$

The effective mass captures how the lattice modifies the electron's inertial response. Near the bottom of a band, $d^2E/dk^2 > 0$ and $m^* > 0$ — the electron accelerates in the direction of the applied force, as expected. Near the top of a band, $d^2E/dk^2 < 0$ and $m^* < 0$ — it is often more convenient to describe these states as missing electrons (holes) with positive effective mass $m^*_h = -m^*_e > 0$ and positive charge.

💡 Key Insight: The effective mass is why quantum mechanics is essential to understanding solids. In silicon, the electron effective mass is $m^* \approx 0.26\,m_e$ near the conduction band minimum and the hole effective mass is $m^*_h \approx 0.36\,m_e$ near the valence band maximum. These are not small corrections — they fundamentally change the dynamics. The effective mass is a direct consequence of band curvature, which is a direct consequence of the periodic potential, which requires Bloch's theorem.

Density of States

The density of states $g(E)$ counts how many single-particle states exist per unit energy:

$$g(E) = \sum_n \int_{\text{BZ}} \frac{d\mathbf{k}}{(2\pi)^d} \, \delta(E - E_n(\mathbf{k}))$$

In 1D, $g(E) \propto 1/|dE_n/dk|$, which diverges at band edges where the band is flat ($dE_n/dk = 0$). These Van Hove singularities produce peaks in the density of states that have observable consequences — for example, they enhance optical absorption at certain energies.

In 3D, the density of states near a band edge takes the form:

$$g(E) \propto \sqrt{E - E_{\text{edge}}}$$

for a parabolic band — the same $\sqrt{E}$ dependence as free electrons, but measured from the band edge rather than $E = 0$. In 2D (relevant for graphene and quantum well structures), the density of states is a step function at each band edge — constant within each subband. In 1D, $g(E) \propto 1/\sqrt{E - E_{\text{edge}}}$, which diverges at the band edge. These different dimensionality dependences have profound consequences for the optical and transport properties of nanostructures.

🔗 Connection: The density of states controls nearly every measurable property of a solid. The electronic specific heat, the optical absorption spectrum, the magnetic susceptibility, and the superconducting transition temperature all involve integrals over the density of states weighted by various physical quantities. Measuring these properties is how experimentalists probe the band structure — they are the observational windows through which the quantum mechanical energy levels become visible.


26.5 Metals, Insulators, and Semiconductors from Quantum Mechanics

Band Filling and the Fermi Energy

We now have the band structure — the allowed energy bands as a function of crystal momentum. To determine the electronic properties of a material, we must fill the bands with electrons according to the Pauli exclusion principle.

The rules are simple: 1. Each state $(n, \mathbf{k}, \sigma)$ (band, crystal momentum, spin) holds at most one electron. 2. At zero temperature, fill states from the lowest energy up to the Fermi energy $E_F$ — the energy of the highest occupied state. 3. In a crystal with $N$ unit cells, each band holds $2N$ electrons (factor of 2 for spin).

The position of the Fermi energy relative to the band structure determines everything.

Metals

A metal has the Fermi energy inside a partially filled band. Electrons near $E_F$ can be excited to nearby empty states with arbitrarily small energy cost. An applied electric field shifts the occupied states slightly in $k$-space, producing a net current. Metals conduct.

There are two ways this happens: - Odd number of electrons per unit cell: Each band holds $2N$ electrons. With an odd number of valence electrons per cell, the topmost occupied band must be partially filled. Example: sodium (1 valence electron per cell), copper (effectively 1), aluminum (3). - Band overlap: Even with an even number of electrons, if bands overlap in energy (the top of one band is higher than the bottom of the next), both bands are partially filled. Example: the alkaline earth metals (beryllium, magnesium).

Insulators

An insulator has the Fermi energy in a band gap, with all lower bands completely filled and all higher bands completely empty. To conduct, an electron would need to be excited across the gap, which requires energy $E_g$. If $E_g \gg k_B T$ (thermal energy), essentially no electrons are thermally excited and the material does not conduct.

Example: diamond has $E_g = 5.47$ eV. At room temperature, $k_BT \approx 0.026$ eV. The probability of thermal excitation across the gap is proportional to $e^{-E_g/2k_BT} \sim e^{-105} \approx 10^{-46}$. Diamond is an excellent insulator.

Semiconductors

A semiconductor is an insulator with a small band gap — small enough that thermal excitation creates a non-negligible number of electrons in the conduction band and holes in the valence band.

The intrinsic carrier concentration at temperature $T$ is:

$$n_i \propto T^{3/2} \exp\left(-\frac{E_g}{2k_BT}\right)$$

For silicon ($E_g = 1.12$ eV), at room temperature $n_i \approx 1.5 \times 10^{10}\,\text{cm}^{-3}$. This is tiny compared to the $\sim 5 \times 10^{22}$ atoms per cm$^3$, so pure silicon is a poor conductor. But the exponential sensitivity to $E_g/k_BT$ means that even small changes in $E_g$ or $T$ produce enormous changes in conductivity.

⚠️ Common Misconception: There is no sharp boundary between "insulator" and "semiconductor." The distinction is one of degree, not kind. By convention, materials with $E_g \lesssim 3$-4 eV are called semiconductors (silicon, germanium, gallium arsenide) and those with larger gaps are called insulators (diamond, quartz). But the physics is the same — it is all about the Boltzmann factor $e^{-E_g/2k_BT}$.

Doping: Engineering the Fermi Energy

The real power of semiconductors comes from doping — intentionally introducing impurity atoms that either donate extra electrons (n-type) or create holes (p-type).

n-type doping: Replace a silicon atom (4 valence electrons) with a phosphorus atom (5 valence electrons). The extra electron occupies a state just below the conduction band edge — a donor level at energy $E_d \approx E_c - 0.045$ eV. At room temperature, $k_BT \approx 0.026$ eV, so the donor is readily ionized and the extra electrons populate the conduction band.

p-type doping: Replace silicon with boron (3 valence electrons). The missing electron creates an acceptor level just above the valence band, at $E_a \approx E_v + 0.045$ eV. Electrons from the valence band are thermally excited into the acceptor, leaving mobile holes in the valence band. The physics is the same as for donors but with the roles of electrons and holes reversed.

The mathematics of doping is elegantly simple. A phosphorus impurity in silicon is essentially a hydrogen atom scaled by the effective mass and dielectric constant of the crystal: the binding energy is $E_d \approx (m^*/m_e)(1/\epsilon_r^2) \times 13.6\,\text{eV}$, giving $\sim 0.03$-$0.05$ eV for typical semiconductors. The Bohr radius of the donor state is correspondingly large: $a_d \approx (\epsilon_r / (m^*/m_e)) \times a_0 \approx 20$-$30\,\text{\AA}$, spanning many unit cells. This large extent justifies the effective mass approximation — the donor electron sees an averaged crystal, not individual atoms.

In both cases, the Fermi energy moves — toward the conduction band for n-type, toward the valence band for p-type. The carrier concentration becomes:

$$n \approx N_d \quad (\text{n-type, fully ionized donors})$$

At a doping level of $N_d = 10^{17}\,\text{cm}^{-3}$ (one impurity per $5 \times 10^5$ silicon atoms), the conductivity increases by a factor of $\sim 10^7$ compared to intrinsic silicon.

Property Metal Semiconductor Insulator
Band gap at $E_F$ None (partially filled band) Small ($\sim 0.1$-$3$ eV) Large ($\gtrsim 4$ eV)
Carriers at 300 K $\sim 10^{22}\,\text{cm}^{-3}$ $\sim 10^{10}$ (intrinsic) $\sim 0$
Conductivity $\sigma$ $\sim 10^{5}$-$10^{7}\,(\Omega\cdot\text{m})^{-1}$ $10^{-8}$-$10^{3}$ $< 10^{-10}$
$\sigma(T)$ dependence Decreases with $T$ Increases with $T$ Increases with $T$
Dominant $T$ mechanism Phonon scattering Thermal activation across gap Thermal activation across gap

💡 Key Insight: The temperature dependence of conductivity is the experimental signature that most clearly distinguishes metals from semiconductors. Metals get worse conductors when heated (more phonon scattering). Semiconductors get better conductors when heated (more carriers excited across the gap). This qualitative difference is a direct consequence of band structure.

Checkpoint: You should now be able to explain: (1) Why does the Fermi energy location determine whether a material conducts? (2) Why does an odd number of valence electrons per unit cell guarantee metallic behavior? (3) Why does doping silicon with phosphorus increase its conductivity by seven orders of magnitude?


26.6 The Tight-Binding Model

Approach: Start from Atoms

The nearly free electron model starts from free electrons and adds a weak periodic potential. The tight-binding model takes the opposite limit: start from isolated atoms and allow them to interact weakly. This is the natural picture for systems where electrons are fairly well localized around atomic sites — transition metals, organic molecules, and (as we will see) graphene.

1D Chain: The Simplest Tight-Binding Model

Consider a 1D chain of identical atoms with lattice constant $a$. Each atom has a single relevant atomic orbital $|\phi_n\rangle$ localized at site $n$ (position $x_n = na$).

In Dirac notation, we write the Bloch state as a superposition of atomic orbitals:

$$|k\rangle = \frac{1}{\sqrt{N}} \sum_{n=1}^{N} e^{ikna} |\phi_n\rangle$$

This is called a Linear Combination of Atomic Orbitals (LCAO) and automatically satisfies Bloch's theorem — the phase factor $e^{ikna}$ ensures the correct transformation under lattice translation.

The energy is:

$$E(k) = \langle k | \hat{H} | k \rangle = \frac{1}{N} \sum_{n,m} e^{ik(n-m)a} \langle \phi_m | \hat{H} | \phi_n \rangle$$

We define the key parameters:

  • On-site energy: $\epsilon_0 = \langle \phi_n | \hat{H} | \phi_n \rangle$ (energy of the isolated atomic orbital)
  • Hopping parameter: $t = -\langle \phi_n | \hat{H} | \phi_{n\pm 1}\rangle$ (coupling between nearest neighbors; the minus sign is a convention so that $t > 0$ for typical atomic potentials)
  • Overlap: $\langle \phi_n | \phi_m \rangle \approx \delta_{nm}$ (neglect overlap between different sites — the simplest approximation)

Keeping only nearest-neighbor hopping ($m = n \pm 1$):

$$E(k) = \epsilon_0 - 2t\cos(ka)$$

This is the tight-binding band for a 1D chain. It has a bandwidth of $4t$ (from $\epsilon_0 - 2t$ at $k = 0$ to $\epsilon_0 + 2t$ at $k = \pm\pi/a$). The hopping parameter $t$ controls the bandwidth — more overlap between neighboring atoms means wider bands.

💡 Key Insight: The tight-binding result $E(k) = \epsilon_0 - 2t\cos(ka)$ tells a beautiful story. When atoms are far apart ($t \to 0$), the "band" is a single degenerate level at $\epsilon_0$ — the atomic energy level. As atoms are brought together and orbitals overlap ($t > 0$), the degenerate level broadens into a band of width $4t$. The discrete atomic levels have dissolved into a continuous band. This is the quantum mechanical origin of bands.

Worked Example: Consider a chain of sodium atoms. The 3s orbital of sodium has energy $\epsilon_0 \approx -5.14$ eV (the ionization energy). With a nearest-neighbor hopping parameter $t \approx 1.0$ eV (determined from the overlap of 3s orbitals at the equilibrium spacing of 3.66 A), the band runs from $-5.14 - 2.0 = -7.14$ eV to $-5.14 + 2.0 = -3.14$ eV, a bandwidth of 4.0 eV. Sodium has one valence electron per atom, so the band is half-filled, and the Fermi energy lies at $E_F = \epsilon_0 = -5.14$ eV (the midpoint of the cosine band, where $\cos(k_Fa) = 0$, i.e., $k_F = \pi/(2a)$). This correctly predicts that sodium is a metal — a result that depends on band theory, not classical physics.

Graphene: The Tight-Binding Masterpiece

Now we apply the tight-binding model to one of the most celebrated materials of the 21st century: graphene, a single layer of carbon atoms arranged in a honeycomb lattice.

The honeycomb lattice is not a Bravais lattice — it has two atoms per unit cell, which we call sublattice A and sublattice B. The Bravais lattice is triangular, with primitive vectors:

$$\mathbf{a}_1 = a\left(\frac{\sqrt{3}}{2}, \frac{1}{2}\right), \qquad \mathbf{a}_2 = a\left(\frac{\sqrt{3}}{2}, -\frac{1}{2}\right)$$

where $a = |\mathbf{a}_1| = |\mathbf{a}_2| = 2.46\,\text{\AA}$ is the lattice constant ($\sqrt{3}$ times the carbon-carbon distance of $1.42\,\text{\AA}$).

Each carbon atom has one relevant $p_z$ orbital perpendicular to the graphene plane (the three $sp^2$ orbitals form the strong $\sigma$ bonds that hold the lattice together). The $p_z$ electrons are the mobile ones that determine the electronic properties.

The tight-binding Hamiltonian, considering only nearest-neighbor hopping $t \approx 2.7$ eV, is a $2 \times 2$ matrix in the sublattice basis $\{|A, \mathbf{k}\rangle, |B, \mathbf{k}\rangle\}$:

$$H(\mathbf{k}) = \begin{pmatrix} 0 & -t \, f(\mathbf{k}) \\ -t \, f^*(\mathbf{k}) & 0 \end{pmatrix}$$

where:

$$f(\mathbf{k}) = \sum_{j=1}^{3} e^{i\mathbf{k}\cdot\boldsymbol{\delta}_j}$$

and $\boldsymbol{\delta}_1, \boldsymbol{\delta}_2, \boldsymbol{\delta}_3$ are the three nearest-neighbor vectors connecting an A-site to its three B-site neighbors.

The eigenvalues are:

$$E_\pm(\mathbf{k}) = \pm t |f(\mathbf{k})| = \pm t\sqrt{3 + 2\cos(\mathbf{k}\cdot\mathbf{a}_1) + 2\cos(\mathbf{k}\cdot\mathbf{a}_2) + 2\cos(\mathbf{k}\cdot(\mathbf{a}_1 - \mathbf{a}_2))}$$

The two bands touch at the corners of the hexagonal Brillouin zone — the K and K' points. Near these points, expanding $f(\mathbf{k})$ to first order in $\mathbf{q} = \mathbf{k} - \mathbf{K}$:

$$E_\pm(\mathbf{q}) \approx \pm \hbar v_F |\mathbf{q}|$$

where $v_F = 3ta/(2\hbar) \approx 10^6\,\text{m/s}$ is the Fermi velocity — about 1/300 the speed of light.

This is a linear dispersion — just like massless relativistic particles! The low-energy electrons in graphene behave as if they were governed by the Dirac equation (Chapter 29) with zero mass. The K and K' points are called Dirac points, and the linear band crossings are called Dirac cones.

🔵 Historical Note: The tight-binding band structure of graphene was calculated by P.R. Wallace in 1947, nearly six decades before the material was isolated experimentally. Wallace's original paper noted the unusual linear dispersion and its resemblance to relativistic physics. When Andre Geim and Konstantin Novoselov isolated single-layer graphene in 2004 (and observed these Dirac fermions experimentally), they won the 2010 Nobel Prize in Physics.

Why Graphene's Band Structure Matters

The linear dispersion near the Dirac points has extraordinary consequences:

  1. Zero effective mass: Near the Dirac points, the effective mass is zero (the curvature $d^2E/dk^2 = 0$). Electrons travel at a constant velocity $v_F$ regardless of energy — like photons, not like ordinary massive particles.

  2. Ambipolar transport: The valence band ($E_-$) and conduction band ($E_+$) are symmetric and touch at the Dirac points. Carriers can be continuously tuned from electrons to holes by shifting the Fermi energy with a gate voltage.

  3. Sublattice pseudospin: The two-component wavefunction (amplitudes on A and B sublattices) acts like a spin-1/2 degree of freedom. This pseudospin is locked to the direction of $\mathbf{k}$, producing a Berry phase of $\pi$ for paths encircling a Dirac point (this connects to Chapter 32 on geometric phases and Chapter 36 on topological phases).

  4. Half-integer quantum Hall effect: Graphene exhibits a quantum Hall effect (Section 26.8) with quantized conductance at half-integer values of $e^2/h$, directly confirming the Dirac fermion nature of its carriers.

Checkpoint: You should now be able to: (1) Write down the tight-binding Hamiltonian for a 1D chain and derive the cosine dispersion. (2) Explain why graphene has two bands (answer: two atoms per unit cell). (3) Explain what "Dirac cone" means and why it implies zero effective mass.


26.7 BCS Superconductivity

The Phenomenon

In 1911, Heike Kamerlingh Onnes discovered that the electrical resistance of mercury drops abruptly to zero below a critical temperature $T_c = 4.2\,\text{K}$. Not approximately zero — exactly zero, as far as anyone can measure. A current set flowing in a superconducting ring persists for years without measurable decay. In 1933, Meissner and Ochsenfeld discovered that superconductors also expel magnetic fields from their interior — the Meissner effect. A magnetic field cannot penetrate a superconductor (except within a thin surface layer of depth $\lambda \sim 50\,\text{nm}$).

These phenomena persisted as one of the great puzzles of physics for 46 years, until Bardeen, Cooper, and Schrieffer (BCS) provided the quantum mechanical explanation in 1957.

The Cooper Pair

Leon Cooper showed in 1956 that in the presence of even a weak attractive interaction between electrons, the Fermi sea is unstable to the formation of bound pairs of electrons with opposite momenta and opposite spins: $(k\uparrow, -k\downarrow)$.

But wait — how can electrons attract each other? They are both negatively charged and should repel via the Coulomb interaction. The answer involves phonons — quantized lattice vibrations.

The mechanism is this: a fast-moving electron slightly distorts the positive ionic lattice as it passes through. Because the ions are heavy, they respond slowly — the lattice distortion persists for a time $\sim 1/\omega_D$ (the inverse Debye frequency, about $10^{-13}$ seconds) after the electron has passed. This leaves behind a wake of enhanced positive charge density. A second electron, arriving within this time window, is attracted to the region of excess positive charge. The net effect is an attractive interaction between the two electrons, mediated by the exchange of a virtual phonon.

The characteristic energy scale of this attraction is the Debye energy $\hbar\omega_D \sim 10$-$30$ meV. This is tiny compared to the Fermi energy ($\sim$ several eV), which is why the pairing is a subtle effect operating only on electrons within a thin shell of thickness $\sim \hbar\omega_D$ around the Fermi surface.

🧪 Experiment: The isotope effect provides direct evidence for phonon-mediated pairing. The critical temperature $T_c$ of a superconductor scales as $T_c \propto M^{-1/2}$, where $M$ is the mass of the lattice ions. Heavier isotopes → lower phonon frequencies → weaker coupling → lower $T_c$. This dependence was predicted by BCS theory and confirmed experimentally.

The key quantum mechanical point is that this bound state — the Cooper pair — exists only because of the Fermi sea. In vacuum, the attractive interaction would be too weak to bind two electrons. But the Pauli exclusion principle blocks all the low-energy scattering channels (the states are already filled), and this "Pauli blocking" allows even a weak attraction to create a bound state.

The BCS Ground State

BCS theory constructs the superconducting ground state as a coherent superposition of Cooper pair configurations:

$$|\Psi_{\text{BCS}}\rangle = \prod_{\mathbf{k}} \left( u_{\mathbf{k}} + v_{\mathbf{k}} \, \hat{c}^{\dagger}_{\mathbf{k}\uparrow} \hat{c}^{\dagger}_{-\mathbf{k}\downarrow} \right) |0\rangle$$

where $|u_{\mathbf{k}}|^2 + |v_{\mathbf{k}}|^2 = 1$. The coefficient $v_{\mathbf{k}}$ is the probability amplitude for the pair $(k\uparrow, -k\downarrow)$ to be occupied, and $u_{\mathbf{k}}$ is the amplitude for it to be empty.

This is a remarkable quantum state. It is a coherent superposition over different numbers of Cooper pairs — it does not have a definite particle number. It exhibits macroscopic quantum coherence: the phase of the complex coefficients $v_{\mathbf{k}}$ is the same for all pairs. The entire superconductor is described by a single macroscopic wavefunction with a single well-defined phase — this phase coherence is the ultimate origin of zero resistance.

The Energy Gap

The BCS state has an energy gap $\Delta$ above the ground state. The minimum energy required to break a Cooper pair and create two independent quasiparticles is $2\Delta$. At zero temperature:

$$\Delta_0 \approx 1.76 \, k_B T_c$$

This gap suppresses all low-energy excitations. An electric current, which corresponds to a shift of the paired Fermi sea in $k$-space, cannot decay because there are no available low-energy states to scatter into. This is why the resistance is exactly zero — not just very small, but exactly zero.

💡 Key Insight: Superconductivity is a macroscopic quantum phenomenon. The BCS ground state is a single quantum state occupied by $\sim 10^{23}$ electrons, all phase-coherent. The energy gap protects this state from thermal fluctuations. When $k_BT \ll \Delta$, the superconductor cannot absorb energy from scattering events, so the current flows forever. Resistance is not merely small — it is exactly zero because the gap is an absolute barrier to low-energy excitations.

Beyond BCS: High-Temperature Superconductors

The BCS mechanism explains superconductivity in conventional metals (aluminum, lead, niobium) with $T_c$ values up to about 30 K. In 1986, Bednorz and Muller discovered superconductivity in a copper oxide ceramic at $T_c \approx 35\,\text{K}$ — far above what BCS phonon-mediated pairing could easily explain. The race was on, and within a year, materials with $T_c > 90\,\text{K}$ (above the boiling point of liquid nitrogen, 77 K) were found.

The mechanism behind high-temperature superconductivity remains one of the major unsolved problems in physics. The pairing is believed to be mediated not by phonons but by magnetic (antiferromagnetic spin-fluctuation) interactions, but no universally accepted theory exists as of this writing.

⚖️ Interpretation: The BCS state challenges our notions of "particle." In a superconductor, the elementary excitations are not electrons but Bogoliubov quasiparticles — coherent superpositions of electron-like and hole-like excitations. The original electrons have "dissolved" into a collective quantum state. This is a profound example of emergence: the superconducting state has properties (zero resistance, Meissner effect, quantized flux) that no individual electron possesses.


26.8 The Quantum Hall Effect

Classical Hall Effect: A Quick Reminder

When a current-carrying conductor is placed in a perpendicular magnetic field $\mathbf{B}$, the Lorentz force $\mathbf{F} = -e\mathbf{v} \times \mathbf{B}$ deflects electrons sideways, building up a transverse voltage — the Hall voltage $V_H$. In steady state, the transverse electric field exactly cancels the magnetic force, and the Hall resistance is:

$$R_{xy} = \frac{V_H}{I} = \frac{B}{ne}$$

where $n$ is the 2D carrier density (charge per area). Classically, $R_{xy}$ is proportional to $B$ — a smooth, featureless line. Edwin Hall discovered this effect in 1879, and it was immediately useful: measuring $R_{xy}$ gives the carrier density and sign (electrons vs. holes) in any material. The classical Hall effect is routinely used to characterize semiconductors.

The Quantum Surprise

In 1980, Klaus von Klitzing performed Hall effect measurements on a two-dimensional electron gas (2DEG) in a silicon MOSFET at low temperature (1.5 K) and high magnetic field (18 T). He expected to see the smooth classical result. Instead, he found something astonishing: the Hall resistance is quantized in exact integer fractions of a fundamental constant:

$$R_{xy} = \frac{h}{ie^2} = \frac{25{,}812.807\ldots\,\Omega}{i}, \quad i = 1, 2, 3, \ldots$$

The quantization was exact to better than one part in $10^9$ — more precise than the values of $h$ and $e$ were independently known at the time. Simultaneously, the longitudinal resistance $R_{xx}$ drops to zero on the plateaus.

Von Klitzing won the 1985 Nobel Prize. His discovery — the integer quantum Hall effect (IQHE) — is now used as the international standard for electrical resistance.

Landau Levels: The Quantum Mechanical Origin

The IQHE arises from the quantization of electron orbits in a magnetic field. You saw in your study of angular momentum (Chapters 12-14) that orbital motion in a central potential is quantized. Something similar happens in a magnetic field.

An electron confined to 2D in a perpendicular magnetic field $B$ obeys the Schrodinger equation with the Hamiltonian:

$$\hat{H} = \frac{1}{2m^*}\left(\hat{\mathbf{p}} + e\mathbf{A}\right)^2$$

where $\mathbf{A}$ is the vector potential ($\nabla \times \mathbf{A} = B\hat{z}$). In the Landau gauge $\mathbf{A} = (0, Bx, 0)$, this can be separated and reduced to a 1D harmonic oscillator problem. The energy levels are:

$$E_n = \hbar\omega_c\left(n + \frac{1}{2}\right), \quad n = 0, 1, 2, \ldots$$

where $\omega_c = eB/m^*$ is the cyclotron frequency. These are Landau levels — equally spaced, just like the quantum harmonic oscillator, with spacing $\hbar\omega_c$.

Each Landau level is massively degenerate: it holds $N_\phi = eB/h$ states per unit area (one state per quantum of magnetic flux $\Phi_0 = h/e$). The total density of states, which was a smooth function of energy in zero field, collapses into a series of discrete delta functions at the Landau level energies.

Why the Hall Resistance Is Quantized

When exactly $i$ Landau levels are completely filled (and the Fermi energy lies in the gap between Landau levels $i$ and $i+1$), the carrier density is:

$$n = i \cdot \frac{eB}{h}$$

Substituting into the classical Hall formula:

$$R_{xy} = \frac{B}{ne} = \frac{B}{i \cdot \frac{eB}{h} \cdot e} = \frac{h}{ie^2}$$

The $B$-dependence cancels! The Hall resistance depends only on fundamental constants and the integer $i$.

But this argument seems too simple — it is essentially just counting. Why is the quantization so remarkably precise? The deep answer involves topology (Chapter 36). The Hall conductance is related to a topological invariant — the Chern number — which is an integer by mathematical necessity. Just as you cannot have half a hole in a donut, you cannot have a non-integer Chern number. The quantization is protected by topology and is robust against disorder, impurities, sample geometry, and other imperfections.

📊 By the Numbers: The quantum Hall resistance $R_K = h/e^2 = 25{,}812.80745...\,\Omega$ is known to better than one part in $10^{10}$. Since 2019, it has been used (along with the Josephson effect) to define the SI units of electrical resistance and current. The quantum Hall effect is not just beautiful physics — it is literally the standard against which all electrical measurements are calibrated.

The Fractional Quantum Hall Effect

In 1982, Tsui, Stormer, and Gossard observed Hall resistance plateaus at fractional filling factors: $R_{xy} = h/(fe^2)$ with $f = 1/3, 2/5, 3/7, \ldots$ This fractional quantum Hall effect (FQHE) cannot be explained by filling Landau levels with non-interacting electrons — it requires electron-electron interactions.

Robert Laughlin proposed a trial wavefunction for the $f = 1/3$ state:

$$\Psi_{\text{Laughlin}}(z_1, z_2, \ldots, z_N) = \prod_{i < j} (z_i - z_j)^3 \, e^{-\sum_k |z_k|^2/4l_B^2}$$

where $z_j = x_j + iy_j$ and $l_B = \sqrt{\hbar/eB}$ is the magnetic length. The factor $(z_i - z_j)^3$ keeps electrons far apart (each pair separated by a third-order zero), minimizing the Coulomb repulsion energy while maintaining the antisymmetry required by Fermi statistics.

The excitations above the Laughlin state are one of the most exotic objects in physics: anyons — quasiparticles that are neither bosons nor fermions but obey fractional statistics. The quasiparticles carry fractional charge $e^* = e/3$. These anyons are the basis of proposals for topological quantum computing (Chapter 36).

⚖️ Interpretation: The FQHE is perhaps the most dramatic example of emergence in quantum mechanics. Start with ordinary electrons (charge $e$, spin-1/2 fermions) in a magnetic field. Add Coulomb interactions. The result: a new state of matter whose excitations carry fractional charge and obey statistics that exist in no fundamental particle. The whole is not merely more than the sum of its parts — it is qualitatively different from its parts.


26.9 QM Explains Your Phone

Let us make the connection to technology explicit. Your smartphone contains:

Processor (CPU/GPU): Billions of transistors, each a quantum mechanical device. The MOSFET (metal-oxide-semiconductor field-effect transistor) works by using a gate voltage to push the Fermi energy in a semiconductor channel from inside the band gap (insulating — "off") to inside the conduction band (conducting — "on"). The on/off ratio, switching speed, leakage current, and threshold voltage are all determined by the band structure and effective masses derived in this chapter.

Memory (Flash/DRAM): Flash memory stores data by trapping electrons on a floating gate via quantum mechanical tunneling (Chapter 3). The electrons tunnel through a thin SiO$_2$ barrier — classically impossible, quantum mechanically routine. The retention time (decades) and write speed (microseconds) are governed by the tunnel barrier height and width.

Display (OLED/LED): Light-emitting diodes exploit band gaps. When an electron in the conduction band recombines with a hole in the valence band, it emits a photon with energy $E \approx E_g$. The color of the LED is directly set by the semiconductor band gap: GaN ($E_g \approx 3.4$ eV, blue/UV), InGaN (tunable, green), AlGaInP (red). White LEDs use a blue GaN LED with a phosphor coating.

Battery: Lithium-ion batteries involve the quantum mechanical band structures of electrode materials (typically transition metal oxides like LiCoO$_2$). The voltage, capacity, and charging rate depend on the electronic band structure and the energy levels of Li$^+$ insertion sites.

Wireless (antenna/RF): The transistors in the RF amplifier exploit the high electron mobility of GaAs or GaN — both direct consequences of their band structures and effective masses.

Camera sensor: Each pixel is a semiconductor photodiode. An incoming photon with $E > E_g$ excites an electron from the valence to the conduction band, generating a measurable current. The band gap determines the spectral sensitivity (silicon: sensitive to visible and near-IR, matching the solar spectrum — not a coincidence, but selection by application engineers).

Solar cells: A photovoltaic cell is essentially a p-n junction diode operated in reverse. Photons with energy $E > E_g$ create electron-hole pairs; the built-in electric field of the p-n junction sweeps them apart, generating a voltage and current. The efficiency is maximized when the band gap matches the peak of the solar spectrum — the Shockley-Queisser limit of $\sim 33\%$ for a single-junction cell corresponds to $E_g \approx 1.34$ eV. Silicon ($E_g = 1.12$ eV) is close to this optimum, which is one reason it dominates the photovoltaic industry.

📊 By the Numbers: The semiconductor industry generates over $600 billion in annual revenue (as of 2024). Every dollar of that revenue depends on quantum mechanical band theory. The entire digital revolution — computers, internet, smartphones, AI — is an engineering application of the physics in this chapter. Solar energy generation exceeded 1,500 TWh in 2023, all from photovoltaic cells whose operation is governed by band gaps.

The quantum Hall effect, meanwhile, underpins the international system of electrical units. And MRI machines use superconducting magnets (BCS superconductors, typically NbTi or Nb$_3$Sn) to generate the strong, stable magnetic fields needed for medical imaging.

🔗 Connection: The technological applications of quantum mechanics continue to grow. Quantum dots (semiconductor nanocrystals with size-tunable band gaps) appear in next-generation displays. Topological insulators (Chapter 36) are being explored for dissipationless electronics. Superconducting qubits (Chapter 25, 35) are the leading platform for quantum computing. Condensed matter physics is where quantum mechanics meets the real world.


26.10 Summary and Progressive Project

Summary of Key Results

This chapter derived the quantum mechanical theory of electrons in periodic potentials and applied it to explain the electronic properties of solids.

Bloch's theorem (Section 26.2): Eigenstates in a periodic potential take the form $\psi_{n\mathbf{k}}(\mathbf{r}) = e^{i\mathbf{k}\cdot\mathbf{r}} u_{n\mathbf{k}}(\mathbf{r})$ with $u$ lattice-periodic. Crystal momentum $\hbar\mathbf{k}$ is conserved; the Brillouin zone contains all distinct $\mathbf{k}$ values.

Band structure (Sections 26.3-26.4): Allowed energies form bands separated by gaps. The nearly free electron model explains gaps as Bragg reflection at zone boundaries (perturbation theory). The Kronig-Penney model provides an exactly solvable example. Effective mass $m^* = \hbar^2/(d^2E/dk^2)$ captures how the lattice modifies electron dynamics.

Classification of solids (Section 26.5): Metals have partially filled bands. Insulators have full bands separated by large gaps. Semiconductors have full bands separated by small gaps, tunable by doping.

Tight-binding model (Section 26.6): Starting from atomic orbitals, the 1D chain gives $E(k) = \epsilon_0 - 2t\cos(ka)$. Graphene's honeycomb lattice yields Dirac cones at the K points — massless relativistic electrons at a fraction of the speed of light.

BCS superconductivity (Section 26.7): Phonon-mediated attraction creates Cooper pairs. The BCS ground state exhibits macroscopic quantum coherence with an energy gap $\Delta$ that protects the zero-resistance state.

Quantum Hall effect (Section 26.8): Landau level quantization in 2D plus a Fermi energy in a gap gives $R_{xy} = h/(ie^2)$, quantized by topology. The fractional QHE produces anyons with fractional charge.

Looking Ahead

The band theory developed here returns in Chapter 36, where we add topology to the story. The Berry phase (Chapter 32) applied to Bloch bands yields topological invariants that classify insulators into ordinary and topological varieties — materials that insulate in the bulk but conduct on their edges. The tight-binding model of graphene, with its sublattice pseudospin and Berry phase $\pi$, is the prototype.

Progressive Project: Condensed Matter Module

Add the following to your Quantum Simulation Toolkit:

Task 1: Kronig-Penney Band Structure

Implement a function kronig_penney(V0, b, w, N_energies) that: - Solves the Kronig-Penney transcendental equation numerically for $E$ given $k$ - Returns band structure $E_n(k)$ for the first several bands - Identifies band gaps (forbidden energy regions) - Plots the band structure in the reduced zone scheme

Task 2: Tight-Binding Band Structure

Implement a function tight_binding_1d(epsilon_0, t, N_k) that: - Computes $E(k) = \epsilon_0 - 2t\cos(ka)$ on a grid of $k$ values in the first Brillouin zone - Returns energy, crystal momentum, effective mass, density of states - Plots the band structure, DOS, and effective mass vs. $k$

Task 3: Graphene Bands

Implement graphene_bands(t, N_k) that: - Computes the full 2D band structure $E_\pm(\mathbf{k})$ - Plots the 3D band surface showing the Dirac cones - Computes and plots a cut along the high-symmetry path $\Gamma \to K \to M \to \Gamma$ - Extracts the Fermi velocity near the K point

See code/project-checkpoint.py for a starter implementation and code/example-01-bands.py for a complete worked example of the Kronig-Penney band structure.

⚠️ Common Misconception: Students sometimes confuse the number of bands with the number of atoms in the crystal. The number of bands is determined by the number of orbitals per unit cell (including spin), not the total number of atoms. A crystal with $10^{23}$ atoms but 1 orbital per unit cell has 1 band (with $10^{23}$ states). A crystal with $10^{23}$ unit cells and 2 orbitals per cell has 2 bands (each with $10^{23}$ states). Graphene has 2 atoms per cell, hence 2 $\pi$ bands.

🔗 Connection: This chapter has shown how quantum mechanics governs the macroscopic properties of solids. In Chapter 27, we turn to the quantum mechanics of light itself — photon number states, coherent states, and quantum optics. In Chapter 32, we will learn about the Berry phase, and in Chapter 36, we will return to band theory with the powerful lens of topology, discovering entirely new states of matter.