Case Study 9.2: The Signal Protocol and the Encryption Countermovement

Overview

In the years following the Snowden revelations of 2013, a quiet but consequential technical revolution took place in the communications software that billions of people use daily. End-to-end encryption — previously available only to technically sophisticated users willing to use complex tools — was built into mainstream applications used by ordinary people: WhatsApp (2016), Apple's iMessage, Facebook Messenger (optional mode), and Google Messages. The technology enabling most of these implementations is the Signal Protocol, developed by a small nonprofit organization called Signal Foundation. This case study examines the Signal Protocol as a case study in the relationship between surveillance revelation, technical response, and the evolving landscape of democratic oversight of communications — what it means, in practice, for communications to be genuinely private from mass interception.


Background: The Surveillance Problem and the Encryption Response

Chapter 9 describes how intelligence agencies access communications at multiple points in the communications infrastructure: at the level of fiber-optic cables (TEMPORA, MUSCULAR), through legal orders to internet companies (PRISM), and through the analysis of metadata (Section 215 bulk collection). Each of these collection methods operates differently, and encryption addresses them differently.

Cable-level interception (TEMPORA, MUSCULAR): GCHQ and NSA tap cables or access data center connections to collect the data flowing through them. If communications are end-to-end encrypted, the data flowing through the cables is encrypted ciphertext — readable only by the intended recipient who holds the decryption key. The intercepted data is useless without the key.

Legal orders to companies (PRISM): When governments require companies to provide user communications content, companies can only provide what they have. If the company has plaintext content stored on its servers, it can provide that content. If the content is end-to-end encrypted and the company does not hold the decryption keys, it cannot provide readable content even if required to do so.

Metadata collection: End-to-end encryption does not protect metadata — who communicated with whom, when, from where. This is a significant limitation of encryption as a privacy protection. The Signal application has implemented several measures to minimize metadata collection (including "sealed sender" technology that obscures who sent a message), but the fundamental limitation remains: encryption protects content; metadata requires different approaches.


The Signal Protocol: Technical Architecture

The Signal Protocol (originally called the Axolotl Ratchet, developed by Moxie Marlinspike and Trevor Perrin at the organization then called Open Whisper Systems) is a cryptographic protocol that provides:

End-to-end encryption: Messages are encrypted on the sender's device and decrypted only on the recipient's device. No intermediate party — not Signal, not Google, not the NSA — has access to the decryption keys.

Forward secrecy: The protocol generates new encryption keys for each message. If an adversary compromises a current session key, they cannot decrypt past messages — the keys used to encrypt past messages no longer exist. This property (called "perfect forward secrecy" in cryptography) means that mass collection of encrypted traffic is less valuable: even if a sophisticated adversary stores all encrypted traffic in hope of breaking the encryption later, they will find that each message requires breaking a separate key.

Break-in recovery: If a session key is compromised, the compromise does not extend to future messages — the protocol regularly "ratchets" to new keys.

Deniable authentication: The protocol allows parties to verify that they are talking to each other without creating a cryptographic proof of the conversation that could be produced in court or to a third party. Each party can verify the other's identity but cannot prove to others that the conversation occurred.


Adoption and Scale

The Signal Protocol was initially implemented in the Signal application, which achieved modest adoption among privacy advocates and technically sophisticated users. The major inflection point came in 2016, when WhatsApp — then with approximately one billion users — implemented the Signal Protocol for all communications on its platform.

The scale of this transition is difficult to overstate. Before WhatsApp's Signal Protocol implementation, end-to-end encryption was used by a small fraction of global internet users. After the implementation, it was used by essentially anyone who used WhatsApp — a user base spanning continents and demographics that had never thought about encryption and wouldn't have described themselves as privacy-conscious.

Subsequently, iMessage (Apple's built-in messaging system), Google Messages, and other major platforms implemented end-to-end encryption for significant portions of their communications. By 2023, a substantial majority of consumer messaging was end-to-end encrypted.


Intelligence agencies and law enforcement responded to the encryption expansion through several channels:

Legal demands. The FBI and DOJ increased pressure on Congress to require technology companies to build "exceptional access" mechanisms — the backdoor proposals described in Chapter 9's main text. These proposals have not, as of this writing, been enacted into U.S. law.

Metadata exploitation. Unable to access communications content, intelligence and law enforcement agencies intensified focus on what they could access: metadata. The records of who communicated with whom, when, and how frequently — even without the content of those communications — remain accessible. The agencies' analytical capability for metadata analysis has become increasingly sophisticated.

Endpoint attacks. When content cannot be intercepted in transit, intelligence agencies can attempt to access it at the endpoint — the device on which it is decrypted. NSO Group's Pegasus spyware, discussed in Chapter 32, is designed exactly for this: rather than breaking encryption in transit, it compromises the device that decrypts the communications. The encryption is not broken; the attacker simply reads the plaintext before it is encrypted or after it is decrypted.

Lawful access to Signal and similar apps. Signal is designed to minimize what information it can provide even in response to lawful legal process. Signal's technical architecture means it cannot provide message content; it retains minimal metadata. In documented cases where Signal received legal process from law enforcement, the company was able to provide only: the date an account was created and the date it last connected to Signal's servers. No message content; no contact lists; no message timestamps; no group information.


The Going Dark Debate Revisited

The expansion of end-to-end encryption has made the "going dark" debate concrete rather than theoretical. Law enforcement agencies can demonstrate specific cases in which they have been unable to access communications content they would have been able to access ten years ago, because the communications were end-to-end encrypted.

The FBI has cited encrypted communications in child sexual abuse investigations, terrorism cases, and drug trafficking investigations. These cases are real, and the operational challenge is genuine. The question is whether the policy response — requiring backdoor access — is technically feasible without creating risks that outweigh the law enforcement benefit.

The technical case against backdoors has remained consistent since the Clipper chip debates of the 1990s:

  1. A government-accessible backdoor is a vulnerability that any sufficiently sophisticated adversary can attempt to exploit.

  2. The organizations that hold the keys to backdoor systems become extremely high-value targets for foreign intelligence services, criminal hackers, and corporate espionage.

  3. Bad actors who know that a backdoor exists can design around it — using homemade encryption tools, foreign-made applications, or other means of achieving secure communication.

  4. Law-abiding users bear the security cost of the backdoor while sophisticated criminals and terrorists avoid it.

The FBI's response is that perfect security is not the standard — it is enough that backdoor access works against the large majority of bad actors who use commercial platforms, even if sophisticated adversaries can evade it.


The Democratic Significance of Encryption

The Signal Protocol and the broader encryption expansion represent something that surveillance history does not frequently offer: a genuine correction that went in the privacy-protective direction. The Snowden revelations revealed the scale of state surveillance; the encryption countermovement meaningfully reduced that surveillance's access to communications content for billions of ordinary users.

This is not a complete correction. Metadata remains accessible. Device-level attacks can bypass content encryption. Mass surveillance of communications metadata continues. And the political fight over encryption — backdoors, lawful access, going dark — continues. But the practical privacy of ordinary digital communications is meaningfully greater in 2024 than it was in 2013, specifically because a technical response to surveillance revelation was widely adopted.

For democratic theory, the encryption story has an interesting implication: democratic accountability for surveillance does not require only legislative reform or judicial oversight — it can emerge through technical countermeasures deployed by the market in response to revealed overreach.


Discussion Questions

  1. The Signal Protocol provides end-to-end encryption that is mathematically secure but does not protect metadata — who communicated with whom, when, and how frequently. Given the chapter's analysis of how much information metadata reveals (the Stanford health privacy study, the "we kill people based on metadata" discussion), is end-to-end content encryption a meaningful privacy protection if metadata remains accessible? What would genuine communications privacy require?

  2. Law enforcement agencies have cited specific cases — particularly child sexual abuse investigations — in which end-to-end encryption has impeded their ability to identify perpetrators. Evaluate this specific use case carefully. Is this a compelling argument for encryption backdoors? What factors distinguish it from the general going dark debate? Are there technical approaches that could address this specific use case without creating general backdoor vulnerabilities?

  3. The encryption countermovement — the deployment of end-to-end encryption by major consumer platforms in response to the Snowden revelations — is described in the case study as "a genuine correction that went in the privacy-protective direction." Evaluate this characterization. Is it accurate? What are the limits of encryption as a correction to surveillance overreach?

  4. Signal's architecture is designed to minimize what it can provide in response to lawful legal process. Evaluate this design choice. Is it appropriate for a private company to design its products specifically to resist lawful government access to data? What is the difference between Signal's approach and a company that simply refuses to comply with lawful process?

  5. The case study notes that bad actors aware of a backdoor can design around it — using non-backdoored tools — while law-abiding users bear the security cost. Evaluate this argument. If sophisticated criminals can evade backdoors, does this mean the FBI's "going dark" concern only affects ordinary (less sophisticated) criminals? Is there a distinction between the terrorists and serious organized criminals the FBI most wants to surveil and the bad actors who are least capable of evading backdoored systems?

  6. Jordan uses Signal to communicate with Yara about surveillance studies work and political organizing. Marcus thinks this is paranoid. Jordan says: "I'm not hiding anything. I just don't want my communications to be part of someone's bulk collection database." Evaluate Jordan's position. Is this a reasonable articulation of a privacy interest? Does using encrypted communications to avoid bulk collection reflect appropriate civic behavior or inappropriate distrust of legitimate governance?


Case Study 9.2 | Chapter 9: Intelligence Agencies and Mass Interception | Part 2: State Surveillance