Case Study 39-1: Signal — Privacy by Design as Structural Commitment

Background

Signal is an encrypted messaging application developed by the Signal Foundation, a nonprofit. It provides end-to-end encrypted text, voice, and video communications. As of the mid-2020s, it had over 100 million users globally, including heads of government, journalists, activists, attorneys, healthcare workers, and hundreds of millions of ordinary people who wanted private digital communications.

Signal is not the most popular messaging application. WhatsApp has over two billion users. iMessage has hundreds of millions. Signal's market position reflects the fact that it is built to do one thing — provide the most private possible communications — rather than to maximize user engagement, monetize data, or integrate with advertising ecosystems.

This case study examines Signal as an example of privacy by design taken seriously: as a structural commitment embedded in technical architecture, business model, and organizational mission rather than as a feature or a compliance posture.

The Technical Architecture

Signal's privacy by design begins with its technical architecture. Several design choices distinguish Signal from virtually all competing communications platforms:

What Signal cannot see:

Signal cannot read your messages — they are end-to-end encrypted from your device to your recipient's device, and Signal holds no keys to decrypt them. Signal cannot see who you are messaging — the encrypted messages passing through Signal's servers contain no metadata about sender, recipient, or timing (the Signal Protocol uses "sealed sender" for metadata protection). Signal cannot see your contact list — it uses cryptographic techniques to check which of your contacts use Signal without uploading your contact list to its servers. Signal cannot see when you were last online.

What Signal retains about you:

Signal retains only the date your account was created and the date you last connected to Signal's servers. That is the complete list of user data that Signal maintains. This minimal data footprint is a design choice: Signal was built to have as little user data as possible, so that there is little to share, subpoena, or breach.

What happens when law enforcement asks:

Signal has repeatedly been subpoenaed by U.S. law enforcement agencies seeking information about Signal users. Its responses are public. In every published case, Signal has responded to subpoenas by providing the only data it has: the date of account creation and the date of last connection. It cannot provide message content, because it doesn't have it. It cannot provide contact information, because it doesn't have it. It cannot provide communication metadata, because it doesn't have it.

Signal's 2021 response to a subpoena from the U.S. Eastern District of Virginia — one of the most detailed public disclosures of Signal's data minimization approach — produced exactly these two fields. The FBI agent who reviewed the response reportedly told a colleague, "they really have nothing."

The Business Model

Signal's privacy architecture is sustainable because Signal's business model does not depend on data extraction. The Signal Foundation is a nonprofit; Signal's development is funded by donations, grants, and foundation support, not advertising revenue. It has received significant donations from the co-founders of WhatsApp (who left Meta specifically because of disagreements about privacy), from various technology philanthropies, and from a broad community of supporters.

This business model is not easily replicated by commercial competitors. A for-profit messaging platform that implemented Signal's data minimization architecture would sacrifice the advertising revenue and data monetization that generates commercial value in the current market. WhatsApp, despite using the Signal Protocol for message encryption, retains extensive metadata about users — who they communicate with, when, how frequently — because this information is valuable to Meta's advertising business. The technical protocol is private; the platform's relationship with user data is not.

Signal's nonprofit model is a feature, not merely a detail. The organizational structure that makes Signal sustainable — foundation funding, mission-driven development — is inseparable from the technical architecture that makes it private. An organization that needed advertising revenue to survive would not build what Signal built.

The Limits of Signal as a Model

Signal is an important existence proof: privacy by design, implemented seriously, is technically feasible. It provides private communications to over 100 million people. It has demonstrated through repeated law enforcement interactions that data minimization, as a structural commitment, produces systems that have genuinely little to surrender.

But Signal is not a blueprint for the entire communications ecosystem for several reasons:

Network effects: Signal's usefulness depends on whether the people you want to communicate with also use Signal. For secure communications with family, friends, and colleagues who use a different platform, you cannot use Signal for those communications. WhatsApp's two billion users create a network effect that Signal, with its hundred million, cannot match. For Signal to replace WhatsApp, the entire network would need to migrate — a coordination problem that market forces are unlikely to solve on their own.

Functionality trade-offs: Signal's data minimization comes at the cost of features that users expect from consumer messaging applications. No read receipts from outside your contacts. Limited media library. No algorithmic content surfacing. These are not bugs in Signal's design — they are the cost of privacy. Not all users are willing to pay that cost in reduced functionality.

Platform dependency: Signal runs on iOS and Android, which are controlled by Apple and Google respectively. Both companies have the technical ability to mandate changes to Signal's codebase as a condition of App Store distribution. The privacy guarantees of a well-designed app cannot fully escape the platform layer on which it runs.

The metadata residual: Even with Signal's metadata protections, the fact that you use Signal at all is observable to your internet service provider. For people in authoritarian contexts, using an encrypted communications application can itself be a flag — "why do they need to hide their communications?" The chilling effect of being observed using privacy tools is a limit on privacy tools' effectiveness in high-risk contexts.

Discussion Questions

  1. Signal's privacy architecture depends on a nonprofit business model that does not require data monetization. Is this a scalable model for the broader technology industry? What would need to change in the technology economy for privacy-first design to be commercially sustainable at scale?

  2. Signal demonstrates that data minimization as a structural commitment means that law enforcement requests produce almost no responsive data. Does this concern you from a public safety perspective? How should society balance the privacy benefits of data minimization with the law enforcement costs?

  3. WhatsApp uses the Signal Protocol for message encryption but retains extensive metadata. What does this illustrate about the difference between implementing a privacy technology and adopting privacy by design as a structural commitment?

  4. The chapter argues that Apple's App Tracking Transparency framework demonstrated that when users are given meaningful choices about privacy, most choose privacy. Does Signal's 100 million users (vs. WhatsApp's 2 billion) contradict this? How do you reconcile the two observations?

  5. If you were advising a government agency considering adoption of Signal for internal communications, what questions would you want answered before recommending adoption? What additional protections would be appropriate for government use?