Case Study: The Encryption Wars — Apple vs. FBI
"If you put a key under the mat for the cops, a burglar can find it, too." — Tim Cook, CEO of Apple (2016)
Overview
On December 2, 2015, Syed Rizwan Farook and Tashfeen Malik carried out a mass shooting in San Bernardino, California, killing 14 people and injuring 22 others. In the investigation that followed, the FBI recovered Farook's iPhone 5C — issued by his employer, San Bernardino County — but could not access its contents because the phone was locked with a passcode and encrypted.
The FBI asked Apple for assistance. Apple provided data from Farook's iCloud backups (which Apple could access because iCloud data was stored on Apple's servers), but the most recent backup was six weeks old. The FBI wanted access to data on the phone itself — data that, due to Apple's encryption architecture, even Apple could not decrypt.
On February 16, 2016, a federal magistrate judge in California ordered Apple, under the All Writs Act of 1789, to create a specialized version of iOS — a tool the FBI called "GovtOS" — that would disable the security features preventing the FBI from brute-forcing the phone's passcode. Apple refused. CEO Tim Cook published an open letter to customers, calling the order "an unprecedented step which threatens the security of our customers."
The confrontation between Apple and the FBI became the most visible battle in the broader "encryption wars" — the ongoing conflict between governments seeking access to encrypted communications and technology companies building systems designed to prevent exactly that access.
Skills Applied: - Analyzing multi-stakeholder conflicts where legitimate interests collide - Evaluating technical arguments about encryption and security architecture - Applying governance frameworks to a case with no easy resolution - Connecting a specific legal conflict to systemic data governance questions
The Technical Context
How iPhone Encryption Works
Understanding the dispute requires understanding the technical architecture:
Hardware-bound encryption. Starting with the iPhone 5S (2013), Apple devices included a Secure Enclave — a dedicated processor that manages encryption keys. The encryption key for the device's data is derived from a combination of the user's passcode and a unique device-specific key (the UID) that is fused into the Secure Enclave hardware during manufacturing. Apple does not have a copy of this key and cannot extract it from the device.
Brute-force protections. iOS includes security features that prevent rapid guessing of the passcode: escalating time delays between incorrect attempts (up to one hour between attempts after multiple failures), and an optional setting that erases the device's encryption keys after ten failed attempts. These protections are enforced by the device's software and hardware.
What the FBI wanted. The FBI did not ask Apple to decrypt the phone. It asked Apple to create a custom version of iOS — loadable onto the specific phone via the Device Firmware Upgrade (DFU) mode — that would: (a) disable the auto-erase feature, (b) remove the escalating time delay between passcode attempts, and (c) allow the FBI to submit passcodes electronically rather than manually. With these modifications, the FBI could brute-force the four-digit passcode in minutes.
What Apple Argued
Apple's resistance was based on several arguments:
The tool, once created, could not be confined to one phone. Apple argued that creating "GovtOS" would establish a precedent and a capability that could be demanded again — by US law enforcement in other cases, by foreign governments, and potentially by adversaries who obtained the tool through theft, legal compulsion, or espionage. "Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge," Cook wrote.
The request was technically a backdoor. Although the government framed its request narrowly (a tool for one phone in one investigation), the technical capability Apple was being asked to create — a version of iOS that disabled security protections — was functionally a master key. The government's assurance that the tool would be used "only once" was not technically enforceable.
Setting a global precedent. If the US government could compel Apple to undermine its security architecture, other governments would follow. China, Russia, and other nations would demand similar access — and would have the legal basis to do so, pointing to the US precedent. Apple's security architecture would become a tool of global government surveillance.
The All Writs Act was being stretched beyond its intent. The government invoked the All Writs Act of 1789 — a statute authorizing courts to issue orders "necessary or appropriate" to fulfill their functions. Apple argued that using a 227-year-old statute to compel a technology company to create a new software tool for the government was an unprecedented expansion of judicial power that should be decided by Congress, not by a magistrate judge.
The Government's Position
The FBI and the Department of Justice argued:
This is not about creating a backdoor — it is about executing a valid search warrant. The government had a lawful warrant to search the phone's contents. Apple was being asked to assist in executing that warrant, just as a landlord might be compelled to provide a key to a property subject to a search warrant.
The request is narrow and device-specific. The government emphasized that the order applied to one phone, in one investigation, related to a mass killing. The tool could be used under Apple's supervision and destroyed afterward.
Public safety demands access. Fourteen people were killed. The government had a legitimate — indeed, compelling — interest in investigating the attack and determining whether others were involved. Encryption that prevents law enforcement from accessing evidence in a terrorism investigation undermines public safety.
Apple chose to create this problem. Apple's encryption architecture was a design choice, not a law of nature. Apple could have designed its systems to allow lawful access. Having chosen to design systems that prevent even Apple from accessing data, Apple bore some responsibility for the consequences.
The Resolution — and What It Revealed
The FBI's Workaround
On March 28, 2016 — one day before a scheduled court hearing — the FBI withdrew its legal motion, informing the court that a third party had demonstrated a method to access the phone without Apple's assistance. It was later reported that the FBI paid an undisclosed company (widely believed to be the Israeli firm Cellebrite, though the exact vendor was never officially confirmed) approximately $1.3 million to exploit a vulnerability in the iPhone 5C's security architecture.
The phone's contents were ultimately accessed. The FBI reported that nothing of significant investigative value was found.
What the Resolution Revealed
The vulnerability market. The FBI's solution — purchasing an exploit from a private company — revealed the existence of a thriving market for security vulnerabilities. Companies like Cellebrite, NSO Group, and Zerodium discover (or purchase from independent researchers) vulnerabilities in consumer devices and sell access to law enforcement and intelligence agencies. This market creates perverse incentives: the existence of buyers for vulnerabilities reduces the incentive to report them to manufacturers for patching, leaving millions of devices exposed.
The specificity problem. The FBI's legal argument — that it needed Apple's help because there was no other way to access the phone — collapsed when a third party demonstrated that there was. This raised the question of whether the legal battle was genuinely about one phone, or whether the FBI had been seeking a legal precedent that would compel technology companies to build lawful access capabilities into their products.
The emptiness of the evidence. The fact that the phone contained nothing of investigative value was significant. The FBI had argued that accessing the phone was essential for public safety. The absence of useful evidence did not retroactively invalidate the argument — the phone might have contained crucial information — but it complicated the narrative that encryption was blocking vital intelligence.
The Broader Encryption Wars
The Apple vs. FBI case was one battle in a longer war. The encryption debate continued to intensify:
The EARN IT Act (US). Proposed legislation that would condition Section 230 liability protections on platforms' compliance with "best practices" for detecting child exploitation — practices that critics argued would be incompatible with end-to-end encryption.
The UK Online Safety Act (2023). Included provisions empowering the regulator (Ofcom) to require technology companies to use "accredited technology" to detect child exploitation material in encrypted messages. Signal and WhatsApp threatened to withdraw from the UK market rather than comply.
The EU Chat Control proposal. A proposed regulation requiring platforms to scan all messages — including encrypted ones — for child exploitation material. The proposal was highly controversial and faced opposition from privacy advocates, technologists, and several EU member states.
Australia's Assistance and Access Act (2018). Required technology companies to provide "technical assistance" to law enforcement, including the ability to modify systems to enable access to encrypted communications. The law was criticized by the technology industry as effectively mandating backdoors.
In each case, the pattern was the same: governments demanded access to encrypted communications, citing child safety or national security; technology companies and civil liberties organizations resisted, citing the technical impossibility of secure backdoors; and the debate remained unresolved because the underlying tension — between security through encryption and security through surveillance — admits no easy technical compromise.
Connecting to Data Governance Themes
The Apple vs. FBI case illuminates several of the textbook's recurring themes:
Power asymmetry. The case pitted one of the world's most powerful government agencies against one of the world's most powerful corporations. Ordinary citizens — whose data security depended on the outcome — had no seat at the table. The power asymmetry was between two powerful actors, but the consequences were borne by the powerless.
The consent fiction. Millions of iPhone users "consented" to Apple's security architecture by purchasing iPhones — but none of them consented to the risk that a government might compel Apple to undermine that architecture. Consent to a security model does not include consent to its dismantling.
The accountability gap. If the FBI had obtained the tool and it had leaked — exposing the security of every iPhone to exploitation — who would have been accountable? The FBI, for demanding the tool? Apple, for creating it under compulsion? The court, for ordering it? The accountability gap would have been total.
The VitraMed parallel. Mira noted the parallel to VitraMed's data governance challenges: "VitraMed built a system optimized for one purpose — health outcomes — and then faced pressure to use it for other purposes. Apple built a security system optimized for one purpose — user privacy — and then faced pressure to undermine it for another purpose. In both cases, the original design decisions created constraints that were later treated as obstacles rather than protections."
Discussion Questions
-
The government argued that Apple "chose" to create an encryption architecture that prevents lawful access. Is this a valid framing? Should technology companies be held responsible for designing systems that comply with lawful search orders, or does the right to design secure systems override government convenience?
-
The FBI ultimately accessed the phone through a purchased exploit. Does the existence of a commercial vulnerability market make the legal backdoor debate moot? Or does it make it more urgent?
-
If you were an Apple engineer ordered to create the "GovtOS" tool, what would you do? How does the Practitioner's Oath (Chapter 40) apply to this scenario?
-
The Apple vs. FBI case was about a dead suspect's employer-issued phone — arguably the strongest possible case for government access. Would you reach a different conclusion for a different scenario — such as a journalist's phone, or a political dissident's phone?
-
Dr. Adeyemi asked: "If the government can compel a company to write code, what is the limiting principle? Can it compel a company to build a surveillance tool? A predictive policing algorithm? An algorithmic content filter?" Where should the line be drawn?
Further Investigation
- Read Tim Cook's open letter to Apple customers (February 17, 2016, available on Apple's website).
- Research the vulnerabilities and exploits market. What is the Wassenaar Arrangement, and how does it attempt to regulate the export of surveillance technologies?
- Compare the legal frameworks for compelling technology company assistance in the US (All Writs Act), UK (Investigatory Powers Act), and Australia (Assistance and Access Act). Which framework provides the strongest civil liberties protections?