Case Study: Privacy Norms in Crisis — COVID-19 Contact Tracing
"In a pandemic, there is no such thing as individual health. My health depends on your health, and your health depends on mine. The same is true of privacy." — A public health ethicist, 2020
Overview
When COVID-19 swept across the world in early 2020, governments, technology companies, and public health agencies faced an urgent question with profound privacy implications: Could digital contact tracing help slow the spread of a lethal virus — and if so, at what cost to informational privacy?
Within weeks, dozens of countries deployed or proposed contact tracing applications. The designs varied enormously, from centralized government databases that tracked citizens' movements in real time to decentralized cryptographic protocols that revealed no personal information to any central authority. The debates were intense, the stakes were life and death, and the decisions made under crisis conditions would shape the relationship between public health and digital privacy for years to come.
This case study examines the COVID-19 contact tracing episode through the privacy theories of Chapter 7 — asking not just whether contact tracing violated privacy, but how different designs navigated the tension between collective health and individual informational rights, and what the episode reveals about privacy norms under crisis.
Skills Applied: - Applying Nissenbaum's contextual integrity framework to a rapidly evolving real-world situation - Evaluating trade-offs between privacy and competing social values (public health, collective safety) - Comparing centralized and decentralized technical architectures from a privacy perspective - Analyzing how crisis conditions affect the negotiation of informational norms
The Situation: A Virus and a Dilemma
Traditional Contact Tracing
Contact tracing is not new. For over a century, public health authorities have used manual contact tracing to contain infectious disease outbreaks: when a person is diagnosed with a communicable disease, trained workers identify and notify people who may have been exposed, advising them to quarantine and get tested.
Manual contact tracing operates within well-established informational norms. A public health worker asks a diagnosed person: "Who were you in close contact with in the past 14 days?" The patient shares names and contact information — sensitive data, but shared within the healthcare/public health context, with a clear purpose (preventing further transmission), and governed by public health confidentiality laws. The informational norms are:
- Type: Exposure-relevant contact information
- Subject: The diagnosed individual and their contacts
- Sender: The patient (voluntarily, typically)
- Recipient: Public health authorities
- Transmission principle: Containment of communicable disease, governed by public health law
These norms have been refined over decades and are well-understood by the populations they serve. They are contextually appropriate for the public health domain.
The Digital Turn
COVID-19 overwhelmed manual contact tracing. The virus spread faster than human tracers could work. Asymptomatic transmission meant that many infected individuals did not know they were infectious. And the scale — millions of cases per week at the pandemic's peak — exceeded the capacity of every country's public health workforce.
Digital contact tracing promised to fill this gap. Using smartphones' Bluetooth signals, GPS, or both, an app could automatically record proximity events between devices. When a user tested positive for COVID-19, the system could rapidly notify everyone who had been in close contact — potentially within minutes rather than the days required for manual tracing.
The promise was real. The privacy questions were immediate.
The Architectures: Centralized vs. Decentralized
The most consequential design decision was the choice between centralized and decentralized architectures. This was not merely a technical question — it was a privacy architecture question that determined what data was collected, by whom, and under whose control.
Centralized Models
In a centralized system, a government server collects and stores the proximity data. When a user tests positive, their contact data is uploaded to the central server, which identifies exposed contacts and sends notifications.
Examples: - Singapore's TraceTogether (initial version): Users' Bluetooth contact logs were encrypted and stored locally, but when a user tested positive, the logs were uploaded to the Ministry of Health, which decrypted them and conducted contact tracing. The government held a central database of contact events. - Australia's COVIDSafe: Built on Singapore's BlueTrace protocol. Contact data was encrypted and stored on users' phones for 21 days. When a user tested positive and consented, data was uploaded to a central server managed by the Digital Transformation Agency. State health authorities could then access the data for contact tracing. - China's Health Code system: Integrated with the Alipay and WeChat platforms, assigning citizens a color-coded health status (green, yellow, red) based on location history, travel records, and self-reported symptoms. The system was mandatory for access to public transportation, workplaces, and many public spaces. Data flowed to government servers with minimal transparency about retention or secondary use.
Privacy characteristics of centralized models: - The government holds a database of who was near whom and when. - The data can potentially be repurposed for non-public-health purposes (law enforcement, immigration enforcement, political surveillance). - The system requires trust in the government's commitment to purpose limitation. - Contact data is identifiable — the government knows which specific individuals were exposed.
Decentralized Models
In a decentralized system, all proximity data stays on users' devices. No central authority ever receives a database of contact events. Instead, when a user tests positive, their device broadcasts anonymous cryptographic tokens; other users' devices check these tokens against their own stored contact logs and determine locally whether an exposure occurred.
Examples: - Apple/Google Exposure Notification (GAEN) system: Developed jointly by Apple and Google in April 2020 and integrated into iOS and Android operating systems. Phones exchange random, rotating Bluetooth identifiers. When a user reports a positive diagnosis, their recent identifiers (called "diagnosis keys") are uploaded to a server. Other phones download these keys and check locally whether they had been in proximity. No central authority learns who was exposed — the notification happens entirely on the device. The system was adopted by public health authorities in Germany, Switzerland, Ireland, the United Kingdom (later version), and many U.S. states. - DP-3T (Decentralized Privacy-Preserving Proximity Tracing): An academic protocol developed by a European research consortium that influenced the GAEN design. DP-3T was explicitly designed to minimize data collection and prevent mission creep.
Privacy characteristics of decentralized models: - No central database of contact events exists. - The government or public health authority never learns who was exposed to whom. - The system is resistant to repurposing because the data needed for surveillance is never collected. - Individual users retain control — they must actively choose to upload their diagnosis keys.
The Battle of the Architectures
The choice between centralized and decentralized models became one of the most heated technology policy debates of 2020. The arguments crystallized into competing priorities:
Proponents of centralization argued: - Public health authorities need identifiable data to conduct effective contact tracing — not just anonymous notifications. - Centralized systems allow epidemiologists to analyze contact patterns, identify super-spreader events, and allocate resources. - Democratic governments can be trusted with contact data under appropriate legal safeguards.
Proponents of decentralization argued: - Centralized databases create irresistible targets for repurposing. History demonstrates that data collected for one purpose is routinely used for others. - Many populations — immigrants, political dissidents, marginalized communities — have well-founded reasons not to trust government databases. - A system that protects privacy by design (rather than by policy) is more robust, because it does not depend on the ongoing good faith of those in power. - High adoption rates — essential for contact tracing effectiveness — require public trust, and trust requires strong privacy protections.
The debate was resolved, in practice, by Apple and Google's decision to support only the decentralized model in their operating systems. Countries that wanted centralized systems found that their apps performed poorly on iPhones (which restrict background Bluetooth access for privacy reasons). Several countries — including the United Kingdom, Germany, and Australia — eventually abandoned centralized designs and adopted the GAEN framework.
Privacy Theory Analysis
Nissenbaum's Contextual Integrity
The COVID-19 contact tracing debate is a textbook application of Nissenbaum's framework, and it reveals both the framework's power and its complexity.
The prevailing context: Public health. The established informational norms of public health include: patients sharing health-relevant information with health authorities for disease containment, governed by public health confidentiality laws, with purpose limitation to the specific health threat.
The centralized model's information flow: - Type: Proximity data (who was near whom, when, for how long) — far more granular than traditional contact tracing's "who were you with?" - Subject: All app users, not just diagnosed individuals - Sender: Smartphone (automatically, continuously) - Recipient: Government health authority (central server) - Transmission principle: Pandemic containment — but with the technical capability for repurposing
Does this violate contextual integrity? The answer depends on whether automatic, continuous proximity logging by a government server conforms to the norms of the public health context. Traditional contact tracing collects exposure data after diagnosis, from the diagnosed patient, about specific contacts. Centralized digital tracing collects proximity data from everyone, continuously, before anyone is diagnosed. The type (continuous proximity logs vs. recalled contacts), the scope (all users vs. diagnosed patients), and the transmission principle (automated surveillance vs. patient-initiated disclosure) all represent significant departures from established norms.
The decentralized model's information flow: - Type: Anonymous rotating Bluetooth identifiers - Subject: All app users — but anonymously - Sender: Smartphone (automatically) - Recipient: Other nearby smartphones (no central recipient) - Transmission principle: Exposure notification without identification
The decentralized model hews much closer to the established norms of the public health context: it provides exposure-relevant notifications (consistent with the purpose of contact tracing) while minimizing the collection of identifiable data (consistent with confidentiality norms). It is not a perfect match — automated, continuous Bluetooth exchange is different from a human conversation with a contact tracer — but the contextual breach is far smaller.
Westin: Privacy Under Crisis
Westin's framework helps explain why contact tracing provoked such intense debate. All four states of privacy were implicated:
- Solitude: Centralized systems that tracked location (like China's Health Code) invaded solitude by monitoring individuals in their most private spaces.
- Intimacy: Contact tracing, by definition, reveals who you spend time with in close proximity — information that maps intimate relationships, social networks, and private associations.
- Anonymity: The fundamental question separating centralized from decentralized systems was whether users could be anonymous. In centralized systems, they could not. In decentralized systems, anonymity was the core design principle.
- Reserve: Individuals who wished to keep their health status private (whether diagnosed, exposed, or simply concerned) found that participation in contact tracing required disclosing — or risked revealing — information they preferred to withhold.
The "Nothing to Hide" Argument in Pandemic Context
The pandemic gave the "nothing to hide" argument new force. "If you don't have COVID, why do you care if your contacts are traced?" became a common refrain. The urgency of the crisis — hundreds of thousands of deaths — made privacy concerns seem secondary, even selfish.
But each of the seven responses from Section 7.4.2 applies:
- Response 1 (Privacy protects more than secrecy): Contact tracing data reveals social networks, intimate relationships, and daily patterns. A person may not be "hiding" anything and still have legitimate reasons to keep their social graph private.
- Response 2 (Future uses): Data collected for pandemic response could be retained and repurposed. Israel's Shin Bet security service used cell phone location data for COVID contact tracing — the same surveillance infrastructure previously used to track Palestinian militants. The precedent of crisis-era surveillance tools persisting beyond the crisis is well-documented.
- Response 3 (Power dynamics): Mandatory contact tracing disproportionately burdens populations who have historical reasons to distrust government surveillance — undocumented immigrants, political dissidents, religious minorities, and communities of color.
- Response 6 (Social value): Even if individuals are willing to be traced, the normalization of digital contact tracking as a public health tool creates infrastructure that can be repurposed for non-health surveillance, affecting society as a whole.
- Response 7 (Burden of proof): The question is not whether citizens must justify wanting privacy during a pandemic, but whether governments must justify the specific degree of surveillance they propose as necessary and proportionate.
Outcomes and Lessons
Effectiveness
The evidence on digital contact tracing effectiveness is sobering. Most contact tracing apps struggled with adoption — the systems required a significant percentage of the population to participate to be effective (estimates ranged from 60% to 80%), and most countries fell far short. In countries where adoption was voluntary, rates typically ranged from 10% to 40%. Factors limiting adoption included:
- Privacy concerns (the dominant reason in surveys across multiple countries)
- Lack of trust in government or technology companies
- Technical barriers (older phones, limited Bluetooth functionality)
- Skepticism about effectiveness
- Unequal access (not everyone owns a smartphone)
Countries with centralized, mandatory systems (notably China and South Korea) achieved higher adoption but at significant privacy costs — and under very different political conditions than liberal democracies.
The Privacy-Trust Paradox
The contact tracing episode revealed a paradox: the more privacy-invasive the system, the less likely people were to trust and use it — undermining the public health effectiveness that justified the privacy invasion. Countries that adopted the privacy-preserving GAEN framework generally saw higher voluntary adoption rates than those that initially deployed centralized systems. Privacy protection and public health effectiveness were not, as many assumed, in opposition — they were mutually reinforcing.
This lesson has broader implications for data governance: systems that respect informational norms earn greater trust, and greater trust produces greater participation, which produces better outcomes for everyone.
Sunset Provisions and Mission Creep
Several countries included sunset provisions in their contact tracing legislation — requirements that the systems be dismantled and data deleted after the pandemic ended. Germany's Corona-Warn-App was shut down in June 2023. Australia's COVIDSafe was decommissioned in August 2022 with all data deleted from the central server.
Other countries were less diligent. China's Health Code system, initially deployed for pandemic containment, was repurposed in some jurisdictions to restrict the movement of citizens attempting to travel to Beijing to file petitions against local government officials — a use entirely unrelated to public health. This confirmed the fears of decentralization advocates: data collected under crisis conditions can be repurposed when the crisis passes, and the infrastructure of surveillance, once built, is rarely dismantled voluntarily.
Cross-Cultural Dimensions
The contact tracing debate exposed the cross-cultural privacy tensions described in Section 7.5 in vivid terms:
- European countries generally adopted or mandated the decentralized GAEN framework, consistent with the GDPR's emphasis on data minimization and purpose limitation. Germany was an early and vocal proponent of decentralization.
- East Asian countries deployed a wider range of approaches, from South Korea's aggressive (and effective) use of CCTV footage, credit card records, and phone location data to trace contacts, to Japan's more privacy-cautious COCOA app.
- The United States had no national contact tracing app; individual states deployed GAEN-based apps with varying levels of promotion and adoption, reflecting the American sectoral approach.
- African countries faced a different challenge entirely: in many nations, smartphone penetration was too low for app-based contact tracing to work, forcing reliance on manual methods and highlighting the digital divide's impact on pandemic response.
Each approach reflected the privacy norms, governance structures, and political cultures described in the chapter — and each produced different trade-offs between surveillance, effectiveness, and rights.
Discussion Questions
-
The design question. If you were designing a contact tracing system for a future pandemic, would you choose a centralized or decentralized architecture? Justify your choice using Nissenbaum's contextual integrity framework. Under what conditions might you choose differently?
-
The proportionality question. Public health emergencies create genuine tension between individual privacy and collective safety. Using the five reasons privacy matters from Section 7.6, evaluate whether any of those reasons should be temporarily overridden during a severe pandemic. If so, which ones, and under what safeguards?
-
The trust paradox. The case study argues that privacy protection and public health effectiveness were "mutually reinforcing" rather than opposed. Does this finding settle the debate, or are there scenarios where a more privacy-invasive system would produce genuinely better health outcomes — and, if so, would the trade-off be justified?
-
Mission creep. China's repurposing of the Health Code system for non-health surveillance confirms Response 2 to the "nothing to hide" argument: data collected for one purpose can be used for another. Should all crisis-era surveillance systems include mandatory sunset provisions? What mechanisms would enforce them?
-
The equity gap. Digital contact tracing required a smartphone. In many countries, the people most vulnerable to COVID-19 — the elderly, the poor, essential workers — were least likely to own or regularly use smartphones. How does this affect the ethical evaluation of digital contact tracing as a public health tool? Connect your analysis to the equity argument in Section 7.6.
Your Turn: Mini-Project
Option A: Contextual Integrity Comparison. Select two countries' COVID-19 contact tracing apps (one centralized, one decentralized). Research each system's data flows in detail. Apply Nissenbaum's five-parameter framework to both systems and write a 750-1,000 word comparative analysis. Which system better preserved contextual integrity? What were the trade-offs?
Option B: Sunset Audit. Research what happened to the contact tracing infrastructure in a specific country after the acute phase of the pandemic ended. Was the app decommissioned? Was data deleted? Were sunset provisions enforced? If the infrastructure persists, what is it being used for? Write a 500-750 word audit report connecting your findings to the chapter's discussion of why privacy matters.
Option C: Future Pandemic Protocol. Design a privacy-respecting digital public health response protocol for a future pandemic. Your protocol should address: (1) what data is collected, (2) who holds it and for how long, (3) what technical architecture you would use (centralized, decentralized, or hybrid), (4) what legal safeguards are required, and (5) how you would address the equity gap for populations without smartphones. Write your protocol in 1,000-1,500 words, referencing at least two privacy theories from Chapter 7.
References
-
Apple and Google. "Exposure Notification: Bluetooth Specification." April 2020. Available at https://covid19.apple.com/contacttracing.
-
Bay, Jason, Joel Kek, Alvin Tan, et al. "BlueTrace: A Privacy-Preserving Protocol for Community-Driven Contact Tracing Across Borders." Government Technology Agency, Singapore, 2020.
-
European Parliament. "COVID-19 Tracing Apps: Ensuring Privacy and Data Protection." Briefing, May 2020.
-
Kahn, Jeffrey P., and the Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing. Digital Contact Tracing for Pandemic Response: Ethics and Governance Guidance. Baltimore: Johns Hopkins University Press, 2020.
-
Nissenbaum, Helen. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford: Stanford University Press, 2010.
-
Troncoso, Carmela, et al. "Decentralized Privacy-Preserving Proximity Tracing (DP-3T)." White Paper, April 2020.
-
Zastrow, Mark. "Coronavirus Contact-Tracing Apps: Can They Slow the Spread of COVID-19?" Nature 583 (2020): 17-18.
-
Mozur, Paul, Raymond Zhong, and Aaron Krolik. "In Coronavirus Fight, China Gives Citizens a Color Code, with Red Flags." The New York Times, March 1, 2020.
-
Sharon, Tamar. "Blind-Sided by Privacy? Digital Contact Tracing, the Apple/Google API and Big Tech's Newfound Role as Global Health Policy Makers." Ethics and Information Technology 23, Supplement 1 (2021): 45-57.