13 min read

> "They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety."

Learning Objectives

  • Trace the evolution of privacy as a legal and philosophical concept from the 19th century to the present
  • Compare and contrast at least four major theories of privacy
  • Apply Nissenbaum's contextual integrity framework to evaluate a data practice
  • Distinguish between privacy as secrecy, privacy as control, privacy as contextual integrity, and privacy as a social value
  • Evaluate cross-cultural differences in privacy norms and expectations
  • Articulate why privacy matters even in an age of pervasive data collection

Chapter 7: What Is Privacy? Definitions and Debates

"They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety." — Benjamin Franklin (attributed), Pennsylvania Assembly: Reply to the Governor (1755)

Chapter Overview

At the end of Chapter 1, Dr. Adeyemi silenced a classroom by asking whether anyone would be comfortable having their complete search history projected on a screen. The discomfort was immediate and universal. But why?

If asked to define privacy, most people hesitate. They know they want it, they know when it's been violated, but articulating exactly what it is — and why it matters — proves surprisingly difficult. Is privacy the right to be left alone? The ability to control your personal information? A social norm that varies by context? A necessary condition for democracy?

The answer, as this chapter will demonstrate, is all of these and more. Privacy is not a single concept but a family of related concerns that cluster around a core insight: human flourishing requires some degree of control over the boundary between the self and the world.

In this chapter, you will learn to: - Navigate the major theories of privacy and identify their strengths and limitations - Apply contextual integrity to evaluate whether a data practice violates privacy norms - Respond substantively to the "nothing to hide" argument - Recognize how cultural context shapes privacy expectations - Connect privacy theory to the practical governance challenges of Parts 2-4


7.1 The Origin Story: Warren and Brandeis

7.1.1 "The Right to Privacy" (1890)

The modern legal concept of privacy in the United States begins with a law review article. In December 1890, Samuel Warren and Louis Brandeis (later a Supreme Court Justice) published "The Right to Privacy" in the Harvard Law Review. The catalyst was personal: Warren was annoyed by Boston society page reporters covering his family's social events.

Their argument was revolutionary for its time. Warren and Brandeis argued that existing tort law — designed for physical injuries and property violations — was inadequate to protect against a new kind of harm: the exposure of private life through the emerging technologies of the day (principally, portable cameras and mass-circulation newspapers).

They defined privacy as "the right to be let alone" — a right rooted not in property but in personality, in what they called "inviolate personality."

Connection: Notice how the pattern mirrors Chapter 2's historical dynamics: a new technology (cameras, newspapers) enables new forms of intrusion; existing governance mechanisms (property law) prove inadequate; and new legal concepts must be invented. Today's equivalent — smartphones, social media, AI — follows the same pattern at exponentially greater scale.

7.1.2 Limitations of "The Right to Be Let Alone"

Warren and Brandeis's definition captured something important, but it has significant limitations for the data age:

  • Passive framing. "Being let alone" suggests privacy is about withdrawal — building a wall between yourself and the world. But in an interconnected society, complete withdrawal is neither possible nor desirable. We want to participate in social media, use health apps, and access digital services — while still having some degree of privacy.
  • Individual focus. The framework treats privacy as an individual right against individual intrusion. It doesn't address structural surveillance, collective privacy (a community's privacy), or the asymmetric power dynamics that characterize modern data systems.
  • Binary. You're either "let alone" or you're not. There's no framework for partial disclosure, contextual appropriateness, or graduated access.

7.2 Privacy as Control: Westin's Framework

7.2.1 Four States of Privacy

Alan Westin, in Privacy and Freedom (1967), redefined privacy as "the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others."

This shifts the emphasis from being left alone to controlling information flow. Westin identified four states of privacy:

  1. Solitude: Being free from observation by others
  2. Intimacy: Being in a small, selected group where members can be candid
  3. Anonymity: Being in public but free from identification and surveillance
  4. Reserve: The psychological barrier against unwanted intrusion — the right to withhold aspects of yourself

7.2.2 Applying Westin to Digital Life

Westin's framework maps onto digital privacy concerns:

Westin's State Digital Equivalent Under Threat From
Solitude Private digital spaces Smart home devices, always-on microphones
Intimacy Private messaging, small group chats Platform data collection from "private" conversations
Anonymity Browsing without tracking Cookies, device fingerprinting, facial recognition in public
Reserve Choosing what to disclose Inference engines that deduce undisclosed attributes

The last row is particularly significant. In Westin's era, reserve meant you could choose not to tell someone you were pregnant. In the data age, Target's predictive analytics can infer pregnancy from purchase patterns — deducing what you chose not to disclose. The right of reserve is undermined not by direct surveillance but by inference.

Mira encountered this at VitraMed. "We can predict with 80% accuracy which patients will be diagnosed with depression within six months, based on their EHR visit patterns — even if they've never reported mental health symptoms," she told Eli. "That's useful for early intervention. But it also means we know something about them that they haven't chosen to share."

"And what happens when that prediction reaches their insurance company?" Eli asked.


7.3 Contextual Integrity: Nissenbaum's Revolution

7.3.1 The Core Insight

Helen Nissenbaum's contextual integrity framework, articulated in her 2010 book Privacy in Context, represents the most influential reconceptualization of privacy in the digital age.

Nissenbaum's insight is that privacy is not about secrecy or control — it's about appropriate flow of information within social contexts. Every social context (healthcare, education, friendship, commerce, law enforcement) has established informational norms that govern:

  1. What information is appropriate to share in that context (the type of information)
  2. About whom (the subject)
  3. By whom (the sender)
  4. To whom (the recipient)
  5. Under what conditions (the transmission principle)

A privacy violation occurs when information flows in ways that breach these contextual norms — even if the information is not "secret" in any absolute sense.

7.3.2 The Power of the Framework

Consider these examples:

Not a privacy violation: Your doctor shares your medical records with a specialist you've been referred to. The information (medical), the sender (your doctor), the recipient (a specialist), and the transmission principle (referral for treatment) all conform to the norms of the healthcare context.

A privacy violation: Your doctor shares your medical records with a marketing company. The same information, the same sender — but the recipient and the transmission principle violate the healthcare context's norms. Even if you signed a consent form buried in a stack of paperwork, the contextual integrity framework identifies this as a privacy violation because it breaches the expected norms of the doctor-patient relationship.

The digital puzzle: You share your location with a ride-sharing app so it can pick you up. The app stores your location history and sells it to a data broker, which sells it to a bail bond company that uses it to track people who have skipped court appearances. Each individual transaction may have technical "consent" — but the overall flow violates the contextual norms of the original context (transportation).

Intuition: Contextual integrity explains why data practices can feel wrong even when they're technically legal. When Facebook used data from your social graph to target political advertising, it didn't steal anything. But it moved information from the context of friendship (where you shared it) to the context of political persuasion (where you didn't intend it to go). The violation is the context breach, not the data collection per se.

7.3.3 Applying the Framework: A Step-by-Step Method

To evaluate a data practice using contextual integrity:

  1. Identify the prevailing context. What social domain is the data practice operating in? (Healthcare, education, commerce, law enforcement, friendship?)
  2. Identify the existing informational norms. What information flows are expected in this context? Who normally sends what to whom, under what conditions?
  3. Describe the new practice. What information is flowing? From whom to whom? Under what conditions?
  4. Compare. Does the new practice conform to established norms, or does it breach them?
  5. Evaluate the breach. If norms are breached, is the breach justified by compelling values or interests? Or does it undermine the social functions that the original norms served?

7.4 The "Nothing to Hide" Argument Revisited

7.4.1 The Argument and Its Appeal

We first encountered the "nothing to hide" argument in Chapter 1. It deserves a more thorough examination now that we have theoretical tools.

The argument takes several forms: - "I have nothing to hide, so I have nothing to fear from surveillance" - "Only people who are doing something wrong need privacy" - "If giving up some privacy makes us safer, it's worth it"

The argument appeals because it seems pragmatic and unselfish. It positions the speaker as a good citizen with a clear conscience.

7.4.2 Seven Responses

Drawing on the work of Daniel Solove (Nothing to Hide: The False Tradeoff Between Privacy and Security, 2011), here are seven responses:

1. Privacy is not just about hiding wrongdoing. Privacy protects: - Autonomy (the freedom to make choices without external pressure) - Intellectual freedom (the freedom to explore ideas without surveillance) - Intimate relationships (the freedom to be vulnerable with chosen people) - Political dissent (the freedom to hold and express unpopular views) - Personal development (the freedom to make mistakes and change without permanent records)

2. You cannot predict what will be considered wrongdoing in the future. Data collected today will be interpreted by governments and institutions that may hold very different values tomorrow. People who were "doing nothing wrong" have been persecuted by subsequent regimes using data collected under previous ones.

3. The argument ignores power dynamics. "Nothing to hide" assumes benevolent, competent institutions that use data fairly. History demonstrates that this assumption is often unwarranted.

4. Aggregation changes the equation. Each individual piece of data may seem innocuous. Your grocery list is not secret. Your location at 3 p.m. is not secret. Your phone records are not secret. But aggregated, these data points reveal intimate patterns of life that you would not voluntarily disclose.

5. The argument is self-refuting. As Snowden observed: "Arguing that you don't care about privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say."

6. Privacy is a social value, not just an individual preference. Even if you are comfortable being surveilled, pervasive surveillance affects society: it chills dissent, discourages whistleblowing, homogenizes behavior, and empowers authoritarian tendencies.

7. The burden of proof is backwards. In a democratic society, the government must justify its intrusions on liberty — not citizens their desire for privacy.


7.5 Privacy Across Cultures

7.5.1 Not Universal, But Not Arbitrary

Privacy norms vary significantly across cultures, but this variation is not random. It follows the social structures, power dynamics, and historical experiences of different communities.

Cultural Context Privacy Emphasis Example
European Strong individual data rights; privacy as fundamental right GDPR, German right to informational self-determination
American Sectoral; privacy balanced against free speech and commercial interests No comprehensive federal privacy law; First Amendment tensions
East Asian Privacy balanced with social harmony and collective interests Japan's APPI balances individual and group interests
Middle Eastern Privacy tied to family honor and modesty norms Different boundaries around personal vs. family data
African Growing framework; emphasis on communal data governance AU Data Policy Framework acknowledges community data rights
Indigenous Collective rights; data as relational CARE Principles (Ch. 3); data about community is community's

7.5.2 Cross-Cultural Tensions in Global Data Governance

These differences create real governance challenges. When a European user's data is stored on a U.S. server, which privacy norms apply? When an AI trained on Western data is deployed in an African context, whose informational norms govern? When a Chinese platform collects data from users worldwide, which regulatory framework takes precedence?

These questions — which we'll examine in depth in Chapters 20, 23, and 37 — cannot be resolved by asserting the universality of any single privacy tradition. They require cross-cultural negotiation, mutual recognition, and governance frameworks flexible enough to accommodate difference.


7.6 Why Privacy Matters: A Synthesis

Drawing on the theories examined in this chapter, privacy matters because:

  1. Autonomy. Without some control over personal information, individuals cannot make free choices — they are subject to manipulation, coercion, and judgment by those who know more about them than they know about themselves (Section 7.2).

  2. Democracy. Democratic self-governance requires spaces for private deliberation, political association, and dissent. Pervasive surveillance chills all three (Sections 7.4, 7.5).

  3. Social trust. Functioning social contexts — healthcare, education, friendship, commerce — depend on appropriate informational norms. When those norms are violated, trust erodes, and the social functions suffer (Section 7.3).

  4. Dignity. Being reduced to a data profile — having your life quantified, categorized, and predicted by systems you don't control — violates the dignity that ethical frameworks (Chapter 6) identify as central to moral personhood.

  5. Equity. Privacy violations disproportionately harm marginalized communities — the surveilled, the profiled, the scored — while privileging those with the resources to protect themselves (Chapter 5).


7.7 Chapter Summary

Key Concepts

  • Warren & Brandeis (1890): Privacy as "the right to be let alone" — influential but limited in the data age
  • Westin (1967): Privacy as informational control — four states: solitude, intimacy, anonymity, reserve
  • Nissenbaum (2010): Privacy as contextual integrity — violations occur when information flows breach established contextual norms
  • Informational self-determination (1983): German Constitutional Court's framing of data protection as a fundamental right
  • Privacy matters for autonomy, democracy, social trust, dignity, and equity

Key Debates

  • Is privacy a universal human right or a culturally contingent preference?
  • Is the "nothing to hide" argument ever valid?
  • Does contextual integrity adequately address contexts that are themselves unjust?
  • How should privacy be balanced against public safety, free speech, and innovation?

Applied Framework

To evaluate a data practice: (1) identify the context, (2) identify existing informational norms, (3) describe the data flow, (4) compare against norms, (5) evaluate whether any breach is justified.


What's Next

In Chapter 8: Surveillance — From Panopticon to Platform, we'll examine the mechanisms and technologies of surveillance in depth — from Bentham's original design through CCTV, NSA mass surveillance, facial recognition, and platform dataveillance. We'll see how the theoretical concepts from this chapter play out in the concrete practices of watching, tracking, and recording.

Before moving on, complete the exercises and quiz.


Chapter 7 Exercises → exercises.md

Chapter 7 Quiz → quiz.md

Case Study: The Carpenter v. United States Decision → case-study-01.md

Case Study: Privacy Norms in Crisis — COVID-19 Contact Tracing → case-study-02.md