> "The right to be let alone — the most comprehensive of rights and the right most valued by civilized men."
In This Chapter
- Opening: The Problem with Rights You Can't Enforce
- 31.1 The Invention of Privacy as a Legal Right
- 31.2 The Fourth Amendment: Original Meaning and Digital Stress
- 31.3 The Third-Party Doctrine: Why It Erodes Privacy in the Digital Age
- 31.4 Statutory Privacy Frameworks: The Sectoral Approach
- 31.5 GDPR: The European Model
- 31.6 State Privacy Laws in the United States
- 31.7 International Human Rights Frameworks
- 31.8 Where U.S. Law Falls Short: The Case for Comprehensive Federal Privacy Legislation
- 31.9 How to Exercise Your Privacy Rights: A Practical Guide
- 31.10 Jordan's Story: Filing a Data Access Request
- 31.11 The Right to Privacy in the Digital Age: Where We Are and Where We're Going
- 31.12 Chapter Summary
- Key Terms
- Discussion Questions
Chapter 31: Privacy as a Right: Legal Frameworks and Constitutional Protections
"The right to be let alone — the most comprehensive of rights and the right most valued by civilized men."
— Justice Louis D. Brandeis, Olmstead v. United States (1928)
Opening: The Problem with Rights You Can't Enforce
Jordan Ellis had been thinking about surveillance all semester. They had read about panopticons and data brokers, about mass interception programs and behavioral targeting, about how the warehouse where they worked tracked their movements to the nearest three feet. They understood the architecture — the cameras, the cookies, the corporate servers holding more personal information than any prior generation had ever surrendered.
But understanding the architecture is not the same as knowing your rights within it. One afternoon, after Dr. Osei's lecture on the legal history of privacy, Jordan pulled up a data broker website on their laptop. They searched their own name. Forty-seven results came back. Physical addresses going back five years. Estimated income. Relatives. Cell phone carrier. Social media profiles linked and compiled. Browsing habits inferred. A stranger could, for fourteen dollars, purchase a remarkably complete portrait of Jordan Ellis's life.
Jordan sat with this for a moment. Then they clicked on "opt out."
It took forty minutes, required separate forms on eleven different sites, demanded a photograph of Jordan's driver's license on three of them, and — as Jordan would discover — required repeating the process every ninety days because the data simply accumulated again.
This is the gap this chapter examines: the space between the legal concept of privacy and the lived reality of surveillance. That gap is not an accident. It is the product of specific legal decisions, legislative choices, and structural incentives that have shaped American and international privacy law over more than a century. Understanding that gap — its dimensions, its causes, its partial remedies — is essential preparation for the action chapters that follow.
31.1 The Invention of Privacy as a Legal Right
Privacy seems like it should be ancient. Every culture in human history has distinguished between public and private conduct. But privacy as a legal right — a claim you can bring before a court — is surprisingly recent. In the Anglo-American legal tradition, it dates precisely to 1890.
Warren and Brandeis: The Founding Document
In December 1890, Samuel D. Warren and Louis D. Brandeis published "The Right to Privacy" in the Harvard Law Review. The article is widely regarded as the most influential law review article ever written in the United States. Its argument was deceptively simple: the existing common law of property and contract could not adequately protect a new kind of harm that modern life was producing.
Warren, a wealthy Boston socialite, had reportedly grown furious at the Boston press's intrusive coverage of his family's social gatherings. He enlisted his law partner Brandeis — later a Supreme Court Justice — to write the legal architecture for a right that would protect personal dignity against unwanted exposure.
"The press is overstepping in every direction the obvious bounds of propriety and decency. Gossip is no longer the resource of the idle and of the vicious, but has become a trade, which is pursued with industry as well as effrontery. To satisfy a prurient taste the details of sexual relations are spread broadcast in the columns of the daily papers."
— Warren & Brandeis, "The Right to Privacy," Harvard Law Review, 1890
The article's central argument is elegant. They wrote that the existing law of defamation protected people from false statements. The law of contract protected people from breach of specific agreements. But there was no protection against the truthful, non-contractual disclosure of private matters — against someone simply publishing what you did in your own home, what you said at your dinner table, what you look like in an intimate moment.
Warren and Brandeis argued that this protection should be grounded not in property rights but in personhood — in the right to an "inviolate personality." Privacy, on their account, is what makes a person a person rather than an object of observation.
Their specific concerns seem almost quaint today: they worried about newspaper gossip columns and new portable cameras (the Kodak "snap camera" had been introduced in 1888). But the conceptual framework they constructed was durable enough to shape legal thinking about surveillance, social media, and data brokers more than a century later.
💡 Intuition Checkpoint: Warren and Brandeis were responding to technologies that felt invasive in their moment — portable cameras, mass-circulation newspapers. What technologies in your life feel similarly invasive in a way that existing law hasn't caught up with? Write a two-sentence version of the Warren/Brandeis argument applied to that technology.
The Four Privacy Torts
The Warren/Brandeis article eventually crystallized into four recognized privacy torts (civil wrongs) under American common law, systematized by William Prosser in 1960:
- Intrusion upon seclusion — physically or electronically intruding into someone's private space
- Public disclosure of private facts — publishing true but private information
- False light — publishing information that creates a misleading impression
- Appropriation — using someone's name or likeness for commercial benefit without consent
These torts matter for understanding the architecture of privacy law. They are civil remedies — you sue a person or company who wrongs you, after the harm has already occurred. They do not prevent surveillance in advance. They do not address the state. And they do not cover the massive-scale data collection that characterizes contemporary surveillance capitalism, because proving individual harm from data collection is extraordinarily difficult.
The four torts are a tool built for a world of newspaper gossip. They were not designed for CCTV networks, behavioral advertising ecosystems, or facial recognition databases.
31.2 The Fourth Amendment: Original Meaning and Digital Stress
The U.S. Constitution does not mention "privacy." What it does mention, in the Fourth Amendment, is the right of people to be "secure in their persons, houses, papers, and effects, against unreasonable searches and seizures." Searches and seizures require warrants, and warrants require particularized probable cause.
This was a response to British "general warrants" — writs of assistance that allowed royal officers to search any home or business for contraband without specifying what they were looking for. The Fourth Amendment was designed to prevent the surveillance state of colonial experience from reconstituting itself in the new republic.
For most of American history, Fourth Amendment doctrine was interpreted through a physical lens: did the government physically intrude on your property? This worked reasonably well in a world where evidence lived in physical spaces — papers in drawers, conversations in rooms, contraband in suitcases.
The twentieth century — and especially the digital age — put enormous pressure on this framework.
Katz v. United States (1967): The Reasonable Expectation of Privacy
Charles Katz was a gambler who used a public telephone booth in Los Angeles to place bets across state lines. The FBI attached a listening device to the outside of the phone booth — no physical intrusion into any space Katz owned — and recorded his conversations.
The Supreme Court held that the wiretap violated the Fourth Amendment. Justice Potter Stewart wrote the controlling opinion, establishing that "the Fourth Amendment protects people, not places." The physical trespass doctrine was replaced by a two-part test articulated by Justice John Harlan in concurrence:
- The person must have exhibited a subjective expectation of privacy
- That expectation must be one that society is prepared to recognize as reasonable
The Katz test was a significant advance. It untethered Fourth Amendment analysis from pure property law. A person using a public phone booth has a reasonable expectation that their conversation is private, even though they're in a glass box on a public street.
But Katz also introduced a vulnerability: if your expectation of privacy is not "reasonable" — as defined by judges interpreting what society recognizes — then the Fourth Amendment doesn't apply. And what's "reasonable" can be shaped by what surveillance technologies become normalized.
🎓 Advanced Note: Legal scholars including Orin Kerr have argued that the Katz test is fundamentally circular: courts determine what expectations are "reasonable" partly by looking at what surveillance practices are common, but those practices become common partly because courts don't prohibit them. This creates a ratchet that can continuously erode privacy protections as surveillance becomes normalized. See Kerr, "Four Models of Fourth Amendment Protection" (2007).
Smith v. Maryland (1979): The Third-Party Doctrine
If Katz was the advance, Smith v. Maryland was the retreat. The case involved a man named Michael Lee Smith who had made threatening calls to a robbery victim, Patricia McDonough. Police installed a "pen register" — a device that records the phone numbers dialed from a telephone — at the phone company's offices. No warrant was obtained. Smith was identified and convicted.
The Supreme Court held that Smith had no Fourth Amendment protection for the numbers he dialed, because he had voluntarily conveyed that information to the phone company. The reasoning was straightforward: when you dial a number, you know the phone company records it for billing and routing purposes. By using the phone company's service, you assumed the risk that the company might share that information with the government.
This is the third-party doctrine: information voluntarily shared with a third party (a company, a service provider) loses Fourth Amendment protection. The government can obtain that information from the third party without a warrant.
The third-party doctrine made sense — or at least was defensible — in 1979, when few people had meaningful understanding of how phone companies worked and most records were either not kept or not useful. It has become catastrophically inadequate in the digital age.
📊 Real-World Application: Consider what the third-party doctrine means today. Every email you send is held by Google, Microsoft, or another email provider. Every text message passes through your carrier's servers. Your location is recorded by your phone company every time your phone pings a cell tower. Your financial transactions are held by banks and credit card companies. Your search queries are stored by search engines. Your health data may be held by apps. If the government can obtain all of this without a warrant — because you "voluntarily" shared it with third parties — then the Fourth Amendment provides almost no protection for digital life.
This is not a hypothetical. The NSA's PRISM program, revealed by Edward Snowden in 2013 and discussed in Chapter 9, was built in part on the logic of the third-party doctrine: if tech companies have this data, the government can compel them to hand it over.
Carpenter v. United States (2018): A Partial Correction
The Supreme Court's 2018 decision in Carpenter v. United States represents the most significant Fourth Amendment ruling in the digital age, and it shows both the promise and the limits of constitutional adaptation.
Timothy Carpenter was convicted of a series of armed robberies. The government had obtained, without a warrant, 127 days of his cell site location information (CSLI) from his mobile carrier — records showing which cell towers his phone had connected to, placing him near the robbery sites at the relevant times.
The question before the Court: did the third-party doctrine mean Carpenter had no Fourth Amendment protection for this location data?
Chief Justice John Roberts, writing for a 5-4 majority, held that it did not. Cell phone location data is different in kind from the phone numbers at issue in Smith, Roberts wrote, for several reasons:
- Comprehensiveness: CSLI provides a near-perfect record of a person's past movements over an extended period
- Intimacy: Location over time reveals the "privacies of life" — medical appointments, places of worship, political meetings, intimate relationships
- Involuntariness: People don't "voluntarily" share location data with carriers in any meaningful sense; the phone does it automatically as a function of using a cellular network
- Retrospective surveillance: Long-term CSLI enables the government to conduct retrospective surveillance on anyone, going back in time without any prior suspicion
The Court held that the government needed a warrant to obtain long-term CSLI.
"When the Government tracks the location of a cell phone it achieves near perfect surveillance, as if it had attached an ankle monitor to the phone's user."
— Chief Justice John Roberts, Carpenter v. United States (2018)
But the Carpenter decision is carefully narrow. Roberts explicitly declined to disturb Smith v. Maryland. He said the decision applies to "the seismic shifts in digital technology" and "the novel circumstance" of comprehensive location data, but the Court did not overrule the third-party doctrine more broadly. Justice Kennedy's dissent worried, with some justification, that the majority had offered no principled limiting principle.
📝 Note: What does Carpenter leave unresolved? The Court did not answer whether warrants are required for shorter periods of location data, for data held by internet service providers, for financial records, for email metadata, or for numerous other categories of digital third-party records. Lower courts are actively litigating these questions. The Fourth Amendment doctrine of the digital age is still being written.
31.3 The Third-Party Doctrine: Why It Erodes Privacy in the Digital Age
The third-party doctrine deserves sustained attention because it sits at the center of the gap between constitutional promise and surveillance reality.
The doctrine's foundational assumption — that sharing information with a third party constitutes "voluntary" assumption of risk — was questionable even in 1979. It becomes nearly incoherent in 2026.
Consider what "voluntary" sharing means in the digital context:
Email: You can decline to use email. But doing so means you cannot communicate with employers, government agencies, banks, or most other institutions in contemporary society. The choice to "not share" with Google or Microsoft is, in practice, the choice to withdraw from modern economic and social life.
Mobile phones: You can decline to carry a mobile phone. But your employer may require you to be reachable. Emergency services assume phone access. Many government services, banking apps, and two-factor authentication systems require it. And if you do carry a phone, it automatically shares location data with carriers — you have no practical ability to use cellular service without this data collection.
Financial transactions: You can use cash for small purchases. But paying rent, receiving a paycheck, filing taxes, applying for housing — all require engagement with financial institutions that create records.
Health data: Many health apps and wearable devices share data with third parties by default. Opting out often requires technical sophistication that most users don't have, and even then, opting out of one app doesn't address data already collected.
The "voluntary" in the third-party doctrine assumes a world where people can meaningfully choose not to share information with third parties while still participating in ordinary life. In the contemporary surveillance economy, that world does not exist.
🔗 Connection: This connects to the concept of "consent as fiction" introduced in earlier chapters. The third-party doctrine is a legal codification of consent-as-fiction: it treats the formal act of using a service as meaningful consent to whatever data collection that service performs, regardless of whether users understand or can meaningfully refuse that collection.
31.4 Statutory Privacy Frameworks: The Sectoral Approach
In the absence of a comprehensive federal privacy law, the United States has developed what legal scholars call a "sectoral" approach to privacy: different laws protect different types of information in different contexts. This approach has produced a patchwork of protections with significant gaps.
HIPAA: Health Information
The Health Insurance Portability and Accountability Act (HIPAA) of 1996 establishes privacy and security standards for "protected health information" (PHI) held by "covered entities" — healthcare providers, health plans, and healthcare clearinghouses.
What HIPAA does well: - Requires patient authorization for most uses of PHI - Grants patients access to their medical records - Requires breach notification - Imposes significant civil and criminal penalties for violations
What HIPAA misses: HIPAA only applies to covered entities. A hospital's records are protected; a period-tracking app's records are not. A fitness wearable company, a genetic testing service, a mental health chatbot, a pharmacy loyalty program — none of these are HIPAA-covered entities. They may collect extraordinarily sensitive health information, but HIPAA does not apply to them.
After the Supreme Court's Dobbs decision (2022) eliminated the constitutional right to abortion, this gap became urgent: period-tracking apps and fitness trackers could be subpoenaed for information about pregnancies, and HIPAA offered no protection.
FERPA: Student Records
The Family Educational Rights and Privacy Act (FERPA) of 1974 protects the privacy of student education records. It gives parents (and students over 18) the right to inspect and correct records, and requires written consent before records are disclosed to third parties.
FERPA was designed for a world of paper records held by schools. It has struggled to adapt to: - Learning management systems (Canvas, Blackboard) that track detailed student engagement data - EdTech platforms that schools contract with third parties to provide - Proctoring software that records students' home environments during online exams - AI tutoring systems that collect conversational data
COPPA: Children's Online Privacy
The Children's Online Privacy Protection Act (COPPA) of 1998 requires verifiable parental consent before collecting personal information from children under 13. It applies to websites and online services directed at children or with actual knowledge of users under 13.
COPPA explains why most social media platforms nominally require users to be 13 or older — not because they genuinely care about child welfare, but because complying with COPPA for younger users would require parental consent mechanisms that complicate user acquisition.
The age verification systems that services use to exclude under-13 users are notoriously weak: asking users to enter a birthdate and accepting whatever they type. The FTC has taken enforcement actions against YouTube ($170 million settlement in 2019) and other platforms, but COPPA's protections remain limited by weak enforcement and easy circumvention.
ECPA: Electronic Communications
The Electronic Communications Privacy Act (1986) was written to extend Fourth Amendment-like protections to electronic communications. At the time of passage, it was considered forward-looking. By 2026, it is a relic.
ECPA's most notorious provision: the government could obtain email and other electronic communications older than 180 days without a warrant, because the drafters assumed that old communications had been abandoned. Most email providers now require warrants regardless, and Congress has passed reforms, but ECPA's framework reflects 1980s assumptions about how electronic communication works.
FISA: National Security Exceptions
The Foreign Intelligence Surveillance Act (1978) created a separate legal framework for national security surveillance. The FISA Court — a secret court that meets in a secure facility and whose decisions are classified — reviews government applications for surveillance of foreign powers and their agents.
Chapter 6 and Chapter 9 examined FISA in detail. Key points for legal context:
- FISA Section 702 authorizes surveillance of non-U.S. persons located outside the United States, but in practice captures substantial amounts of U.S. persons' communications
- The third-party doctrine applies fully in the national security context
- FISA courts have approved the vast majority of government requests
- Oversight mechanisms are limited by secrecy requirements that prevent public or judicial accountability
FISA represents the hardest edge of the surveillance-privacy tension: the national security exception carves out enormous space where ordinary Fourth Amendment and statutory protections do not apply.
31.5 GDPR: The European Model
The European Union's General Data Protection Regulation (GDPR), which took effect in May 2018, represents the most comprehensive privacy regulatory framework in the world. Understanding what it does — and what it doesn't — illuminates both the European approach and its contrast with US law.
Key Principles
The GDPR establishes six principles for lawful data processing:
- Lawfulness, fairness, and transparency — Processing must have a legal basis; it must be done fairly; data subjects must be informed
- Purpose limitation — Data collected for one purpose cannot be used for incompatible purposes without new legal basis
- Data minimization — Only data necessary for the specified purpose should be collected
- Accuracy — Inaccurate data must be corrected or erased
- Storage limitation — Data should not be retained longer than necessary
- Integrity and confidentiality — Data must be protected against unauthorized access and accidental loss
These principles codify what critics of surveillance capitalism argue should be obvious: you should only collect what you need, tell people what you're doing, use it only for what you said you'd use it for, and delete it when you're done.
Legal Bases for Processing
One of the GDPR's central contributions is requiring that any processing of personal data have a legal basis. The regulation lists six:
- Consent — Freely given, specific, informed, and unambiguous
- Contract — Processing necessary to perform a contract with the data subject
- Legal obligation — Processing required by law
- Vital interests — Processing necessary to protect someone's life
- Public task — Processing for official government functions
- Legitimate interests — Processing serves a legitimate interest that is not overridden by the data subject's interests
The "legitimate interests" basis has been controversial: it can be invoked broadly by organizations who claim their business interests outweigh individuals' privacy interests, which creates an escape hatch from the consent requirement that surveillance capitalism companies have aggressively exploited.
Individual Rights Under GDPR
The GDPR grants data subjects (people whose data is processed) extensive rights:
- Right of access — You can request a copy of all personal data held about you
- Right to rectification — You can correct inaccurate data
- Right to erasure (right to be forgotten) — You can request deletion under certain circumstances
- Right to restriction — You can limit processing in certain circumstances
- Right to data portability — You can receive your data in a machine-readable format
- Right to object — You can object to processing based on legitimate interests or for direct marketing
- Rights related to automated decision-making — You can request human review of automated decisions that significantly affect you
Enforcement Record
The GDPR's enforcement record is mixed. Penalties can be enormous — up to 4% of annual global turnover or €20 million, whichever is higher. In practice:
- Meta (Facebook): €1.2 billion fine (Ireland DPA, 2023) for unlawful data transfers to the United States — the largest GDPR fine to date
- Google (Analytics): Multiple EU regulators ruled Google Analytics violated GDPR for transferring data to US servers subject to NSA access
- Amazon: €746 million fine (Luxembourg, 2021) for behavioral advertising violations
- WhatsApp/Meta: €225 million fine (Ireland, 2021) for transparency failures
But enforcement has been criticized as too slow (some investigations take years), too geographically concentrated in Ireland (where many tech companies have EU headquarters), and too deferential to corporate arguments about what constitutes "legitimate interests." Privacy advocacy organizations have filed thousands of complaints under GDPR; a large proportion remain unresolved.
The Right to Be Forgotten
The "right to erasure" — commonly called the right to be forgotten — is one of GDPR's most discussed provisions and most philosophically contested.
The right originates in Google Spain SL v. Agencia Española de Protección de Datos (2014), a Court of Justice of the EU decision that required Google to delist certain results about a Spanish man's past financial difficulties. The principle: people should be able to request removal of information that is no longer relevant or that they no longer wish to have widely circulated.
The right to erasure has been invoked to remove links about criminal convictions that were served, embarrassing past statements, and outdated information. Critics, including free speech advocates, argue that it allows people to rewrite history — to remove from public record information that is genuinely of public interest.
The right to be forgotten is unavailable under US law. The First Amendment's strong free speech protections make removing truthful information from public circulation constitutionally suspect. US courts have consistently held that the right to speak and publish truthful information outweighs privacy interests except in narrow circumstances.
🌍 Global Perspective: The EU/US divide on privacy reflects different foundational values. European human rights law treats privacy as a fundamental right in itself — the EU Charter of Fundamental Rights establishes privacy as Article 7 and data protection as Article 8. American constitutional law treats free speech as the paramount value; privacy protections, when they exist, are carved out as exceptions to the general presumption in favor of disclosure and expression. Neither system is simply "right" — they reflect genuine value trade-offs that democratic societies must negotiate.
31.6 State Privacy Laws in the United States
In the absence of comprehensive federal privacy legislation, U.S. states have increasingly stepped into the gap. The landscape is rapidly evolving.
California Consumer Privacy Act (CCPA) and CPRA
California's CCPA, effective January 2020 and strengthened by the California Privacy Rights Act (CPRA) in 2023, is the strongest state privacy law in the United States and the closest American analog to GDPR.
Key rights under CCPA/CPRA: - Right to know — What personal information is collected, how it's used, and with whom it's shared - Right to delete — Request deletion of personal information - Right to opt out of sale/sharing — Businesses cannot sell or share your data without allowing you to opt out - Right to correct — Request correction of inaccurate information - Right to limit use of sensitive information — Restrict processing of sensitive categories (health, race, precise location, etc.) - Right to non-discrimination — Businesses cannot penalize you for exercising privacy rights
CCPA applies to businesses that: (1) have annual gross revenues over $25 million, (2) buy, sell, or receive personal information of 100,000+ consumers/households, or (3) derive 50% or more of annual revenue from selling personal information.
Significantly, CCPA includes a private right of action for data breaches — consumers can sue directly for certain violations, without waiting for government enforcement. This has generated substantial litigation.
Other State Laws
As of 2026, approximately twenty states have passed comprehensive privacy laws, creating a complex mosaic:
- Virginia (CDPA, 2021), Colorado (CPA, 2021), Connecticut (CTDPA, 2022) — generally follow CCPA model
- Texas, Indiana, Tennessee — more business-friendly versions with limited private rights of action
- Montana, Oregon — include stronger protections for sensitive data
The practical effect: businesses operating nationally must navigate dozens of different state requirements. Some advocates argue this creates pressure for federal preemption; others argue the variation allows states to serve as "laboratories of democracy" for privacy experimentation.
⚠️ Common Pitfall: State privacy laws generally preempt more specific claims but do not create the equivalent of GDPR's comprehensive framework. They also typically apply only to commercial data processing — government surveillance often operates under different legal standards. Don't assume that CCPA or similar laws protect you from law enforcement surveillance; they don't.
31.7 International Human Rights Frameworks
Privacy is recognized as a human right under international law, though enforcement mechanisms are weak.
Universal Declaration of Human Rights, Article 12
"No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks."
The UDHR was adopted by the UN General Assembly in 1948. Article 12 establishes privacy protection as a universal human right. The UDHR is not a binding treaty — it represents aspirational norms rather than enforceable law — but it has had enormous influence on constitutional and statutory law globally.
International Covenant on Civil and Political Rights, Article 17
The ICCPR, which is a binding treaty for signatory states, contains essentially identical privacy language in Article 17. The UN Human Rights Committee, which monitors ICCPR compliance, has interpreted Article 17 to require states to:
- Prevent arbitrary or unlawful interference with privacy by both state and private actors
- Ensure effective remedies for privacy violations
- Regulate surveillance so it is targeted, necessary, and proportionate
The UN Human Rights Council's 2013 resolution (following the Snowden revelations) specifically applied Article 17 to digital surveillance, including metadata collection by intelligence agencies.
🎓 Advanced Note: The tension between national security surveillance and Article 17 obligations has been extensively analyzed by the UN Special Rapporteur on the Right to Privacy. Former Special Rapporteur Joseph Cannataci's reports have repeatedly found that mass surveillance programs operated by the U.S. NSA, UK GCHQ, and other "Five Eyes" agencies are incompatible with Article 17, even accounting for national security exceptions. These findings have no direct enforcement mechanism, but they shape diplomatic and legislative debates.
31.8 Where U.S. Law Falls Short: The Case for Comprehensive Federal Privacy Legislation
The United States stands almost alone among wealthy democracies in lacking a comprehensive federal privacy law. Every session of Congress produces proposals; most fail. Understanding why illuminates the political economy of surveillance.
What the Gap Looks Like
Consider a person living in the United States in 2026. What federal privacy law protects them?
- Health information: HIPAA — if held by a covered entity (not your fitness app, not your mental health chatbot)
- Financial records: GLBA (Gramm-Leach-Bliley Act) — for financial institutions, but allows extensive data sharing with "affiliates"
- Communications content: ECPA — if not too old and not subject to national security exception
- Education records: FERPA — if held by an educational institution, not by EdTech vendors
- Children's information: COPPA — if they're under 13 and the platform knows it
- Everything else: Largely unprotected at the federal level
A data broker can collect your age, address, household income, political affiliation, shopping habits, browsing history, driving record, criminal record, social media activity, and hundreds of other data points — compile them into a profile — and sell that profile to anyone, for any purpose, with no federal law restricting the practice in any meaningful way.
Why Comprehensive Legislation Has Failed
Several structural factors explain congressional failure to pass federal privacy law:
Industry opposition: The data economy is enormously profitable. Google, Meta, Amazon, and data brokers spend heavily on lobbying to prevent legislation that would restrict data collection and monetization. Industry arguments typically focus on: costs to small businesses, innovation benefits of data, opt-in requirements, and the claim that consumers actually prefer personalized services.
Preemption debates: Any federal law must decide whether it preempts stronger state laws. Industry generally prefers federal preemption (one national standard, ideally the weakest possible). Privacy advocates and states like California oppose preemption that would undermine stronger state protections. This disagreement has blocked numerous bills.
Partisan divisions: Privacy has both bipartisan appeal and partisan divide. Republicans tend to favor minimal regulation and oppose private rights of action. Democrats tend to support stronger protections and individual enforcement mechanisms. Neither faction has consistently been able to build a majority.
Defining "harm": Some legislative proposals require that individuals demonstrate concrete harm from privacy violations before suing. Industry favors narrow harm definitions; advocates argue that data collection itself, and the loss of control over personal information, constitute harm even without downstream misuse.
📊 Real-World Application: The American Data Privacy and Protection Act (ADPPA) came closer than any previous proposal to passing Congress — it advanced out of committee with bipartisan support in 2022 — but stalled over California's opposition to preemption of its stronger CCPA/CPRA standards. As of 2026, the legislative landscape remains contested.
31.9 How to Exercise Your Privacy Rights: A Practical Guide
Understanding rights matters only if you can exercise them. Here is a concrete guide to using the privacy rights that currently exist.
Exercising GDPR Rights (if you're subject to GDPR)
GDPR applies to any organization processing the personal data of people located in the EU, regardless of where the organization is based. This means that even if you're a U.S. person, GDPR may protect you when using European services — and European-based services have GDPR obligations toward you.
Step 1: Identify the data controller. The "data controller" is the organization responsible for your data. Usually this is the company you interact with directly.
Step 2: Submit a Subject Access Request (SAR). You have the right to request a copy of all personal data held about you. Under GDPR, controllers must respond within 30 days. The request can typically be submitted: - Through a dedicated privacy portal (most large tech companies have these) - By email to the company's Data Protection Officer (DPO) — GDPR requires large companies to designate a DPO, whose contact information must be published - By letter to the company's registered address
Step 3: Review the response. Organizations must tell you: what data they hold, why they're processing it, where it came from, who they share it with, and how long they'll retain it.
Step 4: Exercise further rights. Based on what you learn, you may want to: request deletion (if no legitimate purpose remains), request correction (if data is inaccurate), request portability (if you want to move your data to another service), or object to processing (if you believe the legitimate interests basis doesn't apply).
Step 5: Complain to the supervisory authority. If the company doesn't respond appropriately, you can file a complaint with your national data protection authority (in the EU) or, in some cases, the ICO (UK) or your state data protection regulator.
Exercising CCPA Rights (California residents)
Step 1: Submit a "Know" request. Visit the company's website and find the "Privacy" or "Do Not Sell My Personal Information" link (required by CCPA). Most companies have an online portal for submitting requests.
Step 2: Opt out of sale/sharing. Many California resident browsers can use the Global Privacy Control (GPC) signal — a browser setting that automatically signals opt-out to websites. GPC is legally recognized under CPRA.
Step 3: Submit a deletion request. Use the same privacy portal to request deletion. Companies have 45 days to respond (with possible 45-day extension).
Step 4: Verify identity. Companies will ask you to verify your identity to prevent fraudulent requests. This typically involves confirming email or account information.
Step 5: If you receive no response or an inadequate one. File a complaint with the California Privacy Protection Agency (CPPA) or the California Attorney General. For data breaches, you may have a private right of action — consult an attorney.
Data Broker Opt-Out Process
Data brokers are companies that collect and sell personal information. They are regulated loosely under CCPA in California and not at all federally. But most provide opt-out mechanisms — partly for legal compliance and partly for reputational reasons.
✅ Best Practice: Data Broker Opt-Out Process
-
Use a removal service. Services like DeleteMe, Privacy Bee, or Kanary (paid) automate the opt-out process across dozens of brokers. They are not perfect and require ongoing subscription because data reaccumulates.
-
Manual opt-out: The major brokers requiring individual opt-outs include: - Spokeo, WhitePages, BeenVerified, Intelius, PeopleFinder (each has an opt-out form) - Acxiom (use their "About the Data" portal) - LexisNexis and Equifax have processes for sensitive individuals (abuse survivors, law enforcement, etc.) - Data brokers serving financial industry (TransUnion, Equifax, Experian) have separate opt-out processes from credit reporting
-
Expect to repeat. Data broker databases update constantly from public records and other sources. Opt-outs must typically be renewed every 90 days.
-
Prioritize. Focus first on brokers that aggregate physical address with other information, as these create risks for physical safety. Then focus on those whose products are used for background checks and tenant screening, which can affect housing and employment.
31.10 Jordan's Story: Filing a Data Access Request
The evening after Dr. Osei's lecture, Jordan went to their apartment and opened a browser. Yara, who had been through this before, sat with them.
"Start with Spokeo," Yara said. "It's got everything."
Jordan typed in their own name. The results were startling — not because anything was secret, exactly, but because of the aggregation. Alone, knowing someone's name is meaningless. Knowing their current city is not a secret. Knowing they went to high school in a different state is hardly private. But seeing all of it together — the three previous addresses, the names of relatives, the estimated household income, the associated social media profiles — felt like being observed from a great height.
"This is legal?" Jordan asked.
"Entirely legal," Yara said. "There's no federal law against it."
They found the opt-out page, which required a photograph of Jordan's driver's license — Jordan hesitated at this, the irony not lost on them — and submitted the request. Then they went to WhitePages. Then BeenVerified. Then Intelius.
"What about the data brokers I've never heard of?" Jordan asked. "The ones whose names I don't know?"
Yara pulled up a list someone had compiled on GitHub. "There are over three thousand registered data brokers in the US. You can't opt out of all of them manually."
Jordan felt a familiar frustration — the kind that comes from understanding a problem deeply enough to know how little control you actually have over it. "So what's the point?"
"The point," Yara said, "is to reduce exposure, not eliminate it. You'll never get to zero. But reducing is still worth doing." She paused. "And the point is also to know what's happening. Most people don't."
Jordan thought about their coworkers at the warehouse — people who would be shocked to know how comprehensively their information had been compiled and sold, who had never been told they had rights they could exercise, who were being data-brokered right now without their knowledge.
"The law should make this opt-in," Jordan said. "Not opt-out."
"Yes," Yara agreed. "It should."
This is the gap the law has not closed. The opt-in vs. opt-out distinction is not a technical matter — it is a political choice about who the law serves. An opt-out system assumes surveillance is the default and privacy requires affirmative action. An opt-in system would assume privacy is the default and surveillance requires affirmative consent. The United States, unlike the EU, has chosen opt-out.
💡 Intuition Checkpoint: Think about the last time you agreed to a privacy policy or clicked "accept" on a cookie notice. Did you read it? Did you understand it? Could you have meaningfully refused? What does your answer tell you about the nature of "consent" in contemporary digital life?
31.11 The Right to Privacy in the Digital Age: Where We Are and Where We're Going
The legal landscape of privacy in 2026 is characterized by a fundamental asymmetry: surveillance technology advances at the pace of software development — rapidly, continuously, iteratively. Law advances at the pace of legislation and judicial decision-making — slowly, incrementally, reactively.
Every major surveillance technology in this book was deployed and normalized before any meaningful legal regulation arrived. By the time courts and legislatures caught up to cookies (partially), to metadata collection (partially), to CCTV (barely), to facial recognition (slowly), to behavioral advertising (still unresolved), the technologies were already embedded in the infrastructure of daily life.
This isn't a criticism of law per se — legislative deliberation and judicial caution serve real values. It's a description of how the surveillance-privacy balance has evolved: surveillance leads, law follows, and the gap between them is where the actual lives of billions of people happen.
The Carpenter decision is genuinely significant because it represents the Supreme Court choosing to apply Fourth Amendment protection to digital data even at the cost of doctrinal complexity. It suggests that courts may be willing to engage more seriously with the privacy implications of digital surveillance. But Carpenter is narrow. It addresses one category of data for one category of government action. The third-party doctrine remains in place. There is no federal comprehensive privacy law. The national security exception remains vast.
What would a genuinely adequate legal framework for privacy in the digital age look like? That question will be addressed in Chapters 39 and 40. For now, it is enough to see what the law provides, where it falls short, and how those gaps can be partially bridged through individual rights exercise, state law, and international frameworks — while acknowledging that individual action alone cannot close the structural gap.
31.12 Chapter Summary
This chapter traced the legal history of privacy from its conceptual origins in the Warren/Brandeis article of 1890 through the current patchwork of constitutional doctrine, federal statutes, state laws, and international frameworks.
Key points:
Origins: Privacy as a legal right was invented in 1890 in response to new surveillance technologies of that era (cameras, mass-circulation newspapers). The conceptual architecture Warren and Brandeis built — privacy grounded in personhood rather than property — has shaped thinking ever since.
Fourth Amendment: The Katz "reasonable expectation of privacy" standard untethered Fourth Amendment analysis from physical trespass but introduced vulnerability to normalization. The third-party doctrine (Smith v. Maryland) gutted digital privacy protection by holding that information shared with third parties loses constitutional protection. Carpenter v. United States began correcting this for location data without overruling the broader doctrine.
Federal statutes: The U.S. sectoral approach — HIPAA for health, FERPA for education, COPPA for children, ECPA for communications — creates a patchwork with enormous gaps, particularly for commercial data brokers and emerging technologies.
GDPR: The EU's comprehensive framework establishes strong individual rights and imposes significant obligations on data processors, but enforcement is slow and the "legitimate interests" basis creates significant loopholes.
State laws: California's CCPA/CPRA represents the strongest U.S. analog to GDPR; an expanding group of states has followed with their own comprehensive privacy laws.
International: The UDHR and ICCPR establish privacy as a universal human right, but enforcement mechanisms are weak.
The gap: The United States lacks comprehensive federal privacy legislation. Data brokers operate largely unregulated. Consent is structured as opt-out rather than opt-in. The national security exception remains vast. The gap between privacy rights and surveillance practice is not a failure of law to keep up — it is a political choice about who the law serves.
The right to privacy, as Brandeis wrote, is "the right most valued by civilized men." Whether contemporary law honors that value — and what can be done when it doesn't — are the animating questions of the chapters that follow.
Key Terms
Fourth Amendment — Constitutional protection against unreasonable searches and seizures; requires warrants based on probable cause
Third-party doctrine — Legal principle that information voluntarily shared with third parties loses Fourth Amendment protection
Reasonable expectation of privacy — Katz test: Fourth Amendment protects information you keep private and that society recognizes as legitimately private
GDPR — General Data Protection Regulation; EU comprehensive privacy law establishing individual rights and organizational obligations
CCPA/CPRA — California Consumer Privacy Act/California Privacy Rights Act; strongest U.S. state privacy law
Right to be forgotten — EU right to request removal of personal information from public circulation
Data broker — Company that collects and sells personal information without direct relationship with data subjects
HIPAA — Health Insurance Portability and Accountability Act; U.S. health information privacy law for covered entities
Sectoral approach — U.S. model of separate privacy laws for different sectors rather than comprehensive protection
Subject Access Request (SAR) — GDPR mechanism for requesting all personal data held by an organization
Discussion Questions
-
Warren and Brandeis were responding to what they saw as invasive journalism about wealthy social circles. Does their concern seem sympathetic today? What does it tell us that the foundational legal concept of privacy emerged from the grievances of the privileged?
-
The Carpenter decision held that long-term location data requires a warrant. What principle might the Court apply to determine which digital data requires a warrant? Is there a coherent limiting principle, or will courts have to decide case by case?
-
The third-party doctrine assumes that sharing information with a third party is "voluntary." Is it voluntary to use Gmail? To carry a smartphone? To have a bank account? How should courts think about structural compulsion in digital life?
-
GDPR's right to erasure conflicts with freedom of expression — the right to publish and circulate truthful information. Which value should prevail? Does your answer depend on what kind of information is involved?
-
The United States has failed to pass comprehensive federal privacy legislation despite repeated attempts. Is this a failure of democracy, a feature of it, or something else? Who benefits from the status quo?
-
Jordan's opt-out experience required photographing their driver's license — surrendering more personal information to protect personal information. What does this irony reveal about the design of contemporary privacy rights?
Next: Chapter 32 examines counter-surveillance in practice — the technical, behavioral, and artistic tools available to people who wish to reduce their surveillance exposure. Where this chapter examined legal rights, Chapter 32 examines practical methods for exercising those rights and going beyond them.