Appendix E: Legal Frameworks Quick Reference
Constitutional Rights, Federal Statutes, State Laws, and International Standards
Introduction
Understanding surveillance requires understanding law — not because law resolves all surveillance questions, but because law defines rights, establishes procedures, creates accountability mechanisms, and reflects (often imperfectly) societal values about privacy and state power. This appendix provides a concise reference to the major legal frameworks governing surveillance in the United States and abroad.
This appendix is not a substitute for legal advice. Law changes through court decisions and legislation; this reference reflects the state of law as of early 2026. For any specific legal situation, consult a qualified attorney.
A note on the structure of US surveillance law: the United States lacks a comprehensive federal privacy law applicable to all contexts. Instead, privacy and surveillance law derives from multiple sources — constitutional doctrine, federal statutes, state statutes, and common law — that together create a complex and incomplete framework with significant gaps.
Section 1: US Constitutional Framework
The Fourth Amendment
Text: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."
What It Does: The Fourth Amendment protects against unreasonable government searches and seizures. It applies to government actors (law enforcement, government agencies) and generally does not apply to private actors (corporations, private employers).
Key Doctrines:
Reasonable Expectation of Privacy (REP): Derived from Justice Harlan's concurrence in Katz v. United States (1967), the REP test asks whether: (1) the individual subjectively expected privacy in the object of the search, and (2) society recognizes that expectation as reasonable. The REP test replaced the older property-based trespass doctrine that limited Fourth Amendment protection to physical intrusions.
Third-Party Doctrine: Information voluntarily shared with third parties loses Fourth Amendment protection. Derived from United States v. Miller (1976) (bank records) and Smith v. Maryland (1979) (phone numbers dialed). The third-party doctrine has vast implications for digital privacy: most digital communications are mediated by third parties (ISPs, cloud services, social media platforms) and thus technically lack Fourth Amendment protection under this doctrine. Partially retreated from in Carpenter v. United States (2018).
Particularity Requirement: Warrants must particularly describe the place to be searched and things to be seized. General warrants — authorizing search of everything — are prohibited. This requirement creates tension with digital searches: a warrant for all files on a seized computer is technically general, yet courts have generally authorized such searches.
Exclusionary Rule: Evidence obtained through unconstitutional searches is generally inadmissible ("fruit of the poisonous tree"). This is the enforcement mechanism for Fourth Amendment rights.
Key Cases:
Katz v. United States (1967): Government installed listening device in a public phone booth without a warrant. Court held this violated the Fourth Amendment; established that "the Fourth Amendment protects people, not places." Justice Harlan's concurrence introduced the "reasonable expectation of privacy" standard.
Smith v. Maryland (1979): Police obtained records of phone numbers dialed by a suspect from the telephone company without a warrant. Court held no Fourth Amendment violation; the third-party doctrine's application to phone records. This case's logic has been applied to internet communications.
Riley v. California (2014): Police searched arrested individuals' smartphones incident to arrest without warrants. Unanimous Court held this violated the Fourth Amendment; digital phones are not like wallets or briefcases — they contain "the privacies of life." Riley established meaningful Fourth Amendment protection for smartphone content.
Carpenter v. United States (2018): Government obtained seven days of cell-site location information (CSLI) from a cell carrier without a warrant. 5-4 Court held this violated the Fourth Amendment. CSLI is comprehensive, retrospective, and passively generated — the third-party doctrine applies "with less force" to this data. First significant limitation of the third-party doctrine for digital data.
United States v. Jones (2012): Police attached a GPS tracker to a suspect's car without a warrant for 28 days. Unanimous Court held this a Fourth Amendment violation (as a physical trespass). Concurrences by Justices Alito and Sotomayor addressed the mosaic theory of surveillance — that prolonged aggregate surveillance creates constitutional questions even absent trespass.
The First Amendment and Surveillance
The First Amendment protects freedom of speech, association, and religion. Its relevance to surveillance operates primarily through the chilling effect doctrine: government surveillance that discourages the exercise of First Amendment rights constitutes a First Amendment violation even without direct punishment for speech.
Key applications: - Government surveillance of political organizations implicates First Amendment associational rights (established in cases arising from COINTELPRO). - Library surveillance — government monitoring of library records — implicates First Amendment interests in intellectual freedom. - Surveillance that deters religious practice (documentation of mosque attendance, monitoring of religious communities) implicates free exercise rights. - Mass surveillance programs may chill protected speech by making speakers uncertain whether their communications are monitored.
Laird v. Tatum (1972): The Supreme Court held that awareness of surveillance alone, without concrete injury, was insufficient to establish First Amendment standing. This ruling has been used to dismiss challenges to surveillance programs.
Section 2: US Federal Statutory Law
Electronic Communications Privacy Act (ECPA, 1986)
Summary: ECPA established standards for government access to electronic communications and stored data, comprising three titles: the Wiretap Act (Title I), the Stored Communications Act (Title II), and the Pen Register Act (Title III).
Title I — Wiretap Act: Requires a court order meeting a high standard ("super-warrant") for real-time interception of wire or electronic communications. Applies to content.
Title II — Stored Communications Act (SCA): Governs government access to stored electronic communications (email, cloud storage). Different and lower standards apply than for real-time interception.
Title III — Pen Register Act: Governs government use of devices that capture call metadata (numbers dialed, IP addresses) without content. Requires a court order but at a lower standard than for content.
Why ECPA is Outdated: ECPA was enacted in 1986, before the World Wide Web, email was understood as a communication medium, or cloud storage existed. Provisions including the "180-day rule" — allowing government to access emails stored over 180 days without a warrant, because they were treated as "abandoned" — reflect a 1986 technological context that has no contemporary relevance. Reform efforts have repeatedly failed in Congress due to law enforcement lobbying.
FISA and the FISA Court
Foreign Intelligence Surveillance Act (1978): FISA established a special federal court — the Foreign Intelligence Surveillance Court (FISC) — with jurisdiction to authorize surveillance for foreign intelligence purposes. FISA was enacted following the Church Committee's documentation of surveillance abuses, as a mechanism to provide judicial oversight for intelligence surveillance.
FISA Court: The FISC operates in secret; its judges are appointed by the Chief Justice without Senate confirmation; its proceedings are ex parte (only the government's side is presented). The FISC has been criticized for rubber-stamping government surveillance requests — it has approved the vast majority of requests brought before it.
Key FISA Sections: - Title I (Traditional FISA Orders): Authority for surveillance of specific foreign powers or their agents, under relatively stringent procedural requirements. - Section 215 (now amended): Authorized bulk collection of phone metadata under the theory that business records could be compelled if "relevant" to an authorized investigation. The Snowden revelations revealed this was used for mass collection of Americans' phone records. Section 215's phone records program was ended; metadata collection authority was modified by the USA FREEDOM Act (2015). - Section 702: Authorizes collection of communications of non-US persons located outside the United States, including from US-based internet companies. PRISM operated under Section 702. This authority has been the most controversial in the post-Snowden era; its reauthorizations have prompted significant congressional debate.
USA PATRIOT Act (2001)
The Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act (yes, that's what it spells) dramatically expanded surveillance authorities following September 11, 2001. Key provisions:
- Section 215: Expanded "business records" authority for FISA investigations.
- Roving wiretaps: Authorized surveillance orders that follow an individual rather than a specific phone or device.
- "Sneak and peek" searches: Authorized delayed notification search warrants — searches without contemporaneous notice to the subject.
- National Security Letters (NSLs): Authorized FBI to compel disclosure of records from businesses without court approval, with attached gag orders prohibiting disclosure.
- Information sharing: Removed barriers between intelligence and law enforcement information sharing.
Many PATRIOT Act provisions were controversial; some were subject to sunset clauses requiring reauthorization, which has prompted periodic congressional debates.
HIPAA (Health Insurance Portability and Accountability Act, 1996)
What It Covers: HIPAA's privacy rule applies to "protected health information" held by "covered entities" (healthcare providers, health plans, healthcare clearinghouses) and their business associates. Covered entities must limit disclosure of PHI, provide patients access to their records, and implement security safeguards.
Critical Gap: HIPAA does not apply to health information held by non-covered entities — including fitness trackers, health apps, direct-to-consumer genetic testing services, and most employee wellness programs. Health data held by Apple Health, Fitbit, 23andMe, or similar services may not be HIPAA-protected.
FERPA (Family Educational Rights and Privacy Act, 1974)
What It Covers: FERPA applies to educational institutions that receive federal funding and protects the privacy of student education records. Parents have rights to access and correct their children's records; when students turn 18, these rights transfer to the student. Schools generally cannot disclose education records without consent.
Limitations: FERPA was enacted before educational technology platforms became ubiquitous. Ed tech platforms that collect behavioral data on students may not be clearly covered by FERPA as "education records" administrators. Enforcement depends on schools' compliance.
COPPA (Children's Online Privacy Protection Act, 1998)
What It Covers: COPPA applies to websites and online services directed at children under 13, or that knowingly collect personal information from children under 13. Covered entities must obtain verifiable parental consent before collecting personal information, provide privacy notices, and allow parents to review and delete their children's information.
Critical Limitations: COPPA's age-13 threshold is routinely circumvented by having children falsely attest to being 13 or older. The FTC enforces COPPA; fines for violation have been issued against major platforms but critics argue they are insufficient deterrents.
Video Privacy Protection Act (1988)
What It Covers: The VPPA prohibits "video tape service providers" from disclosing rental or purchase records without informed written consent. Passed in the specific context of the disclosure of Supreme Court nominee Robert Bork's video rental history, the VPPA has been invoked in litigation against streaming services that shared viewing histories with third parties (including Facebook's share button). Courts have debated whether streaming services are "video tape service providers" under the 1988 definition.
Why It Matters: The VPPA illustrates how narrowly targeted privacy laws can have unexpected scope when applied to new technologies — and how they can also fail to cover new contexts.
Section 3: US State Privacy Laws
CCPA/CPRA — California
California Consumer Privacy Act (CCPA, effective 2020) and California Privacy Rights Act (CPRA, effective 2023): The most significant US state privacy laws. The CPRA substantially strengthened the CCPA and created the California Privacy Protection Agency (CPPA) as an independent enforcement agency.
Consumer Rights Under CCPA/CPRA: - Right to Know: Right to know what personal information is collected, used, sold, or disclosed, and to whom. - Right to Delete: Right to request deletion of personal information. - Right to Opt Out: Right to opt out of the sale or sharing of personal information for cross-contextual behavioral advertising. - Right to Correct: Right to correct inaccurate personal information. - Right to Limit Use of Sensitive Personal Information: Sensitive categories (Social Security number, precise geolocation, racial/ethnic origin, religious beliefs, health information, sexual orientation) have additional protections. - Right to Non-Discrimination: Covered businesses cannot discriminate against consumers who exercise privacy rights (e.g., by denying service or charging higher prices).
Scope: Applies to for-profit businesses that meet certain thresholds (annual gross revenue over $25 million; buy/sell/receive personal information of 100,000+ consumers; or derive 50%+ of revenues from selling/sharing personal information) and that do business in California.
Illinois Biometric Information Privacy Act (BIPA, 2008)
Why BIPA is Most Important Biometric Law: Illinois's BIPA is the most significant biometric privacy law in the United States. It requires informed written consent before collecting biometric identifiers (fingerprints, retina/iris scans, voiceprints, face geometry scans, and others), prohibits selling biometric data, requires a data retention and destruction schedule, and establishes a private right of action — meaning individuals can sue directly without waiting for government enforcement.
The private right of action is what makes BIPA powerful: major class action settlements under BIPA have reached hundreds of millions of dollars (including a $650 million settlement against Facebook for facial recognition data collection). BIPA has driven significant changes in how biometric technology companies operate in Illinois, including decisions by some companies not to deploy facial recognition in the state.
The Patchwork Challenge: Only a few states (Illinois, Texas, Washington, and a handful of others) have biometric privacy laws. Absent federal preemption, this creates a patchwork in which biometric protection depends heavily on state of residence.
State-Level Comprehensive Privacy Laws
Following California's lead, numerous states have enacted comprehensive consumer privacy laws: Virginia (VCDPA), Colorado (CPA), Connecticut (CTDPA), Utah (UCPA), and others. These laws vary on key dimensions including:
- Whether they provide a private right of action (most do not)
- Whether opt-in or opt-out is required for sensitive data processing
- Data minimization requirements
- Coverage thresholds
- Enforcement mechanisms
The lack of a federal comprehensive law means individuals' privacy rights depend significantly on their state of residence — a problematic inequity that privacy advocates continue to press Congress to address.
Section 4: European Framework
GDPR — General Data Protection Regulation (EU, 2018)
Overview: The GDPR is the European Union's comprehensive data protection regulation, in force since May 25, 2018. It applies to the processing of personal data of individuals in the EU, regardless of where the processing entity is located. The GDPR's extraterritorial reach has made it a global reference standard.
Key Principles: 1. Lawfulness, fairness, and transparency: Processing must have a lawful basis; data subjects must be informed. 2. Purpose limitation: Data collected for one purpose cannot be used for incompatible purposes. 3. Data minimization: Only the data necessary for the stated purpose should be collected. 4. Accuracy: Inaccurate data must be corrected or deleted. 5. Storage limitation: Data should not be kept longer than necessary. 6. Integrity and confidentiality: Data must be processed securely. 7. Accountability: Controllers are responsible for demonstrating compliance.
Individual Rights: - Right of access (know what data is held) - Right to rectification (correct inaccurate data) - Right to erasure ("right to be forgotten") - Right to restriction of processing - Right to data portability - Right to object to processing - Rights related to automated decision-making and profiling
Lawful Bases for Processing: Processing is only lawful if one of six lawful bases applies: consent; contract performance; legal obligation; vital interests; public task; or legitimate interests of the controller. Consent, under GDPR, must be freely given, specific, informed, and unambiguous.
Enforcement: Data protection authorities (DPAs) in each member state enforce the GDPR. Fines can reach €20 million or 4% of annual global turnover, whichever is higher. Major fines have been issued against Meta (€1.2 billion by Ireland DPC for data transfers), Amazon (€746 million by Luxembourg), and Google (multiple fines across jurisdictions).
Adequacy Decisions: The EU determines whether non-EU countries provide "adequate" data protection for the purpose of international data transfers. The US has not had a consistently adequate status; the EU-US Privacy Shield framework was invalidated by the CJEU in Schrems II (2020). The EU-US Data Privacy Framework (2023) is the current mechanism for US-EU data transfers, but its durability is uncertain given ongoing litigation.
EU AI Act (2024)
The EU AI Act establishes a risk-based framework for regulating AI systems, including those used for surveillance:
Prohibited AI Practices include: - Real-time remote biometric identification in publicly accessible spaces by law enforcement (with narrow exceptions) - AI systems that exploit vulnerabilities of specific groups - Social scoring by public authorities (directly targeting Chinese-style social credit systems) - Predictive policing based solely on profiling
High-Risk AI Systems (subject to requirements including registration, transparency, human oversight, and accuracy standards) include: - Biometric identification and categorization - AI used in critical infrastructure management - AI used in employment and workers management - AI used in education - AI for law enforcement - AI for border management and migration
The EU AI Act's restrictions on real-time biometric surveillance represent the most significant legal constraints on facial recognition in public spaces by any major jurisdiction.
EU Charter of Fundamental Rights, Article 8
Article 8 of the EU Charter of Fundamental Rights establishes the right to protection of personal data as a fundamental right, distinct from privacy. This foundational status means data protection is treated differently in the EU than in the US — not primarily as a consumer protection measure but as a fundamental rights issue.
Section 5: International Human Rights
Universal Declaration of Human Rights, Article 12
"No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks."
Article 12 UDHR (1948) is the foundational international human rights norm for privacy. It is not legally binding as a treaty but is widely understood as customary international law and has influenced subsequent binding instruments.
ICCPR, Article 17
The International Covenant on Civil and Political Rights (ICCPR), which the United States has ratified, provides legally binding privacy protection at Article 17: "No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation."
The UN Human Rights Committee's General Comments on Article 17 have interpreted it to require that surveillance be necessary, proportionate, and subject to independent oversight.
UN Special Rapporteur on Privacy
The United Nations Special Rapporteur on the Right to Privacy conducts investigations and issues reports on surveillance practices worldwide. Reports have addressed: commercial surveillance; mass surveillance by state actors; surveillance of journalists and activists; and surveillance technologies in conflict zones. These reports provide important international normative context for surveillance law.
Section 6: Practical Quick-Reference Table
| Law / Framework | What It Covers | Key Rights | How to Invoke |
|---|---|---|---|
| Fourth Amendment (US) | Government searches and seizures | Protection against unreasonable search; warrant requirement | Challenge in criminal proceedings; sue under 42 U.S.C. § 1983; civil rights litigation |
| First Amendment (US) | Government interference with speech, association, religion | Protection from surveillance that chills protected activity | Litigation; challenge to surveillance programs |
| ECPA — Wiretap Act | Real-time interception of electronic communications | Government must obtain wiretap order | Challenge evidence in criminal proceedings |
| ECPA — Stored Communications Act | Government access to stored email, messages | Government generally must obtain warrant (Carpenter) | Challenge in criminal proceedings |
| FISA / Section 702 | Intelligence surveillance of foreign persons | Limited rights for US persons; notification rare | ACLU/EFF legal challenges; limited individual standing |
| HIPAA | Health data held by covered entities | Access, amendment, accounting of disclosures | File complaint with HHS Office for Civil Rights |
| FERPA | Student education records | Access, correction, limit disclosure | Complain to school; complain to US Dept. of Education |
| COPPA | Online services for children under 13 | Parental consent required; access and deletion | File complaint with FTC |
| CCPA/CPRA (CA) | Personal data held by covered businesses | Know, delete, opt out, correct, limit | Submit request to business; file complaint with CPPA |
| Illinois BIPA | Biometric identifiers | Consent; no sale; destruction schedule | Private right of action (lawsuit); class action |
| GDPR (EU) | All personal data of EU residents | Access, erasure, portability, objection, portability | Request to data controller; complaint to national DPA |
| EU AI Act | High-risk and prohibited AI systems | Transparency, human oversight, accuracy | Complaint to national AI supervisory authority |
| UDHR Article 12 / ICCPR Article 17 | State surveillance | Privacy from arbitrary government interference | UN complaints mechanisms; domestic court interpretation |
Legal frameworks change through court decisions, legislation, and regulatory action. This reference reflects the state of law as of early 2026. Students should consult current legal databases (Westlaw, LexisNexis), the Cornell Legal Information Institute (LII), and the EFF's surveillance law resources for current case law and statutory text. For legal questions in specific situations, consult a licensed attorney.