Glossary of Key Terms

The Architecture of Surveillance

This glossary defines the essential vocabulary of surveillance studies as used throughout this textbook. Terms appear in alphabetical order. Each entry includes a definition of two to four sentences and the chapter where the term is first introduced or most substantially discussed. Where a term carries multiple disciplinary meanings, the surveillance-studies usage is prioritized.


Accountability gap — The structural condition in which surveillance systems collect, process, and act on data while remaining insulated from meaningful oversight, legal challenge, or democratic accountability. Accountability gaps arise when surveillance programs are classified, when automated decisions are opaque, or when the entities responsible for harm are difficult to identify. The gap between what surveillance systems can be made to do and what legal frameworks require of them is a central problem in contemporary privacy law. (Chapter 31)

Actuarial surveillance — The use of statistical profiles derived from population-level data to make predictions and decisions about individuals. Actuarial surveillance treats people as instances of a risk category rather than as individuals with particular circumstances; insurance pricing, credit scoring, and pretrial risk assessment tools all employ actuarial logic. Critics argue that actuarial surveillance encodes historical inequalities into automated futures. (Chapter 29)

Administrative surveillance — The routine collection and processing of personal information by governments and institutions for bureaucratic purposes such as taxation, census-taking, and identification. David Lyon distinguishes administrative surveillance from more coercive forms, though the two categories often blur when administrative data is repurposed for law enforcement or social control. (Chapter 4)

Affective computing — Technologies designed to detect, interpret, and respond to human emotional states, typically through analysis of facial expressions, vocal tone, physiological signals, or behavioral patterns. Affective computing has been deployed in job interviews, classroom monitoring, and customer service contexts. Critics note that the underlying science — that emotions can be reliably read from facial expressions — is contested. (Chapter 29)

Algorithmic management — The use of software systems to monitor, direct, evaluate, and discipline workers in real time, replacing or supplementing human supervisors with automated decision-making. Algorithmic management systems can track keystrokes, monitor delivery routes, score customer interactions, and determine pay — often without worker understanding of how decisions are made. (Chapter 28)

Anonymization — The process of modifying a dataset so that individuals cannot be identified from the data alone or in combination with other reasonably available information. True anonymization is technically difficult; studies have repeatedly demonstrated that ostensibly anonymous datasets can be re-identified using auxiliary information. The gap between claimed anonymization and actual anonymization is a persistent feature of the data economy. (Chapter 32)

Asymmetric visibility — See Visibility asymmetry.

Behavioral surplus — Shoshana Zuboff's term for the portion of behavioral data collected by technology platforms that exceeds what is needed to improve products or services, and which is instead used to train predictive models and sold in behavioral futures markets. Behavioral surplus is the raw material of surveillance capitalism; it is extracted without users' meaningful knowledge or compensation. (Chapter 11)

Biometrics — Physiological or behavioral characteristics that can be used to identify individuals, including fingerprints, iris patterns, facial geometry, gait, voice, and DNA. Unlike passwords, biometric identifiers cannot be changed if compromised, making biometric surveillance databases a particularly consequential form of data collection. (Chapter 7)

Body camera (bodycam) — A wearable recording device typically affixed to a law enforcement officer's uniform to document police-civilian interactions. While introduced as an accountability tool, body camera footage is often controlled by police departments, raising questions about selective activation, footage access, and whether the cameras genuinely deter misconduct or primarily surveil the surveilled. (Chapter 8)

Boolean logic in surveillance — The use of "if-then" conditional rules to trigger surveillance responses, flag records, or generate alerts. Many algorithmic surveillance systems use Boolean or rule-based logic layered atop statistical models; understanding Boolean logic helps analysts and auditors trace why particular decisions were made. (Chapter 28)

Browser fingerprinting — A tracking technique that identifies individual browsers by compiling a unique profile of device attributes — screen resolution, installed fonts, browser plugins, time zone, and hundreds of other signals — without placing any file on the user's device. Browser fingerprinting is more persistent than cookies and more difficult to block. (Chapter 12)

Bulk collection — The mass interception or retention of communications data from entire populations or large segments of a population, rather than targeted collection directed at specific suspects. Bulk collection programs, such as those revealed by Edward Snowden in 2013, raise fundamental questions about whether mass surveillance is compatible with the presumption of innocence. (Chapter 9)

CCPA (California Consumer Privacy Act) — A California state law enacted in 2018 (substantially strengthened by the CPRA in 2020) that grants California residents rights including the right to know what personal information is collected about them, the right to delete personal information, and the right to opt out of the sale of personal information. The CCPA is the most significant US consumer privacy law, though enforcement has been uneven. (Chapter 31)

Chilling effect — The inhibition of legally protected behavior — speech, assembly, association, religious practice — caused by surveillance or the awareness of being watched. The chilling effect is a First Amendment doctrine but also a sociological phenomenon: people change their behavior when they believe they are observed, even when they have done nothing wrong. (Chapter 1)

COINTELPRO — The Federal Bureau of Investigation's covert program of surveillance, infiltration, and disruption targeting domestic political organizations, operating from 1956 to 1971. COINTELPRO targeted civil rights organizations, antiwar groups, and socialist organizations; its tactics included planting informants, sending anonymous threatening letters, and fabricating evidence. The program's exposure by the Citizens' Commission to Investigate the FBI in 1971 remains the definitive case study of how surveillance is weaponized against political dissent. (Chapter 6)

Commercial surveillance — The systematic collection, processing, and monetization of personal data by private corporations for advertising targeting, behavioral prediction, credit assessment, and related commercial purposes. Commercial surveillance has expanded dramatically with the rise of internet-connected devices, social media platforms, and data broker ecosystems. (Chapter 11)

Consent (informed) — A legal and ethical requirement that individuals must be meaningfully informed about data collection practices and must voluntarily agree to them, without coercion and with genuine understanding of what they are agreeing to. Most academic analysts argue that surveillance capitalism's consent mechanisms — lengthy terms-of-service agreements, "agree or don't use" ultimatums — fall far short of genuine informed consent. (Chapter 11)

Context collapse — The flattening of distinct social contexts in digital communication, such that information shared in one context (with friends) becomes visible in others (to employers or law enforcement). Defined by danah boyd, context collapse is a mechanism by which social media surveillance extracts information that speakers did not intend to make public. (Chapter 13)

Cookie (browser) — A small text file placed on a user's device by a website to store information about their visit, preferences, or identity. First-party cookies are set by the visited site; third-party cookies are set by other entities (typically advertisers) whose code is embedded in the page. Third-party cookies are the backbone of cross-site behavioral tracking, though browser makers and regulators have moved to restrict them. (Chapter 12)

COPPA (Children's Online Privacy Protection Act) — A US federal law enacted in 1998 that requires websites and online services directed at children under 13 to obtain verifiable parental consent before collecting personal information. COPPA's age-13 threshold is easily circumvented, and its enforcement has been criticized as inadequate relative to the scale of children's data collection. (Chapter 38)

Counter-surveillance — Practices, technologies, and strategies designed to detect, prevent, or disrupt surveillance. Counter-surveillance ranges from technical tools (Tor, Signal, ad blockers) to behavioral practices (wearing hats or sunglasses to defeat facial recognition) to activist interventions (documenting police surveillance at protests). (Chapter 33)

Credit scoring — The use of algorithms to generate numerical scores predicting the likelihood that an individual will repay debt, using inputs including payment history, account types, credit utilization, and inquiry history. Credit scores function as a surveillance infrastructure for financial institutions; what variables are included (and excluded) carries significant equity implications. (Chapter 29)

Data broker — A company that collects personal information from diverse sources — public records, loyalty programs, social media, transaction data — aggregates and analyzes it, and sells profiles to third parties, typically without direct relationships with the individuals profiled. Data brokers are a largely unregulated industry in the United States that underpins advertising, insurance, and background-check industries. (Chapter 11)

Data minimization — A privacy-by-design principle holding that systems should collect only the minimum data necessary for a specified purpose, retain it only for as long as necessary, and not repurpose it for other ends. Data minimization is a core principle of the GDPR and a counterweight to the "collect everything" logic of surveillance capitalism. (Chapter 39)

Dataveillance — Coined by Roger Clarke (1988), dataveillance refers to the systematic monitoring of individuals or populations through the analysis of their data trails, rather than through direct observation. Dataveillance includes credit monitoring, loyalty card tracking, internet browsing analysis, and any other practice that uses transactional or behavioral data records to surveil. (Chapter 1)

Deep fake — Synthetic audio or video content generated by machine learning models that depicts real individuals saying or doing things they did not say or do. Deep fakes have significant implications for surveillance: they undermine the evidentiary status of video footage, can be weaponized in harassment campaigns, and expose gaps in authentication infrastructure. (Chapter 40)

Differential privacy — A mathematical framework for releasing statistical information about a dataset while protecting individual privacy, achieved by adding calibrated random noise to query responses. Differential privacy, formalized by Cynthia Dwork (2006), allows aggregate analysis without exposing individual records; it is used by Apple, Google, and the US Census Bureau. (Chapter 39)

Digital exhaust — The data generated incidentally as a byproduct of digital activities — location signals from phone towers, browsing history, purchase timestamps — which, while not deliberately shared, constitutes a detailed behavioral record. Digital exhaust forms a significant portion of the raw material collected by data brokers and advertising platforms. (Chapter 11)

Discriminatory surveillance — The disproportionate or intentional targeting of surveillance at particular communities based on race, ethnicity, religion, political affiliation, or other protected characteristics, often without individualized suspicion. Discriminatory surveillance is both a historical pattern (lantern laws, COINTELPRO) and a contemporary concern (predictive policing algorithms, mosque surveillance programs). (Chapter 36)

Disparate impact — A legal doctrine and analytical concept holding that a practice or policy can be discriminatory even if it was not designed with discriminatory intent, if its effect falls disproportionately on protected groups. Disparate impact analysis is central to evaluating algorithmic systems for racial, gender, or other bias. (Chapter 29)

Distributed surveillance — Surveillance conducted by multiple actors across a decentralized network rather than by a single central authority. Networked home cameras, neighborhood apps, and crowdsourced crime-reporting platforms distribute surveillance function across communities, creating a participatory surveillance infrastructure. (Chapter 16)

Drone surveillance — The use of unmanned aerial vehicles equipped with cameras, thermal sensors, or other instruments to monitor individuals, crowds, or geographic areas. Drone surveillance has been used by law enforcement for border patrol, protest monitoring, and crime investigation, often operating in legal gray zones regarding warrant requirements. (Chapter 9)

Electronic Communications Privacy Act (ECPA) — A US federal law enacted in 1986 that establishes standards for government access to electronic communications and stored data. ECPA was written before modern internet services existed and contains significant gaps and ambiguities that courts and Congress have struggled to resolve, including the controversial "180-day rule" for email. (Chapter 31)

Emotion recognition — See Affective computing.

Encryption — The process of encoding data so that it can only be read by parties with the appropriate decryption key. End-to-end encryption, used by Signal and similar applications, ensures that even the service provider cannot read message content. Encryption is a foundational privacy protection technology, and law enforcement debates about "backdoor" access represent a central tension in surveillance policy. (Chapter 33)

Environmental DNA (eDNA) — Genetic material shed by organisms into their environment — water, soil, air — that can be collected and analyzed to determine which species (including humans) are present in a location. Environmental DNA collection is an emerging form of passive biological surveillance with applications in ecology, law enforcement, and population monitoring. (Chapter 23)

Epidemiological surveillance — The systematic collection and analysis of health data to monitor the distribution and determinants of disease in populations. Epidemiological surveillance is public health's most essential tool, but it involves substantial collection of sensitive personal information, raising tensions between individual privacy and collective health. (Chapter 24)

FERPA (Family Educational Rights and Privacy Act) — A US federal law enacted in 1974 that protects the privacy of student education records and gives parents (and students over 18) rights to access, correct, and limit disclosure of those records. FERPA's protections have been tested by the expansion of educational technology platforms that collect detailed behavioral data. (Chapter 38)

FISA (Foreign Intelligence Surveillance Act) — A US federal law enacted in 1978 establishing a special court (the FISA Court) that authorizes surveillance for foreign intelligence purposes, operating under procedures substantially different from ordinary criminal court. FISA's broad authorities, and the secrecy surrounding FISA Court orders, were central to the Snowden revelations of 2013. (Chapter 9)

Forensic surveillance — Retrospective surveillance that uses stored data to reconstruct past behavior, movements, or communications, often in the context of criminal investigation. The availability of large volumes of historical digital data has dramatically expanded forensic surveillance capabilities; investigators can now reconstruct a suspect's movements, communications, and associations with high precision. (Chapter 18)

Function creep — The gradual expansion of a surveillance system's purpose beyond its original stated scope. Function creep is a near-universal feature of surveillance technologies: systems built for one purpose (tracking infectious disease contacts, monitoring highway traffic) are regularly repurposed for additional surveillance functions, often without new legal authorization or public debate. (Chapter 4)

Gait recognition — A biometric surveillance technique that identifies individuals based on the distinctive pattern of their walk, observable from video footage even when the subject's face is obscured. Gait recognition is in deployment in China and under research in the United States and Europe. (Chapter 7)

GDPR (General Data Protection Regulation) — The European Union's comprehensive data protection law, in force since May 2018, which establishes rights for EU residents including the right of access, right to erasure, right to data portability, and right to object to automated decision-making. The GDPR's extraterritorial reach and substantial penalties have made it a global reference point for data protection regulation. (Chapter 31)

Geofence warrant — A type of search warrant that compels a technology company (typically Google) to provide law enforcement with records of all devices present within a defined geographic area during a specified time period. Geofence warrants are controversial because they are not targeted at specific suspects but rather sweep up data from everyone in an area. (Chapter 18)

Hierarchical surveillance — Surveillance directed downward through organizational or social hierarchies — employers watching employees, governments watching citizens, parents watching children. Hierarchical surveillance is the dominant mode theorized by Foucault and Bentham; it is the panoptic gaze directed from positions of power toward those with less. (Chapter 1)

HIPAA (Health Insurance Portability and Accountability Act) — A US federal law enacted in 1996 that establishes privacy and security standards for protected health information held by "covered entities" (healthcare providers, insurers, and clearinghouses) and their business associates. HIPAA's applicability does not extend to most health apps, fitness trackers, or direct-to-consumer genetic testing services, creating significant gaps. (Chapter 19)

HUMINT (Human Intelligence) — Intelligence gathered through interpersonal contact, including informants, undercover agents, and human sources. HUMINT represents one pole of the surveillance spectrum, contrasted with SIGINT (signals intelligence); surveillance programs that rely on community informants — including COINTELPRO's use of domestic informants — are HUMINT operations. (Chapter 9)

Informed consent — See Consent (informed).

Internet of Things (IoT) — The expanding network of internet-connected physical objects embedded with sensors, software, and communication capabilities that collect and share data. IoT devices including smart speakers, connected appliances, fitness trackers, and industrial sensors generate continuous streams of behavioral data; security researchers have documented numerous vulnerabilities. (Chapter 15)

Lateral surveillance — Surveillance of peers by peers, often facilitated by digital platforms — neighbors watching neighbors, friends monitoring friends, romantic partners tracking each other. Lateral surveillance redistributes the surveillance gaze beyond institutional actors and implicates ordinary people as both watchers and watched. (Chapter 16)

Lantern laws — Colonial and antebellum laws in New York City requiring enslaved Black people to carry lanterns when moving through the city after dark, making them perpetually visible and identifying them as subjects of suspicion. Simone Browne analyzes lantern laws as an early biometric surveillance technology applied along racial lines. (Chapter 36)

Liquid surveillance — Zygmunt Bauman and David Lyon's concept of surveillance in late modernity as mobile, flexible, and permeating all social domains rather than residing in fixed institutional structures. Liquid surveillance flows between sectors (commercial, governmental, social) and adapts to changing technological conditions. (Chapter 4)

Metadata — Data about data — information describing the attributes of a communication or file rather than its content. In telecommunications, metadata includes who called whom, when, for how long, and from where. Security agencies and legal actors have long argued that collecting metadata is less intrusive than content; researchers have demonstrated that metadata enables detailed behavioral inference. (Chapter 9)

Mosaic theory — The principle that individually innocuous pieces of information, when assembled together, can constitute a comprehensive and invasive surveillance picture. Mosaic theory has been cited by courts and civil liberties advocates to argue that aggregated metadata warrants the same Fourth Amendment protections as the content of communications. (Chapter 9)

Network analysis — A methodological approach that maps and measures relationships and flows between entities (people, organizations, devices) in a network. Law enforcement agencies use network analysis to identify "associates" of suspects; marketing companies use it to identify influential social nodes; social scientists use it to study information diffusion. (Chapter 13)

Obfuscation — A privacy protection strategy involving the deliberate injection of misleading, irrelevant, or false information into surveillance data streams to make individual records harder to isolate or interpret. Obfuscation strategies include browser plugins that generate fake browsing traffic, burner phone use, and wearing camouflage makeup designed to confuse facial recognition. (Chapter 33)

Opt-in / Opt-out — The two paradigms for acquiring data subject consent: opt-in requires affirmative consent before data collection begins; opt-out places the burden on the individual to proactively refuse collection, with collection as the default. Most commercial surveillance systems use opt-out or "implied consent" frameworks; privacy advocates argue that opt-in is the appropriate standard for sensitive data. (Chapter 11)

Panopticism — Michel Foucault's theoretical analysis, derived from Jeremy Bentham's panopticon design, of a modern power mechanism in which the possibility of constant observation induces subjects to regulate their own behavior. Panopticism describes not just prisons but schools, hospitals, factories, and any institutional arrangement in which visibility itself functions as a mechanism of control. (Chapter 2)

Panopticon — Jeremy Bentham's late-18th-century architectural design for a prison in which a central observation tower is surrounded by a ring of cells, each visible from the tower. The panopticon's key feature is that inmates cannot know whether they are being observed at any given moment, inducing them to behave as if they always are. Foucault used the panopticon as a metaphor for modern disciplinary power. (Chapter 2)

Passive acoustic monitoring — The use of sensors and recording equipment to detect and analyze sounds in an environment — typically for wildlife monitoring — without active signal transmission. Passive acoustic monitoring applied to birdsong is a case study in how environmental surveillance technologies generate data about non-target subjects; the same infrastructure can record human activity. (Chapter 22)

People analytics — The application of data analysis to human resources functions, including hiring, performance management, retention, and workforce planning. People analytics uses employee data — productivity metrics, communication patterns, survey responses — to generate insights and recommendations, raising questions about employee privacy and algorithmic bias. (Chapter 29)

Performative surveillance — Surveillance that is not primarily about gathering information but about communicating power, demonstrating capacity, and inducing behavioral compliance through the knowledge that surveillance occurs. Security theater at airports and visible CCTV cameras in public spaces often function performatively. (Chapter 2)

Platform surveillance — The collection of user behavioral data by social media platforms, search engines, and other digital intermediaries through the operation of their services. Platform surveillance is distinguished by its scale, its integration with social life, and the degree to which users are unaware of or unable to contest data collection. (Chapter 13)

Predictive policing — The use of algorithms to forecast where crimes are likely to occur or who is likely to commit crimes, used to allocate police resources. Predictive policing systems trained on historical crime data have been shown to encode and amplify existing racial disparities in law enforcement. (Chapter 8)

Pre-crime logic — The extension of surveillance and law enforcement to target individuals not for what they have done but for what statistical models predict they might do. Pre-crime logic is embedded in predictive policing systems, terrorist watch lists, and some forms of social benefit fraud detection. (Chapter 8)

Privacy by design — An approach to systems engineering and policy development that builds privacy protections into the design of technologies, systems, and institutional practices from the outset, rather than treating privacy as an add-on compliance requirement. Privacy by design was developed by Ann Cavoukian and is now a GDPR requirement. (Chapter 39)

Privacy paradox — The observed tendency for individuals to express strong privacy preferences in surveys while behaving in ways that reveal little actual concern for privacy in practice — accepting data-intensive apps, posting personal information publicly, or trading data for convenience. Researchers debate whether the privacy paradox reflects genuine ambivalence, rational calculation, or structural constraints on meaningful privacy choice. (Chapter 20)

PRISM — A US National Security Agency program, disclosed by Edward Snowden in 2013, that collected internet communications — including email, chat, video, photos, and stored data — from the servers of major American technology companies including Google, Apple, Facebook, and Microsoft. PRISM operated under FISA Section 702 authority. (Chapter 9)

Profiling — The practice of constructing an analytical model of an individual or category of individuals from data, which is then used to predict behavior, assess risk, or make decisions. Profiling ranges from marketing segmentation to terrorist watch-listing; in all cases, it involves treating predictions about a category as applicable to an individual. (Chapter 12)

Quantified self — A movement and practice involving the continuous self-monitoring of bodily functions, activities, and behaviors using wearable sensors, apps, and other tracking technologies, with the goal of self-optimization. The quantified self represents a form of voluntary self-surveillance; the data generated is typically also collected by device manufacturers. (Chapter 20)

Real-time bidding (RTB) — An automated advertising auction system in which ad impressions are sold and purchased in milliseconds as a user loads a webpage, with bids informed by detailed data profiles of the user. Real-time bidding involves the simultaneous transmission of personal data to hundreds or thousands of advertising entities; privacy researchers have argued that RTB structurally violates GDPR. (Chapter 12)

Re-identification — The process of linking anonymized data back to identifiable individuals, typically by combining the dataset with auxiliary information. Re-identification research has consistently demonstrated the fragility of anonymization; Netflix viewing records, hospital data, and location traces have all been re-identified in published studies. (Chapter 32)

Rendition cycle — Shoshana Zuboff's term for the process by which surveillance capitalism converts human experience into behavioral data, translating the texture of lived life into quantifiable signals for predictive modeling. The rendition cycle describes how the digital representation of experience feeds back into and shapes that experience. (Chapter 11)

Responsibilization — A neoliberal governance strategy that shifts responsibility for managing social risks from institutions to individuals, often facilitated by surveillance technologies that monitor and evaluate individual compliance. Wellness programs that monitor employee health metrics, insurance pricing based on driving behavior, and self-reporting requirements all exemplify responsibilization. (Chapter 20)

Risk scoring — See Actuarial surveillance.

Satellite imagery surveillance — The use of satellite-mounted cameras and sensors to observe the earth's surface at various resolutions, used for military intelligence, environmental monitoring, urban planning, and commercial applications. The democratization of high-resolution satellite imagery (through commercial providers like Planet Labs) has extended surveillance capabilities to non-state actors. (Chapter 21)

Score systems (social credit) — See Social credit system.

Security theater — Surveillance or security measures that are visible and reassuring but that provide little actual security benefit. The term, coined by Bruce Schneier, describes TSA security screening, decorative CCTV cameras, and other measures whose primary function is to perform security rather than achieve it. (Chapter 8)

Shadow profile — A profile constructed and maintained by a platform about an individual who has not registered for or consented to that platform, typically built from contact lists uploaded by the individual's connections, tracking pixels, and data broker information. Facebook's shadow profiles were documented in legal proceedings following the Cambridge Analytica scandal. (Chapter 13)

SIGINT (Signals Intelligence) — Intelligence gathered by intercepting electronic signals, including communications (COMINT) and non-communications signals. SIGINT agencies — including the NSA in the United States and GCHQ in the United Kingdom — conduct bulk collection of communications data under authorities that have generated significant legal and political controversy. (Chapter 9)

Smart city — An urban environment in which networked sensors, data analytics, and automated systems are used to manage infrastructure, services, and governance. Smart city infrastructure generates pervasive ambient surveillance data; critics argue that smart city projects are often surveillance infrastructure projects marketed as efficiency improvements. (Chapter 25)

Social credit system — China's government-operated system of behavioral scoring and sanction that aggregates data from financial, legal, social, and behavioral records to generate scores that affect individuals' access to services, travel, employment, and social standing. The social credit system is the most comprehensive governmental behavior-management surveillance system currently operating at scale. (Chapter 10)

Social graph — The map of relationships between individuals on a social platform or in a social network, representing who knows whom and how. Social graph data is highly revealing even without communication content; analysis of social graphs enables inference about political beliefs, sexual orientation, health conditions, and other sensitive attributes. (Chapter 13)

Social sorting — David Lyon's concept of surveillance as a mechanism for sorting populations into categories that receive different treatments, opportunities, or restrictions. Social sorting transforms surveillance data into differential outcomes; it is the mechanism by which surveillance perpetuates and intensifies social stratification. (Chapter 4)

Stalkerware — Software secretly installed on a device, typically by a domestic partner or abuser, to covertly monitor the device user's communications, location, and activities. Stalkerware is a domestic violence technology and represents the darkest application of consumer-grade surveillance tools. (Chapter 19)

Structural surveillance — Surveillance that operates not through conscious intent to watch but through the design of environments, architectures, and systems that make certain populations perpetually legible to power. Structural surveillance includes the racial geography of CCTV deployment, the surveillance embedded in welfare administration, and the differential smartphone tracking of low-income versus affluent users. (Chapter 36)

Surveillance capitalism — Shoshana Zuboff's framework describing an economic logic in which human behavioral data — extracted without meaningful consent — is the primary raw material for generating predictions about human behavior, which are sold in behavioral futures markets. Surveillance capitalism names a specific mutation of capitalism, not merely the use of digital technology for commercial purposes. (Chapter 11)

Surveillance creep — See Function creep.

Surveillance studies — An interdisciplinary academic field examining the social, political, ethical, and technical dimensions of monitoring practices, with roots in sociology, political science, geography, law, computer science, and philosophy. Surveillance studies emerged as a distinct field in the 1990s through the work of David Lyon, Gary Marx, and others. (Chapter 1)

Sousveillance — Steve Mann's term for surveillance directed upward — recording those in positions of power by those subject to their authority. Citizen filming of police violence is a form of sousveillance; the Rodney King video was a landmark sousveillance moment. Sousveillance is one tool of accountability in asymmetric surveillance environments. (Chapter 33)

Synopticism — Thomas Mathiesen's concept, proposed as a complement to panopticism, describing the condition in which the many watch the few — as in mass media, celebrity culture, and reality television. Synopticism captures surveillance dynamics not addressed by Foucault's primarily vertical, hierarchical model. (Chapter 3)

Targeting — In surveillance contexts, the selection of specific individuals or communications for intensive collection and analysis. Targeted surveillance is distinguished from bulk collection; civil libertarians argue that surveillance programs should be limited to targeted collection predicated on individualized suspicion. (Chapter 9)

Third-party doctrine — A US legal doctrine, derived from Smith v. Maryland (1979), holding that information voluntarily shared with third parties (banks, telephone companies, service providers) loses Fourth Amendment protection. The third-party doctrine has been applied to dramatically limit privacy protections for digital communications; the Supreme Court's Carpenter v. United States decision (2018) marked a partial retreat from this doctrine. (Chapter 9)

Tracking pixel — An invisible one-pixel image embedded in emails or webpages that, when loaded, sends information about the user's IP address, device, email client, and behavior back to the tracking entity. Tracking pixels are widely used in email marketing and web analytics; they can detect when emails are opened, forwarded, or read on which device. (Chapter 12)

Transparency — In data governance, the principle that individuals should be able to know what information is collected about them, how it is processed, by whom, and for what purposes. Transparency is a necessary but insufficient condition for meaningful privacy protection; disclosing surveillance practices in opaque legal language or fifty-page terms of service does not constitute meaningful transparency. (Chapter 31)

Ubiquitous computing — A paradigm in which computing devices are embedded throughout environments and everyday objects, making computation continuous and ambient rather than confined to discrete devices. Ubiquitous computing enables ubiquitous surveillance; the smart home, smart city, and IoT all instantiate this paradigm. (Chapter 15)

USA PATRIOT Act — A sweeping post-September 11 US federal law (2001) that dramatically expanded government surveillance authorities, lowered standards for FISA warrants, authorized roving wiretaps, enabled access to library and bookstore records, and facilitated information sharing between intelligence and law enforcement agencies. Many provisions were controversial and were subject to ongoing congressional reauthorization debates. (Chapter 9)

Veillance — A portmanteau of "surveillance" and "sousveillance" coined by Steve Mann to describe the general field of watching and being watched, inclusive of all directions and modalities. (Chapter 1)

Video Privacy Protection Act (VPPA) — A US federal law enacted in 1988 prohibiting video rental services from disclosing customers' rental records without consent. Passed in response to the disclosure of Supreme Court nominee Robert Bork's video rental history, the VPPA has been invoked in litigation concerning streaming service data sharing. (Chapter 31)

Visibility asymmetry — The condition in which surveillance arrangements make some actors highly visible to others while rendering those doing the watching relatively invisible. Visibility asymmetry is a structural feature of hierarchical surveillance and is central to the power effects Foucault analyzes: those watched cannot see their watchers, cannot know when observation is occurring, and cannot contest or respond to the surveillance gaze. (Chapter 1)

VPN (Virtual Private Network) — A technology that encrypts internet traffic and routes it through a server in another location, obscuring the user's IP address and browsing activity from their ISP and local network. VPNs protect against certain forms of surveillance (network monitoring) but do not provide anonymity from the VPN provider itself or from websites that use other tracking techniques. (Chapter 33)

Warrant (general) — See Geofence warrant and Third-party doctrine.

Warrant canary — A periodic statement by a service provider that it has not received secret government orders compelling data disclosure, which functions by its absence: if the canary disappears, users can infer that a secret order has been received. Warrant canaries are a legal workaround to gag orders accompanying national security letters. (Chapter 33)

Watch list — A database of individuals flagged for enhanced scrutiny, restriction, or interdiction, maintained by governments or private entities. Watch lists — including the US Terrorist Screening Database (the "no-fly list") — often lack meaningful appeal processes, and inclusion criteria are frequently opaque. (Chapter 9)

Wearable technology — Consumer electronics worn on the body that continuously monitor physiological signals, movement, location, and other personal data, including fitness trackers, smartwatches, and health monitors. Wearable technology blurs the boundary between voluntary self-quantification and involuntary surveillance, particularly when the data is accessible to employers, insurers, or device manufacturers. (Chapter 20)

Whistleblowing — The disclosure of information about illegal, dangerous, or unethical practices by an insider to an external authority or the public, often at significant personal risk. Whistleblowers — including Edward Snowden, Chelsea Manning, and Frances Haugen — have been among the most important sources of information about surveillance programs and their social effects. (Chapter 30)

Zero-day vulnerability — A software flaw unknown to the software developer that can be exploited by attackers before a patch is issued. Intelligence agencies have been revealed to stockpile zero-day vulnerabilities for offensive surveillance capabilities; this practice is controversial because stockpiled vulnerabilities also expose civilian users to exploitation. (Chapter 9)

Zero-knowledge proof — A cryptographic method by which one party can prove to another that they know a value without conveying any information about the value itself. Zero-knowledge proofs enable privacy-preserving authentication and verification systems and are increasingly significant in privacy-by-design engineering. (Chapter 39)


Terms in bold throughout the main text are defined in this glossary. Where terms have multiple technical meanings, the definition above reflects usage specific to surveillance studies. For the etymology and conceptual history of key terms, consult the primary sources listed in the Bibliography (Appendix).