Key Takeaways — Chapter 36: Racial Surveillance and the Discriminatory Gaze
Core Argument
Surveillance systems do not watch everyone equally. The distribution of surveillance burdens follows racial lines with remarkable historical consistency. This is not accidental or aberrational; it reflects the structural logic of societies organized around racial hierarchy, in which monitoring technologies have been consistently deployed to manage, control, and suppress populations racially marked as threatening, foreign, or requiring verification.
Historical Continuity
-
New York City's 1713 Lantern Laws required enslaved and free Black people to carry lanterns when unaccompanied by white persons after dark — a visibility-control technology that made certain bodies legible to authority and subject to detention if they could not produce proper credentials.
-
The slave pass system required enslaved people to carry written authorization to move through public space — an identification and travel permit system that is structurally continuous with modern ID requirements and biometric verification.
-
Simone Browne's Dark Matters (2015) argues that modern surveillance is not merely analogous to historical racial control technologies but is continuous with them: the logic of making Black bodies verifiable and detainable persists across technological generations.
-
The history of biometrics — including fingerprinting and anthropometric measurement — is inseparable from the history of scientific racism; the measurement of bodies for identification was developed by eugenicists and racial scientists.
Contemporary Expressions
-
Stop and frisk as practiced by the NYPD (2003–2013) was mass surveillance of Black and Latino men: 685,724 stops at peak, over 88% Black or Latino, over 88% completely innocent. The UF-250 database constituted a surveillance registry of the innocent.
-
Predictive policing systems like PredPol are trained on arrest data that reflects historical over-policing of communities of color, creating a feedback loop: over-policing generates arrest data → data trains algorithm to predict more policing in same areas → more policing generates more arrest data.
-
ShotSpotter acoustic surveillance is deployed almost exclusively in majority-Black and Latino neighborhoods; a Chicago OIG audit found 89% of alerts led to no evidence of a gun crime.
-
The Ring Neighbors platform distributes and amplifies racial bias in threat assessment: human reviewers operating through frameworks shaped by racialized crime coverage flag Black and Brown people as suspicious at disproportionate rates.
-
Commercial facial recognition systems have documented racial bias: Gender Shades found error rates for darker-skinned women up to 34 percentage points higher than for lighter-skinned men. NIST confirmed disproportionate false-positive rates for Black, Indigenous, and East Asian faces. Wrongful arrests of Robert Williams, Michael Oliver, and Nijeer Parks are the human consequences.
-
Immigration enforcement functions as comprehensive surveillance of Latino communities: E-Verify, CBP checkpoints, Secure Communities integration, and biometric collection at entry create a surveillance infrastructure that chills civic participation regardless of legal status.
-
The NYPD Demographics Unit surveilled Muslim communities for over a decade without producing a single criminal lead — a case study in surveillance targeting an entire religious community based on identity rather than behavior.
-
Surveillance of Indigenous land defenders at Standing Rock and elsewhere reflects the centuries-long pattern of monitoring Indigenous political resistance; private firms like TigerSwan apply counterterrorism frameworks to Indigenous protest, coordinating with law enforcement in ways that blur public and private accountability.
Conceptual Frameworks
-
Racial formation (Omi and Winant): Race is socially constructed and continually remade through political, economic, and cultural processes. Surveillance participates in racial formation by producing, marking, and enforcing racial categories.
-
Racializing surveillance (Browne): Surveillance processes that do not merely respond to existing racial identities but actively construct and enforce them.
-
Structural racism vs. individual racism: The racial disparities in surveillance systems are produced by institutional structures and feedback dynamics rather than by the prejudiced intentions of individual actors — which is precisely why they are difficult to reform through individual-level interventions.
-
Intersectionality (Crenshaw): Surveillance burdens fall not on race alone but at the intersection of race, class, gender, immigration status, religion, and other social categories. Analysis that disaggregates these dimensions misses the specific experiences at their intersections.
-
Feedback loop: The self-reinforcing dynamic by which over-policing generates data that trains algorithms to predict more policing in the same locations, extending and encoding historical discrimination into technical form.
The Five Themes in Chapter 36
-
Visibility asymmetry: Is racially distributed — certain populations are made more visible to surveillance systems than others, systematically and by structural design.
-
Consent as fiction: The racialized surveillance of Black, Brown, Muslim, and Indigenous communities is categorical — it targets group membership rather than individual behavior, making individual consent not merely absent but structurally impossible.
-
Normalization: Those who do not bear the surveillance burden often cannot perceive it; the racial surveillance of others is normalized as "public safety" or "reasonable precaution" from the perspective of the unwatched.
-
Structural vs. individual: The racial bias in surveillance systems is produced by structures (historical data, feedback loops, deployment decisions) rather than by individual prejudice — which is why fixing individual bias does not resolve the structural problem.
-
Historical continuity: The lantern law, the slave pass, the Bertillon system, COINTELPRO, stop and frisk, PredPol, and facial recognition are not analogous; they are continuous — each extends the logic of making racially marked bodies verifiable and controllable.
What Yara's Experience Teaches
The chapter's character vignette — Yara describing her experience of surveillance uncertainty — illustrates a dimension that statistical analysis cannot fully capture: the subjective experience of racialized surveillance. The surveillance operates not only through actual watching but through the uncertainty of whether one is being watched, which self-modifies behavior, restricts expression, and creates a permanent low-level awareness of potential monitoring that shapes how a person moves through the world.
Looking Forward
-
Chapter 39 addresses design-level responses to discriminatory surveillance: privacy by design, algorithmic auditing, community oversight, and policy frameworks for restricting or banning the most harmful applications.
-
Chapter 40 incorporates the racial analysis of this chapter into the book's full synthesis: surveillance as built environment that was not designed for everyone equally, and the question of what it means to live in — and resist — that environment.