Chapter 7 Key Takeaways: Border Control and Biometric Databases


Core Concept: The Border as Surveillance Chokepoint

The border concentrates surveillance because it combines: (1) reduced constitutional protections (border search exception), (2) absolute visibility asymmetry (traveler knows nothing about database queries), and (3) high-stakes consequences (denial of entry, detention, family separation). This concentration makes the border the clearest expression of the state's surveillance power.


Historical Arc of Border Documentation

Era Primary Technology Key Development
Pre-WWI Letters of introduction, informal documents Movement relatively free for those with resources
WWI–1920s Modern passport emerges Emergency wartime measures become permanent
1882–1920s Targeted biometric collection (fingerprinting of Chinese immigrants) Race-specific biometric requirements established
Post-9/11 Biometric databases (US-VISIT, IDENT) Universal biometric collection from non-citizen visitors
2010s–present AI-assisted risk scoring, facial recognition Predictive assessment precedes arrival

Key pattern: Emergency measures become permanent; targeted measures become universal; documentary systems become biometric systems.


Major U.S. Biometric Systems

System Operator Function Scale
IDENT DHS Biometric records for immigration/border interactions 220+ million records
NGI FBI Criminal justice, immigration, law enforcement biometrics including facial recognition Multiple biometric modalities
E-Verify DHS Employment eligibility verification 900,000+ employers
ATS CBP Pre-arrival risk scoring for all travelers All international travelers to U.S.

EU Biometric Border Systems

System Function Function Creep
Eurodac Fingerprints of asylum seekers; enforces Dublin Regulation 2013: expanded to law enforcement access
SIS II Alert database for wanted persons, border exclusions Includes biometric data; shared across 30+ countries

Biometric Modalities: Key Trade-offs

  • Fingerprints: Established, reliable, some degradation with age/injury
  • Iris scans: Highly accurate, stable across lifetime
  • Facial recognition: Non-contact but significant accuracy disparities by race/gender (NIST 2019: 10–100x higher false positive rates for Black faces)
  • DNA: Most specific but invasive, expensive; proposed at U.S. borders

Critical limitation: All biometrics have false positive and false negative rates. Error rates vary across demographic groups. The "objectivity" of biometrics is a technology-facilitated illusion.


The Social Sorting Logic of Border Surveillance

Border surveillance distributes scrutiny along lines that map onto existing social hierarchies:

  • Trusted travelers (Global Entry, PreCheck): Fee-paying, background-checked travelers receive expedited, reduced-scrutiny processing
  • Standard documented travelers (passport holders): Routine biometric/documentary verification
  • Travelers from high-scrutiny countries: Nationality-based additional screening; profiling based on religion, name, appearance
  • Undocumented people: Invisible to routine documentary systems; hypervisible when detected by enforcement

The border does not treat all people as equal security risks — it sorts people into tiers and applies scrutiny accordingly.


The Predictive Turn: Risk Scoring

The Automated Targeting System (ATS) generates pre-arrival risk scores for all international travelers to the United States. Key concerns:

  • Algorithm criteria are classified — travelers cannot know why they are scored as they are
  • No effective mechanism to contest erroneous scores
  • System draws on commercial data broker databases as well as government records
  • Past discriminatory enforcement patterns are encoded in training data — algorithm can replicate discrimination without discriminatory intent

The Hypervisibility Paradox

Undocumented people experience: - Invisibility in documentary tracking systems (no legal entry record, no visa database entry) - Hypervisibility in enforcement contexts (when detected, subject to maximum coercive capacity with minimal legal protections)

This paradox produces strategic avoidance behaviors (not calling police when victimized, not seeking medical care) that harm individuals and communities.


E-Verify and Function Creep

E-Verify extends border surveillance logic into the employment domain: - Employers become de facto immigration enforcement agents - "Border" effectively moves into the workplace - Legal residents and citizens are also affected when database errors occur

This is function creep operating at policy scale: the border's verification logic migrates to a domain (employment) where it was not originally applied.


Recurring Themes in Chapter 7

Theme How It Appears
Visibility asymmetry Traveler knows nothing about database queries, risk scores, or watchlist status
Consent as fiction Biometric collection is required as condition of entry; refusal means exclusion
Normalization Universal biometric collection from all non-citizen visitors now standard
Social sorting Trusted traveler programs, ATS risk scoring, and nationality profiling sort travelers into scrutiny tiers
Historical continuity Race-targeted biometric collection → universal collection; colonial documentation systems → biometric databases

Key Research Finding

NIST (2019): Facial recognition algorithms tested against law enforcement and immigration databases showed false positive rates 10–100 times higher for Black faces than white faces. Women and older people also showed higher error rates. These disparities mean facial recognition systems in border surveillance will flag more innocent Black, Asian, and female travelers than white male travelers for errors that belong to the system.


What Jordan Learned

As a U.S. citizen, Jordan crosses borders with documents that confirm they belong. But Jordan's racialized appearance introduces elements of ambiguity that the documentary system cannot fully resolve, and the risk-scoring algorithms that precede Jordan's arrival make assessments Jordan knows nothing about. The distribution of border surveillance burdens — falling most heavily on travelers from Muslim-majority countries, on undocumented people, on racial minorities — reflects a system that sorts populations into scrutiny tiers rather than assessing individual risk. Jordan's relatively smooth crossing is not the norm; it is a position of relative privilege within a stratified system.


Forward Connections

  • Chapter 35 examines facial recognition technology in detail — technical architecture, accuracy challenges, and legal responses
  • Chapter 36 analyzes racial surveillance — how race shapes surveillance targeting across domains from border control to policing
  • Chapter 31 examines the legal frameworks governing surveillance, including the specific rights (and lack thereof) at borders

Chapter 7 Key Takeaways | Part 2: State Surveillance | The Architecture of Surveillance