Chapter 7 Exercises: Border Control and Biometric Databases


Bloom's Level 1 — Remember

Exercise 1.1 — System Identification

Match each biometric system to its correct description:

System Description
IDENT A. EU fingerprint database for asylum seekers and irregular migrants
NGI B. Internet-based employment eligibility verification system
Eurodac C. FBI comprehensive biometric database integrating fingerprints, iris scans, and facial recognition
E-Verify D. DHS biometric database with records for over 220 million individuals
SIS II E. EU-wide alert database for law enforcement and border management

Exercise 1.2 — Timeline of Border Documentation

Arrange the following developments in border documentation chronologically:

  • League of Nations standardizes passport formats
  • US-VISIT program begins biometric collection from visa holders
  • Eurodac established by the EU
  • Chinese Exclusion Act requires fingerprinting of Chinese immigrants
  • The modern concept of the "passport" emerges in early modern Europe
  • NIST publishes major study on facial recognition accuracy disparities
  • Immigration Act of 1924 establishes national origins quota system
  • E-Verify made mandatory for federal contractors

Bloom's Level 2 — Understand

Exercise 2.1 — Explaining the Border Search Exception

In your own words, explain: 1. What the "border search exception" to the Fourth Amendment is 2. Why courts have upheld this exception 3. How the exception has expanded with digital technology 4. What the Riley v. California (2014) decision held and why its application to border searches is contested

Write 300–350 words.

Exercise 2.2 — Biometric Modality Comparison

Create a comparison table for the following biometric modalities: fingerprints, iris scans, facial recognition, and DNA. For each modality, describe: (a) what is measured, (b) relative accuracy in identification, (c) invasiveness of collection, (d) current use in U.S. border systems, and (e) a specific vulnerability or limitation.

Exercise 2.3 — The Trusted Traveler Logic

Explain the Global Entry/TSA PreCheck "trusted traveler" model. What is the underlying logic — what determines who is "trusted"? What socioeconomic factors affect access to these programs? How does the trusted traveler model relate to the concept of social sorting introduced in Chapter 1?


Bloom's Level 3 — Apply

Exercise 3.1 — Document the Surveillance Experience

You are accompanying three different travelers through an international arrivals terminal at a major U.S. airport. Using the concepts from Chapter 7, describe the likely surveillance encounter for each:

Traveler A: A white American businessperson with Global Entry, returning from a business trip to the UK. Carries an official corporate laptop and has traveled to this destination 12 times in the past year.

Traveler B: A 28-year-old Somali-American woman, U.S. citizen, returning from visiting family in Kenya. Has no criminal record, no watchlist alerts. Name is Arabic.

Traveler C: A Brazilian national on a tourist visa, entering for the first time. Works in construction. Has a cousin who is undocumented in the United States (which is not in any database).

For each traveler, describe: (a) the documentary and biometric processing they likely experience, (b) the databases their information is checked against, (c) factors that might trigger additional scrutiny, and (d) the rights they have (or lack) to contest decisions made about them.

Exercise 3.2 — E-Verify's Function Creep

The chapter argues that E-Verify represents function creep — the migration of border surveillance logic into the employment domain. Write a 400-word analysis of this claim. Your analysis should: (a) explain what function creep means, (b) trace the specific path by which border surveillance logic migrated to employment verification, (c) identify at least two specific consequences of this migration that go beyond the employment context, and (d) evaluate whether there is a principled distinction between border verification and employment verification, or whether they represent the same surveillance logic in different settings.


Bloom's Level 4 — Analyze

Exercise 4.1 — Analyzing the NIST Study

Review the chapter's summary of the 2019 NIST facial recognition accuracy study. Then answer:

  1. What does "false positive rate" mean in the context of border surveillance? What happens to a traveler who is a false positive?

  2. The best-performing algorithms had false positive rates 10–100 times higher for Black faces. If an algorithm with a 0.001% overall false positive rate has a 0.1% false positive rate for Black travelers, and 50 million travelers per year pass through U.S. airports with roughly 14% being Black, how many Black travelers per year would be falsely flagged? Show your calculation.

  3. CBP argues that its operational context (comparing live faces to passport photos) is different from law enforcement context (comparing crime scene photos to database). Does this distinction affect the error-rate problem? Why or why not?

  4. What would "adequate response" to the NIST findings look like from CBP? What would "inadequate response" look like? Assess CBP's actual response.

Exercise 4.2 — Comparing U.S. and EU Border Surveillance

Compare the U.S. system (IDENT, ATS, E-Verify) with the EU system (Eurodac, SIS II) across the following dimensions. Write a 500-word comparative analysis:

  • Scope of biometric data collected
  • Primary populations targeted
  • Legal framework governing data use
  • Oversight and accountability mechanisms
  • Evidence of function creep

Exercise 4.3 — The Hypervisibility Paradox

The chapter describes undocumented people as experiencing both "hypervisibility" and "invisibility" in relation to surveillance systems. Analyze this paradox. In what specific contexts are undocumented people invisible to surveillance systems? In what contexts are they hypervisible? What behavioral adaptations does this asymmetric surveillance experience produce? What harms result from those adaptations?


Bloom's Level 5 — Evaluate

Exercise 5.1 — Evaluating the Risk Score

The Automated Targeting System generates a risk score for every traveler before they arrive at a U.S. port of entry. The score's criteria are classified. Travelers cannot review or contest their score. Evaluate the ATS risk score system using the following criteria:

  1. Accuracy: On what basis can we evaluate whether a classified algorithm is accurate? What is known about its track record?

  2. Fairness: What conditions would need to be met for a risk scoring system to be considered "fair" in its distribution of scrutiny? Are those conditions likely to be met when the algorithm incorporates data from historically discriminatory enforcement patterns?

  3. Accountability: What mechanisms exist for identifying and correcting errors in a classified risk assessment system? Are these mechanisms adequate?

  4. Proportionality: Is the level of surveillance — and the potential consequences for those flagged — proportionate to the security benefit provided?

Write a structured evaluation of 600–700 words, with a clear conclusion.

Exercise 5.2 — DNA at the Border

The Trump administration proposed — and piloted — collecting DNA from migrants and asylum seekers at the border, with the data to be entered into the FBI's CODIS criminal database. Evaluate this proposal:

  1. What is the rationale offered for DNA collection at the border?
  2. What is the legal basis? What legal challenges has it faced?
  3. What are the privacy and civil liberties concerns?
  4. How does this proposal relate to the historical examples of racially targeted biometric collection described in the chapter (Chinese Exclusion Act fingerprinting)?
  5. Should DNA be treated differently from other biometrics? Is there a principled basis for this distinction?

Bloom's Level 6 — Create

Exercise 6.1 — Designing an Accountable Border Biometric System

Design a border biometric verification system that you believe adequately addresses the concerns raised in Chapter 7. Your design should specify:

  1. What biometric data is collected and from whom
  2. How the data is stored, secured, and eventually deleted
  3. What databases it is connected to and the rules governing those connections
  4. What transparency and notification travelers receive
  5. What mechanisms exist for challenging errors
  6. What independent oversight governs the system
  7. How accuracy disparities across demographic groups are identified and addressed

Write 700–900 words. Include an honest assessment of what your design cannot fully solve.

Exercise 6.2 — The Undocumented Student's Account

You are working on the college newspaper at Hartwell University. A student who is undocumented has agreed to be interviewed (anonymously) about their experience of surveillance systems. Draft the interview — 10 questions and plausible responses — that illuminates the specific ways in which surveillance systems (not just border surveillance, but E-Verify, driver's license systems, police encounters, healthcare access) affect daily life for an undocumented student in the United States. Draw on the concepts from Chapter 7 but also connect to themes from earlier chapters.


Reflection Questions

  1. The chapter traces the use of biometrics at borders from the Chinese Exclusion Act (which required fingerprinting specifically of Chinese immigrants) to the current universal biometric collection from non-citizen visitors. Does the historical specificity of biometric targeting by national origin or race tell us something important about the contemporary universal collection system? Or are these genuinely different practices?

  2. Jordan's crossing experience as a U.S. citizen is relatively smooth despite the racial dimensions of border encounters. What responsibilities, if any, does Jordan have regarding a border system that falls more heavily on others? More broadly: what responsibilities do those who pass through surveillance systems relatively unscathed have regarding those who bear greater burdens?

  3. The "trusted traveler" programs (Global Entry, TSA PreCheck) offer reduced surveillance in exchange for a background check, in-person interview, and fee. Some defenders argue this is efficient risk-based screening; some critics argue it creates a two-tier system that effectively privatizes smooth border passage for those who can afford it. Evaluate both positions.

  4. Biometric data is sometimes described as uniquely sensitive because it cannot be changed — unlike a password or a document, your fingerprints are permanent. But the chapter also discusses how biometric data can be compromised (NIST accuracy study) and how databases containing biometric data can be breached. Does the permanence of biometric data change the ethical calculus of collecting and storing it? If so, how?


Chapter 7 Exercises | Part 2: State Surveillance | The Architecture of Surveillance