Case Study 1.1: The Warehouse and the Algorithm — Productivity Monitoring at Fulfillment Centers

Overview

This case study examines the surveillance ecosystem of large logistics and e-commerce fulfillment warehouses — facilities that have become among the most intensively monitored workplaces in the American economy. It uses Jordan Ellis's part-time work experience as an entry point and expands outward to documented evidence from investigative reporting, labor litigation, and workers' testimony.

Estimated Reading and Analysis Time: 75–90 minutes


Background: Inside the Fulfillment Center

The modern fulfillment warehouse is a study in systematic visibility. From the moment a worker badges in, a surveillance architecture begins generating data: badge entry times, station assignment, scan rates, travel time between stations, restroom duration, idle time at any location, communication with supervisors, and error rates for items incorrectly packed or incorrectly labeled.

Major logistics and e-commerce companies have deployed variations of what is called "algorithmic management" — a system in which human supervisors are supplemented or partially replaced by automated systems that track worker performance in real time, compare it to benchmarked targets, generate coaching or disciplinary alerts, and in some cases make or recommend decisions about termination.

Workers are typically told their scan-rate targets at orientation. What they are often not told is the full architecture of what is being measured — the secondary metrics like idle time, "time off task," and movement efficiency that feed into their overall performance score.


The Data Architecture

Primary Metrics (usually disclosed):

  • Units processed per hour (UPH)
  • Accuracy rate (percentage of correct picks/packs)
  • Attendance and punctuality

Secondary Metrics (often undisclosed or partially disclosed):

  • "Time off task" (TOT): the duration and frequency of periods when a worker is not scanning, derived from gap analysis between scan events
  • Station dwell time: how long a worker stays at a given station
  • Restroom and break patterns: derived from badge swipes and time gaps
  • Movement efficiency: in some facilities, wristband-based sensors or camera systems track worker movement and calculate the efficiency of their physical paths through the warehouse
  • Communication patterns: time spent talking to supervisors or co-workers

Downstream Uses:

Performance data feeds into: - Shift scheduling (high performers get better shifts) - Contract renewal decisions - Disciplinary processes ("coaching" sessions triggered by algorithm alerts) - In some documented cases, automated termination: the system generates termination paperwork when a worker's performance falls below threshold, without human supervisor review


Documented Cases

Case A: Termination by Algorithm

Investigative reporting by The Verge (2019) examined Amazon fulfillment center operations and documented that the company's automated tracking system sent workers "productivity warnings" and termination notices without a human supervisor initiating or reviewing the decision. Workers received printed paperwork from an automated system. The rate of terminations driven by the system was high enough that some labor attorneys argued it constituted a pattern of unlawful dismissal — workers terminated without the human judgment that labor law implicitly assumes is present in employment decisions.

Amazon stated that humans make all termination decisions and that the system merely "supports" those decisions. Workers and advocates disputed this characterization, noting that supervisors rarely overrode system-generated recommendations and that the practical effect was algorithmic termination.

Case B: Restroom Monitoring and Worker Testimony

Workers at multiple fulfillment facilities have testified to avoiding restroom breaks during peak periods because their time off task (TOT) metrics would suffer, triggering coaching conversations or warnings. Some workers described wearing diapers or urinating in bottles to avoid TOT penalties — testimony documented by investigative journalists and by worker advocacy organizations including the Strategic Organizing Center.

This represents a direct physiological consequence of the visibility architecture: the body's needs become data points, and workers internalize the algorithm's gaze to the point of self-denial.

Case C: The New York Warehouse Act (2022)

In response to worker advocacy, New York became the first state to pass legislation specifically regulating algorithmic monitoring in warehouses. The Warehouse Worker Protection Act requires employers to disclose the work speed quotas they use, prohibits quotas that prevent workers from taking legally required breaks, and gives workers the right to request information about the performance metrics that govern their jobs.

The law represents a legislative attempt to reduce the visibility asymmetry built into algorithmic management: workers would know what was being measured and how, rather than being managed by an opaque system they could not interrogate.


Analysis: Applying Chapter 1 Concepts

Surveillance Category

The warehouse monitoring system is primarily commercial surveillance conducted in a workplace context. The agent is the employer (a private corporation). The stated purpose is efficiency management and quality control. Unstated purposes include labor cost reduction, union avoidance (workers who are constantly being measured have less time and energy for organizing), and data accumulation for future automation planning.

Visibility Asymmetry

The asymmetry in fulfillment warehouse monitoring is extreme:

What the Employer Sees What the Worker Sees
All scan events, timestamped Their own scan rate (sometimes in real time)
All badge movements Their own badge entry/exit
Time off task, calculated automatically Coaching notification (but not necessarily the metric that triggered it)
Performance relative to all co-workers Their own performance (sometimes)
Historical trend analysis Current shift data
Predictive flags for potential disciplinary action Nothing until flag is acted on
Aggregate labor cost data Their own hourly wage

Workers can see some of their own data, but they cannot see the full model used to evaluate them, cannot access co-workers' data for comparison, and cannot audit the algorithmic decisions that affect their employment.

Dataveillance and Aggregation

Each individual data point — a scan event, a badge swipe, a time gap — is unimportant in isolation. Aggregated across a shift, a week, a quarter, and compared to population-level benchmarks, these data points construct a detailed performance profile that is used to make decisions with significant consequences for the worker's livelihood.

This is dataveillance in its classic form: the transformation of behavioral traces into a model of the person, used to manage and evaluate them.

Function Creep

The original stated purpose of barcode scanning in warehouses was inventory accuracy — ensuring that the right items were shipped to the right customers. The data generated by those scans has expanded into a comprehensive worker monitoring system. Workers did not agree to be monitored when they agreed to perform scanning tasks; the surveillance expanded from the package to the person.

Workers at these facilities do consent — they accept employment contracts that include language about performance monitoring. But this consent has structural characteristics that Lyon's framework would call into question:

  1. Workers were not told the full scope of what would be measured
  2. Workers had no meaningful ability to negotiate the terms (take it or leave it)
  3. The alternatives — unemployment or lower-wage work — represent significant coercion
  4. Workers with the most limited alternatives (economic precarity, limited education credentials) face the greatest surveillance intensity

The Chilling Effect in Physical Space

The chilling effect is not only a behavioral modification in information-seeking (as in the Wikipedia study). It is also physical. Workers avoid restroom breaks, avoid talking to co-workers, avoid slowing down to work more carefully when accuracy requires it — because the algorithm is watching and will penalize them. The surveillance apparatus modifies bodily behavior, not just online activity.


Connecting to Broader Themes

The fulfillment warehouse is not an aberration. It is the leading edge of a transformation in workplace surveillance that is expanding into sectors previously less intensively monitored: call centers, delivery driving, nursing, teaching, and knowledge work. The tools change — from badge readers to keylogging software to AI-assisted performance review — but the structural logic is consistent.

Jordan's experience at the warehouse is, in this sense, not merely a personal narrative. It is a front-row view of the direction labor-management relations are moving across the economy.

🔗 Connection: Part 6 (Chapters 26–30) examines workplace surveillance in depth, including remote work monitoring (Chapter 27), gig economy surveillance (Chapter 28), and the labor law questions raised by algorithmic management (Chapter 30).


Discussion Questions

  1. Applying the Taxonomy: The case identifies warehouse monitoring as "commercial surveillance in a workplace context." Could it also be characterized as environmental surveillance (surveillance of a physical space)? What would be gained or lost analytically by reclassifying it?

  2. Consent Analysis: Workers signed employment contracts that included general language about performance monitoring. In what sense did they consent to algorithmic monitoring? In what sense did they not? Use the "consent as fiction" framework from the chapter.

  3. The Algorithm as Supervisor: When a termination decision is made by an automated system without human review, does that change its moral character? Who is responsible — the system's designers, the company that deployed it, the manager who did not override it, or all three? How does the visibility asymmetry affect this moral analysis?

  4. Worker Response: The New York Warehouse Worker Protection Act requires disclosure of quotas and prohibits quotas that prevent legally required breaks. Is disclosure sufficient to address the visibility asymmetry described in this case? What additional measures would you recommend, and on what grounds?

  5. Jordan's Position: Jordan works part-time in exactly this kind of facility. They are aware that their performance is being tracked. Jordan is also a sociology student who has now read Chapter 1 of a surveillance textbook. How might this knowledge change Jordan's experience of their next shift? Does knowledge dissolve the power asymmetry, partially address it, or fail to address it at all?

  6. The Equity Dimension: Fulfillment warehouses disproportionately employ workers of color, immigrants, and people with limited formal education credentials — people with fewer labor market alternatives. How does this demographic reality interact with the surveillance architecture described? Does the equity objection to "nothing to hide" apply here? How?

  7. Historical Continuity: Chapter 1 introduces the idea that surveillance is not a modern invention. Before reading Chapter 4 (The Industrial Eye), speculate: How is the algorithmic management of modern warehouses similar to, and different from, the factory management practices of the early twentieth century?


Extension Activity

If you have ever worked in a job where your performance was systematically tracked — in retail, food service, logistics, call centers, or elsewhere — write a 300-word reflection on your experience:

  • What data was collected about you?
  • Did you know the full scope of the monitoring?
  • Did the monitoring change your behavior? In what ways?
  • Did you consider the monitoring legitimate? Why or why not?

If you have not worked in such a job, interview someone who has and write your reflection based on their experience (with their permission).


Chapter 1 | Case Study 1.1 | Part 1: Foundations | The Architecture of Surveillance