Case Study 28-2: Amazon Warehouse Workers and Algorithmic Management
Overview
Amazon's fulfillment center network — the physical backbone of what is arguably the world's most sophisticated logistics operation — employs approximately 750,000 workers in the United States alone, making Amazon one of the country's largest private employers. Inside those warehouses, a system of continuous AI-driven performance monitoring, algorithmic productivity management, and automated discipline represents perhaps the most extensively documented example of algorithmic management in the contemporary economy. This case examines how that system works, what its consequences have been for workers, and what it reveals about the ethics of using AI to manage human labor.
How Amazon's Algorithmic Management System Works
The Rate System
At the core of Amazon warehouse management is "rate" — the pace at which workers must complete their tasks, measured in units per hour (UPH). A picker in a fulfillment center is expected to select a certain number of items per hour; a stower must stow a certain number of items per hour; a packer must pack a certain number of boxes per hour. These rates are not fixed by a human industrial engineer making a one-time assessment; they are continuously updated by machine learning algorithms that analyze historical performance data across thousands of workers to establish what the system calculates as achievable rates.
The handheld scanners that workers use for every task transmit continuous data: when a worker scans a product, when they move between locations, how long they spend at each station. This data feeds into the system's assessment of whether a worker is on track to meet rate. Workers can see their current rate progress on the scanner screen and on display monitors throughout the facility. The system tracks "time off task" (TOT) — any period during which the scanner is not recording productive activity — and this metric can trigger automated flags when it exceeds certain thresholds.
Automated Discipline and Termination
Amazon has acknowledged using automated systems to generate discipline recommendations. Investigative reporting by The Verge in 2019, based on internal Amazon documents, revealed that the system could automatically generate warnings and termination paperwork for workers who failed to meet productivity targets — without any manager reviewing individual circumstances before the automated action was initiated. Amazon stated at the time that managers reviewed termination decisions before they were executed, but the scale of the operation made meaningful individual review questionable: with hundreds of thousands of workers, even a small percentage generating automated discipline flags would create a volume that exceeds realistic human review capacity.
Workers interviewed by journalists described receiving automated warnings on their scanners for periods of low productivity during which they were, for example, waiting for inventory to arrive at a picking station — a situation outside the worker's control but not necessarily visible to the algorithm as a legitimate exception. The challenge of registering a "legitimate exception" required interacting with a supervisor, which itself consumed time that might register as TOT.
The Surveillance Infrastructure
The monitoring extends beyond scanners. Amazon has deployed a range of surveillance technologies in its fulfillment centers:
Camera networks provide continuous visual coverage of the warehouse floor. More recently, Amazon has deployed AI-powered cameras in its Amazon Logistics delivery vans — the Driveri system, supplied by Netradyne — that use four cameras and AI to monitor drivers' facial expressions, gaze direction, hand position on the steering wheel, speed, braking, and phone use. Drivers receive real-time audio alerts from the system when the AI detects potential safety violations, and their driving performance feeds into a weekly performance score that can affect their contract status.
Wristband patents. Amazon received patents in 2018 for technology that would track workers' hand movements with haptic feedback wristbands — vibrating when a worker reached toward the wrong bin. Amazon stated at the time that this technology was not in use; critics argued the patents themselves revealed the direction of the company's thinking about worker monitoring.
Floor robots and their interaction with workers. Amazon's fulfillment centers deploy hundreds of autonomous mobile robots (AMRs) that bring shelving units to stationary human workers for picking. While this reduces the walking burden on workers, it creates a different dynamic: the robot-driven system can deliver inventory at a pace that effectively sets the worker's rate, creating a situation where the human is pacing themselves to the machine rather than the other way around.
The Consequences: Injury Rates and Worker Experience
The Safety Data
The most serious documented consequence of Amazon's algorithmic management system is its correlation with high injury rates. Multiple investigations have found injury rates at Amazon fulfillment centers substantially above industry averages.
A 2021 investigation by the Strategic Organizing Center, a coalition of labor unions, analyzed Amazon's own OSHA 300 injury logs (which Amazon became required to publish following an OSHA rulemaking). The analysis found that the serious injury rate at Amazon warehouses was more than twice the industry average for general warehousing. A follow-up 2022 report found that Amazon accounted for approximately half of all serious injuries in the US warehousing industry, despite employing roughly one-third of warehouse workers.
A 2022 Senate investigation, led by Senator Bernie Sanders, found similar results and specifically linked productivity pressure to injury risk, citing worker testimony and internal Amazon safety data. The investigation also found that Amazon had been aware of the injury rate disparity for years without resolving it.
Amazon has consistently challenged these figures and pointed to its own safety investments, including ergonomics programs, robotic assistance that reduces walking, and its "WorkingWell" initiative focused on worker wellness. The company argues that its injury rate statistics are not directly comparable to industry averages because of differences in how injuries are categorized and recorded. Critics dispute these explanations and note that Amazon's own internal safety audits showed awareness of ergonomic risk.
Worker Testimony
Extensive worker testimony, collected by journalists, researchers, labor advocates, and congressional investigators, describes the experience of working under algorithmic management in consistent terms. Workers report:
Fear of falling below rate, which creates pressure to skip bathroom breaks, work through minor injuries, skip ergonomic precautions that slow pace, and avoid taking the allowed breaks. One Amazon worker in Bessemer, Alabama, testified in a 2021 Senate hearing that workers often urinated in bottles rather than risk falling behind on rate by walking to the bathroom — a claim Amazon initially denied but which was later confirmed by internal Amazon communications acknowledging the practice occurred, though the company said it was not condoned.
The psychological pressure of continuous monitoring, which workers describe as dehumanizing — the experience of being evaluated not as a person but as a productivity metric. A former Amazon HR manager described the experience of managing workers under this system as "managing the algorithm" — that human managers had little discretion; their role was to enforce algorithmic productivity standards, not to exercise individual judgment.
Limited recourse when the algorithm is wrong. Workers who believed they had been incorrectly flagged for low productivity or excessive TOT described difficulty challenging automated assessments, particularly when the circumstances that caused the low productivity (inventory delays, equipment malfunction, understaffing at certain stations) were not visible to the system.
Turnover and Its Costs
Amazon's turnover rate among fulfillment center workers has been extraordinarily high — reportedly 150% annually before the pandemic, meaning the average tenure was approximately eight months. Amazon has acknowledged the turnover rate and attributed it to the physically demanding nature of the work and workers' preference for other opportunities as the labor market tightened.
Internal Amazon documents leaked to reporters in 2022 described concern at senior leadership levels about worker attrition — not primarily on grounds of worker welfare but because of the cost and difficulty of continuously recruiting and training replacement workers at the scale of hundreds of thousands. The phrase "burning through" the available labor pool in specific geographic markets appeared in internal materials. This suggests that even from a pure business optimization perspective, the combination of high productivity demands and low tenure was creating operational challenges — a potential constraint on Amazon's growth model.
Amazon's Counterarguments and Reforms
Amazon's Defense of Its System
Amazon has offered several defenses of its workforce management approach:
Safety investment. Amazon points to its investment in ergonomics, its robotics that reduce walking distances, its WorkingWell program, and its stated commitment to being "Earth's Best Employer." The company has made substantial investments in safety technology and argues that comparisons to industry averages do not account for the scale and intensity of its operations.
Worker agency and choice. Amazon argues that warehouse work is physically demanding by definition and that workers choose these positions with full knowledge of the requirements. The company pays above-minimum wage in most markets and provides benefits including health insurance for full-time workers, a $15 minimum hourly wage (above federal minimum), and tuition assistance through Career Choice.
Continuous improvement. Amazon argues that its monitoring systems allow it to identify and address safety issues faster than traditional observation-based approaches, pointing to specific examples where data-driven intervention reduced injury rates in specific areas.
Legal compliance. Amazon's management practices have been reviewed by OSHA and state safety agencies. While OSHA has issued citations at specific facilities for specific violations, Amazon has generally maintained that its practices are legally compliant.
Reforms Under Pressure
Amazon has made some modifications to its monitoring practices under public and regulatory pressure. Following reporting on TOT tracking and automatic discipline, Amazon indicated it had modified how time off task was counted in some circumstances. Following the 2022 strategic organizing effort and congressional attention, Amazon added a "voice of the associate" feedback tool — an app through which workers can provide anonymous feedback about their experience. The company also expanded its safety programs and modified rate calculation in some facilities.
Critics have characterized these reforms as marginal adjustments that do not address the fundamental dynamic: an algorithmically-set productivity rate that creates continuous pressure, enforced by continuous surveillance, with automated consequences for performance below threshold.
The Bessemer Union Vote and Its Aftermath
The Bessemer, Alabama, Amazon fulfillment center became the focal point of the highest-profile Amazon unionization effort when workers there voted on forming the first Amazon union in April 2021. The election resulted in a defeat for the Retail, Wholesale and Department Store Union (RWDSU), with Amazon winning by a substantial margin. A subsequent NLRB investigation found that Amazon had interfered with the election through unlawful practices including controlling worker access to the bathroom, where they encountered anti-union materials, and improper communication during captive audience meetings. The NLRB ordered a new election; in the revote conducted in 2022, the vote was extremely close, and the results were contested through formal challenges.
Separately, in April 2022, workers at Amazon's JFK8 facility on Staten Island voted to form the Amazon Labor Union — the first successful Amazon union election in US history. The organizing was led not by an established union but by warehouse workers, many of them former Amazon employees, who ran an independent campaign. The ALU won the election 2,654 to 2,131. Amazon challenged the results; the NLRB certified the union. Contract negotiations between Amazon and the ALU have been contentious, with Amazon raising objections to the ALU's representation and the parties far apart on economic demands as of 2024.
The algorithmic management system was central to organizing in both cases. Workers cited productivity rate pressure, TOT monitoring, and the dehumanizing experience of algorithmic oversight as primary motivations for seeking collective representation. The question of who controls the algorithm — who sets the rate, who reviews exceptions, who has discretion — emerged as a central bargaining demand.
Ethical Analysis
The Accountability Gap
The most fundamental ethical problem with Amazon's algorithmic management system is the accountability gap it creates. When a human manager disciplines a worker for poor performance, the manager is accountable for that decision — can be questioned, challenged, overridden, and held responsible. When an algorithm generates a discipline recommendation or termination, the accountability is diffuse: the engineers who designed the algorithm, the managers who approved its deployment, the executives who set the productivity standards the algorithm enforces — none of them are directly accountable for the specific decision affecting the specific worker.
This accountability gap is not an accident; it is a feature of algorithmic management from the deploying organization's perspective. An algorithm that generates a discipline recommendation distances the company from the decision in ways that reduce both legal liability and reputational exposure. "The system flagged you" is not the same as "I decided to discipline you" — even if the former is a consequence of choices made by the latter.
Power Asymmetry
The power asymmetry created by algorithmic management is profound. A fulfillment center worker interacting with Amazon's management system faces:
- A system that knows far more about their work performance than they do
- A system whose decision criteria they cannot fully understand
- A system whose outputs can be challenged only through processes the company controls
- Economic dependence on continued employment that limits their willingness to challenge the system
- Individual isolation — the system manages each worker individually, making collective action difficult
Amazon, by contrast, faces minimal accountability: limited regulatory oversight of algorithmic management practices, weak labor law protections for non-union workers, and the economic power to replace individual workers who leave or challenge the system.
The Human Dignity Question
Beyond the specific harms documented — high injury rates, dehumanizing working conditions, economic precarity — algorithmic management raises questions about human dignity at work. Work is not merely an economic transaction; it is a place where people spend significant portions of their lives, develop skills, build relationships, and often find meaning. A system that reduces workers to productivity metrics evaluated continuously by an algorithm, with little discretion, little understanding of context, and little accountability, treats workers as production inputs rather than as people.
This is not merely a philosophical concern — it has practical consequences. Workers who feel dehumanized by their conditions are more likely to disengage, to leave, and to organize. Amazon's extraordinarily high turnover rate is both a business cost and a signal that something in the working conditions is inconsistent with sustainable employment.
Lessons for Business Professionals
Efficiency gains and sustainability are not the same. Amazon's system generates impressive short-term productivity metrics but is associated with injury rates and turnover that represent substantial costs — both financial (recruiting and training replacement workers at scale) and human. Business professionals should distinguish between metrics that measure current output and metrics that measure sustainable, durable productivity.
Algorithmic management is a design choice, not a technological inevitability. The decision to set rates algorithmically, to track TOT automatically, to generate discipline recommendations without human review — each of these is a design choice. Different design choices (algorithmic support for human managers rather than algorithmic replacement of managerial discretion, for example) would produce different outcomes.
Worker voice improves outcomes. There is consistent evidence that involving workers in the design of work systems — including AI-assisted ones — produces better outcomes for both workers and organizations. Systems designed by engineers optimizing for productivity metrics alone tend to miss factors visible only to workers themselves.
Regulatory attention is increasing. The EU Platform Work Directive, New York City's AI in employment decisions law, and increasing OSHA and NLRB attention to algorithmic management practices signal that the regulatory environment is shifting. Organizations designing algorithmic management systems should anticipate legal requirements for transparency, worker notification, human review, and accountability.
Discussion Questions
-
Amazon argues that its monitoring systems improve safety by providing data-driven insights. Critics argue the systems increase injury risk through productivity pressure. How should organizations evaluate these competing claims when making workforce management decisions?
-
The accountability gap in algorithmic management — where the system makes the decision but no individual is clearly responsible — is a recurring ethical problem. What organizational structures and governance mechanisms could close this gap?
-
Amazon's fulfillment center workers are formal employees, not gig workers, yet they experience many of the same algorithmic control features as Uber drivers. What does this suggest about whether employment classification is an adequate protection against algorithmic management harms?
-
If you were advising Amazon on redesigning its fulfillment center management system to be both operationally effective and ethically sound, what specific changes would you recommend?
-
The Amazon Labor Union's founding at JFK8 was largely driven by worker concerns about algorithmic management. What does this suggest about the relationship between AI deployment practices and worker organizing?