Chapter 28 Quiz: Algorithmic Management — When the Boss Is an AI
Answer all questions. Multiple choice questions have one best answer. Short answer questions should be 2–4 sentences.
Part I: Multiple Choice
1. The defining feature that distinguishes algorithmic management from performance monitoring systems is:
a) Algorithmic management uses more data b) Performance monitoring creates data about worker behavior; algorithmic management uses data to actively direct worker behavior in real time c) Algorithmic management is only used in warehouses d) Performance monitoring involves human review; algorithmic management does not
2. Amazon's "rate" metric in its fulfillment centers is used algorithmically for:
a) Calculating annual bonuses only b) Continuous real-time evaluation that automatically generates warnings and can initiate termination procedures when thresholds are crossed c) Setting shift schedules d) Determining which workers are eligible for promotions
3. What does the chapter mean by the "treadmill effect" in algorithmic task assignment?
a) Workers on treadmills in fulfillment center break rooms b) The dynamic by which the highest-performing workers are consistently assigned the most demanding tasks, creating a self-reinforcing cycle of differential burden c) The effect of rate metrics that keep increasing over time d) A form of Goodhart's Law specific to warehouse work
4. In gig economy platforms, "deactivation" refers to:
a) The temporary suspension of a driver's account pending investigation b) The platform-initiated removal of a worker's access to the platform — functionally equivalent to firing — which can occur automatically when rating thresholds are crossed c) The process by which a worker voluntarily leaves the platform d) A technical term for software maintenance
5. The "black box manager" problem refers to:
a) Managers who are physically invisible to workers b) The situation in which algorithmic employment decisions are made by opaque systems whose logic workers cannot access, challenge through available channels, or identify a responsible human decision-maker for c) Management software with a black interface d) A type of performance review system used in some industries
6. Customer ratings in gig economy platforms function as a surveillance mechanism because:
a) Customers are paid to rate workers b) They transform customers into distributed surveillance agents whose assessments feed directly into consequential algorithmic performance evaluations, with no mechanism for workers to contest individual ratings c) Ratings are stored permanently in workers' records d) The platform reviews every rating before it counts
7. Research has documented that customer ratings are affected by:
a) Only the quality of service provided b) Factors including the worker's race and gender, the customer's mood, and external circumstances unrelated to service quality c) Only the efficiency of the transaction d) The price charged for the service
8. Amazon's "bathroom break controversy" is analytically significant primarily because:
a) It shows that Amazon treats workers cruelly b) It illustrates that the algorithmic management system creates conditions in which normal human biological needs generate automated performance consequences — demonstrating the system's inability to distinguish between constrained and chosen behavior c) It demonstrates that Amazon has insufficient bathroom facilities d) It is the only documented case of worker mistreatment at Amazon
9. Sentiment analysis of customer calls monitors:
a) Only the content of what is said b) The emotional dimensions of work — vocal patterns, word choice, response timing — extending measurement into the affective interior that workers have historically retained some autonomy over c) Customer satisfaction only, not worker performance d) The number of complaints per agent
10. The EU Platform Work Directive (2024) is significant because:
a) It bans gig economy work in EU member states b) It creates a legal presumption that platform workers are employees and requires platforms to provide transparency about algorithmic management systems c) It requires all gig workers to form unions d) It limits the number of hours gig workers can work per week
11. GDPR Article 22 is relevant to algorithmic management because:
a) It requires all workplace algorithms to be publicly disclosed b) It provides individuals with the right not to be subject to solely automated decisions that produce significant effects, requiring human review of automated employment decisions c) It limits data retention to 30 days d) It requires employers to share monitoring data with workers
12. The Python code in this chapter demonstrates that:
a) Algorithmic management systems are always unjust b) The algorithm treats idle time caused by bathroom breaks identically to idle time caused by intentional inactivity — revealing embedded assumptions about what constitutes "legitimate" inactivity c) Algorithmic systems are more accurate than human supervisors d) Python is the most common language used in warehouse management systems
13. When an Amazon warehouse worker is "fired by algorithm," the situation is most notable because:
a) Amazon has too many workers to manage b) There is no human who made the individual termination decision, creating a distributed responsibility problem in which no single person authorized the firing while the system processed it as routine c) The algorithm is always wrong in its termination decisions d) Workers prefer algorithmic decisions to human ones
14. The gig economy misclassification problem — classifying workers as "independent contractors" — creates what surveillance scholars call:
a) A benefit for workers who prefer flexibility b) A situation in which workers are subject to comprehensive algorithmic control (direction, monitoring, evaluation, discipline) without the legal protections that accompany employment relationships c) A technical legal distinction with no practical consequences d) A problem only for workers who prefer employment status
15. The Amazon Labor Union's organizing demands at JFK8 included calls for:
a) Higher wages as their primary demand b) Algorithmic due process — meaningful appeal processes, worker access to performance data, and human review of automated employment decisions c) Elimination of all monitoring d) Worker control over scheduling software
16. In the chapter's analysis, algorithmic management is described as "Taylor's dream automated" primarily because:
a) Both Taylor and Amazon are associated with efficiency consulting b) Both share the goal of eliminating worker discretion through systematic direction — the algorithm simply does at millisecond scale and continuous operation what Taylor's time-and-motion men did at hourly scale with periodic studies c) Taylor predicted that computers would eventually manage workers d) Both systems use the same performance metrics
Part II: Short Answer
17. Explain the "distributed responsibility" problem in algorithmic management. Using a specific example from the chapter, describe how this distribution of responsibility affects worker ability to seek redress for an unfair automated decision.
Your answer:
18. The chapter argues that algorithmic management "encodes values" in its design. Using the Python code example, identify two specific design choices that encode management values, and explain what those values are and what alternative values might produce different design choices.
Your answer:
Answer Key (Instructor Version)
- b
- b
- b
- b
- b
- b
- b
- b
- b
- b
- b
- b
- b
- b
- b
- b
17. Distributed responsibility: the algorithmic management system fires workers without any individual human making the termination decision. The HR representative processes automated paperwork without reviewing the individual case; the manager oversees the facility without the ability to override the system; the engineers designed the algorithm in a different part of the company. The Amazon worker on medical leave who received an automated termination illustrates this: she was fired because the system was not updated with her leave status, and when she contacted HR, she was told the decision was "automated" with a multi-week appeal timeline. No one fired her; no one can easily unfire her; the responsibility for the error is distributed across the system's design, implementation, and oversight gaps in ways that make individual accountability nearly impossible to establish.
18. Design values in the code: (1) The performance score weights rate at 50%, idle time at 20%, errors at 30%. This encodes the management value that speed is the most important performance dimension. Alternative values: weighting error prevention more heavily (a quality-first organization) or making rate and quality equal would reflect different priorities. (2) The idle_warning_threshold is set to 5 minutes without accommodation for biological needs or facility conditions. This encodes the assumption that all idle time is discretionary worker behavior. Alternative design: excused idle categories (bathroom, facility wait, emergency assistance) would encode the value that context matters for performance evaluation.