Case Study: The Uber Self-Driving Car Fatality
"The probable cause of this crash was the failure of the vehicle operator to monitor the driving environment and the operation of the automated driving system because she was visually distracted throughout the trip by a personal cell phone." — National Transportation Safety Board, Final Report (2019)
"This crash was not just about one distracted operator. It was about an entire safety culture that had been systematically degraded." — NTSB Chairman Robert Sumwalt
Overview
On the night of March 18, 2018, at approximately 9:58 p.m., a modified Volvo XC90 SUV operated by Uber's Advanced Technologies Group struck and killed Elaine Herzberg, a 49-year-old woman who was walking a bicycle across a four-lane road in Tempe, Arizona. The vehicle was in autonomous mode at the time of the collision. A human safety operator, Rafaela Vasquez, was seated behind the wheel.
Elaine Herzberg became the first pedestrian killed by a self-driving vehicle on a public road.
The incident prompted a federal investigation, criminal proceedings, a reassessment of autonomous vehicle testing regulations, and a public reckoning with the governance gaps that had allowed a company to test experimental autonomous vehicles on public roads under minimal oversight. This case study examines what happened, what the investigation revealed, and what the incident tells us about the governance challenges of autonomous systems.
Skills Applied: - Applying the levels of autonomy framework (Section 19.1) to a real-world incident - Analyzing the vigilance problem and human oversight failures (Section 19.5) - Evaluating accountability in a multi-actor system (Chapter 17) - Assessing regulatory adequacy for autonomous system testing
The Night of the Crash
The Vehicle
The Uber test vehicle was a 2017 Volvo XC90 SUV modified with autonomous driving hardware and software. The vehicle was equipped with lidar (laser-based distance sensing), radar, and cameras — a sensor suite designed to detect objects around the vehicle in all directions. The autonomous system was Uber's proprietary software, designed to perceive the environment, classify objects, predict their movement, and control the vehicle accordingly.
At the time of the crash, the system was operating at approximately SAE Level 3 — the vehicle was controlling all driving functions, but a human safety operator was required to monitor the system and intervene if necessary.
The Road
The crash occurred on North Mill Avenue in Tempe, a divided four-lane road with a posted speed limit of 45 mph. The road was straight, flat, and dry. Lighting conditions were dark — the area was not well-lit — but the vehicle's sensor systems were designed to operate in darkness using lidar and radar, which do not depend on visible light.
Elaine Herzberg was walking her bicycle from left to right across the road at a location without a marked crosswalk, approximately 360 feet from the nearest crosswalk. She was wearing dark clothing.
The Collision
The NTSB investigation reconstructed the events in detail:
5.6 seconds before impact: The autonomous system's sensors first detected Herzberg. The system classified her as an "unknown object."
5.2 seconds before impact: The system reclassified her as a "vehicle."
4.2 seconds before impact: The system reclassified her as "other" — neither vehicle nor pedestrian.
2.7 seconds before impact: The system reclassified her as a "bicycle" — but projected her path as stationary, not crossing the road.
1.5 seconds before impact: The system determined that emergency braking was needed. However, Uber had disabled the vehicle's automatic emergency braking system — a factory-installed safety feature of the Volvo XC90 — to prevent what Uber engineers described as "erratic vehicle behavior" caused by false positives. The system was programmed instead to alert the safety operator, who would then brake manually.
0 seconds: The vehicle struck Herzberg at approximately 39 mph. Vasquez looked up from her phone approximately 0.5 seconds before impact. She did not brake.
Herzberg died of her injuries at a hospital that night.
What the Investigation Revealed
The Safety Operator
Rafaela Vasquez was the designated safety operator — the human in the loop whose role was to monitor the autonomous system and intervene when necessary. The NTSB investigation found that Vasquez was visually distracted by her personal cell phone for approximately 34% of the trip preceding the crash. In-vehicle camera footage showed her repeatedly looking down at her phone in the minutes before the collision.
The NTSB's finding was direct: the "probable cause" of the crash was the safety operator's failure to monitor the driving environment. But the investigation went further, identifying the conditions that made this failure predictable.
Uber's Safety Culture
The NTSB investigation documented systematic failures in Uber's safety culture:
Removal of the second safety operator. Earlier in the testing program, Uber had used two safety operators per vehicle — one to monitor the driving environment and one to monitor the autonomous system's status on a laptop. In 2017, Uber reduced this to a single operator to cut costs and increase the pace of testing. This meant one person was responsible for both monitoring the road and monitoring the system — a dual-task demand that human factors research consistently shows degrades performance.
Disabling the automatic emergency braking system. Uber disabled the Volvo XC90's factory-installed automatic emergency braking system because it caused the vehicle to brake unexpectedly when the autonomous system's perception generated false positives. Rather than improving the perception system to reduce false positives, Uber removed the safety net — a decision that directly contributed to the fatality. Had the factory AEB been active, the vehicle would have braked or at least slowed before impact.
Inadequate operator training and monitoring. The NTSB found that Uber's safety operator training was insufficiently rigorous and that the company did not have effective systems for monitoring operator attentiveness during test drives. There was no in-cab monitoring system to detect operator distraction in real time.
Aggressive testing pace. Uber was in an intense competitive race with Waymo, Cruise, and other autonomous vehicle developers. Internal communications revealed pressure to increase the number of autonomous miles driven — a metric that incentivized more testing with less oversight. Safety, in this context, was treated as a friction to be minimized rather than a constraint to be respected.
No formal safety management system. The NTSB found that Uber's Advanced Technologies Group did not have a formal safety management system — a structured framework for identifying, assessing, and mitigating risks. The absence of such a system meant that known risks (such as operator distraction, perception limitations, and the consequences of disabling the AEB) were not systematically tracked, evaluated, or addressed.
Arizona's Regulatory Framework
Arizona was, at the time, one of the most permissive jurisdictions in the United States for autonomous vehicle testing. Governor Doug Ducey had issued an executive order in 2015 welcoming autonomous vehicle companies to the state, and the regulatory framework was deliberately light:
- No state pre-approval was required to test autonomous vehicles on public roads.
- No minimum safety standards were mandated.
- No reporting of test incidents was required (though some voluntary reporting existed).
- No minimum training requirements for safety operators were specified.
This permissive environment was intentional — Arizona was competing with California, Michigan, and other states to attract autonomous vehicle companies and the jobs and investment they brought. The regulatory framework prioritized economic development over safety governance.
The NTSB concluded that the regulatory environment was inadequate: "The lack of federal safety standards for automated driving systems and the absence of adequate state oversight of automated vehicle testing contributed to the conditions that led to this crash."
Accountability Analysis
The Many Hands Problem
The Uber fatality is a textbook illustration of the many hands problem from Chapter 17. Multiple actors contributed to the conditions that led to Herzberg's death, but each can point to factors beyond their individual control:
| Actor | Contribution | Claim of Non-Liability |
|---|---|---|
| Rafaela Vasquez (safety operator) | Was distracted by her phone; failed to monitor | "I was doing the best I could in impossible conditions — monitoring both the road and the system for hours on end." |
| Uber ATG (developer/deployer) | Disabled AEB, removed second operator, inadequate training, no safety management system | "The safety operator was hired specifically to intervene. She failed in her primary duty." |
| Uber corporate (management) | Set aggressive testing pace, prioritized speed over safety | "Our Advanced Technologies Group had operational autonomy." |
| Arizona (regulator) | Created a permissive testing environment with minimal safety requirements | "We trusted the companies to test responsibly. We are not automotive engineers." |
| Volvo (vehicle manufacturer) | Built the base vehicle; its safety systems were disabled by Uber | "We designed a vehicle with AEB. The modifications were Uber's decision." |
Every actor has a plausible claim. The harm is real and fatal. And the distribution of responsibility across the chain — as Mira observed in Section 17.1.4 — is "functionally the same as if nobody is."
The Legal Outcomes
Criminal prosecution: Vasquez was charged with negligent homicide in 2020. In 2023, she pleaded guilty to a lesser charge and was sentenced to three years of supervised probation. Uber itself was not criminally charged — the Yavapai County Attorney declined to prosecute the company, stating that there was no basis for criminal liability under existing law.
Civil settlement: Uber reached a financial settlement with Herzberg's family shortly after the crash. The terms were not publicly disclosed.
Regulatory consequences: Arizona temporarily suspended Uber's autonomous vehicle testing permit. Uber voluntarily halted all autonomous testing for nine months. When testing resumed, Uber implemented two safety operators per vehicle, re-enabled AEB, installed driver-monitoring cameras, and established a formal safety management system. In 2020, Uber sold its Advanced Technologies Group to Aurora Innovation.
Governance Lessons
The Vigilance Problem in Practice
The Uber fatality is the most consequential real-world demonstration of the vigilance problem described in Section 19.1.2. Vasquez was a human in the loop — formally responsible for monitoring and intervening. But the cognitive demands of her role — sustained attention to a system that operated correctly most of the time, for hours on end, in a monotonous driving environment — were incompatible with human cognitive capabilities.
Human factors research consistently demonstrates that sustained vigilance degrades after 20-30 minutes. Vasquez's shift involved hours of monitoring. Her distraction was a failure of individual discipline, but it was also a predictable consequence of a system designed around a human capability that does not reliably exist.
Safety as a System Property
The NTSB investigation emphasized that safety is a system property, not an individual characteristic. The crash resulted not from a single failure but from the interaction of multiple failures: perception limitations, disabled safety systems, inadequate operator training, insufficient monitoring, aggressive organizational culture, and permissive regulation. Addressing any one of these factors in isolation would not have prevented the crash; addressing several in combination likely would have.
This systems perspective has direct implications for governance. Regulating autonomous vehicles by setting a single safety metric (e.g., "the vehicle must be safer than a human driver") misses the interdependencies among technical, organizational, and regulatory factors. Effective governance must address the system as a whole — technology, organization, and regulatory environment in combination.
The Cost of Regulatory Competition
Arizona's permissive regulatory approach illustrates the risks of regulatory competition — the "race to the bottom" in which jurisdictions compete to attract industry by offering the lightest regulatory burden. When Arizona competed with California by offering fewer safety requirements, the result was an environment in which a company could test experimental autonomous vehicles on public roads without adequate safety systems, operator training, or incident reporting. The cost of that competitive advantage was Elaine Herzberg's life.
Discussion Questions
-
Responsibility allocation. If you were designing a liability framework for this incident, how would you allocate responsibility among Vasquez, Uber, Arizona, and Volvo? Would you apply strict liability, negligence, or another framework? Justify your choice.
-
The AEB decision. Uber disabled the Volvo's automatic emergency braking system because it caused "erratic behavior." Evaluate this engineering decision using the ethical frameworks from Chapter 6. Under what circumstances, if any, is it acceptable to disable a safety system to improve ride comfort?
-
The regulatory race. Arizona competed with other states by offering a permissive regulatory environment. Is regulatory competition inherently problematic, or can it be structured to promote both innovation and safety? Propose a regulatory framework for autonomous vehicle testing that balances these interests.
-
The human factors question. Should autonomous vehicle testing ever rely on a single human safety operator for sustained monitoring? What does human factors research suggest about the feasibility of this model? What alternatives exist?
Your Turn: Mini-Project
Option A: NTSB Report Analysis. Read the NTSB's full investigation report on the Uber-Herzberg crash (NTSB/HAR-19/03, available online). Write a 600-word analysis focusing on one aspect the case study did not cover in detail — such as the technical details of the perception failure, the organizational dynamics within Uber ATG, or the NTSB's specific regulatory recommendations.
Option B: Comparative Regulation. Compare the autonomous vehicle testing regulations in three U.S. states (e.g., Arizona, California, and Michigan) or three countries (e.g., the US, UK, and Germany). For each jurisdiction, identify: the approval process, safety requirements, operator requirements, and incident reporting obligations. Write a two-page comparative analysis identifying which jurisdiction's approach best balances innovation and safety.
Option C: Redesign the Safety System. You are an engineer hired to redesign the safety architecture for an autonomous vehicle testing program after the Uber fatality. Propose a comprehensive safety system that addresses the failures identified in this case study: perception redundancy, emergency braking, operator monitoring, dual-operator requirements, and organizational safety culture. Present your proposal in a structured two-page design document.
References
-
National Transportation Safety Board. "Collision Between Vehicle Controlled by Developmental Automated Driving System and Pedestrian, Tempe, Arizona, March 18, 2018." Accident Report NTSB/HAR-19/03. November 19, 2019.
-
Wakabayashi, Daisuke. "Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam." The New York Times, March 19, 2018.
-
Boudette, Neal E. "Despite Fatal Crash, Uber's Self-Driving Car Test Resumed." The New York Times, December 20, 2018.
-
Metz, Cade, and Neal E. Boudette. "Inside the Self-Driving Car That Killed a Pedestrian." The New York Times, June 7, 2018.
-
Arizona Governor's Office. "Executive Order 2015-09: Self-Driving Vehicles — Testing and Operating on Public Roads." August 25, 2015.
-
Parasuraman, Raja, and Dietrich Manzey. "Complacency and Bias in Human Use of Automation: An Attentional Integration." Human Factors 52, no. 3 (2010): 381-410.
-
Matthias, Andreas. "The Responsibility Gap: Ascribing Responsibility for the Actions of Learning Automata." Ethics and Information Technology 6, no. 3 (2004): 175-183.
-
Stilgoe, Jack. "Who's Driving Innovation? New Technologies and the Collaborative State." London: Palgrave Macmillan, 2020.