Case Study 22-1: ShotSpotter in Chicago — Acoustic Surveillance, Race, and the Evidentiary Chain

Background

Chicago has one of the highest rates of gun violence among major American cities, and it has been one of the most extensive users of ShotSpotter technology. The Chicago Police Department (CPD) has maintained a ShotSpotter contract since 2012, with the system deployed across large portions of the city's South and West Sides — neighborhoods that are predominantly Black and Latino, and that have historically experienced both high rates of gun violence and aggressive police presence.

Between 2016 and 2020, CPD received approximately 40,000 ShotSpotter alerts. The system's presence became deeply embedded in CPD's operational model: officers were routinely dispatched to ShotSpotter alert locations as a matter of course, and ShotSpotter data was incorporated into crime reporting and analysis.

The system's defenders argued it was essential: gunshots are frequently not reported by community members who distrust police or fear retaliation, and ShotSpotter provided documentation of gunfire events that would otherwise go unrecorded. Critics argued it was a racially targeted surveillance system that wasted police resources, produced unreliable evidence, and subjected Black and Latino residents to disproportionate police contact.

In 2022, after a public controversy that escalated through 2021, Chicago's city council voted not to renew the ShotSpotter contract. The decision made Chicago the largest American city to drop the technology — a significant moment in the debate over algorithmic surveillance in law enforcement.

The Michael Williams Case

The most consequential specific controversy involved the prosecution of Michael Williams, a 65-year-old Black Chicago resident, for the 2020 murder of a 25-year-old man named Safarian Herring.

Williams maintained he had not shot Herring — that he had driven Herring somewhere, that Herring had been shot by someone else in a separate incident, and that Herring had then gotten back into Williams's car, where he died. Williams was driving Herring to a hospital when he was stopped by police. He was held in pretrial detention for 11 months while prosecutors sought to build their case.

The ShotSpotter Evidence

ShotSpotter was central to the prosecution's theory. The system had detected a gunshot at a specific time and location — a location consistent with Williams having been in the area in his car. Prosecutors argued that this ShotSpotter alert placed a gunshot at the location where Williams and Herring were present, supporting the theory that Williams had shot Herring in the car.

The Reclassification

A 2021 investigation by Motherboard/Vice revealed a disturbing wrinkle. ShotSpotter's original automated analysis of the relevant sound had classified it not as a gunshot but as a firecracker. A human ShotSpotter analyst then reclassified the sound as a gunshot — after the fact, during the investigation of Herring's death. The reclassification was apparently made at the request of or in coordination with law enforcement.

The defense obtained documentation of this reclassification sequence. The ShotSpotter alert had not simply detected a gunshot and reported it; an analyst had retroactively decided the sound was a gunshot, and this reclassification became a cornerstone of the prosecution's case.

The charges against Michael Williams were ultimately dropped in December 2021, after more than 11 months of pretrial detention. The judge in the case cited the ShotSpotter evidence's reliability problems as part of her reasoning.

Analysis

The Evidentiary Chain and Its Weaknesses

The Williams case illustrates a fundamental problem with algorithmic evidence in criminal proceedings: the algorithm is presented as objective, but its output passes through multiple human hands and decisions before it becomes evidence. In this case:

  1. A sound occurred in the environment
  2. ShotSpotter sensors captured it
  3. Automated algorithms classified it initially as a firecracker
  4. A human analyst, aware that the sound was associated with a specific criminal investigation, reclassified it as a gunshot
  5. The reclassified result was presented to prosecutors as ShotSpotter data
  6. Prosecutors incorporated it into a theory of the crime
  7. The theory was used to justify 11 months of pretrial detention

At step 4, the "objective" algorithmic evidence became a human judgment — one that may have been influenced, consciously or unconsciously, by knowledge of the investigation context. This contamination of algorithmic evidence by human interpretation is not unique to ShotSpotter; it is a general feature of how machine learning outputs are actually used in institutional settings.

Racial Asymmetry in Surveillance Effects

The ShotSpotter deployment in Chicago was geographically concentrated in majority-Black and majority-Latino neighborhoods. This meant that the errors of the system — both false positives (police dispatched for non-gunshot sounds) and problematic evidentiary reclassifications like the Williams case — fell disproportionately on Black and Latino residents.

Residents of predominantly white Chicago neighborhoods were not subject to ShotSpotter's error rate because ShotSpotter was not deployed in their neighborhoods. The risk of being dispatched on by an incorrect ShotSpotter alert, or of having ShotSpotter data used in a prosecution against you, was not equally distributed across the city's population. It was concentrated in communities that already bore the heaviest burden of police contact and criminal justice system involvement.

This geographic concentration is not accidental. ShotSpotter is deployed in neighborhoods that police designate as "high-crime." Those designations themselves reflect decades of racially unequal policing — neighborhoods that have historically received more aggressive enforcement are recorded as having more crime, which justifies more surveillance, which produces more records of crime, in a self-reinforcing cycle.

The 11 Months

An important element of the Williams case that analysis can obscure: Michael Williams spent 11 months in pretrial detention — incarcerated, separated from his family, unable to work — based in significant part on evidence that was later found to be unreliable and was generated through a process that had not been disclosed to his defense. He was a 65-year-old man with no prior violent offense record. He was acquitted not at trial but through dismissal, meaning the case against him collapsed — not that he was found innocent at a jury verdict.

The 11 months are not an abstraction. They represent the real-world cost of algorithmic surveillance systems deployed in contexts where their errors fall on people with the fewest resources to challenge them.

Discussion Questions

  1. The ShotSpotter company's response to the Williams case emphasized that the reclassification was performed by a trained human analyst and that all ShotSpotter data goes through human review. Does this response address the core problem identified in the case, or does it miss it? What is the core problem?

  2. If ShotSpotter had been deployed equally across all Chicago neighborhoods — including affluent white neighborhoods on the North Side — do you think the political controversy over its accuracy and its racial disparity effects would have emerged sooner, later, or not at all? What does your answer suggest about the relationship between who is surveilled and whether surveillance systems are scrutinized?

  3. The chapter discusses the concept of algorithmic authority — the tendency of decision-makers to treat algorithm outputs as objective and authoritative. In the Williams case, how was algorithmic authority mobilized in the prosecution's case? At what point did the "objectivity" of ShotSpotter data break down, and what should have been the warning signs?

  4. Chicago ended its ShotSpotter contract in 2022. Other cities — including those with similar demographics and gun violence patterns — have maintained or expanded their contracts. What factors explain this variation? What would constitute sufficient evidence to cause a city to discontinue a surveillance contract?

  5. Michael Williams's charges were dropped, but he cannot recover the 11 months of pretrial detention he served. What legal remedies, if any, should be available to people who suffer harm as a result of unreliable algorithmic evidence? Who should bear responsibility — the technology company, the police department that deployed the system, the prosecutors who used the evidence, or some combination?


This case study connects to Chapter 2 (social sorting), Chapter 25 (urban sensor systems), and Chapter 38 (predictive policing and algorithmic decision-making).