Case Study 26-1: Robert Williams and the Wrongful Arrest — Facial Recognition Accountability in Practice

Overview

On January 9, 2020, Robert Williams, a 42-year-old Black man living in Farmington Hills, Michigan, was arrested in his driveway, handcuffed in front of his wife and two young daughters, and transported to the Detroit Detention Center on a shoplifting charge he did not commit. He spent approximately 18 hours in custody before investigators, confronted with his driver's license, acknowledged that the photograph of the suspect and the man in front of them were clearly different people.

The arrest was made on the basis of a facial recognition match. The match was wrong.

Williams's case, which the American Civil Liberties Union subsequently litigated and publicized, became a landmark in the public understanding of facial recognition's risks. It was not the first wrongful identification by a facial recognition system, nor was it the last. But it was one of the first cases in which the algorithmic origin of the error was documented, the agency's process was subjected to detailed external scrutiny, and the individual wronged was represented by counsel capable of pressing for accountability. For these reasons, it illuminates the entire chain of failures — technical, procedural, legal, and political — that can transform an algorithm's error into a man's arrest record.


The Underlying Investigation

In October 2018, a man entered a Shinola watch store in Detroit and shoplifted approximately $3,800 worth of merchandise. Store security cameras captured video footage of the incident. Investigators obtained still images from that footage.

The quality of surveillance footage from retail environments is frequently poor — cameras optimized for wide-area coverage rather than individual identification, often operating at low resolution and frame rates that produce blurry still images when isolated. The still images from the Shinola store were consistent with this common limitation: the suspect's face was visible but not in sharp resolution.

Detroit police submitted the still image to the Michigan State Police (MSP), which operates a facial recognition unit. The MSP submitted the image to a system called DataWorks Plus, a commercial facial recognition product built on algorithms licensed from NEC and other vendors. The system returned a list of candidate matches. Robert Williams's name appeared in the results.

It is not publicly known at what confidence score Williams appeared, or how many other candidates the system returned, or what criteria were used to select Williams for further investigation rather than other candidates. These details — which would be essential for any meaningful accountability — were not documented in materials that subsequently became public.

The Photo Array

Having identified Williams as a candidate, Detroit Police Department (DPD) investigators proceeded to a standard investigative step: constructing a photo array, or lineup, to present to a witness. Photo arrays are an eyewitness identification procedure developed in the forensic psychology literature to reduce suggestibility and confirmation bias. When properly conducted, they are presented by an administrator who does not know which photograph is the suspect, with each photo shown individually, and without cues about the investigator's expectations.

In the Williams case, a DPD officer constructed a photo array that included Williams's photograph, taken from his driver's license. The array was presented to a loss prevention officer from the Shinola store — someone who had presumably seen the surveillance footage but had observed the suspect in the context of security review, not direct personal interaction.

Critically, the loss prevention officer was not informed that the photo array was presented in the context of a facial recognition match. This information would have been directly relevant: a witness who knew that the investigation began with an algorithmic identification — itself probabilistic — might have calibrated their confidence in selection differently. The loss prevention officer selected Williams's photograph.

With a facial recognition match and a photo array identification, investigators sought and obtained a warrant for Williams's arrest.


The Arrest

January 9, 2020. Robert Williams came home from work to find Detroit police officers waiting in his driveway. He was handcuffed and placed in a police car. His wife, Melissa Williams, witnessed the arrest along with their daughters, ages 2 and 5. Melissa Williams would later testify that she could not adequately explain to her daughters what was happening to their father.

Williams was transported to the Detroit Detention Center and booked. He was held overnight.

The following afternoon, Williams was taken to an interview room. An investigator placed two photographs on the table: a still from the Shinola surveillance footage and Williams's driver's license photograph. According to Williams and his subsequent account, the investigator pointed to the surveillance still and asked if that was him. Williams looked at both photographs and said, "This is not me." He described the investigator's response as: "The computer must have made an error."

Williams was released but not yet exonerated. He was released on a personal bond with conditions including surrendering his passport. The charges against him — second-degree retail fraud — remained active. Williams had an active criminal case hanging over him, stemming from a crime he did not commit, based on a match he was told almost immediately was likely an error.

It took several weeks for the Wayne County Prosecutor's Office to formally dismiss the charges.


ACLU Representation and Lawsuit

Williams was connected to the American Civil Liberties Union of Michigan. The ACLU provided legal representation and, in April 2021, filed a formal complaint with the Detroit Police Department on Williams's behalf, demanding policy reform. In July 2021, the ACLU filed a federal civil rights lawsuit on Williams's behalf against the City of Detroit and the relevant Detroit police officers.

The lawsuit alleged violations of Williams's Fourth Amendment rights (unlawful seizure), his Fourteenth Amendment rights (equal protection), and related claims under Michigan law. The complaint documented the investigative chain — facial recognition match, photo array, arrest warrant — and argued that law enforcement reliance on a facial recognition match as the initiating basis for the arrest, without corroborating evidence and given the known error rates of such systems, constituted an unreasonable seizure.

The Detroit Police Department settled the lawsuit in June 2023. The terms included monetary compensation to Williams and his family (the amount was not publicly disclosed) and commitments to specific policy reforms in the DPD's facial recognition practices.

Williams's Public Advocacy

Williams testified before the House Committee on Oversight and Reform in 2021, joining a hearing on facial recognition technology and civil liberties. His testimony, delivered alongside testimony from ACLU attorneys and technology researchers, described the experience of the arrest and its effects on his family.

"It's like I'm being used as a guinea pig for this technology," Williams told lawmakers. He described the lasting psychological impact on his daughters, who had witnessed his arrest, and the professional and personal disruption of having an active criminal charge while the case was worked through.

His testimony was widely covered and contributed to legislative momentum — though, as of the time of this writing, no comprehensive federal facial recognition legislation has been enacted.


Detroit Police Department's Response

Initial Denial and Minimization

The Detroit Police Department's initial public statements, following the disclosure of Williams's case, minimized the significance of the incident. Department officials noted that facial recognition was used only as an "investigative lead" and that the subsequent photo array identification — a traditional eyewitness procedure — provided the basis for the arrest warrant. This framing placed the causal weight of the arrest on the eyewitness identification rather than the algorithmic match.

The framing was technically accurate but practically misleading. The photo array would not have been conducted without the facial recognition match; Williams would not have been in the array without being identified by the algorithm; the witness would not have seen his photograph without the algorithm selecting him from a database of potentially hundreds of thousands. The facial recognition match was not independent of the subsequent identification — it was the sole basis on which Williams entered the investigation at all.

Policy Reform

In June 2020, following the public disclosure of Williams's case, the Detroit Police Department announced a revised policy on facial recognition use. Key provisions included:

  • Facial recognition results may only be used as investigative leads and not as the sole basis for an arrest.
  • Any facial recognition match must be accompanied by additional corroborating evidence before an arrest warrant is sought.
  • Investigators must document when facial recognition was used in an investigation.
  • Facial recognition matches against a "most wanted" list will have enhanced human review requirements.

The policy changes were meaningful on their face (pun intended), but critics raised several concerns: the policy was self-imposed rather than legislatively mandated; it was not independently audited; it did not specify what "corroborating evidence" means with sufficient precision to prevent photo array identification — itself initiated by the facial recognition match — from being counted as independent corroboration; and it did not require disclosure to defendants when facial recognition was used in their investigation, limiting the ability of defense attorneys to challenge the chain of evidence.


The Constitutional and Legislative Dimensions

Fourth Amendment Questions

The Fourth Amendment prohibits unreasonable searches and seizures. Whether the use of facial recognition to generate an investigative lead constitutes a "search" within the meaning of the Fourth Amendment is an unresolved legal question, though the dominant academic view is that it probably does not — submitting a photograph to a recognition system involves no physical intrusion and searches records that police are entitled to access (DMV databases, prior arrest records).

The more directly relevant Fourth Amendment question in the Williams case is the reasonableness of the arrest warrant. Warrants must be supported by probable cause — a reasonable belief, grounded in specific facts, that a particular person committed a particular crime. The question is whether a facial recognition match (known to have significant false positive rates) plus a photo array identification (conducted without disclosing the facial recognition origin) constitutes probable cause.

Courts have not yet directly ruled on this question in a facial recognition context. The closest analogous case law comes from eyewitness identification cases, where courts have addressed the problem of suggestive identification procedures. When an identification procedure is impermissibly suggestive and unreliable, resulting identifications may be excluded from evidence. If the Williams framework were applied: presenting a photo array that the investigator knew was constructed around a facial recognition candidate, without disclosing this to the identifying witness, might constitute an impermissibly suggestive procedure.

This is not currently settled law, but it is a live area of constitutional litigation, and the Williams case's civil settlement, while providing remedy to one individual, did not produce the legal precedent that would constrain future departments.

What Legislative Reform Would Prevent Recurrence

Based on the documented facts of the Williams case, several legislative reforms would reduce the likelihood of similar incidents:

1. Accuracy Standards Before Deployment. Legislation should require that any facial recognition system used in law enforcement meet minimum accuracy standards across demographic groups, validated by independent testing, before deployment. Systems that demonstrate materially higher false positive rates for specific racial or gender groups should not be approved for law enforcement use without demonstrated mitigation measures.

2. Mandatory Documentation and Disclosure. Every use of facial recognition in an investigation should be documented in the investigative record. Defense attorneys should have the right to discover when facial recognition was used, the confidence score produced, the size and composition of the database searched, and the algorithm version used. In Williams's case, the absence of documentation requirements meant that the investigative chain was not clear even to the people who became responsible for challenging it.

3. Independent Corroboration Requirements. Statutes should define what constitutes adequate corroborating evidence before a facial recognition match can support an arrest warrant — and should specify that eyewitness identifications arranged on the basis of the facial recognition match itself do not qualify as independent corroboration. The photo array in the Williams case was initiated by the facial recognition result; it cannot logically provide independent confirmation of that result.

4. Notice to Witnesses. Witnesses presented with photo arrays in facial recognition-initiated investigations should be informed that the array was developed with algorithmic assistance. This information is relevant to the witness's epistemic confidence and should not be withheld.

5. Private Right of Action. Legislation modeled on Illinois BIPA should create a private right of action for individuals wrongly arrested based on facial recognition matches below specified accuracy standards, enabling civil enforcement without requiring the resources of a law enforcement investigation or a government enforcement agency.

6. Demographic Bias Audits. Any agency using facial recognition should be required to conduct and publicly report annual audits of the demographic distribution of matches generated, investigations initiated, and arrests made — creating a paper trail that can detect disparate impact patterns before they accumulate to the scale documented across the Williams, Parks, and Oliver cases.


Broader Context: The Pattern

Robert Williams, Nijeer Parks, and Michael Oliver represent three documented cases. But there is no national registry of facial recognition use in criminal investigations; no agency is required to report when facial recognition was used; many agencies actively resist disclosing this information, citing law enforcement investigative privilege.

The documented cases are almost certainly the visible fraction of a larger number. All three individuals are Black men. All three were matched by algorithms tested and found to have significantly higher false positive rates for darker-skinned faces. All three were in jurisdictions where police departments had adopted facial recognition without comprehensive policies governing its investigative use. All three faced significant burdens in establishing that the technology was responsible for the investigation of them, because the information was not routinely disclosed.

The Williams case is thus not merely a story about one man and one wrong match. It is a case study in what a system without adequate governance looks like: a technology adopted without accuracy standards, used without documentation requirements, generating leads without disclosure obligations, producing arrests without independent corroboration requirements, and creating harms concentrated in communities that historically face the steepest barriers to legal accountability.


Discussion Questions

  1. The Detroit Police Department framed the facial recognition match as an "investigative lead" and pointed to the photo array as the actual basis for the arrest. Evaluate this framing. Does it adequately describe the causal chain that led to Williams's arrest?

  2. What would "independent corroboration" mean in a facial recognition investigation context? Propose a definition specific enough to inform a legislative standard.

  3. Williams's case became public because he had the personal resources and the legal connections to be represented by the ACLU. How many similar cases might exist that have not been documented? What systemic changes would produce visibility into the scope of the problem?

  4. Detroit's post-incident policy reform was self-imposed. What is the argument for self-imposed reform as sufficient? What is the argument for legislative mandate? Which do you find more persuasive, and why?

  5. Consider the photo array procedure. What is the ethical obligation of a law enforcement officer conducting a photo array when they know the array was constructed from a facial recognition match? Should that information be disclosed to the identifying witness?


This case study connects to Chapter 30 (AI in the Criminal Justice System) and Chapter 20 (Liability Frameworks for AI).