Case Study: Auditing Airbnb: Racial Discrimination in Platform Marketplaces

"Airbnb is a platform, and platforms are never neutral. They encode choices — about whose identity is visible, whose is hidden, and who gets to participate on equal terms." — Adapted from Benjamin Edelman, Michael Luca, and Dan Svirsky, researchers

Overview

In 2014, a Black man named Gregory Selden booked an Airbnb listing in Philadelphia. The host accepted the booking. Selden then added a profile photo — revealing that he was Black. The host cancelled. Selden created a new profile with a white man's photo and booked the same listing for the same dates. The host accepted immediately.

Selden's experience was not an isolated incident. It was one data point in a pattern of racial discrimination on the Airbnb platform that became the subject of rigorous academic research, a viral social media campaign (#AirbnbWhileBlack), a lawsuit, a corporate reckoning, and a case study in how algorithmic auditing can expose discrimination in platform marketplaces.

This case study examines the landmark research that documented racial discrimination on Airbnb, the platform's response, and what the episode reveals about the challenges of accountability and audit in systems where algorithms and human decisions interact.

Skills Applied: - Applying audit study methodology (Section 17.2) to a real-world platform - Analyzing the interaction between algorithmic design and human discrimination - Evaluating corporate accountability responses - Connecting platform governance to the accountability gap framework


The Research: A Correspondence Audit at Scale

The Edelman, Luca, and Svirsky Study (2017)

Researchers Benjamin Edelman, Michael Luca (Harvard Business School), and Dan Svirsky designed what the chapter describes as an audit study — a controlled experiment in which matched test subjects identical in all respects except for a variable of interest (in this case, perceived race) interact with a system to detect differential treatment.

The researchers created 20 Airbnb guest profiles. The profiles were identical in every respect — age, gender, booking history, review scores, and profile descriptions — except for one variable: the guest names. Ten profiles used distinctively African American names (e.g., Darnell Jackson, Tamika Williams). Ten used distinctively white names (e.g., Brett Murphy, Kristen O'Brien). The names were drawn from a well-established naming convention database used in audit research, where previous studies had confirmed that the names reliably signal perceived race.

The researchers sent approximately 6,400 booking requests to hosts in five major U.S. cities: Baltimore, Dallas, Los Angeles, St. Louis, and Washington, D.C. Each request was identical in text, booking dates, and party size. The only difference was the guest's name on the profile.

The Findings

The results were stark:

Metric White Names African American Names Gap
Acceptance rate 50% 42% 8 percentage points
Host response rate 56% 49% 7 percentage points

Guests with distinctively African American names were roughly 16% less likely to have their booking requests accepted than guests with distinctively white names.

The discrimination was pervasive across cities, property types, and host demographics. It persisted regardless of: - Whether the host was Black or white - Whether the property was a shared room, private room, or entire apartment - Whether the host had many reviews or few - Whether the listing was in a racially diverse neighborhood or a predominantly white one

The researchers also found that the discrimination affected both male and female guests — though the gap was somewhat larger for male guests with African American names.

The Platform Design Choice

A critical detail in the Airbnb study was that the discrimination was facilitated by a platform design choice: Airbnb required guests to display their real names and profile photos when requesting a booking. Hosts saw the guest's name and photo before deciding whether to accept or decline.

This design choice was not algorithmic in the narrow sense — no machine learning model was discriminating. But it was an architectural choice that created the conditions for human discrimination to operate at scale. By making guest race visible to hosts at the moment of decision, the platform's design enabled racial discrimination across thousands of transactions simultaneously.

This is a critical insight for the accountability framework presented in Chapter 17: the accountability gap does not require algorithmic decision-making to exist. It can arise whenever a platform's architecture distributes decision-making authority while obscuring the accountability for discriminatory patterns in those decisions.

Each individual host making a booking decision is a "human in the loop." But the platform's design aggregates thousands of individual discriminatory decisions into a systemic pattern. And no single actor — not the host, not the platform, not the guest — has clear accountability for the aggregate effect.


#AirbnbWhileBlack: The Social Media Reckoning

The academic research confirmed statistically what many Black travelers already knew from lived experience. In 2016, the hashtag #AirbnbWhileBlack went viral on Twitter (now X), as users shared stories of booking requests denied, reservations cancelled after profile photos were posted, and discriminatory messages received from hosts.

The stories were strikingly consistent:

  • A Black woman in Detroit booked a listing. The host accepted. When the host met her at the door, the host said the listing was "no longer available" and shut the door. The listing remained active on the platform the next day.

  • A Black man in San Francisco found that his booking requests were consistently declined. He created a second profile with a white friend's photo. Using the same text, same dates, same listings — requests were accepted.

  • A Black couple travelling in a rural area had their confirmed reservation cancelled hours before check-in. No explanation was provided. They spent the night in their car.

The #AirbnbWhileBlack campaign did something that the academic study alone could not: it gave the statistical pattern a human face. The combination of rigorous empirical evidence and compelling personal testimony created pressure that Airbnb could not ignore.


Airbnb's Response

The Murphy Report

In September 2016, Airbnb commissioned former U.S. Attorney General Eric Holder and former ACLU director Laura Murphy to conduct an independent review of the company's anti-discrimination policies. The resulting Murphy Report (September 2016) made 17 recommendations, including:

  1. Reduce the prominence of guest photos and names in the booking process — delaying identity disclosure until after the booking is confirmed.
  2. Implement an "Instant Book" feature that allows guests to book without host pre-approval, removing the opportunity for screening.
  3. Strengthen the non-discrimination policy with clear language, mandatory host agreement, and enforcement mechanisms.
  4. Expand the "Open Doors" policy — when a guest reports discrimination, Airbnb finds them an alternative listing or hotel room at Airbnb's expense.
  5. Increase diversity within Airbnb's workforce and leadership.
  6. Partner with civil rights organizations for ongoing monitoring and feedback.

What Changed

Airbnb implemented many of the recommendations, though with notable limitations:

Instant Book: Airbnb expanded the Instant Book feature, which allows guests to book without host pre-approval. By 2023, the majority of Airbnb listings used Instant Book. However, Instant Book remains optional for hosts — and hosts who opt out can still screen guests by name and photo before accepting.

Photo delay: Airbnb adjusted the booking flow so that guest profile photos are less prominent during the initial booking inquiry. However, photos remain visible on guest profiles, and hosts can view them before responding to booking requests in non-Instant Book listings.

Non-discrimination policy: Airbnb added a mandatory non-discrimination agreement to the host sign-up process. Hosts must click "I agree" to a policy stating they will not discriminate based on race, religion, national origin, gender, sexual orientation, or other protected characteristics. The policy is clear and visible — though enforcement relies on guest complaints, and the evidentiary standard for confirming discrimination in individual cases is high.

Enforcement: Airbnb committed to removing hosts who are found to have engaged in discrimination. However, proving discrimination in any single case is difficult — a host can offer pretextual reasons for declining a booking (scheduling conflict, maintenance issue, personal plans). The systemic pattern documented by the audit study is visible only in aggregate.

The Gap Between Policy and Practice

Subsequent research tested whether Airbnb's reforms actually reduced discrimination. A 2020 follow-up study by Edelman and colleagues found that the acceptance gap for guests with African American names had narrowed — but not disappeared. In markets where Instant Book was widely adopted, the gap was smaller. In markets where hosts retained screening authority, it persisted.

This finding illustrates a key tension in platform governance: the most effective intervention — removing hosts' ability to screen by race — conflicts with hosts' desire for control over who enters their home. Airbnb has navigated this tension by making Instant Book the default and encouraged option while stopping short of requiring it for all hosts.


Analysis Through Chapter 17 Frameworks

The Accountability Chain

Actor Role Accountability Claim
Individual hosts Make booking accept/decline decisions "It's my home. I have the right to choose my guests."
Airbnb (platform) Designed the system, sets the rules, profits from transactions "We're a platform, not a hotel. Hosts make their own decisions."
Researchers (Edelman et al.) Conducted the audit Identified the problem; no authority to mandate change
Regulators Oversee housing and civil rights law Enforcement limited; platform business model does not fit neatly into existing fair housing categories

This is the accountability gap in a platform context. The discrimination is real, measurable, and harmful. But responsibility is distributed between millions of individual hosts (each making "personal" decisions), a platform (that designed the architecture enabling those decisions), and a regulatory apparatus (that was built for hotels and landlords, not peer-to-peer digital marketplaces).

Audit Methodology

The Edelman, Luca, and Svirsky study is a textbook implementation of the audit study methodology described in Section 17.2. Its strengths include:

  • Controlled design: By holding all variables constant except perceived race (signaled by name), the researchers isolated the effect of race on booking outcomes.
  • Scale: 6,400 booking requests across five cities provided sufficient statistical power to draw robust conclusions.
  • External validity: The study tested real behavior on the actual platform, not hypothetical responses in a laboratory.

Its limitations — also predicted by the chapter's discussion — include:

  • Mechanism opacity: The study demonstrates that discrimination occurs but cannot explain why individual hosts discriminate or what cognitive processes are involved.
  • Snapshot in time: The study captures behavior at one moment. Host behavior, platform design, and social norms all evolve.
  • Researcher ethics: The study involved deception (creating fictional profiles), raising methodological ethics questions about audit research.

The Many Hands Problem

The Airbnb case is a vivid illustration of the many hands problem. No single host's individual discrimination is sufficient to create the systemic pattern — it emerges from the aggregation of millions of independent decisions. The platform's design choice (making race visible) creates the conditions for discrimination, but the platform itself does not discriminate. The result is a pattern of harm with no clear single accountable party.


The Broader Implications

Platform Responsibility

The Airbnb case raises a fundamental question about platform accountability: When a platform designs a system that foreseeably enables discrimination, is the platform responsible for the resulting discriminatory outcomes — even if the discriminatory decisions are made by individual users?

The answer has implications far beyond Airbnb. Ride-sharing platforms where drivers can see passenger names and photos before accepting rides. Freelance marketplaces where clients can screen by name and location. Rental platforms where landlords can view applicant demographics before responding. Any platform that surfaces identity information in a decision-making context creates the conditions for discrimination — and must grapple with the accountability question.

The Limits of Design Solutions

Airbnb's response — Instant Book, reduced photo prominence, non-discrimination policies — represents a design-based approach to discrimination. Design solutions can reduce discrimination by limiting the information available at the moment of decision. But they cannot eliminate the discriminatory intent of individual users. A host who is determined to discriminate can find ways to do so within any design framework — by cancelling after meeting the guest, by setting prices that exclude certain populations, or by operating only in markets with low demographic diversity.

Design solutions are necessary. They are not sufficient.

The Audit as Catalyst

Perhaps the most significant lesson of the Airbnb case is the power of the independent audit as a catalyst for change. Without the Edelman, Luca, and Svirsky study, the #AirbnbWhileBlack stories would have remained anecdotal — powerful but dismissible by a platform that could claim individual incidents did not represent a pattern. The audit provided the statistical evidence that transformed individual stories into a documented systemic problem. The combination of empirical evidence and public pressure created the conditions for corporate response.

This vindicates the chapter's argument that external audits serve a function that internal accountability mechanisms cannot. Airbnb had no incentive to conduct this research on itself. The audit happened because independent researchers asked a question the platform would rather not have answered.


Discussion Questions

  1. Design vs. policy. Airbnb could have required Instant Book for all listings, eliminating hosts' ability to screen by race entirely. They chose not to, citing host preferences. Evaluate this decision. Is host autonomy a legitimate competing interest? How should platforms balance user autonomy with anti-discrimination obligations?

  2. The platform defense. Airbnb argues it is a platform, not a hotel chain — hosts, not Airbnb, make booking decisions. Evaluate this argument using the accountability frameworks from Section 17.1. At what point does a platform's design of the decision environment make it accountable for the decisions made within that environment?

  3. Audit ethics. The audit study involved researchers creating fictional profiles and booking listings under false pretenses. Is this ethically justified? What constraints should apply to audit research that involves deception? Consider the trade-off between the deception involved and the social good of exposing discrimination.

  4. Generalization. Apply the Airbnb audit methodology to a different platform marketplace — for example, a ride-hailing service, a freelance work platform, or an online lending marketplace. What would a well-designed audit study look like? What practical obstacles would you face?


Your Turn: Mini-Project

Option A: Platform Audit Design. Select a platform marketplace you use (e.g., a food delivery app, a freelance marketplace, a dating app, a ride-hailing service). Design an audit study to test for racial or gender discrimination on that platform. Specify: (a) the test variable, (b) the matched-pair design, (c) the sample size needed, (d) the outcome metric, and (e) the ethical considerations. You do not need to conduct the audit — just design it rigorously.

Option B: Policy Evaluation. Research Airbnb's current anti-discrimination policies (available on Airbnb's website). Evaluate: (a) what the policies cover, (b) how they are enforced, (c) what design changes have been implemented, and (d) whether you believe the policies are sufficient to address the discrimination documented in the audit study. Write a two-page evaluation with specific recommendations.

Option C: The Platform Accountability Debate. Write a 600-word position paper on the question: "Should platforms be held legally liable for discriminatory outcomes that result from their design choices, even when the individual discriminatory decisions are made by users?" Argue for or against, drawing on the Airbnb case and the accountability frameworks from Chapter 17.


References

  • Edelman, Benjamin, Michael Luca, and Dan Svirsky. "Racial Discrimination in the Sharing Economy: Evidence from a Field Experiment." American Economic Journal: Applied Economics 9, no. 2 (2017): 1-22.

  • Murphy, Laura W. "Airbnb's Work to Fight Discrimination and Build Inclusion: A Report Submitted to Airbnb." September 8, 2016.

  • Cui, Ruomeng, Jun Li, and Dennis J. Zhang. "Reducing Discrimination with Reviews in the Sharing Economy: Evidence from Field Experiments on Airbnb." Management Science 66, no. 3 (2020): 1071-1094.

  • Kakar, Venoo, et al. "The Visible Host: Does Race Guide Airbnb Rental Rates in San Francisco?" Journal of Housing Economics 40 (2018): 25-40.

  • Ge, Yanbo, et al. "Racial and Gender Discrimination in Transportation Network Companies." NBER Working Paper No. 22776, 2016.

  • Hannák, Anikó, et al. "Bias in Online Freelance Marketplaces: Evidence from TaskRabbit and Fiverr." Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW), 1914-1933. ACM, 2017.

  • Benner, Katie. "Airbnb Adopts Rules to Fight Discrimination by Its Hosts." The New York Times, September 8, 2016.

  • Leong, Nancy, and Aaron Belzer. "The New Public Accommodations: Race Discrimination in the Platform Economy." Georgetown Law Journal 105, no. 5 (2017): 1271-1322.