Case Study 30-1: Frances Haugen and the Facebook Papers — A Surveillance Studies Analysis

Overview

In September 2021, an anonymous source provided the Wall Street Journal with thousands of pages of Facebook's internal research documents — research showing that the company knew its Instagram platform was harmful to teenage girls' mental health, that it knew its news feed algorithm amplified divisive and harmful content, and that it had repeatedly prioritized growth and engagement over safety when the two came into conflict.

The source revealed herself in October 2021: Frances Haugen, a former product manager in Facebook's civic integrity unit. Haugen subsequently testified before the Senate Commerce Committee, provided documents to regulatory agencies in multiple countries, and met with lawmakers across Europe. Her disclosures triggered Congressional hearings, regulatory investigations, and a fundamental reappraisal of social media platform accountability.

This case study examines the Haugen disclosures through the analytical lens of surveillance studies, focusing on three dimensions: the surveillance by Facebook that Haugen documented; the surveillance techniques she used to gather and preserve evidence; and the organizational surveillance directed at Haugen and her disclosures.


The Surveillance Haugen Documented

Haugen's disclosures revealed that Facebook was engaged in an intensive program of internal surveillance of its own platform's effects — research that the company systematically obscured from public view.

Internal research on mental health: Facebook's internal research team, whose work Haugen disclosed, had conducted studies finding that: Instagram use was associated with worse body image and mental health outcomes for approximately 13% of teenage girls; among teenage girls who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire for self-harm to Instagram; and the platform was described by internal researchers as capable of "deepening" depression and anxiety for vulnerable users.

Facebook had not published this research. The company publicly stated that its own research showed Instagram was "safe." The internal research, visible to senior leadership including Mark Zuckerberg, contradicted the public position.

Algorithm manipulation research: Haugen's documents included research on Facebook's news feed algorithm, showing that the algorithm prioritized content that generated strong emotional reactions — including anger and outrage — because such content produced higher engagement metrics. Internal researchers had proposed reducing this amplification of emotionally charged content; the proposals were rejected or not implemented because of concerns about reducing engagement.

Coordinated inauthentic behavior research: Documents showed Facebook's internal awareness of coordinated campaigns using the platform for political manipulation and the company's decisions about how much to invest in countermeasures.

These internal research programs constitute surveillance in the original sense: observation producing data that is retained and used for decision-making. The company was intensively watching the effects of its own platform, gathering detailed behavioral data about users' psychological responses, and using this data for internal decision-making while withholding it from the public, from regulators, and from the users themselves.


The Evidence Gathering: DLP Risk and Methodology

Haugen gathered approximately 22,000 pages of internal documents before leaving Facebook. The methodology involved access to documents through her regular work at the company, followed by archiving them for later disclosure.

Haugen has been careful not to provide complete details about exactly how she preserved the documents, but what is known publicly: - She accessed documents through her regular work access as a product manager - She copied or preserved documents before resigning from Facebook - She worked with a lawyer (experienced in whistleblower cases) before approaching the Wall Street Journal

From a DLP perspective, the data transfer behavior Haugen engaged in would have been exactly the pattern that DLP systems are designed to detect. Several factors may explain why she was not identified before disclosing:

Access legitimacy: Because Haugen had legitimate work access to the documents she accessed, her access patterns may not have been anomalous enough to trigger UEBA alerts.

Timing and resignation: She resigned before disclosing — meaning any investigation initiated after identifying anomalous behavior would have encountered an ex-employee rather than a current employee subject to immediate adverse action.

The specific nature of the documents: The documents included internal research reports and presentations — not code, financial data, or customer data, which are more commonly the targets of DLP monitoring.

Facebook's subsequent investigation into Haugen's document collection was conducted after her public disclosure, by which time she had already testified before Congress (creating significant additional legal protection).


Haugen's disclosures fell into multiple legal protection categories:

SEC Whistleblower Program (Dodd-Frank): Haugen filed a whistleblower complaint with the SEC, alleging that Facebook's public statements about its platform's safety contradicted its internal research — a potential securities fraud issue (companies cannot knowingly make material misrepresentations to investors). SEC whistleblower filings receive strong legal protection against retaliation.

Congressional testimony: Employees who provide testimony to Congress are protected from retaliation under federal law (18 U.S.C. § 1513). Haugen's Senate testimony was the most publicly visible element of her disclosures.

First Amendment and media law: Disclosures to journalists about matters of public concern are protected by the First Amendment in ways that make employer lawsuits legally costly. While Facebook technically could have sued Haugen for breach of confidentiality obligations (she signed an NDA), the First Amendment and public interest defense in such a suit would have been strong.

The NLRA: Haugen's disclosures about working conditions and organizational culture elements could also receive NLRA protection as discussion of working conditions — though this protection would be secondary to the stronger protections under securities law and Congressional testimony statutes.


Facebook's Response and the Surveillance Turn

Facebook's initial response to the Wall Street Journal's reporting was to characterize the documents as misrepresenting the company's practices. After Haugen revealed herself, Facebook's communications shifted to challenging her credibility and emphasizing her limited scope of access.

The company also launched an internal investigation to determine how she had obtained the documents — using its own security and access monitoring infrastructure to trace the disclosures back through the data access patterns. This investigation used the same UEBA and access logging systems that the company uses for security monitoring — turned retroactively on a now-departed employee who had publicly disclosed wrongdoing under legal protection.

Facebook reportedly considered filing suit against Haugen related to the breach of her confidentiality agreement. The suit was never filed, likely because: - The legal risk of losing (and creating additional public discovery) was high - The reputational risk of suing a whistleblower before Congressional committees was significant - Haugen's legal team had structured her disclosures to maximize legal protection

The non-suit decision illustrates an important dynamic in corporate whistleblowing: the legal protections available to well-prepared whistleblowers, combined with the reputational risks of retaliating against someone testifying before Congress, can effectively constrain organizational retaliation even against a highly consequential disclosure.


Impact and Limitations

The Facebook Papers disclosures had significant immediate effects: multiple Congressional hearings, regulatory investigations in the EU and UK, accelerated work on the EU's Digital Services Act, and sustained public discourse about social media platform accountability.

The limitations of the impact are also instructive: - Facebook (subsequently renamed Meta) did not fundamentally change its news feed algorithm or its approach to engagement metrics - The SEC whistleblower proceeding remains ongoing with no public resolution - The most significant regulatory response came in the EU, not the U.S. - Facebook's stock declined significantly in the weeks after the disclosures but subsequently recovered

The Haugen case illustrates both the potential and the limits of whistleblowing as a mechanism for corporate accountability: the disclosures generated enormous public attention and regulatory response, but the structural conditions that produced the documented harms — prioritizing engagement metrics over user wellbeing — remained substantially intact. Whistleblowing exposed the problem; it did not fix it.


Discussion Questions

  1. Haugen's disclosures revealed that Facebook was conducting intensive surveillance of its platform's mental health effects — research it concealed from users, parents, and regulators. In what sense is this internal research itself a surveillance problem? Who should have had access to this research?

  2. Haugen worked with an attorney before approaching journalists. This allowed her to structure her disclosures to maximize legal protection. What does the fact that well-prepared whistleblowers (with access to legal counsel) are substantially better protected than unprepared whistleblowers tell us about the equity of whistleblower protection?

  3. Facebook's post-disclosure investigation used the company's own monitoring infrastructure to trace Haugen's document collection. This investigation was conducted against a former employee who had already testified before Congress under legal protection. What are the limits on organizational investigation of post-employment disclosures?

  4. The most significant regulatory response to Haugen's disclosures came in the EU through the Digital Services Act. What does this geographic distribution of regulatory response suggest about the relationship between regulatory framework and whistleblowing effectiveness?

  5. Haugen described her motivation as wanting to give the public the information they needed to make informed decisions about social media, rather than leaving these decisions to internal Facebook deliberations. Apply the proportionality, exhaustion, and necessity principles to Haugen's decision. Does her disclosure satisfy each principle?