Case Study 02: Frances Haugen and The Facebook Papers
Facebook's News Feed: A Decade of Optimization Against Users
Background
In October 2021, the Wall Street Journal published a series of investigative articles — later known as "The Facebook Files" — based on thousands of internal Facebook documents provided by a whistleblower. The whistleblower identified herself shortly after publication: Frances Haugen, a former product manager on Facebook's civic integrity team, who had left the company in May 2021 after making copies of the documents she later disclosed.
The disclosure was the largest internal leak in Facebook's history. The documents — subsequently shared with a consortium of news organizations that published additional investigations, and with the US Congress — provided an unprecedented view into Facebook's internal research on the harms produced by its platforms, the company's knowledge of those harms, and the decisions the company had made about how to respond to that knowledge.
This case study examines what the Facebook Papers revealed, what the disclosure shows about the relationship between internal knowledge and public accountability, and what Haugen's congressional testimony established about the gap between Facebook's public claims and its internal understanding of its own products.
Who Is Frances Haugen?
Frances Haugen joined Facebook in June 2019 as a product manager on the civic integrity team — the group responsible for studying and addressing threats to democratic processes on Facebook's platforms. She had previously worked at Google and Pinterest. She was recruited to Facebook specifically because of her background in algorithmic product management and her interest in the problem of misinformation.
At Facebook, Haugen worked on issues related to election integrity, content ranking, and the recommendation algorithm. In this role, she had access to internal research documents, cross-functional meeting notes, and product decision memos that are not visible to ordinary employees or to the public. She became increasingly concerned, she later testified, about a pattern she observed: internal research consistently identified harms, the integrity team consistently proposed interventions, and those interventions were consistently blocked or deprioritized when they conflicted with engagement metrics.
In early 2021, Facebook disbanded the civic integrity team, citing the end of the 2020 election cycle as the reason. Haugen viewed the disbanding as evidence that the company had concluded that integrity research was an election-specific expenditure rather than an ongoing operational necessity. She made the decision to leave the company and to take the internal documents she had access to as a basis for her disclosure.
She spent several months working with a lawyer specializing in whistleblower cases to determine the appropriate channels for disclosure. She filed complaints with the US Securities and Exchange Commission, alleging that Facebook had made materially misleading public statements about the known harms of its platforms. She provided documents to the Wall Street Journal and, subsequently, to the consortium of news organizations.
Timeline of the Disclosure
September 2021: The Wall Street Journal Series
The WSJ published "The Facebook Files" beginning September 13, 2021. The series ran over several weeks and covered multiple topics. The most widely discussed installments included:
"Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show" — reporting on the internal "Teen Mental Health Deep Dive" research, including the finding that 32 percent of teen girls reported Instagram made them feel worse about their bodies, that the platform's algorithmic amplification of idealized body content drove social comparison, and that the company had been aware of these findings since at least 2019 while continuing to publicly minimize concerns about Instagram's effects on teens.
"Facebook Tried to Make Its Platform a Healthier Place. It Got Angrier Instead." — reporting on the MSI algorithm change, the internal warnings that the change would amplify outrage, the evidence that it had done exactly that, and the subsequent decisions not to implement interventions that would have mitigated the problem.
"Facebook's Emotions Research" — deeper reporting on the emotional contagion findings and the broader context of Facebook's internal research culture around emotional manipulation.
October 2021: Haugen Goes Public
On October 3, 2021, the night before she was scheduled to testify before the US Senate, Haugen identified herself as the whistleblower in a "60 Minutes" interview. The interview was broadcast to a national audience and was widely discussed.
On October 5, 2021, Haugen testified before the Senate Commerce Committee. The testimony was notable for several reasons: it was unusually substantive (senators had been briefed on the documents and asked specific, informed questions), Haugen was a credible witness with direct professional experience of the systems she described, and the hearing achieved bipartisan engagement in a Senate that had found little common ground on technology regulation.
Haugen's central argument in her testimony: "The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people. Congressional action is needed."
October–November 2021: The Facebook Papers
Following Haugen's congressional testimony, the documents she had provided were shared with a consortium of news organizations including the Associated Press, ABC News, the Atlantic, the Daily Beast, the Guardian, the Washington Post, and others. This consortium published a wave of additional reporting — collectively known as the Facebook Papers — that covered topics beyond the WSJ series, including:
- Algorithmic amplification of political violence and its role in international contexts including Ethiopia and Myanmar
- Failures of content moderation in non-English languages
- Internal debates about whether Facebook's platforms caused democratic harm
- The company's knowledge of coordinated inauthentic behavior campaigns and the limits of its responses
- Internal metrics showing the declining engagement of teen users and the company's strategies to respond
Key Revelations and Their Significance
The Angry Emoji Weighting
One of the most widely reported findings from the Papers was confirmation that Facebook's algorithm assigned the "angry" emoji reaction a weight five times greater than the "Like" reaction in determining which content to amplify. This meant that content provoking anger was systematically surfaced in more users' feeds than content that provoked approval or positive reaction.
This weighting had not been designed with malicious intent. It had been implemented because internal research showed that anger drove more subsequent engagement — more comments, more shares, more time-on-platform — than passive approval. In terms of the metrics the algorithm was optimizing for, anger was genuinely more valuable than satisfaction.
The internal documents showed that researchers had subsequently found that the anger weighting was amplifying outrage-producing content and contributing to political polarization. They recommended reducing the weighting. The recommendation was not implemented.
The anger-weighting story illustrates, in a single technical design choice, the entire dynamic this book describes: a design choice made to optimize an engagement metric, found to produce harm, and retained because reversing it would cost engagement.
The MSI Backfire
Internal documents confirmed in detail what critics had argued from the outside: the 2018 "meaningful social interactions" algorithm change had not produced meaningful social interaction. It had amplified politically divisive content, increased the prominence of outrage-optimized Pages, and made the platform more polarizing rather than more connecting.
A 2019 internal analysis, part of the Haugen disclosure, described the problem in terms that were remarkably candid: the MSI change had made Facebook's civic information environment worse. The analysis proposed several remediation strategies. Each was evaluated for its engagement cost. None was implemented.
The most striking detail in the internal documents about the MSI backfire was the apparent awareness at senior levels of the company that the public framing of the MSI change — as a wellbeing-driven initiative responsive to academic research on passive consumption — did not match what the company's own researchers were finding. The change had been publicly framed as beneficial to users; internally it was known to have made the platform more divisive.
Teen Mental Health Research
The internal teen mental health research disclosed through the Papers was significant both for its content and for its existence. The content was alarming: internal research had found that Instagram was associated with body image harm in teen girls, with increases in anxiety and depression, and with social comparison dynamics driven by the platform's algorithmic amplification of idealized images.
The existence of this research contradicted Facebook's public posture toward questions about teen mental health. When researchers, journalists, and parents raised concerns about Instagram's effects on teens, Facebook's public response consistently minimized those concerns and called for more research. The Papers revealed that more research existed — Facebook's own research — and that it confirmed the concerns that the company was publicly minimizing.
In congressional testimony shortly before Haugen's disclosure, Facebook's head of global safety, Antigone Davis, had told senators that Facebook was "not able to conclude" that Instagram caused harm to teen girls. The internal research, prepared for company leadership, reached different conclusions.
The Resource Allocation Problem
The Papers also documented a systematic imbalance in how Facebook allocated resources between the US market and international markets. The vast majority of the company's content moderation, integrity research, and safety investment was concentrated on English-language content and on the US political context. Markets in the Global South, including countries experiencing active political violence, received a fraction of these resources.
This imbalance had concrete consequences. In Myanmar, Facebook's platforms had been used to coordinate violence against the Rohingya minority — a crisis that had been extensively reported. In Ethiopia, the Papers documented internal warnings that Facebook's algorithm was amplifying content promoting ethnic violence. In each case, the documented response was inadequate relative to the scale of the harm, and the inadequacy was connected to deliberate resource allocation decisions.
The resource imbalance reveals an important dimension of the algorithmic harm story: the harms of engagement-optimization systems fall disproportionately on the most vulnerable users, in the least-resourced markets, where the company has invested least in mitigation.
Analysis: What the Haugen Disclosure Reveals
The Gap Between Public Knowledge and Internal Knowledge
The most significant implication of the Haugen disclosure is structural: it demonstrates that a large and publicly active organization can maintain a systematic gap between its internal knowledge and its public representations over a sustained period, without any individual act of dishonesty needing to be coordinated or centrally directed.
Facebook's external communications — press releases, blog posts, congressional testimony, media statements — consistently presented a picture of a company working in good faith on difficult problems, uncertain about the causal relationship between its platforms and documented harms, and committed to user safety as a core value.
Facebook's internal documents presented a picture of a company that had studied the harms in detail, reached conclusions about their causes, developed interventions that would have addressed them, and declined to implement those interventions because they would have reduced engagement.
These two pictures are not irreconcilable if one is willing to accept that corporations can simultaneously know things internally and not know them publicly — that knowledge can be compartmentalized within an organization so that different parts of the organization operate with different information. But the compartmentalization, in this case, systematically served the company's commercial interests: the knowledge that would have prompted external accountability was kept internal; the knowledge that defended the company against criticism was shared externally.
The Whistleblower as an Accountability Mechanism
The Haugen disclosure raises important questions about whistleblowing as an accountability mechanism. On one hand, the disclosure was effective: it brought internal knowledge into the public domain, prompted congressional action, generated regulatory investigation, and materially damaged Facebook's reputation in ways that may have influenced subsequent corporate behavior.
On the other hand, the disclosure was entirely contingent on Haugen's individual decision to take extraordinary personal and legal risk. She faced potential civil liability. She accepted reputational attacks from Facebook and its defenders. She acted, as she described it, because she concluded that internal accountability mechanisms had failed and that external accountability was necessary.
Relying on individual whistleblowers as the primary mechanism for corporate accountability over harmful algorithmic systems is inadequate. It places extraordinary demands on individuals; it depends on individuals having access to information; it is reactive rather than preventive; and it produces accountability only in cases where the individual chooses to act and has the resources to do so.
The Haugen case demonstrates both the value of whistleblowing and its limitations as the primary check on corporate power over algorithmic systems.
The Role of Congressional Testimony
Haugen's Senate testimony was notably more substantive than most tech-company hearings before Congress, in part because senators had been briefed on specific documents and could ask specific questions, and in part because Haugen was a credible, technically sophisticated witness who could explain algorithmic systems in accessible terms.
The hearing generated bipartisan agreement that regulation was necessary — a rare area of consensus in the Senate at the time. It did not, however, produce legislation. The political dynamics of technology regulation in the US, including industry lobbying and disagreements about the specific form regulation should take, continued to impede legislative action.
The contrast between the hearing's rhetorical consensus and its legislative outcome illustrates a broader pattern: the production of accountability in democratic systems requires not just knowledge and public outrage, but organized political pressure sustained over time in the face of well-resourced industry opposition.
The Velocity Media Parallel
The Haugen disclosure gives us a way to think about what transparency would look like at Velocity Media — and what the absence of transparency enables.
In the Velocity Media scenario, Dr. Aisha Johnson's internal research reaches conclusions that parallel what Facebook's integrity team found: the recommendation algorithm is amplifying content associated with user harm, and interventions exist that would reduce that amplification at a cost to engagement metrics. The decision is made not to implement those interventions.
If this decision is never disclosed — if there is no Velocity Media equivalent of Frances Haugen — what is the accountability mechanism? The users affected by the harmful content do not know that internal research documented the harm. They do not know that interventions were modeled and declined. They experience the harm, but they cannot attribute it to a specific corporate decision because they have no access to the decision-making process.
This is the epistemic situation of most social media users with respect to most platforms' internal deliberations. The systems affecting them are opaque. The decisions shaping those systems are internal. The research documenting the effects of those decisions is proprietary. The gap between what users experience and what companies know about what users experience is, in ordinary circumstances, unbridgeable.
The Haugen disclosure temporarily bridged that gap for Facebook. It produced a moment of accountability that would not have been possible without the disclosure. The question it leaves open is: what structures, beyond individual whistleblowing, can make that kind of transparency routine rather than exceptional?
Discussion Questions
-
Frances Haugen described her decision to disclose internal documents as a choice she made after concluding that internal accountability mechanisms had failed. What are the conditions under which internal accountability mechanisms can be expected to work? What structural features of a company's decision-making process are necessary for internal dissent to be effective?
-
Facebook's head of global safety testified to the Senate that the company was "not able to conclude" that Instagram caused harm to teen girls, shortly before the Papers revealed that internal research had reached different conclusions. What does this contrast reveal about the nature of corporate truthfulness? Is there a meaningful distinction between lying and strategic omission?
-
The Papers documented that Facebook allocated the vast majority of its integrity resources to English-language, US-focused content. How should we evaluate this resource allocation decision? Is it a neutral business decision about where to focus scarce resources, or does it constitute an ethical choice with identifiable victims?
-
The Haugen disclosure raised legislative attention to social media regulation, but did not produce major legislation in the US. What does this outcome reveal about the relationship between public knowledge of harm and political accountability? What political conditions would be necessary for the knowledge produced by the Papers to translate into regulatory change?
-
The case study describes whistleblowing as "inadequate as the primary mechanism for corporate accountability over harmful algorithmic systems." What alternative mechanisms would you propose? Design a regulatory framework in which accountability does not depend on individual whistleblowers, and evaluate the trade-offs of your design.
What This Means for Users
The Haugen disclosure and the Facebook Papers have several direct implications for users of algorithmic social media platforms.
The information asymmetry is greater than you thought. Most users understand, in an abstract sense, that platforms know more about them than they know about platforms. The Papers reveal the concrete form of that asymmetry: platforms have detailed research about the effects of their systems on your wellbeing, research they have not shared with you, research that in some cases documents specific harms while company communications deny or minimize those harms.
Public messaging from platforms about user safety should be evaluated critically. When a platform says it is committed to user safety, that commitment should be evaluated against evidence — not against the platform's own characterizations. The Papers document a gap between public safety commitments and internal research findings that should make users skeptical of corporate self-assessment on safety questions.
The "we're still learning" defense has limits. A common corporate response to evidence of platform harms is to acknowledge uncertainty and commit to further research. The Papers reveal that this response can be deployed strategically — as a way to delay accountability — even when internal research has already produced clear findings. Further research is not always what is needed; sometimes the research is done and the findings are being withheld.
Regulatory attention to platforms matters. The Haugen disclosure prompted legislative interest and regulatory investigation in multiple jurisdictions. Users who care about the harms documented in the Papers can engage with regulatory processes — public comment periods, constituent communication with legislators, support for advocacy organizations working on platform accountability — as a way to translate public knowledge into structural change.
The systems affecting Maya affect you. The teen mental health research disclosed in the Papers is about Instagram specifically and adolescents specifically, but the mechanisms it documents — algorithmic amplification of content that drives social comparison, body image anxiety, and emotional distress — are not limited to adolescents or to Instagram. They are properties of engagement-optimization systems applied to social content. Understanding how these systems work is the prerequisite for navigating them with greater awareness.