Case Study 01: The Frances Haugen Disclosure — What Facebook Knew About Instagram and Teen Girls
Background
On October 3, 2021, the Wall Street Journal published the first story in what would become "The Facebook Files" — a landmark series of investigative reports based on thousands of pages of internal Facebook documents provided by a whistleblower. That whistleblower, Frances Haugen, a data scientist who had worked on Facebook's civic integrity team from 2019 to 2021, simultaneously filed complaints with the Securities and Exchange Commission and testified before the United States Senate Commerce Subcommittee on Consumer Protection.
The documents Haugen released covered a wide range of Facebook and Instagram behaviors and internal research findings. But the disclosures that captured the most sustained public attention — and that most directly concerned Instagram as a platform — were the internal studies on Instagram's effects on the mental health of teenage girls. These findings revealed that Facebook's own researchers had documented, in specific terms, the psychological harms Instagram caused to adolescent users. More consequentially, the disclosures revealed the gap between what Facebook knew internally and what it communicated publicly.
This case study examines the Haugen disclosures in detail: the research findings that were revealed, the timeline of when Facebook knew what, the congressional testimony and its aftermath, and what the episode means for our understanding of the relationship between platform companies, internal research, and public accountability.
The Whistleblower
Frances Haugen's professional background made her a uniquely credible source. A Harvard-educated data scientist with a background in algorithmic product management, she had worked at Google, Pinterest, Yelp, and Change.org before joining Facebook in 2019. At Facebook, she was assigned to the civic integrity team — the unit responsible for studying and mitigating Facebook's role in political manipulation, election interference, and the spread of misinformation. She described her experience as marked by repeated encounters with research showing platform harms and repeated decisions not to act on that research.
Haugen began collecting internal documents before her departure from the company in May 2021. The documents she gathered — tens of thousands of pages of internal research, presentations, memos, and discussions — were not obtained through any unauthorized access. They were documents she encountered in the normal course of her work. The decision to release them was, by her account, motivated by the belief that the public could not meaningfully participate in democratic deliberation about social media policy without knowing what platform companies knew internally.
Her decision to approach the SEC before the media was strategic. Haugen believed that framing her disclosures as a securities compliance matter — specifically, that Facebook had made material misrepresentations to investors about the company's knowledge of its harms — would ensure legal protection and force regulatory attention. The simultaneous release to the Wall Street Journal's investigative team ensured broad public reach.
Timeline of Key Events
2019 — Internal Research and Presentation Internal Facebook researchers conduct surveys and qualitative interviews with teenage Instagram users. The research produces the finding, later publicized by Haugen, that "we make body image issues worse for one in three teen girls." A presentation slide states: "Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse." The research also documents associations between Instagram use and increases in anxiety and depression rates among adolescent girls.
2019 — Like Count Experiment Begins Instagram begins testing the removal of public like counts in Canada, framing the experiment as a measure to reduce social pressure. Internal documents, later revealed in the Haugen files, suggest that this experiment was partly motivated by the body image research — the recognition that quantified social approval was a mechanism of harm.
2020 — "Teen Mental Health Deep Dives" Facebook launches an internal initiative described in the documents as "Teen Mental Health Deep Dives," intended to study the mechanisms of Instagram's effects on adolescent users in greater depth. The initiative produced additional research confirming and extending the 2019 findings.
May 2021 — Haugen Departs Facebook Haugen leaves the company, having spent the preceding months documenting the internal research she intended to release.
September 2021 — SEC Complaints Filed Haugen files complaints with the SEC, alleging that Facebook made material misrepresentations about its knowledge of the platform's harms.
October 3, 2021 — "The Facebook Files" Begins The Wall Street Journal publishes the first installment of "The Facebook Files," including a story titled "Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show." The headline and story's use of the phrase "toxic" reflected the internal language of the documents themselves.
October 5, 2021 — Haugen Identified and Testifies Haugen's identity is revealed on CBS's 60 Minutes, where she gives her first on-camera interview. She describes her experience at Facebook, the specific research she encountered, and her reasons for disclosing.
October 6, 2021 — Senate Commerce Subcommittee Testimony Haugen testifies before the United States Senate Commerce Subcommittee on Consumer Protection, Safety, and Data Security. Her testimony is notable for its technical specificity about algorithmic design, its characterization of Facebook's internal culture, and its argument that Facebook's choices constituted a deliberate prioritization of engagement over user welfare.
October 2021 — Facebook's Response Adam Mosseri, head of Instagram, responds publicly, acknowledging that "Instagram can have a negative effect on some teens" while disputing the framing that the internal research established that Instagram makes body image issues worse for a significant proportion of teen girls. Facebook's corporate communications team releases statements questioning the characterization of the research.
2021-2022 — Congressional Investigations Multiple congressional committees hold hearings on Facebook and Instagram's effects on teen mental health. Proposed legislation includes various platform transparency and teen protection measures, most of which do not pass in the immediate aftermath but contribute to an ongoing legislative environment.
The Research Findings in Detail
The internal Facebook documents on teen mental health were not a single study but a collection of surveys, qualitative research, and analysis conducted over several years. Key findings documented by Haugen and reported by the Wall Street Journal include:
Body image findings: Thirty-two percent of teenage girls who felt bad about their bodies reported that Instagram made them feel worse. This was not a marginal finding embedded in a large dataset; it was a headline finding of research specifically designed to understand Instagram's body image effects.
Comparative platform effects: Facebook's internal research found that Instagram had worse effects on teen mental health than other social media platforms on certain measures. Teens themselves, in qualitative interviews, named Instagram specifically as a source of body image anxiety and comparison distress in ways that they did not name other platforms.
Awareness of mechanism: The documents showed that Facebook's researchers understood the specific mechanisms by which Instagram produced these effects: the exposure to idealized body images, the social comparison enabled by visible follower and like counts, and the algorithm's tendency to surface body-focused content to users who had engaged with it once. The internal research was not vague about causation; it identified pathways with considerable specificity.
Mental health trends: The documents included analysis suggesting that Instagram use was associated with increases in rates of anxiety and depression among teen girls during the period of the platform's growth. The causal inference required to move from correlation to causation was contested internally, but the association was documented.
Failure of existing interventions: The documents showed that Instagram's existing wellbeing features — including the "You're All Caught Up" notification, time usage dashboards, and content moderation of eating disorder imagery — were insufficient to materially change the outcomes measured. Researchers identified more radical interventions but these were not implemented.
The Congressional Testimony
Haugen's Senate testimony on October 6, 2021, lasted approximately three and a half hours and was notable for the unusual degree of bipartisan agreement it produced. Senators from both parties expressed alarm at the research Haugen described and frustration at Facebook's self-regulation record.
Haugen's core arguments before the Senate were:
-
Facebook's ranking algorithms amplified content that generated engagement, and the most engaging content was frequently harmful — divisive, emotionally intense, and in the case of Instagram, aspirationally distorted.
-
Facebook had access to detailed research documenting these harms and had consistently chosen not to implement the changes that would reduce them, because those changes would also reduce engagement and revenue.
-
The company's public statements minimized and qualified the harms its internal research documented, creating a systematic gap between internal knowledge and public communication.
-
Self-regulation had failed and would continue to fail because the incentive structure — advertising revenue tied to engagement — was incompatible with the interventions needed to protect users.
-
The remedy she proposed was transparency: requiring platforms to share their algorithmic and internal research data with independent researchers and regulators, enabling external accountability rather than relying on voluntary disclosure.
Analysis: The Gap Between Knowledge and Action
The Haugen disclosures raise fundamental questions about corporate accountability in the platform economy. Facebook's internal research capacity was sophisticated. The company employed hundreds of researchers whose job was to understand the platform's effects on users. Those researchers identified, with specificity, mechanisms of harm to adolescent girls.
The gap between this internal knowledge and the company's public communications exemplifies what might be called "knowledge without accountability" — a situation in which a corporation possesses detailed information about the harms it causes but faces no legal obligation to disclose it, correct it, or compensate those harmed.
The existing legal framework for product liability is poorly equipped for this situation. Section 230 of the Communications Decency Act, which provides platforms broad immunity for third-party content, has been interpreted to limit legal claims that platform design choices produce harm. The SEC complaints Haugen filed represented an attempt to use investor protection law — a body of law where the disclosure obligations are clearer — to compel accountability. Whether this theory would succeed in litigation was unclear at the time of writing.
The episode also reveals the limits of self-regulation. Facebook did conduct research. It did identify harms. It did implement some partial interventions. But the commercial incentives that made Instagram's comparison-generating design so profitable also made comprehensive remediation prohibitively costly. When every dollar of revenue is traceable to engagement, and engagement is generated partly by the mechanisms that cause harm, the self-regulatory impulse is structurally limited.
What This Means for Users
The Haugen disclosures matter for users in several ways that extend beyond the specific findings about Instagram and teen girls.
First, they establish that platform companies possess — and use — detailed information about how their products affect users' psychological states. This knowledge is not used primarily to improve user welfare; it is used to optimize engagement. Users who assume that platforms are primarily interested in their satisfaction should understand that engagement and satisfaction are related but distinct objectives.
Second, they establish the inadequacy of current transparency requirements. In the absence of mandatory disclosure obligations for internal research, users and the public are dependent on whistleblowers like Haugen for access to information that directly affects their health and wellbeing. This is not a sustainable accountability mechanism.
Third, they provide empirical grounding for the psychological experiences many users, particularly teenage girls, have reported but struggled to have taken seriously. The internal research confirms what users described experientially — that Instagram made them feel worse about their bodies — and establishes that the company knew this. This confirmation matters for the many users who were told that their distress was a personal problem rather than a design-induced harm.
Finally, the Haugen episode illustrates the importance of insider testimony and whistleblowing in the regulation of powerful technological institutions. Many of the most significant accountability moments in platform company history — Cambridge Analytica, the Facebook election integrity issues, the Instagram teen mental health research — have come through insider disclosure rather than through regulatory examination or journalistic investigation alone. This dependence on insiders creates a fragile accountability system.
Discussion Questions
-
Haugen framed her disclosures partly as a securities compliance matter, arguing that Facebook made material misrepresentations to investors. What ethical and legal arguments support this framing? What are its limitations?
-
Facebook's researchers documented the harms but were not, individually, responsible for the product decisions that maintained those harms. How should moral responsibility be distributed among corporate researchers who document harm, product managers who make design decisions, and executives who set commercial priorities?
-
The chapter describes existing interventions (like count hiding, time-use dashboards) as insufficient. What would a genuinely effective intervention look like? What would it cost the platform commercially, and would that cost ever be voluntarily borne?
-
Haugen's testimony generated strong bipartisan agreement in the Senate but produced limited legislation. What structural factors prevent legislative response to documented platform harms?
-
If you were an Instagram user who believed you or someone you know had experienced body image harm attributable to the platform's design, what legal remedies, if any, might be available to you? What changes in law would be needed to improve access to remedies?