Case Study 13.2: Instagram, Adolescents, and the Internal Research Problem

When the Company Knows and Does Not Tell


Background: The Internal Research Documents

In September 2021, journalist Georgia Wells at The Wall Street Journal published a story with a headline that would prompt Congressional hearings, regulatory attention, and major public debate: "Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show."

The documents behind the story were internal Facebook research presentations — slide decks prepared by the company's own data scientists and researchers — that had been disclosed to the Journal as part of a broader collection of internal documents by Facebook whistleblower Frances Haugen. The documents, which Haugen simultaneously provided to the SEC and to Congress, documented a systematic pattern: Facebook had conducted extensive internal research on Instagram's effects on adolescent users, found significant negative effects on mental health — particularly for teenage girls — and had not disclosed those findings publicly, had not acted on them to change the platform, and had, in some cases, actively considered expanding the platform to younger users.

The case is particularly instructive for surveillance studies because it involves a second form of visibility asymmetry beyond the user/platform divide: an asymmetry between what the company knows about its own effects and what the public, regulators, parents, and users know about those effects.

What the Research Found

The internal research documents revealed several specific findings:

On body image: A 2019 slide presentation stated: "Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse." The research identified "social comparison" as the primary mechanism — Instagram's visual, curated-image format encouraged users to compare themselves to others, with predictable effects on self-esteem, particularly for users already vulnerable to body-image concerns.

On mental health and suicidal ideation: Internal documents contained the finding that "13.5% of teen girls say Instagram makes thoughts of suicide and self-injury worse." The research noted that this was not a small or marginal effect — it represented a substantial proportion of a population that includes tens of millions of adolescents worldwide.

On awareness and inability to stop: The research found that adolescent users were aware that Instagram was making them feel bad and reported feeling unable to reduce their usage — consistent with psychological research on the addictive properties of variable-reward feedback systems (the mechanism by which social media engagement operates).

On algorithm design: Internal research confirmed that the recommendation algorithm amplified emotional, often negative, content because such content drove engagement — consistent with the emotional contagion finding from 2014 and the pattern of engagement optimization described in Section 13.10 of the chapter.

The Company's Response: Internal and External

The internal response to these research findings, as documented in the disclosure, was instructive for what it reveals about the priorities within a surveillance capitalism business model.

What did not change: The algorithm design that amplified negative content. The "Like" and engagement metric system that drove social comparison. The platform's design for adolescents. The expansion plans that, at various points, included developing an "Instagram Kids" product for users under 13.

What was considered: Internal memos and presentations discussed options for addressing the mental health concerns, including reducing social comparison features, adjusting recommendation algorithms, and changing how engagement metrics were displayed. Some of these options were piloted or considered. Most were not implemented in ways that addressed the structural issue.

The external posture: When external researchers, journalists, and parents raised concerns about Instagram's effects on adolescents, Facebook's public response consistently minimized those concerns. Company representatives testified before Congress that the research on social media's mental health effects was "mixed" and did not clearly establish harm. The contrast between the internal research findings and the external public statements formed the basis of subsequent regulatory attention and legal proceedings.

In September 2021, under pressure from the Congressional investigation, Facebook announced that it was pausing development of "Instagram Kids." In March 2022, Facebook disclosed in SEC filings that Haugen had provided documents to the SEC and that the company was the subject of a securities investigation into whether it had misled investors about its knowledge of Instagram's harms.

The Surveillance Asymmetry at the Institutional Level

The Instagram adolescent case illustrates a form of visibility asymmetry that operates not between users and platforms (the primary focus of Chapter 13) but between institutions and the public they affect. Facebook had extensive visibility into the harm its platform was producing — more visibility, arguably, than any external researcher, because it had access to behavioral data at a scale no independent researcher could match. The users, parents, regulators, and policymakers who might have acted on this information had no access to it.

This institutional visibility asymmetry is a direct consequence of the surveillance capitalism data model. Because Facebook collects behavioral data from hundreds of millions of users, it can conduct research on its own effects with statistical power and methodological access that external researchers cannot replicate. The same surveillance infrastructure that Facebook uses to sell advertising to corporations gives Facebook capabilities to study human behavior at a population level that no university, government agency, or independent organization can match.

The irony is deep: the surveillance system that enables Facebook to understand its users' behavior better than any outside researcher is the same system that simultaneously makes it possible for Facebook to produce harm at population scale while knowing about it.

The Regulatory Aftermath

The Haugen disclosures prompted significant regulatory and legislative activity:

Congressional hearings: Facebook CEO Mark Zuckerberg testified before the Senate Commerce Committee in October 2021. Several senators questioned him about the internal research findings, the discrepancy between internal research and public statements, and the company's obligations to adolescent users. Zuckerberg defended the company's research practices and disputed characterizations that the company had concealed harm.

FTC investigation: The FTC expanded its existing investigation of Facebook to include the Instagram adolescent research findings and the company's practices with respect to children's privacy.

State attorneys general: A coalition of state attorneys general launched an investigation into Facebook/Meta and its practices with respect to minors on Instagram, culminating in a coordinated multistate lawsuit filed in 2023 alleging that Instagram's design was addictive and harmful to adolescents.

UK Children's Code enforcement: The UK's Information Commissioner's Office had already adopted an "Age Appropriate Design Code" (Children's Code) requiring platforms to assess and mitigate risks to children from their design choices. Instagram faced scrutiny under this code.

The regulatory response illustrated a pattern: social media harm, visible to the company through its surveillance of users, became visible to regulators only through disclosure — accidental, whistleblower-driven, or legally compelled. The information asymmetry between platforms and regulators is a structural feature of the surveillance capitalism model, and it systematically advantages platforms in regulatory interactions.

Analysis Questions

  1. Facebook's internal research documented significant harm from Instagram to adolescent users. The company did not disclose this research publicly or act on it to change the platform. What legal and ethical obligations, if any, do companies have to disclose internal research findings about harms their products cause? How should these obligations be structured?

  2. The surveillance capitalism model gives platforms extraordinary research capabilities — the ability to study population-level behavioral effects with statistical power that no external researcher can match. This capability produces both commercial intelligence (for advertising) and knowledge about harm (like the Instagram adolescent findings). Should the regulatory community have access to this research capacity? How might such access be structured?

  3. The contrast between Facebook's public statements (minimizing mental health research) and its internal research findings (documenting harm) is at the center of the SEC investigation. What legal theory supports the claim that this constitutes securities fraud or deceptive corporate practice? What does it suggest about the relationship between corporate surveillance of consumers and corporate disclosure to investors and regulators?

  4. The proposed "Instagram Kids" product — a version of Instagram for users under 13 — was paused after public pressure but not formally abandoned. Apply the framework of the chapter to this proposal. What surveillance data would such a platform collect? Who would the platform's primary commercial relationships be with (young users, their parents, advertisers)? What ethical analysis framework should govern the design of surveillance systems that target children?

  5. Frances Haugen's decision to become a whistleblower — to disclose internal Facebook documents to journalists, Congress, and regulators — is itself an act of counter-surveillance: making the corporation visible to the public in the way the corporation makes users visible to itself. Evaluate the ethics of whistleblowing in this context. What obligations and risks did Haugen face? What are the limits of whistleblowing as a mechanism for platform accountability?


Case Study 13.2 | Chapter 13 | Part 3: Commercial Surveillance