Chapter 24 Key Takeaways: Digital Disinformation — The 2016–2020 Campaigns


Core Concepts

1. The 2016–2020 period was qualitatively different from prior disinformation eras. The combination of social media algorithmic amplification, platform architecture not designed for adversarial use, and the speed of information spread created conditions in which a state-sponsored influence operation could reach 126 million Americans without traditional media access. This is not a quantitative difference from prior influence operations — it is a structural difference that changed what was possible.

2. The IRA's documented primary goal was division, not a specific electoral outcome. The Internet Research Agency ran operations simultaneously on multiple sides of American social divisions — building Black civil rights communities and conservative white patriot communities, Muslim American communities and anti-Muslim communities. The strategic objective was to deepen existing American social fractures and undermine trust in democratic institutions, not to elect any particular candidate. This distinction matters for understanding the operation's design and for evaluating claims about its effects.

3. Foreign disinformation exploits genuine social divisions; it does not create them. The IRA's operations were built on real American community concerns — real racial injustice, real economic anxiety, real religious identity. The grievances were authentic. The community members who participated were expressing genuine feelings. Only the organizer was fabricated. This means that countering the disinformation without addressing the underlying divisions it exploits is insufficient.

4. The domestic disinformation ecosystem was larger than the IRA's operation. The IRA accounted for a small fraction of total disinformation circulating in 2016. Domestic partisan media, content farms (including the Macedonian profit-motivated operations), and individual viral sharing produced substantially more false and misleading content in total. The foreign and domestic ecosystems were mutually reinforcing. Focusing exclusively on foreign disinformation misrepresents the actual information environment.

5. Disinformation exposure was concentrated, not universal. Guess, Nagler, and Tucker (2020) found that only 8.5% of Americans visited a fake news website in the month before the 2016 election, and that visits were concentrated among older, highly partisan Facebook users. The alarming claims about universal exposure to disinformation were overstated. But the concentrated exposure among politically active demographics was itself significant. The problem was depth in a specific population, not breadth across all populations.

6. The COVID-19 infodemic demonstrated that health disinformation can be lethal. Hotez et al. (2022) estimated approximately 318,000 preventable deaths from vaccine hesitancy driven by disinformation in the second half of 2021 alone. The COVID-19 infodemic was not primarily a foreign operation — it was the activation of pre-existing domestic anti-vaccine infrastructure in a pandemic information environment. Platform architecture, epistemic damage from prior disinformation, and the psychological dynamics of health risk decisions all contributed to the infodemic's scale and lethality.

7. The Stop the Steal campaign was disinformation about verifiable facts. What distinguishes election denial from contested political claims is its subject matter: vote counts, legal procedures, and audit results are verifiable facts, not matters of interpretation or values. More than 60 courts — including courts with Trump-appointed judges — rejected the fraud claims. The DOJ found no evidence of fraud sufficient to change the result. Republican election officials certified the results. The campaign's persistence despite its thorough factual refutation illustrates the resilience of politically motivated disinformation in the presence of in-group social proof.

8. Platform response was reactive, incomplete, and architecturally inadequate. Platform interventions — account removals, content labels, fact-checking partnerships — addressed specific instances of disinformation but did not address the algorithmic architecture that advantages divisive, emotionally engaging content over accurate content. The DSA represents the most systematic attempt to address structural platform responsibility, but its effectiveness remains to be demonstrated.


Key Terms

Infodemic: The WHO's term for an overabundance of information — accurate and inaccurate — that makes it difficult for people to find reliable guidance, particularly during a public health emergency. First applied specifically to the COVID-19 pandemic in February 2020.

Internet Research Agency (IRA): A private Russian company, founded in 2013 and funded by Yevgeny Prigozhin, that operated a large-scale social media influence operation targeting U.S. audiences. At peak operation, employed 400+ people in functionally organized departments, spending approximately $1.25 million monthly.

Coordinated inauthentic behavior (CIB): Facebook's term, developed in response to the IRA, for the systematic use of fake or false-front accounts to manipulate public discourse. CIB involves networks of actors coordinating to amplify specific content while concealing the artificial or inauthentic nature of the coordination.

Stop the Steal / Big Lie: The disinformation campaign, beginning immediately after the 2020 election, that promoted the false claim that Donald Trump had won the election and that the results had been fraudulently changed. The campaign's claims were rejected by more than 60 courts, by the DOJ, and by Republican election officials in key states. The campaign culminated in the January 6, 2021 attack on the U.S. Capitol.

Domestic disinformation ecosystem: The network of American-produced, American-distributed false and misleading political content, including partisan media outlets (Breitbart, Infowars, Gateway Pundit), content farms (including the Macedonian operation), and individual viral sharing. Distinguished from foreign-origin operations by the domestic actors' ideological motivation, commercial incentives, or genuine (if mistaken) belief in the content.

Content farm: A website or network of websites that produces high volumes of content optimized for algorithmic engagement and advertising revenue, without primary concern for accuracy. The Macedonian content farms in Veles produced pro-Trump content in 2016 not from ideological conviction but because engagement with that content was monetizable.

Election denial: The category of political disinformation that specifically targets trust in democratic electoral processes, either through claims of specific fraud (the Stop the Steal model) or through broader claims of systemic electoral corruption. Distinguished from normal political contestation by its focus on verifiable facts about vote counts and legal processes rather than policy preferences or interpretive claims.

Platform moderation: The set of policies and practices by which social media platforms determine what content is permissible, what content receives reduced distribution, and what accounts are suspended or removed. Platform moderation during the 2016–2020 period was reactive, inconsistently applied, and unable to address the domestic disinformation ecosystem.

Digital Services Act (DSA): The European Union's 2022 regulation requiring very large online platforms (more than 45 million EU users) to assess and mitigate systemic risks from their platforms' contribution to information disorder, to provide researchers with data access, and to submit to independent algorithmic audits. The most systematic regulatory response to the disinformation environment internationally.


Connections to Previous Chapters

Chapter 9 (Manufactured Consensus): The IRA's community-building strategy is a direct application of manufactured consensus techniques at scale. By creating Facebook pages that appeared to represent organic community consensus, the IRA manufactured the impression of widespread grassroots support for specific political positions. The "Heart of Texas" operation manufactured a community consensus about Texas identity and federal overreach that did not organically exist at the scale the page implied.

Chapter 16 (Social Media Architecture): This chapter completes the analytical treatment of the social media architecture chapter by showing that architecture's effects at maximum exploitation. The algorithmic amplification, community formation tools, and targeted advertising systems introduced in Chapter 16 are the specific mechanisms through which the IRA's operations achieved their documented reach. Chapter 24 is Chapter 16's case study.

Chapter 17 (Algorithmic Amplification): The IRA's organic reach of 126 million Americans — achieved through $100,000 in paid advertising supplemented by free algorithmic distribution — is the definitive quantitative illustration of algorithmic amplification's role in disinformation spread. The Vosoughi, Roy, and Aral (2018) finding about false news spreading faster than true news, introduced in Chapter 17, is the underlying mechanism that made the IRA's free-content strategy viable.

Chapter 21 (Cold War Dezinformatsiya): The IRA's operations are the direct descendant of Cold War dezinformatsiya, updated for social media. The ancestral operation planted stories in foreign newspapers and let them migrate back into Western media through legitimate channels. The IRA created social media communities that operated as the equivalent of those foreign newspapers — trusted-seeming, locally-voiced, capable of generating content that would migrate through algorithmic channels into mainstream discourse. The technique is the same; the medium is new.

Chapter 22 (Big Tobacco and Manufactured Doubt): The COVID-19 treatment disinformation — particularly the ivermectin campaign — follows the Big Tobacco playbook identified in Chapter 22 with fidelity. Genuine preliminary scientific uncertainty is exploited and amplified, contrarian expert voices are promoted, the appearance of scientific debate is manufactured to discourage decisive public health action, and the burden of proof is shifted from those promoting a new claim to those defending established consensus. The medium is social media rather than scientific journals, but the strategic logic is identical.


The Recurring Themes in Chapter 24

Message/Medium: The 2016–2020 period is the definitive demonstration that medium shapes propagandistic possibility. Every technique the IRA deployed depended on specific features of social media platforms — algorithmic amplification, community formation, pseudonymous identity, viral sharing. The same campaign run through prior media (print, radio, television) would have required either significant media access or massive financial investment. Social media made it free. The medium did not just carry the message; it made the message's unprecedented reach possible.

Truth/Deception Spectrum: The most effective IRA operations were not primarily built on false content — they were built on accurate content, curated and framed to serve strategic goals. Blacktivist posted accurate information about police violence and civil rights. Heart of Texas posted accurate Texas history. The deception was not in the content but in the source — the fabricated community identity that gave the content its apparent authenticity. This places these operations in a complex position on the Truth/Deception spectrum: true content in a false frame.

Us vs. Them: The IRA's entire operational strategy was organized around Us/Them divisions — not to create them, but to find existing divisions and make them more intense. The Houston competing rallies are the starkest illustration: two groups of real Americans, each genuinely committed to their community and its concerns, organized into physical confrontation by a third party exploiting the gap between them.

Power/Voice: The 2016–2020 period created an unprecedented situation in which a foreign state actor acquired, at low cost, a voice in American democratic discourse commensurate with that of major domestic media organizations. The IRA's organic reach exceeded that of many mainstream media outlets. This represents a fundamental disruption of the Power/Voice relationship that prior communication systems had embedded: voice required resources, resources required legitimacy, legitimacy required accountability. Social media eliminated the first step.

Resistance/Resilience: The Guess et al. finding that fake news consumption was concentrated rather than universal suggests that significant numbers of Americans were, in some sense, resistant to the 2016 disinformation environment — either through media literacy practices, information environments that provided reliable alternatives, or social network effects that insulated them from high-volume disinformation exposure. Understanding what made those people resistant is as important as understanding what made others vulnerable.


Chapter 24 — Key Takeaways