Chapter 16: Further Reading
Digital Media, Social Networks, and Viral Spread Propaganda, Power, and Persuasion: A Critical Study of Influence, Disinformation, and Resistance
Primary Sources
Vosoughi, S., Roy, D., and Aral, S. (2018). "The Spread of True and False News Online." Science, 359(6380), 1146–1151.
The landmark quantitative study of true and false news spread on Twitter, analyzing 126,000 news stories across eleven years. The finding that false news spreads faster, further, and more broadly than true news — and that humans rather than bots are primarily responsible — is among the most cited findings in misinformation research. Essential reading for understanding the empirical foundation of viral spread dynamics. Access through major research libraries or doi:10.1126/science.aap9559.
United States Senate Select Committee on Intelligence. (2019). Report of the Select Committee on Intelligence on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election, Volume 2: Russia's Use of Social Media. Washington, D.C.: U.S. Government Publishing Office.
The most comprehensive public account of the Internet Research Agency's social media operations, based on materials provided by Facebook, Twitter, Google, and other platforms. Volume 2 focuses specifically on social media and includes detailed analysis of the Blacktivist account, the Black-targeted operations, and the IRA's targeting of multiple American communities. Freely available at intelligence.senate.gov.
United Nations Human Rights Council. (2018). Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar. A/HRC/39/CRP.2.
The UN's investigation into the Myanmar genocide, including the specific finding that Facebook had played a "determining role" in spreading hate speech. Contains granular documentation of specific propaganda content, the anti-Rohingya campaign's methods, and Facebook's documented warnings and responses. Essential for Case Study 1. Available at ohchr.org.
Pennycook, G., McPhetres, J., Zhang, Y., Lu, J. G., and Rand, D. G. (2020). "Fighting COVID-19 Misinformation on Social Media: Experimental Evidence for a Scalable Accuracy-Nudge Intervention." Psychological Science, 31(7), 770–780.
The accuracy-nudge study demonstrating that brief accuracy priming significantly improves sharing decisions. Accessible and clearly written; the method section is worth reading for its experimental design elegance. The finding has generated a productive research program that students interested in intervention design should explore further. Available through university library databases.
Secondary Sources and Investigations
Wardle, C., and Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. Council of Europe Report DGI(2017)09.
The foundational framework paper distinguishing misinformation (false content shared without intent to harm), disinformation (false content shared with intent to harm), and malinformation (true content shared with intent to harm). The typology is now standard in the field and provides vocabulary that clarifies the "disinformation" concept's many uses. Free download from the Council of Europe website. Required reading for anyone working in this field.
Haugen, F. (2021). "The Facebook Whistleblower: Testimony Before the United States Senate Committee on Commerce, Science, and Transportation" (October 5, 2021).
Frances Haugen's Senate testimony, accompanied by the Wall Street Journal's "Facebook Files" series (September–October 2021), provides the primary documented account of Facebook's internal knowledge about engagement optimization, the "angry" reaction multiplier, and algorithmic amplification of civic misinformation. The testimony transcript is publicly available. The WSJ series (paywalled but available through libraries) is the journalistic primary source.
Silverman, C., and Alexander, L. (2016, November 3). "How Teens in the Balkans Are Duping Trump Supporters With Fake News." BuzzFeed News.
The original investigative report documenting the Macedonian content farm ecosystem and its business model of financially motivated disinformation production. Essential for understanding how content farm economics interact with platform engagement optimization. Available free at buzzfeednews.com.
Mozur, P. (2018, October 15). "A Genocide Incited on Facebook, With Posts From Myanmar's Military." The New York Times.
The New York Times investigation into Myanmar military personnel's use of Facebook to spread propaganda targeting Rohingya communities. Based on original reporting and documentation of specific accounts and content. Pairs with Case Study 1. Available at nytimes.com.
BBC News India. (2018). "WhatsApp Rumours and Mob Lynching." Series of investigative reports, April–August 2018.
The BBC's documentary and investigative reporting on specific cases of WhatsApp-linked mob violence in India, including the Rainpada case. Includes verified documentation of specific forwarding chains and the false content's origins. Available at bbc.com/news/world-asia-india.
Academic Literature
Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. B. F., Lee, J., Mann, M., Merhout, F., and Volfovsky, A. (2018). "Exposure to Opposing Views on Social Media Can Increase Political Polarization." Proceedings of the National Academy of Sciences, 115(37), 9216–9221.
Research finding that exposure to opposing political views on social media — far from promoting the exchange of perspectives that platforms' democratic framing suggests — can increase rather than decrease polarization. Challenges simple assumptions about social media as a vehicle for political dialogue and introduces the "backfire effect" dynamics relevant to understanding how correction attempts can fail.
Berger, J., and Milkman, K. L. (2012). "What Makes Online Content Viral?" Journal of Marketing Research, 49(2), 192–205.
The original publication of the STEPPS framework, derived from analysis of New York Times articles and their sharing behavior. Applicable well beyond its original marketing context. Students interested in the psychology of content virality will find this paper accessible and analytically rich.
Pennycook, G., and Rand, D. G. (2019). "Lazy, Not Biased: Susceptibility to Partisan Fake News Is Better Explained by Lack of Reasoning Than by Motivated Reasoning." Cognition, 188, 39–50.
Challenges the popular assumption that partisan motivated reasoning (people believe false things because they confirm their political biases) is the primary mechanism of fake news credulity. Finds instead that failure to engage analytical thinking — not partisan bias — better predicts susceptibility to false news. Consequential for intervention design: if the problem is analytical disengagement rather than bias, accuracy nudges that restore analytical orientation may be more effective than bias-correction approaches.
Watts, D. J., and Dodds, P. S. (2007). "Influentials, Networks, and Public Opinion Formation." Journal of Consumer Research, 34(4), 441–458.
Classic network science paper challenging the "influentials" model (the idea that a few influential individuals drive opinion cascades) and finding that large-scale social cascades are more often driven by the susceptibility of the overall network than by the influence of specific nodes. Relevant to understanding why viral propaganda spreads in the ways it does and why targeting "influencers" as the primary intervention point may be insufficient.
Freelon, D., and Wells, C. (2020). "Disinformation as Political Communication." Political Communication, 37(2), 145–156.
Analytical framework arguing that disinformation should be understood primarily as a political communication practice rather than an information disorder problem, which implies different responses: political accountability and democratic reform rather than purely technical or epistemic interventions. Provides useful pushback against purely technical framings of the disinformation problem.
Reports and Policy Documents
Freedom House. (Annual). Freedom on the Net. Washington, D.C.: Freedom House.
Annual global assessment of internet freedom, with country-specific reports documenting social media censorship, government-directed disinformation, and platform access restrictions. Essential reference for understanding the global landscape of online political manipulation. Available free at freedomhouse.org.
Stanford Internet Observatory. Election Integrity Partnership: The Long Fuse: Misinformation and the 2020 Election. (2021).
Comprehensive analysis of misinformation circulating about the 2020 U.S. presidential election, tracking over a million pieces of content. Produced by a consortium of research organizations with access to platform data. Available free at stacks.stanford.edu.
Reuters Institute for the Study of Journalism. (Annual). Digital News Report. Oxford: Reuters Institute.
Annual survey of news consumption habits across multiple countries, including data on social media as a news source, trust in different information channels, and cross-national variation in disinformation vulnerability. Available free at reutersinstitute.politics.ox.ac.uk.
Documentaries
"The Social Dilemma" (2020). Directed by Jeff Orlowski. Netflix.
Interviews with former social media executives and engineers about the design choices behind engagement optimization, the attention economy, and the mental health implications of platform architecture. Somewhat polemical in framing but provides accessible first-person accounts of platform design decisions from people who made them. Useful as introductory material before more rigorous academic reading.
"Myanmar's Killing Fields" (2018). Frontline / ProPublica.
Documentary investigation into the Myanmar genocide and Facebook's role. Includes original reporting and documentation of specific anti-Rohingya content and Facebook's delayed response. Pairs with Case Study 1.
Chapter 16 | Propaganda, Power, and Persuasion