Chapter 33: Further Reading
Misinformation and Engagement Optimization: The Epistemic Crisis
1. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151.
The foundational empirical study of misinformation spread on social media. Analyzed 126,000 fact-checked stories on Twitter from 2006-2017, finding that false news spreads faster, further, and more broadly than true news, and that the differential is driven by human sharing behavior rather than bots. The novelty and emotional arousal hypotheses are carefully tested. Required reading for anyone studying misinformation at any level.
2. Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. Council of Europe.
The report that introduced the misinformation/disinformation/malinformation tripartite framework that has become standard in the field. Provides both definitional clarity and a comprehensive analysis of the actors, messages, and interpreters involved in information disorder. Available free from the Council of Europe website and essential for policy-focused analysis.
3. Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388-402.
A comprehensive review of psychological research on why people believe and share misinformation. Covers cognitive mechanisms (motivated reasoning, inattention to accuracy), research on correction effects, and the evidence base for nudge-based interventions. Written accessibly for non-psychologists and provides an excellent bridge between empirical findings and practical intervention design.
4. van der Linden, S. (2022). Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity. W. W. Norton.
The leading popular science account of inoculation theory applied to misinformation. Van der Linden, the Cambridge psychologist who has led much of the prebunking research, explains the theory and evidence in accessible terms. Essential reading for understanding why prebunking may be among the most promising counter-misinformation interventions and what scaling it would require.
5. DiResta, R., Schafer, J. S., Ruppel, B., et al. (2019). The Tactics and Tropes of the Internet Research Agency. New Knowledge.
The most comprehensive analysis of the Internet Research Agency's social media influence operations. Based on data released by the Senate Intelligence Committee following the 2016 U.S. election. Documents the specific tactics used, the platforms targeted, the narratives promoted, and the scale of the operation. Essential for understanding state-sponsored disinformation.
6. Starbird, K. (2019). Disinformation's spread: Bots, trolls and all of us. Nature, 571(7766), 449-450.
A short, essential essay by one of the leading researchers in crisis misinformation. Challenges the dominant framing of misinformation as a problem of bots and trolls, arguing that ordinary people are central to misinformation spread. Reframes the intervention problem accordingly.
7. Abernathy, P. M. (2020). News Deserts and Ghost Newspapers: Will Local News Survive? University of North Carolina at Chapel Hill, Hussman School of Journalism and Media.
The most comprehensive documentation of local journalism collapse in the United States. Maps newspaper closures, analyzes structural causes, and examines the consequences for communities left without local journalism. The "news deserts" terminology originates here. Essential context for understanding the information environment into which social media misinformation expands.
8. Woolley, S., & Howard, P. N. (Eds.) (2018). Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media. Oxford University Press.
An edited volume collecting research from the Oxford Internet Institute's Computational Propaganda Project. Documents bot activity, coordinated inauthentic behavior, and political manipulation across multiple countries and platforms. Provides both specific country case studies and cross-platform comparative analysis.
9. Loomba, S., de Figueiredo, A., Piatek, S. J., et al. (2021). Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA. Nature Human Behaviour, 5(3), 337-348.
Documents the measurable impact of specific COVID-19 vaccine misinformation narratives on vaccination intention, and evaluates the effectiveness of corrections. Finds that misinformation reduces vaccine intent by approximately 6.4 percentage points and that corrections help but do not fully reverse the effect. Critical evidence for quantifying the public health consequences of vaccine misinformation.
10. Chesney, R., & Citron, D. K. (2019). Deep fakes: A looming challenge for privacy, democracy, and national security. California Law Review, 107, 1753-1820.
The paper that introduced the "liar's dividend" concept — the ability to use the existence of deepfake technology to dismiss genuine evidence as fabricated. Provides a comprehensive legal and policy analysis of deepfakes' implications. Essential for understanding how AI-generated synthetic media changes the epistemic landscape.
11. Pennycook, G., Epstein, Z., Mosleh, M., et al. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592(7855), 590-595.
Documents the "accuracy nudge" intervention: simply prompting users to think about accuracy before sharing reduces the sharing of false headlines without reducing sharing of accurate ones. A low-cost, psychologically grounded intervention with significant implications for platform design.
12. Clayton, K., Blair, S., Busam, J. A., et al. (2020). Real solutions for fake news? Measuring the effectiveness of general warnings and fact-check tags in reducing belief in false stories on social media. Political Behavior, 42(4), 1073-1095.
Documents both the positive effects of fact-checking labels (reduced belief in labeled content) and the negative "implied truth effect" (unlabeled content near misinformation gets a credibility boost). Required reading for nuanced understanding of label-based interventions.
13. Roose, K. (2019, June 8). The making of a YouTube radical. The New York Times.
A landmark piece of narrative journalism documenting one person's radicalization pathway through YouTube recommendations. Reconstructed from the subject's YouTube watch history, it provided concrete evidence of the recommendation pipeline from mainstream to extremist content. More accessible than academic studies and highly effective at communicating the dynamics to general audiences.
14. Hayes, D., & Lawless, J. L. (2021). News Hole: The Demise of Local Journalism and Political Engagement. Cambridge University Press.
Rigorous empirical analysis of the political consequences of local journalism collapse, including effects on civic knowledge, political participation, and the quality of democratic accountability. Provides essential context for understanding why news deserts are a democratic governance problem, not merely an information access problem.
15. Broniatowski, D. A., Jamison, A. M., Qi, S., et al. (2018). Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. American Journal of Public Health, 108(10), 1378-1384.
Documents Russian IRA and bot activity in amplifying vaccine-related content on Twitter, including amplification of both pro- and anti-vaccination content in ways designed to maximize polarization rather than promote a specific health outcome. Significant for demonstrating that the goal of some disinformation campaigns is not to promote a specific belief but to deepen distrust and polarization.
16. Guess, A. M., & Lyons, B. A. (2020). Misinformation, disinformation, and online propaganda. In N. Persily & J. A. Tucker (Eds.), Social Media and Democracy: The State of the Field, Prospects for Reform (pp. 10-33). Cambridge University Press.
An authoritative review of the academic literature on misinformation and propaganda, situating the research within debates about methodology, effect sizes, and policy implications. Particularly valuable for its critical assessment of what is and is not established by the research literature.
17. Benkler, Y., Faris, R., & Roberts, H. (2018). Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. Oxford University Press.
A comprehensive analysis of the U.S. political information ecosystem focusing on asymmetric polarization, the role of right-wing media networks, and the relationship between mainstream media and political misinformation. Argues that structural features of the media ecosystem — not just social media — explain the information disorder of contemporary American politics.
18. Rogers, K. (2020). In Bad Faith: The Science of Conspiracy Theories and What to Do About Them. MIT Press.
Examines the psychological and social mechanisms underlying conspiracy theory belief and spread. Particularly useful for understanding why corrections often fail and what alternatives (including prebunking and inoculation) are more likely to be effective. Bridges academic research and practical application accessibly.