Chapter 40 Further Reading: Global and Cross-Cultural Perspectives on Misinformation
Foundational Research on Research Bias
1. Henrich, J., Heine, S. J., & Norenzayan, A. (2010). "The weirdest people in the world?" Behavioral and Brain Sciences, 33(2–3), 61–83.
The foundational paper introducing the WEIRD critique to behavioral science. Henrich and colleagues document the systematic over-representation of Western, Educated, Industrialized, Rich, and Democratic populations in behavioral research and the extent to which findings from these populations do not generalize globally. Essential background for understanding why the WEIRD critique applies with particular force to misinformation research, which has followed the same sampling patterns as the psychology research this paper critiques.
2. Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making. Council of Europe.
The foundational typology of information disorder (misinformation, disinformation, malinformation) that has shaped the field. While primarily developed in a Western context, Wardle and Derakhshan's framework has been applied globally and provides the conceptual vocabulary for this chapter's analysis. Essential reading for understanding the conceptual foundations of misinformation research before engaging with global variations. Available at: https://rm.coe.int/information-disorder-report-november-2017/1680764666
India
3. Banaji, S., Bhat, R., Agarwal, A., Andreou, N., & Vis, F. (2019). WhatsApp Vigilantes: An Exploration of Citizen Reception and Forwarding of WhatsApp Misinformation Linked to Mob Violence in India. London School of Economics and Political Science.
The most comprehensive academic study of the connection between WhatsApp misinformation and mob violence in India. Banaji and colleagues conducted qualitative research in communities that had experienced WhatsApp-linked violence, documenting how misinformation circulated, why it was believed, and what local dynamics enabled it to trigger collective action. Essential for understanding the mechanism of harm, not just the phenomenon. Available at the LSE website.
4. Ressa, M. (2022). How to Stand Up to a Dictator: The Fight for Our Future. HarperCollins.
While primarily a Philippines case study, Ressa's memoir provides essential first-person documentation of state-affiliated social media influence operations, algorithmic amplification, and the personal cost of fact-checking in the face of politically motivated legal action. Ressa's description of the Philippine keyboard army's mechanism — coordinated engagement triggering algorithmic amplification — is the clearest available account of this dynamic, applicable beyond the Philippines context.
Africa
5. Wasserman, H. (Ed.) (2011). Popular Media, Democracy and Development in Africa. Routledge.
A foundational academic collection on African media ecosystems, examining how popular media forms (radio, mobile communications, community media) serve democratic functions in African contexts. While predating the WhatsApp era, Wasserman's analysis of African media trust dynamics, oral information culture, and the relationship between colonial legacy and contemporary media practice provides essential context for understanding contemporary African misinformation ecosystems.
6. Hitchen, J., & Hassan, I. (2023). Africa Check: How a Fact-Checking Organisation Navigates Verification in Low-Resource Environments. Reuters Institute for the Study of Journalism.
A detailed institutional case study of Africa Check's methodology development, funding challenges, language coverage strategy, and impact assessment. The Reuters Institute study provides the most thorough academic analysis of Africa Check as an institution and is essential for understanding what fact-checking in African contexts actually involves. Available at the Reuters Institute Digital News Report site.
7. Posetti, J., Bell, E., & Brown, P. (2020). Journalism and the Pandemic: A Global Snapshot of Impacts. ICFJ/Reuters Institute.
Documents the pandemic's effects on journalism and misinformation globally, with specific attention to Global South contexts where health misinformation intersected with weak healthcare infrastructure and low institutional trust. The Africa and Asia sections are particularly relevant for understanding how health misinformation dynamics differ from political misinformation in the same environments.
Latin America
8. Venturini, T., & Rogers, R. (2019). "API-Based Research or How Can Digital Sociology and Journalism Studies Learn from the Data Politics of Platforms." Social Media + Society, 5(4).
While methodological rather than regional, this paper is essential for understanding the constraints on researching WhatsApp-centered misinformation ecosystems like Brazil's. Venturini and Rogers' analysis of API access and data politics explains why WhatsApp political communication has been less systematically studied than Twitter or Facebook political communication — a gap with direct implications for understanding Latin American election misinformation.
9. Benkler, Y., Roberts, H., Faris, R., Solow-Niederman, A., & Etling, B. (2015). Social Mobilization and the Networked Public Sphere: Mapping the SOPA-PIPA Debate. Berkman Klein Center.
While focused on a US case, this research provides the analytical framework of "networked public sphere" analysis that Brazilian researchers have adapted for studying WhatsApp political ecosystems. The methodology of network mapping applied to messaging ecosystems, rather than open social media, requires significant adaptation — adaptation that Brazilian researchers have pioneered and that this foundational work informed.
Post-Soviet Space and Europe
10. Roozenbeek, J., van der Linden, S., & Nygren, T. (2020). "Prebunking interventions based on 'inoculation' theory can reduce susceptibility to misinformation across cultures." Harvard Kennedy School Misinformation Review, 1(2).
An empirical test of inoculation/pre-bunking methodology across multiple European countries including Eastern European contexts with Russian disinformation exposure. Roozenbeek et al. find evidence for cross-cultural effectiveness of pre-bunking, but with variation across contexts. Essential for assessing what Western-developed interventions can and cannot be expected to generalize to non-Western contexts.
11. European External Action Service. EUvsDisinfo Database. (Ongoing, available at euvsdisinfo.eu).
The essential empirical resource for documented Russian state-affiliated disinformation cases. The database's thousands of cases, spanning multiple languages and years, provide the most comprehensive available record of Russian information operations. The methodology notes and FAQ on the site explain how attribution and documentation decisions are made. Indispensable for any research on Russian disinformation targeting European and post-Soviet audiences.
Cross-Cultural and Methodological
12. Edelman. (Annual). Edelman Trust Barometer. Edelman PR.
The most widely cited cross-national survey of trust in institutions — media, government, business, NGOs. Essential empirical foundation for the chapter's discussion of institutional trust variation and its implications for fact-checking effectiveness. Annual editions allow tracking of trust trends; global reports disaggregated by country provide the country-level data needed for comparative analysis. Available at edelman.com/trust.
13. Reuters Institute for the Study of Journalism. (Annual). Digital News Report. Oxford University.
The most comprehensive annual cross-national survey of digital news consumption, trust in news, and platform use. The global edition covers 40+ countries and provides essential comparative data on: primary news platforms by country, social media and messaging app use for news, trust in news media, and willingness to pay for news. Essential for comparing information ecosystem characteristics across regions and for tracking changes over time. Available at reutersinstitute.politics.ox.ac.uk
14. IFCN. (2023). State of Fact-Checking. International Fact-Checking Network, Poynter Institute.
The IFCN's periodic assessment of the global fact-checking ecosystem, covering: the number and distribution of fact-checking organizations by region, funding models and sustainability, methodology standards, and challenges facing the sector. The most comprehensive available overview of the global fact-checking ecosystem's geographic distribution, capacity, and resource gaps. Available at poynter.org/ifcn.
15. Guess, A., Nagler, J., & Tucker, J. (2019). "Less than you think: Prevalence and predictors of fake news dissemination on Facebook." Science Advances, 5(1), eaau4586.
A rigorous empirical study finding that actual sharing of misinformation on Facebook is much lower than public concern suggests, and that older Americans are disproportionately likely to share misinformation. This US-focused research provides an important benchmark for comparative analysis: what does the evidence show about actual sharing rates in Western contexts, and how do Global South contexts compare? The paper's methodology is also useful as a model for research designs that could be adapted to non-Western contexts. Available at advances.sciencemag.org.