Case Study 37-1: The Bad News Game — Inoculation at Scale
How a 15-Minute Game Changes What People See
The Problem the Game Was Designed to Solve
By 2017, the research consensus on debunking — correcting misinformation after exposure — was pessimistic. The continued influence effect was well established: false information persists in its influence on beliefs even after explicit correction. Corrections work imperfectly, require reaching people who have already been exposed, and fight against the human tendency to prefer consistent narratives over revised ones.
Sander van der Linden and his collaborators at the Cambridge Social Decision-Making Lab were working on the prebunking alternative — building resistance before exposure rather than administering corrections after. But prebunking in the classic experimental form (reading a brief text warning about manipulation techniques) had limitations: it was passive, potentially boring, and difficult to deliver at the scale needed to matter.
The Bad News game was designed to solve this delivery problem. If inoculation requires exposing people to weakened versions of manipulation techniques, what better delivery mechanism than a game where players actively practice producing those techniques? Learning by doing — producing misinformation and seeing its effects — is substantially more memorable than reading about it.
The game emerged from a collaboration between van der Linden's Cambridge team and the Dutch media company DROG, which had been developing creative approaches to misinformation education. The initial version launched in 2018. It is free to play at badnewsgame.com and has since been updated, translated, and expanded multiple times.
How the Game Works
Bad News places the player in the role of an aspiring social media influencer who wants to build a massive following by producing viral misinformation. Players progress through six "badges" corresponding to six manipulation techniques that the researchers identified as the most commonly used in real-world misinformation:
Badge 1: Impersonation. Players learn to impersonate authoritative accounts — government agencies, news organizations, scientists — to lend false credibility to their content.
Badge 2: Emotional appeals. Players learn to use fear, outrage, and anxiety to bypass analytical thinking and trigger automatic sharing behavior.
Badge 3: Polarization. Players learn to exploit and amplify existing social divisions, framing every issue as an us-versus-them conflict to maximize emotional engagement.
Badge 4: Conspiracy theories. Players learn the structural features of conspiracy theories — unfalsifiability, explanations for counter-evidence, appeals to special knowledge — that make them resistant to correction.
Badge 5: Discrediting. Players learn to undermine trust in credible sources rather than engaging with the substance of what those sources report.
Badge 6: Trolling. Players learn to use social pressure, ridicule, and pile-ons rather than argument to discredit opponents and silence critics.
The game is narrative-based: players make choices about tweets, posts, and responses, and the game provides feedback based on the effectiveness of their choices in building followers while maintaining credibility. Players experience the mechanics of viral misinformation production as an interactive decision process rather than as an abstract description.
Each "badge" achievement is accompanied by explicit labeling: "You've unlocked the Impersonation badge. Here's how impersonation works in real misinformation..." This explicit labeling is the key inoculation element — the game doesn't just expose players to manipulation; it names the technique and explains its mechanism.
The Evidence: What the Research Shows
The 2019 study by Jan Roozenbeek and Sander van der Linden, published in Palgrave Communications, is the primary evaluation of the Bad News game.
Design: 15,000 participants recruited via online sampling were randomly assigned to play Bad News or an active control condition (playing a game unrelated to misinformation). Before and after the game, participants rated the credibility of a set of real social media posts, some of which used the manipulation techniques targeted by the game and some of which were accurate and not manipulative.
Key finding 1 — Manipulation recognition improved. After playing Bad News, participants were significantly better at identifying manipulation techniques in real social media posts. They assigned lower reliability scores to manipulative content and were better at identifying the specific technique being used. Effect sizes were small to medium (Cohen's d approximately 0.18-0.22) — meaningful but not enormous.
Key finding 2 — Confidence in accurate content was maintained. Critically, the game did not produce generalized skepticism. Participants' ratings of accurate, non-manipulative content were not significantly different between the Bad News group and the control group. The inoculation was "specific" — it built resistance to manipulation without creating blanket distrust.
Key finding 3 — Effects were consistent across demographics. The researchers found no significant moderating effect of political orientation, age, education level, or prior media literacy knowledge. The game worked similarly for conservatives and liberals, older and younger adults, more and less educated participants. This cross-demographic effectiveness is one of the game's most important properties.
Key finding 4 — Effects persisted at delayed assessment. A subset of participants was reassessed three days after playing the game. The effects had not significantly decayed over this short follow-up period. However, longer follow-up was not conducted in this study, leaving open the question of longer-term persistence.
Scale and Deployment
Beyond the experimental evaluation, the Bad News game has achieved genuine scale:
By 2020, the game had been played over 1.5 million times across approximately 150 countries.
A UK government partnership (2020-2022) promoted the game as part of its response to COVID-19 misinformation, reaching hundreds of thousands of UK residents.
Sequel games have been deployed with specific misinformation targets: - Go Viral! (COVID-19 misinformation; 5 minutes to play) - Harmony Square (election manipulation misinformation) - Cranky Uncle (science denial techniques specifically) - Bad News Junior (age-appropriate version for ages 8-11)
The scale of deployment enables a natural experiment: in countries where the game was heavily promoted, can misinformation sharing rates be compared to periods before promotion? This kind of real-world effectiveness data is harder to collect than laboratory effects but more meaningful for policy.
A 2022 study (Roozenbeek et al.) used a pre-registered experimental design to test the effectiveness of 90-second prebunking videos (drawing on the same principles as Bad News) deployed on YouTube, Facebook, and Twitter before exposure to specific misinformation narratives. The study found significant improvements in manipulation recognition across platforms, suggesting that the inoculation mechanism works in non-game formats as well and at genuine platform scale.
Limitations and Open Questions
The Bad News game and related inoculation research face several important open questions:
Durability. The evidence on long-term persistence of effects is limited. The three-day follow-up in the primary study is not enough to assess whether effects last weeks or months. Inoculation theory predicts decay and recommends boosters; the optimal booster schedule for different populations is not established.
Ecological validity. Laboratory assessments of manipulation recognition involve carefully selected examples. Whether improved recognition in a research task translates to improved behavior in the real-world information environment — where emotional investment is higher, sharing is socially rewarded, and there is no explicit framing as an evaluation task — is less established.
Selection bias in deployment. The millions of people who have voluntarily played Bad News are not a random sample of social media users. People who choose to play a game about misinformation are likely already more media-literate and manipulation-aware than average. Reaching less engaged populations requires embedding the game in mandated contexts (school curricula) rather than voluntary deployment.
Platform complement. The game builds individual skills; it doesn't change platform architecture. A person who has played Bad News is better at recognizing impersonation in a post — but they're still encountering that post because an algorithm served it to them. Individual inoculation works within an unchanged platform environment that continues to prioritize engagement.
Transfer across technique types. The six techniques covered in Bad News don't exhaust the space of manipulation. Novel misinformation may use techniques not covered in the game. Evidence on transfer — whether inoculation to known techniques generalizes to unknown ones — is limited.
What the Bad News Game Teaches Us About Prebunking
Beyond its own effectiveness, the Bad News game is instructive for what it reveals about the properties of effective inoculation:
Active production beats passive recognition. The game's key design choice — having players produce misinformation rather than just identify it — appears to be important for effectiveness. Understanding a technique well enough to deploy it is a deeper form of knowledge than understanding it well enough to recognize it.
Explicit labeling matters. The game doesn't just give players experience with manipulation; it names the techniques and explains their mechanisms. The combination of experiential learning and explicit conceptual labeling appears to be what produces lasting recognition skills.
Short interventions can have meaningful effects. At approximately 15-25 minutes to complete, Bad News is a short intervention relative to traditional media literacy curricula. The fact that it shows meaningful effects in such a compressed timeframe suggests that the inoculation mechanism is efficient when well-designed.
Scale is achievable. The game's deployment to 1.5 million users demonstrates that inoculation interventions can reach genuinely large numbers of people. The question is whether voluntary, self-selected scale is sufficient, or whether institutional deployment (schools, platforms) is necessary for population-level effects.
The Bad News game represents the most promising evidence to date that targeted prebunking can be delivered at scale with measurable effects on manipulation resistance. It is not a complete solution — but it is a genuine tool.
This case study draws on: Roozenbeek, J., & van der Linden, S. (2019). The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research, 22(5), 570-580. Also: Roozenbeek, J., van der Linden, S., & Nygren, T. (2020). Prebunking interventions based on the psychological theory of "inoculation" can reduce susceptibility to misinformation across cultures. Harvard Kennedy School Misinformation Review.