Chapter 31 Exercises: State-Sponsored Disinformation and Information Warfare

Conceptual and Analytical Exercises

Exercise 1: Taxonomy Application

Classify each of the following activities as: (a) disinformation, (b) misinformation, (c) malinformation, (d) propaganda, or (e) legitimate public communication. In each case, explain your reasoning and note cases where multiple categories might apply.

a. A state-funded news organization reports accurately that a foreign country's government has a high rate of political imprisonment, but only runs such stories about adversary governments while ignoring comparable practices by allied governments.

b. A government spokesman claims that a leaked document is a forgery, when in fact it is genuine.

c. An intelligence agency releases genuine but selectively chosen private communications from a foreign politician to embarrass them before an election.

d. A state-funded troll farm creates fake accounts portraying ordinary citizens expressing support for a government policy.

e. A government issues a press release claiming its health statistics show a lower disease mortality rate than independent researchers have calculated, using a non-standard methodology.

f. A foreign government's official spokesman accurately cites a genuine news article critical of a target country's government policies.


Exercise 2: Bezmenov's Framework Evaluation

Yuri Bezmenov described a four-stage model of Soviet subversion: Demoralization, Destabilization, Crisis, Normalization.

Part A: Identify three specific features of the contemporary information environment (social media dynamics, algorithmic amplification, platform business models, etc.) that might accelerate or facilitate any of Bezmenov's four stages compared to the Cold War era.

Part B: Identify two significant ways in which contemporary information warfare differs from Bezmenov's model — aspects of current operations that his framework does not adequately capture.

Part C: Bezmenov was a defector with ideological motivations for presenting Soviet operations as maximally threatening. How should this context affect how we evaluate and use his account?


Exercise 3: Operation INFEKTION Case Analysis

Read the following description of Operation INFEKTION's timeline:

  • July 1983: Letter published in Indian newspaper Patriot by "anonymous American scientist"
  • October 1985: Literaturnaya Gazeta amplifies the claim
  • 1986-87: Global spread through Soviet-aligned media; translation into 30 languages
  • 1987: Soviet government acknowledges operation; agrees to discontinue
  • 1990s onwards: Narrative persists independently in Black American communities

Part A: Map this timeline onto the concept of "narrative laundering." At each stage, who are the amplifiers and what gives the narrative additional credibility?

Part B: Why did the narrative persist after the Soviet government acknowledged and discontinued it? What does this tell us about the "half-life" of successfully planted disinformation?

Part C: Structural parallels have been drawn between Operation INFEKTION and COVID-19 lab-leak discourse. Write a careful paragraph identifying the genuine parallels and a second paragraph identifying the important differences. What analytical conclusions follow?


Exercise 4: The Gerasimov Misreading

Researcher Mark Galeotti publicly acknowledged that his coining of the phrase "Gerasimov Doctrine" was a misreading that took on a life of its own.

Part A: What is the difference between describing what an adversary is doing (descriptive) and announcing what one's own side intends to do (prescriptive)? Why does confusing these produce analytical errors?

Part B: What were the real-world analytical and policy consequences of the "Gerasimov Doctrine" misreading becoming influential in Western security discourse?

Part C: This case illustrates how secondary sources can distort primary sources. Design a verification protocol that analysts or journalists could use to avoid similar errors when analyzing foreign government documents or statements.


Exercise 5: RT Editorial Analysis

Find and analyze three recent segments, articles, or social media posts from RT (Russia Today). For each, identify:

a. The specific narrative being promoted b. The specific RT techniques used (false balance, whataboutism, strategic amplification of division, etc.) c. The target audience and what emotional responses the content seems designed to elicit d. Whether the content contains outright false claims, misleading framing, or is technically accurate but strategically selected

Write a 500-word analytical summary of what the three examples reveal about RT's current editorial strategy.


Exercise 6: The Firehose of Falsehood Response Problem

The RAND "Firehose of Falsehood" model identifies a strategy designed to overwhelm fact-checkers through sheer volume. This creates a structural problem for democratic information ecosystems.

Part A: Explain why traditional "debunking" strategies (publishing corrections to specific false claims) may be insufficient responses to the Firehose strategy. Draw on psychological research on the "illusory truth effect" and "backfire effect" in your explanation.

Part B: Design an alternative response strategy for a major news organization facing a Firehose campaign targeting their coverage. The strategy should be realistic given the organization's resources and journalistic norms.

Part C: What systemic or regulatory changes to the information ecosystem might make it more resistant to Firehose strategies? Evaluate each proposal for potential unintended consequences.


Exercise 7: China vs. Russia Comparison

Compare and contrast China's and Russia's information operations across the following dimensions:

Dimension China Russia
Primary target audiences
Primary strategic objectives
Characteristic tactics
Role of state media
Relationship to domestic politics
Capacity for plausible deniability
Effectiveness in democratic countries

After completing the table, write a 300-word analytical summary of the most important differences between the two countries' approaches and what those differences reveal about each country's strategic priorities.


Exercise 8: Narrative Laundering Mapping

Choose one contemporary political narrative that you believe may have undergone narrative laundering (examples: specific immigration claims, election integrity claims, COVID origin narratives, NATO expansion narratives).

Part A: Map the pathway of the narrative through the fringe-to-mainstream pipeline. For each stage, identify: - The specific sources or platform types at that stage - What gave the narrative additional credibility at that stage - Any transformations in the narrative's form or framing as it moved up the pipeline

Part B: At what stage did the narrative become impossible to suppress even if the original source was discredited? What does this "tipping point" reveal about how narrative laundering works?

Part C: To what extent can you determine whether the narrative's origin was state-sponsored, domestically organic, or some combination? What evidence would you need to make a confident attribution?


Exercise 9: Helsinki Model Applicability Assessment

The "Helsinki model" of societal resilience to information operations has four key elements: media literacy in schools, cross-sector coordination, historical memory of external pressure, and a prebunking-over-debunking orientation.

Evaluate the applicability of the Helsinki model to your own country's context. For each element: - What analogous institutions or practices currently exist? - What would need to change to implement the element? - What cultural, political, or institutional obstacles would need to be overcome? - Are there aspects of your country's specific history or context that might serve as functional equivalents to Finland's historical memory of Soviet pressure?


Exercise 10: Attribution Standards Design

You are advising a government facing a suspected foreign influence operation targeting an upcoming national election. The operation appears to involve coordinated inauthentic behavior on social platforms that amplifies divisive narratives, but the technical evidence linking it to a specific foreign state is probabilistic rather than certain.

Part A: Design an evidentiary framework specifying what standard of attribution evidence would be required before the government takes each of the following actions: (i) public attribution; (ii) private diplomatic communication to the suspected state; (iii) social media platform notification; (iv) targeted sanctions; (v) expulsion of diplomatic personnel; (vi) counter-operation.

Part B: Who should make these determinations — intelligence agencies, elected officials, independent courts, or some combination? Justify your answer.

Part C: What risks arise from setting the attribution standard too high? Too low? How should these competing risks be balanced?


Applied Technical Exercises

Exercise 11: Sockpuppet Network Detection

Given the following hypothetical data about 500 social media accounts (posting frequency, mutual follower relationships, account creation dates, content similarity scores, geographic signals), design a methodology for identifying accounts that are likely part of a coordinated inauthentic network.

a. What specific behavioral signals would you look for? b. How would you weight different signals relative to each other? c. What false positive rate would be acceptable, and why? d. How would you validate your detection methodology?


Exercise 12: Counter-Narrative Design

You have been hired by a civil society organization to design a counter-narrative campaign responding to a state-sponsored disinformation narrative claiming that your country's election system is fundamentally corrupt and cannot produce legitimate results.

a. Should you directly debunk the claim? What are the risks of direct debunking? b. Should you prebunk future variations of the claim? How would you design a prebunking intervention? c. How would you handle the fact that the narrative has some basis in genuine (if exaggerated) concerns about specific electoral vulnerabilities? d. How would you measure the effectiveness of your counter-narrative campaign?


Exercise 13: Influence Operation Red Team Exercise

Working in groups of 4-6 students, design a hypothetical influence operation targeting a specific policy debate in a democratic country. Your design should specify:

a. Strategic objectives (what beliefs or behaviors do you want to change?) b. Target audiences (who are you trying to influence, and why?) c. Key narratives (what claims will you promote?) d. Platform and channel strategy (where will you distribute content?) e. Persona strategy (what fake personas, if any, will you use?) f. Timeline and escalation plan

After completing the design, analyze your own operation: What are its vulnerabilities? How could it be detected? What counter-measures would be most effective against it? What ethical issues does this exercise raise?


Exercise 14: EU East StratCom Analysis

Visit the EUvsDisinfo database (euvsdisinfo.eu) and examine 10 recent entries in the database.

a. What are the most common narrative themes in the 10 entries? b. What sources and platforms most frequently distribute the disinformation? c. What countries are most frequently targeted? d. Evaluate the quality of the debunking provided for 3 of the entries: Is it accurate? Is it persuasive? Could it backfire by amplifying the claim? e. What are the limitations of the EUvsDisinfo approach as a counter-disinformation strategy?


Exercise 15: Wumao vs. IRA Comparative Tactics

Based on the chapter reading and any additional research, compare the tactical approaches of China's wumao (50-cent army) and Russia's Internet Research Agency across these dimensions:

a. Content strategy: argument-based vs. distraction-based b. Target audiences: domestic vs. foreign, specific demographics c. Operational security: degree of concealment of state sponsorship d. Platform strategy: preferred social media platforms and why e. Scale of operation f. Measurable effectiveness

Write a comparative analysis (approximately 600 words) explaining which approach poses greater challenges for democratic counter-measures and why.


Research and Writing Exercises

Exercise 16: Primary Source Analysis

Read a selection from the Mueller Report (available publicly), specifically Volume I sections dealing with Internet Research Agency operations. Identify:

a. Three specific IRA tactics that are described in the report b. The evidentiary basis for the report's attribution of the operation to Russia c. What the report identifies as the operation's primary strategic objectives d. Limitations in the report's analysis that you can identify

Write a 400-word critical analysis of the report as a source for understanding state-sponsored influence operations.


Exercise 17: RAND Firehose of Falsehood Reading Response

Read the original 2016 RAND publication "The Russian 'Firehose of Falsehood' Propaganda Model" by Christopher Paul and Miriam Matthews (available at rand.org).

a. What is the authors' main analytical claim? b. What psychological research do they cite to support their claims about why the model is effective? c. What counter-measures do they propose? d. In what ways does the model remain relevant to contemporary information operations? In what ways has the information environment changed since 2016 in ways that might require updating the model?


Exercise 18: Policy Brief Writing

Write a 750-word policy brief for a government minister responsible for national security, summarizing the key threats posed by state-sponsored disinformation and recommending a set of counter-measures. Your brief should:

a. Prioritize the most significant threats b. Recommend specific, actionable measures c. Address both short-term (12-month) and long-term (5-year) policy horizons d. Acknowledge trade-offs and potential unintended consequences e. Be written in accessible language for a non-technical audience


Exercise 19: Historical Research

Choose one Cold War active measures operation NOT covered in the chapter (examples: Soviet disinformation surrounding the Korean Air Lines 007 shootdown, fabricated documents targeting specific Western politicians, operations surrounding the Solidarity movement in Poland). Research the operation and write a 600-word case study that:

a. Describes the operation's objectives and methods b. Assesses its effectiveness c. Identifies what techniques, if any, have been adapted in contemporary operations d. Notes what primary sources or declassified documents are available


Exercise 20: Interview Design

Design a semi-structured interview protocol that a researcher could use to understand how ordinary news consumers encounter and evaluate content from state-sponsored media or influence operations.

a. What specific research questions is the interview designed to answer? b. What are the most important 8-10 interview questions, and why? c. What ethical considerations should govern the recruitment and interview process? d. What are the limitations of interview methodology for studying this topic? What alternative methods might complement it?


Advanced and Synthesis Exercises

Exercise 21: Synthesizing the Attribution Problem

The chapter discusses attribution challenges in influence operations. Consider the following scenario:

A major social media platform's security team identifies a network of 3,000 accounts engaging in coordinated inauthentic behavior during your country's election campaign. The accounts predominantly promote narratives favorable to one political party and unfavorable to another. Technical analysis shows the accounts were created in a short window, post on similar schedules, and share content in patterns suggestive of automation. Some accounts' IP addresses are associated with a foreign country; others are domestic VPNs.

Part A: What additional information would you need to make a confident attribution to a specific foreign state?

Part B: What would a responsible public disclosure of this information look like? Who should make the disclosure — the platform, government agencies, independent researchers?

Part C: What are the risks of premature disclosure? Of delayed disclosure? How should these risks be balanced?

Part D: Write two versions of a public statement about this finding: one that appropriately communicates the uncertainty in the attribution, and one that inappropriately overstates the certainty. Explain what makes each version appropriate or inappropriate.


Exercise 22: Designing a Media Literacy Curriculum

Based on the chapter's discussion of the Helsinki model and other counter-disinformation approaches, design a 6-session media literacy curriculum for high school students (ages 14-17) that addresses state-sponsored disinformation.

For each session, specify: - Learning objectives - Main content and key concepts - One hands-on activity - Assessment approach

Then write a justification (approximately 300 words) explaining how your curriculum reflects research on effective inoculation against manipulation rather than simple debunking.


Exercise 23: Ethical Dimensions of Counter-Operations

Democratic governments facing state-sponsored information operations must decide whether to conduct counter-operations — active measures of their own designed to undermine adversary governments' influence. This raises profound ethical questions.

Write a structured philosophical argument (approximately 800 words) addressing the ethics of democratic governments conducting offensive information operations. Your argument should:

a. Identify the strongest case for conducting counter-operations (strategic necessity, tu quoque, deterrence) b. Identify the strongest case against (hypocrisy, escalation, rule of law, domestic risk) c. Distinguish between different types of counter-operations on ethical grounds d. Reach a defensible conclusion about what limits, if any, should apply to democratic governments' information operations against adversaries


Exercise 24: Future Scenarios Analysis

AI-generated content, deepfakes, and autonomous bot networks are rapidly transforming the technical landscape of information operations. Write a 600-word analysis of how three specific technological developments will affect the challenge of state-sponsored disinformation over the next decade:

a. Generative AI capable of producing highly convincing text, images, audio, and video on demand b. AI-powered micro-targeting capable of customizing disinformation to individual psychological profiles c. Decentralized social media platforms that lack centralized moderation capacity

For each development, assess: How does it change the threat? How does it change the available countermeasures? Does it favor attackers or defenders?


Exercise 25: Cross-Chapter Synthesis

Connect the material in this chapter to Chapter 6 (Cognitive Biases and Susceptibility to Misinformation) and Chapter 15 (Algorithmic Amplification). Write a 500-word synthesis essay answering the following question:

"State-sponsored information operations are effective not primarily because of their technical sophistication but because they exploit known features of human cognition and the structural incentives of digital platforms. Evaluate this claim using specific evidence from all three chapters."


Exercise 26: The Domestic Responsibility Problem

The chapter argues that foreign information operations primarily work by amplifying existing domestic grievances and divisions, with domestic political actors — knowingly or unknowingly — serving as amplifiers.

Write a 700-word essay addressing the following ethical question: "Do domestic political figures, media organizations, or social media users who amplify foreign disinformation bear moral responsibility for its effects, even if they are unaware of its foreign origin? Does this moral responsibility vary based on whether they could reasonably have known the origin?"

Your essay should engage with concepts from political philosophy (responsibility, complicity, reasonable care) as well as the empirical evidence about how narrative laundering works.


Exercise 27: Comparative Regulatory Analysis

Different democracies have adopted very different regulatory approaches to state-sponsored disinformation:

  • Germany: NetzDG law requiring rapid removal of illegal content (including some disinformation) by platforms, under threat of significant fines
  • France: Loi contre la manipulation de l'information, allowing courts to order rapid removal of false information during election campaigns
  • Australia: Foreign interference legislation targeting agents of foreign principals
  • United States: Primarily relying on FARA registration requirements and platform self-regulation
  • Singapore: POFMA (Protection from Online Falsehoods and Manipulation Act), allowing government ministers to order corrections or removals

For each approach, evaluate: What problem is it designed to solve? What are the risks of government overreach or abuse? What is the evidence of effectiveness? Which approach best balances free expression with protection against manipulation?


Exercise 28: Mock Intelligence Assessment

You are an analyst at a fictional national intelligence agency. Write a 500-word mock intelligence assessment (modeled on the format of declassified IC assessments) evaluating foreign influence operation threats to your country's upcoming election.

Your assessment should: - State your confidence level in key judgments (high/moderate/low confidence) - Distinguish between what is known, what is assessed, and what remains uncertain - Identify specific threat actors and their likely objectives - Recommend specific protective measures

Include a brief methodological note explaining your evidentiary basis and the limitations of your assessment.