Case Study 29.2: NATO StratCom and Countering Russian Disinformation
"The Firehose That Cannot Be Turned Off"
Introduction
On the morning of February 24, 2022 — hours after Russian forces began their large-scale invasion of Ukraine — the information environment surrounding the conflict lit up simultaneously across dozens of platforms, in dozens of languages, with contradictory and self-undermining claims: the invasion was not happening; the invasion was happening but was justified by NATO aggression; Ukraine was a failed state that had requested Russian intervention; Ukrainian soldiers were committing atrocities against Russian speakers; the Western media was fabricating evidence; the Western media was suppressing the truth. Within hours, some of these narratives had been amplified millions of times through social media networks that included both organic users and coordinated inauthentic accounts.
The institutional actors responsible for countering Russian disinformation at the European and NATO level had been tracking and documenting Russian information operations for eight years. They had published research, trained practitioners, built databases, and designed counter-narrative frameworks. On the morning of February 24, 2022, the volume, velocity, and internal contradiction of the information operations they faced exceeded the capacity of any institutional counter-propaganda apparatus.
This case study examines those institutional actors — NATO's Strategic Communications Centre of Excellence and the EU External Action Service's East StratCom Task Force — their mandates, their methods, their genuine achievements, and the structural challenges that limit their effectiveness. It is a case study not in failure but in the realistic limits of institutional counter-propaganda confronting an adversary that operates without comparable constraints.
NATO StratCom CoE: Origins and Mandate
The NATO Strategic Communications Centre of Excellence (StratCom CoE) was established in 2014 in Riga, Latvia. The timing was not coincidental: Russia's annexation of Crimea in March 2014, accompanied by some of the most sophisticated information operations in modern military history, made the gap in NATO's institutional capacity for understanding and countering information warfare impossible to ignore.
StratCom CoE is an accredited NATO Center of Excellence — meaning it has been formally recognized by NATO as a center of specialized expertise, but it is not a NATO command structure. It does not issue orders, it does not run operations, and it does not make policy for NATO member states or for the Alliance. Its role is research, analysis, training, and advisory — producing the knowledge that enables policymakers and communicators to act more effectively.
The Centre is funded by its framework nations (currently Latvia, Germany, Italy, Lithuania, Estonia, the United Kingdom, the United States, Poland, Turkey, Slovakia, the Netherlands, Canada, and several others). Its staff includes military officers, civilian researchers, communications professionals, and technology specialists from member nations.
Its core outputs include:
Annual and thematic research reports: StratCom CoE has published comprehensive studies of Russian, Chinese, and other state-sponsored information operations. Reports have examined troll farm operations, the use of social media bots for influence amplification, coordinated inauthentic behavior on Facebook and Twitter/X, the specific techniques of the "firehose of falsehood" strategy, and the intersection of cyber operations and information operations.
Practitioner training: StratCom CoE runs training programs for military communicators, government public affairs professionals, and civilian practitioners across member nations — teaching the principles of strategic communication, the techniques of disinformation operations, and the methods of counter-disinformation practice.
Strategic communication advisory: StratCom CoE advises Allied nations on communication strategy for specific operations and contexts, providing analysis-based recommendations for how to communicate effectively in contested information environments.
Technology assessment: StratCom has invested significantly in technical research on automated propaganda (bot networks, coordinated amplification), AI-generated disinformation, and deepfakes as an emerging disinformation vector.
The "Firehose of Falsehood": StratCom's Analytical Contribution
One of StratCom CoE's most influential analytical contributions to understanding Russian information operations is its articulation — developed by Christopher Paul and Miriam Matthews at RAND, drawing on StratCom research — of the "firehose of falsehood" model of Russian propaganda.
Traditional models of propaganda assume a relatively small number of coherent, consistent messages, carefully designed for maximum credibility and impact. Soviet propaganda during the Cold War generally followed this model: it told lies, but it tried to tell consistent, believable lies.
Contemporary Russian information operations, StratCom researchers documented, follow a fundamentally different model:
High volume: The operations produce an enormous volume of content — not a few carefully crafted messages but hundreds or thousands of messages daily, across dozens of platforms and in multiple languages.
High speed: Content is produced and disseminated at speeds that outpace any institutional fact-checking or counter-messaging capacity.
Deliberate internal contradiction: The operations frequently produce simultaneously contradictory narratives — the plane was shot down by Ukraine; the plane was shot down by the CIA; the plane crash was a Western provocation; the plane crash never happened. The goal is not to convince audiences of any specific alternative narrative but to produce confusion and epistemic paralysis — the sense that "no one can know the truth."
Disregard for truth or consistency: Unlike traditional propaganda, the firehose model does not require maintaining consistent positions over time. Claims can be abandoned and replaced as needed; internal contradiction is not a bug but a feature.
The practical implication for counter-propaganda is profound. The firehose model cannot be countered claim-by-claim: the volume, speed, and internal contradiction of the operations make that approach untenable. It must be countered at the level of technique (inoculation against the confusion strategy itself) and at the level of structural resilience (media literacy that makes audiences resistant to epistemic paralysis).
EU vs. Disinfo: The EUvsDisinfo Database
The EU External Action Service's East StratCom Task Force was established in 2015 at the direction of the European Council, responding to concerns about Russian-linked disinformation targeting EU member states. Its most public-facing product is the EUvsDisinfo website and database — a publicly accessible, continuously updated catalog of documented cases of disinformation originating from Kremlin-connected sources.
As of 2024, the EUvsDisinfo database contains over 15,000 documented cases, spanning more than 30 languages. Each case entry includes:
- The specific false claim
- The source(s) where it was documented
- The factual correction with evidence and source citations
- Analysis of how the claim relates to known Kremlin disinformation patterns and narratives
The database serves multiple functions: it is a research resource for journalists and researchers tracking specific narratives; it is a public awareness tool for citizens who want to recognize documented disinformation; it is an accountability mechanism that publicly names specific false claims and their sources; and it is a historical record of the scope and character of Kremlin-connected information operations over a decade.
The East StratCom Task Force also produces weekly disinformation review newsletters, which identify and analyze trending disinformation narratives across the EU information space, and thematic reports on specific disinformation campaigns and techniques.
What StratCom and EUvsDisinfo Have Documented
The scale and specificity of what these institutions have documented is important to understand. Over a decade of systematic monitoring, they have documented:
The scale of disinformation operations: State-linked Russian disinformation is not marginal or occasional — it is a sustained, well-resourced industrial operation. The EUvsDisinfo database documents thousands of cases annually, with coordinated narratives appearing simultaneously in multiple languages and across multiple platforms.
Key recurring narrative themes: Kremlin-connected disinformation consistently advances a relatively small number of strategic narratives: NATO is the aggressor in Russia's neighborhood; Western democracy is hypocritical and failing; Western media is propaganda while Russian state media tells the truth; Russia is defending "traditional values" against a decadent West; Ukraine is a failed state controlled by Nazis and Western puppeteers. These themes repeat across thousands of specific false claims, suggesting coordinated strategic intent rather than organic information disorder.
Attribution challenges: Despite systematic monitoring, formal attribution — definitively connecting specific disinformation content to specific Russian government actors — remains difficult. The operations use multiple layers of obfuscation: proxies, third-party distributors, organic-looking social media accounts, and media organizations with deliberately obscured ownership structures. StratCom and EUvsDisinfo attribute content to "Kremlin-connected" or "pro-Kremlin" sources rather than to specific government agencies in most cases.
Specific operations: Several major operations have been documented in sufficient detail for near-definitive attribution: - The "Secondary Infektion" operation (documented by the EU DisinfoLab), which operated for approximately a decade using hundreds of fake websites and social media accounts across 30+ countries - The Internet Research Agency (IRA) troll farm operations, exposed through Facebook and Twitter transparency reports and US Senate Intelligence Committee investigations - The "Ghostwriter" operation targeting Germany, Poland, and Baltic states, which involved compromising journalists' and politicians' online accounts to post fabricated content
What Institutional Counter-Disinformation Cannot Do
StratCom and EUvsDisinfo represent the most sophisticated and well-resourced institutional counter-disinformation efforts in the democratic world. Honest assessment requires acknowledging what they have not been able to achieve.
Reach the most vulnerable audiences: The audience for EUvsDisinfo's database and weekly briefings is primarily journalists, researchers, policy professionals, and already-engaged citizens. The populations most exposed to and persuaded by Kremlin-connected disinformation — including citizens in EU member states with significant Russian-language media consumption and citizens in countries with lower institutional media literacy — are largely not reading EUvsDisinfo. The counter-narrative reaches those who need it least.
Match the volume and speed of the adversary: Institutional counter-propaganda produces carefully sourced, accurately documented counter-narratives. The adversary produces high-volume, high-speed contradictory narratives that overwhelm any claim-by-claim counter-messaging capacity. One analyst at StratCom described it privately as "trying to drain a bathtub while the tap is open at full pressure." Systematic documentation takes time; at scale, there is always more disinformation than there are resources to document and correct it.
Attribution at the speed of news: The time required to responsibly attribute specific disinformation to specific state actors often exceeds the news cycle in which the disinformation is most influential. By the time attribution can be publicly stated with confidence, the emotional response to the original disinformation has already been formed in much of the target audience.
Counter the adversary's structural advantages: Democratic governments operating transparent counter-propaganda face constraints that the adversary does not: legal accountability, democratic oversight, free press scrutiny of government communication, and the credibility cost of any perceived deviation from accuracy or transparency. These constraints are features, not bugs — they are what makes democratic government legitimate. But they impose asymmetric limits on counter-propaganda capacity.
Overcome politically motivated disbelief: In EU member states where significant domestic political actors have made alignment with Russian narratives a partisan identity marker, institutional counter-disinformation is systematically dismissed by the audiences it most needs to reach. In this context, the institutional source itself has been made incredible — fact-checks from the EU External Action Service are interpreted by targeted audiences as EU propaganda.
The Structural Challenge: Democratic Values vs. Information War Asymmetry
The deepest challenge facing NATO StratCom and EUvsDisinfo is structural: democratic governments operating under rule-of-law constraints, with transparent sources and accountable communications, face adversary information operations that deliberately operate without these constraints.
Russia's state-linked information operations do not need to be transparent about their source — they use proxies and false-front organizations. They do not need to be accurate — they deploy the firehose of falsehood precisely because internal contradiction serves their goals. They do not need to serve their audiences' genuine interests — the goal is strategic disruption, not public information.
Democratic counter-propaganda cannot adopt these methods without ceasing to be democratic. A democratic government that runs covert disinformation operations undermines the transparency and accountability that make it democratically legitimate. There is no covert information operation that, if discovered, does not cause significantly more reputational damage than the operation was designed to prevent.
This is the asymmetry that Tariq identified in the seminar — and it is real. But the historical record also suggests that democratic governments that have attempted to meet disinformation with disinformation have consistently done more harm than good. The Office of Strategic Influence (OSI), established by the United States Department of Defense in 2001 to conduct "information operations" abroad, was shut down within months after press reports revealed it planned to distribute false stories through foreign news organizations. The revelation undermined US credibility in precisely the regions it was designed to influence.
The available evidence suggests that the correct response to the structural asymmetry is not to adopt the adversary's methods but to invest in the structural resilience that makes democratic populations less vulnerable to those methods: media literacy education, transparent and trustworthy public institutions, well-resourced independent journalism, and platform-level interventions that reduce the amplification of disinformation.
Specific Case: Countering the Ukraine Biolabs Narrative (2022)
The "secret NATO biolabs in Ukraine" narrative illustrates both the achievements and the limits of institutional counter-disinformation in practice.
The disinformation: Beginning in February 2022, Kremlin-connected sources — primarily Russian state media and the Russian Foreign Ministry's official social media accounts — began amplifying the claim that the United States had established secret biological weapons laboratories in Ukraine, in preparation for an attack on Russia. This claim cited real US government-funded laboratories as its evidence, grossly misrepresenting their purpose (biosafety work under the Nunn-Lugar Cooperative Threat Reduction Program).
The amplification: The narrative was picked up by Chinese state media and amplified to global audiences. It circulated on Twitter/X, Facebook, YouTube, Telegram, and TikTok, accumulating billions of views across platforms. US senators received constituent inquiries about it. It was amplified by domestic US political actors.
The counter-narrative response: EUvsDisinfo published a detailed fact-check of the claim within days. The Biden White House published a detailed fact sheet. Multiple investigative journalists reported the factual background. The US State Department published a public memorandum. NATO StratCom tracked and analyzed the amplification network.
Assessment: Among journalists, policy researchers, and engaged citizens who sought out fact-checking, the counter-narrative was effective. But among the audiences most exposed to the disinformation — those consuming Russian and Chinese state media, those in partisan online communities that had amplified the narrative, those who had not sought fact-checking but encountered the claim organically in their social feeds — the counter-narrative had limited reach.
A 2023 YouGov survey found that significant minorities in multiple European countries (ranging from 8% to 22% depending on country) believed or were uncertain about whether US biological weapons laboratories existed in Ukraine. The counter-narrative did not erase the disinformation from the information environment.
Lessons from the Institutional Experience
The decade-plus experience of StratCom CoE and EUvsDisinfo produces several durable lessons for understanding the role of institutional counter-propaganda:
Attribution, documentation, and public accountability are genuinely valuable. The EUvsDisinfo database is a unique public record of the scope and character of Kremlin-connected information operations. It has informed journalism, policy, and research in ways that would not have been possible without it. Public attribution of specific operations — when the evidence supports it — does raise the political and reputational cost of disinformation for the actors who conduct it.
Technique-level inoculation is more scalable than claim-level correction. StratCom's research on the firehose of falsehood has contributed to technique-based inoculation programs that reach far more people than specific fact-checks. The intellectual contribution of systematically documenting how information operations work is arguably more valuable than any specific counter-narrative.
Institutional counter-propaganda cannot substitute for structural resilience. The Finnish model — population-level media literacy built over decades — has produced more durable resistance to Russian disinformation than any institutional counter-narrative campaign. The resources invested in reactive counter-messaging would produce greater long-term impact invested in proactive media literacy education.
Democratic values are not a handicap in the long run. In the short term, the asymmetry between transparent democratic counter-propaganda and opaque adversary disinformation is real and creates genuine disadvantages. In the longer term, the trustworthiness of transparent, accurate institutions is itself a strategic asset — and democratic governments that compromise that trustworthiness by adopting adversary methods destroy the asset they were trying to protect.
Conclusion
NATO StratCom CoE and the EU East StratCom Task Force represent serious, well-resourced, and genuinely valuable institutional counter-disinformation efforts. Their research has advanced collective understanding of how information operations work. Their documentation has created accountability records that would not otherwise exist. Their training programs have improved practitioner capacity across member nations.
And they have not solved the problem.
The information operations they face are conducted by a state actor with virtually unlimited resources, no democratic accountability, and no commitment to accuracy or transparency. The operations operate at a volume and speed that overwhelms reactive counter-messaging capacity. The narratives they advance are designed not to be believed but to confuse — making claim-by-claim correction a structurally inadequate response.
What StratCom and EUvsDisinfo cannot do, Finland's media literacy system begins to address: building population-level inoculation against the techniques before the claims arrive. The institutional and the educational approaches are not alternatives but complements — each addressing dimensions of the problem that the other cannot reach.
The firehose cannot be turned off. But the people standing in its path can be taught to recognize that they are getting wet.
Discussion Questions
-
NATO StratCom CoE does not issue orders or make policy — it produces research and training. Is this an appropriate institutional model for counter-disinformation, or should NATO have a more operational counter-propaganda capacity? What are the risks of a more operational approach?
-
The EUvsDisinfo database documents over 15,000 cases of Kremlin-connected disinformation. Who uses this database, and who does not? What would need to change for it to reach more vulnerable audiences?
-
The "firehose of falsehood" strategy does not aim to convince audiences of specific alternative narratives — it aims to produce confusion and epistemic paralysis. What implications does this strategic goal have for counter-propaganda design?
-
StratCom's experience suggests that claim-by-claim counter-messaging cannot match the volume and speed of the adversary's operations. What alternative counter-disinformation strategies does this failure point toward?
-
Should democratic governments run covert information operations as part of their counter-disinformation response? What does the historical record — including the US Office of Strategic Influence case — suggest about the likely effects?