Glossary of Key Terms
Propaganda, Power, and Persuasion: A Critical Study of Influence, Disinformation, and Resistance
This glossary defines the key terms introduced and developed throughout the textbook. Entries are arranged alphabetically. Chapter references indicate where a term is first introduced or most fully developed. Cross-references direct readers to related concepts. Terms marked with (FLICC) are part of the Fake, Lobbyist, Illogical, Conspiracy, Cherry-picking taxonomy of science denial techniques.
A
Agenda-setting The process by which media organizations and political actors determine which topics receive public attention, thereby shaping what the public considers important — even if they do not explicitly tell audiences what to think. First theorized by Maxwell McCombs and Donald Shaw in their 1972 study of the Chapel Hill presidential campaign, agenda-setting distinguishes between the media's limited power to change opinions directly and its considerable power to assign salience to issues. A closely related phenomenon, framing, determines not just which issues are prominent but how they are interpreted. (→ Ch.7)
See also: Framing, Priming
Agnotology The study of culturally induced ignorance or doubt — the deliberate production and maintenance of not-knowing as a political and commercial strategy. The term was coined by historian Robert Proctor to describe how the tobacco industry manufactured uncertainty about the health harms of smoking. Agnotology broadens the study of propaganda beyond the spread of false beliefs to encompass the strategic suppression of true ones. It underpins concepts such as manufactured doubt and the doubt manufacturing playbook. (→ Ch.5)
See also: Doubt manufacturing, Manufactured doubt
Algorithmic amplification The process by which recommendation systems — including social media feeds, search engines, and video platforms — preferentially surface and redistribute content based on engagement signals, rather than accuracy, importance, or representativeness. Because outrage, fear, and novelty generate high engagement, algorithmic amplification structurally advantages sensational and emotionally provocative content, including disinformation. This dynamic can occur without any deliberate intent to spread false information, making it a systemic rather than merely intentional problem. (→ Ch.16)
See also: Filter bubble, Echo chamber
Anchoring bias A cognitive tendency in which an individual's judgments are disproportionately influenced by the first piece of information encountered on a given topic — the "anchor." In propaganda and advertising contexts, anchoring is exploited by establishing extreme or favorable reference points early (e.g., an inflated original price, an exaggerated threat) so that subsequent information is evaluated relative to that anchor. Even when the anchor is known to be arbitrary, its influence on final judgment persists. (→ Ch.4)
See also: Heuristic processing, Priming
Astroturfing The creation of an artificial appearance of grassroots public support for a political position, product, or movement. The term derives from AstroTurf, the synthetic grass brand, as a metaphor for fake organic growth. Astroturfing operations are conducted by governments, corporations, and political parties using networks of fake accounts, paid commenters, and front organizations that disguise their sponsorship. Distinguished from organic advocacy by the coordinated, inauthentic nature of the activity. (→ Ch.17)
See also: Coordinated inauthentic behavior (CIB), Sockpuppet
Attitude inoculation A psychological intervention based on inoculation theory in which individuals are exposed to weakened forms of persuasive attacks — along with refutations — before encountering the real attack. The analogy to biological vaccination is direct: a small, neutralized dose of the pathogen builds resistance to a full infection. Attitude inoculation has been shown to reduce susceptibility to misinformation even when the specific false claims differ from those used in the inoculation. (→ Ch.33)
See also: Inoculation theory, Prebunking, Technique-based inoculation
Authority appeal (false expertise) A persuasion technique that invokes the credibility of an apparent expert or authority figure to lend legitimacy to a claim, even when that figure lacks genuine relevant expertise or has conflicts of interest. This technique is exploited across commercial, political, and scientific domains — for instance, using a celebrity doctor to endorse a dubious medical product, or citing a credential-holding outlier to suggest scientific controversy where none exists. Distinct from legitimate appeals to authority, which involve genuine and unbiased expertise. (→ Ch.8)
See also: Fake experts (FLICC), Epistemic authority
B
Backfire effect A phenomenon, initially reported by Brendan Nyhan and Jason Reifler, in which providing factual corrections to a person's false belief not only fails to correct the belief but actually strengthens it. The effect was theorized to arise from identity-protective cognition: when a correction threatens a deeply held identity or worldview, the corrected individual doubles down to defend it. Subsequent research has found the effect to be less robust and less universal than originally reported, but it remains relevant as a theoretical warning about the limits of simple fact-checking. (→ Ch.34)
See also: Identity-protective cognition, Correction paradox, Motivated reasoning
Bandwagon effect A persuasion technique that encourages adoption of a belief or behavior on the grounds that "everyone" is already doing or believing it. By appealing to the desire for social conformity and the fear of exclusion, bandwagon propaganda reduces the perceived cost of compliance and raises the perceived social cost of dissent. Used extensively in electoral propaganda, advertising, and totalitarian mass mobilization campaigns in which visible participation signals loyalty. (→ Ch.8)
See also: In-group/out-group dynamics, Glittering generalities
Big Lie (Grosse Lüge) A propaganda technique, first described by Adolf Hitler in Mein Kampf, premised on the observation that people find it easier to believe an enormous falsehood than a small one, because ordinary people tell small lies and so do not assume that anyone could fabricate something on a colossal scale. The Big Lie functions not merely as deception but as a framework that displaces reality wholesale, making it a cornerstone of totalitarian information environments. The term has since been applied to contemporary political disinformation campaigns that assert sweeping false narratives with great repetition and confidence. (→ Ch.21)
See also: Firehose of falsehood, Repetition, Illusory truth effect
Bounded choice A concept developed by Alexandra Stein and others to describe the way coercive groups — cults, authoritarian movements, totalitarian states — create an illusion of free decision-making while systematically narrowing the range of permissible options. Within a bounded-choice system, individuals appear to choose freely among alternatives, but all alternatives are defined, permitted, and monitored by the controlling authority, and exit from the system is made costly or dangerous. (→ Ch.29)
See also: Coercive persuasion, Milieu control, Sacred science
Brandenburg standard The legal threshold established by the U.S. Supreme Court in Brandenburg v. Ohio (1969) for when government may restrict speech that advocates illegal action. Under the Brandenburg test, speech may only be restricted if it is directed to inciting or producing imminent lawless action and is likely to produce such action. The standard provides strong constitutional protection for political speech, including inflammatory speech, and is a key reference point in discussions of the legal limits of anti-propaganda regulation in liberal democracies. (→ Ch.35)
See also: NetzDG, Section 230
C
Cherry picking (FLICC) The selective use of evidence in which data or examples that support a desired conclusion are prominently cited while contradictory data or examples are ignored, suppressed, or minimized. One of the five main techniques catalogued in the FLICC taxonomy of science denial, cherry picking exploits the availability of some supporting data in virtually any body of research to manufacture an appearance of empirical legitimacy. It is distinct from legitimate selective emphasis, which acknowledges contrary evidence while arguing it is outweighed. (→ Ch.9)
See also: FLICC, Manufactured doubt
Coercive persuasion A system of social influence and behavior control developed in high-demand groups and totalitarian states that employs psychological rather than purely physical means to alter beliefs, identities, and behaviors. The term was developed by sociologist Robert Lifton and psychologist Edgar Schein to describe the "thought reform" programs used in Chinese Communist reeducation campaigns. Techniques typically include isolation, sleep and food deprivation, love bombing followed by punishment, confession rituals, and systematic deconstruction of prior identity. (→ Ch.29)
See also: Milieu control, Bounded choice, Sacred science, Thought-stopping clichés
Confirmation bias The tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses, while giving disproportionately less attention to information that contradicts them. One of the most robust findings in cognitive psychology, confirmation bias is systematically exploited by propaganda that provides its audience with a stream of confirming evidence while discrediting disconfirming sources. Social media algorithms may amplify confirmation bias by filtering feeds toward content the user has previously engaged with. (→ Ch.4)
See also: Echo chamber, Filter bubble, Motivated reasoning
Conspiracy framing (FLICC) The use of conspiracy narratives to preemptively discredit criticism or contradicting evidence by asserting that apparent refutation is itself part of a coordinated cover-up. One of the five FLICC techniques, conspiracy framing is logically unfalsifiable: any evidence against the conspiracy is reinterpreted as evidence of the conspiracy's power and reach. It functions to insulate a belief system from normal epistemic correction and is a common feature of both science denial and political extremism. (→ Ch.9)
See also: FLICC, Manufactured doubt, Doubt manufacturing
Content analysis A systematic, replicable research methodology for analyzing the manifest and latent content of communications — including text, images, audio, and video. In the study of propaganda, content analysis is used to quantify the presence of specific themes, symbols, techniques, and frames across large bodies of material, enabling comparative and longitudinal study. The method requires explicit coding protocols, reliability testing between coders, and care in moving from description to inference about effects. (→ Ch.36)
Coordinated inauthentic behavior (CIB) A term coined by Facebook/Meta to describe activity on digital platforms in which multiple accounts or pages work together to promote content or narratives while misrepresenting their origin, coordination, or identity. CIB may involve fake accounts, compromised real accounts, or real accounts acting in a coordinated way at the direction of a government, political party, or private contractor. The definition focuses on inauthenticity of behavior rather than falsity of content. (→ Ch.17)
See also: Astroturfing, Sockpuppet
Correction paradox The documented tendency of factual corrections to fall short of fully reversing the effects of misinformation, even when the correction is accepted and the original false claim is no longer believed. The correction paradox encompasses the backfire effect (belief strengthening under correction), continued influence effects (in which corrected claims continue to influence reasoning and recall), and various asymmetries between the ease of spreading false information and the difficulty of correcting it. (→ Ch.34)
See also: Backfire effect, Illusory truth effect, Truth sandwich
Counter-propaganda The systematic effort to identify, analyze, expose, discredit, and neutralize propaganda from adversarial sources. Counter-propaganda operations may be offensive (attacking the credibility of adversary sources), defensive (building audiences' resistance to manipulation), or structural (reforming the information environment to reduce propaganda effectiveness). Counter-propaganda is distinct from propaganda in its orientation toward accuracy and transparency, though the boundary can become blurred in practice. (→ Ch.31)
See also: Inoculation theory, Media literacy, Prebunking
CPI (Committee on Public Information) The U.S. government agency established by President Woodrow Wilson in 1917 to manage domestic propaganda during World War I. Headed by journalist George Creel, the CPI employed thousands of volunteers (the Four-Minute Men), produced posters, pamphlets, films, and news content, and used advertising techniques adapted from commercial practice to mobilize public support for the war. The CPI is a foundational case study in state-organized mass persuasion campaigns in a democratic context. (→ Ch.12)
See also: Four-Minute Men
D
Deepfake Synthetic audio or video media in which an existing image or recording of a person is replaced with a computer-generated likeness, typically using deep learning techniques such as generative adversarial networks (GANs) or diffusion models. Deepfakes have significant implications for propaganda and disinformation because they can fabricate apparently authentic evidence — a political leader giving a speech they never delivered, or a private individual appearing in compromising situations. The existence of deepfake technology also generates a "liar's dividend" in which genuine recordings can be credibly dismissed as fabricated. (→ Ch.38)
See also: Liar's dividend, AI-generated content, Information warfare
Democratic backsliding The gradual erosion of democratic norms, institutions, and practices in states that have established formal democratic systems, typically through legal and procedural means rather than outright coups. Propaganda and information manipulation are consistently identified as tools in democratic backsliding, as incumbents use media capture, strategic disinformation, and electoral manipulation to entrench their power while maintaining a formal democratic facade. The concept of the "spin dictatorship" (Guriev and Treisman) describes a contemporary subtype. (→ Ch.32)
See also: Spin dictatorship, Media capture
Deception gradient A conceptual framework for classifying forms of information manipulation along a spectrum from outright fabrication to subtle framing and omission. At one extreme lies deliberate disinformation — content known by its producer to be false. At the other extreme are techniques such as strategic omission, misleading framing, and technically true but misleading statements. The deception gradient is useful for analysis because much effective propaganda does not involve outright lies but rather the selective presentation and contextualization of true information. (→ Ch.3)
See also: Disinformation, Misinformation, Strategic omission, White propaganda / gray propaganda / black propaganda
Disinformation False or misleading information that is created and spread with the deliberate intent to deceive, typically to advance a political, commercial, or ideological agenda. Disinformation is distinguished from misinformation by the element of intent: disinformation is a weapon, while misinformation may result from honest error. The distinction matters for policy responses — misinformation may be addressed through correction, while disinformation requires attention to the actors and incentive structures producing it. (→ Ch.1)
See also: Misinformation, Deception gradient, White propaganda / gray propaganda / black propaganda
Dog whistle A coded political message that carries a covert meaning understood by a specific target audience while remaining innocuous or ambiguous to the broader public. Dog whistles allow speakers to mobilize in-group sentiment — particularly on racially or culturally charged issues — while maintaining plausible deniability. The term acknowledges that effective political communication often operates on multiple registers simultaneously, with surface meaning diverging from subtext. (→ Ch.8)
See also: Loaded language, In-group/out-group dynamics
Doubt manufacturing A strategic communications approach, documented most thoroughly in the tobacco, fossil fuel, and pharmaceutical industries, in which organizations fund and publicize research, commentary, and advocacy designed not to prove an alternative claim but simply to create the impression that scientific consensus is uncertain or contested. The goal is paralysis of regulatory action rather than persuasion of the public to a specific alternative position. The phrase "doubt is our product" — from an internal tobacco industry memorandum — has become a defining statement of the strategy. (→ Ch.5)
See also: Agnotology, Manufactured doubt, Cherry picking (FLICC)
E
Echo chamber A communication environment — whether digital, social, or institutional — in which a person encounters primarily information and opinions that reinforce their existing beliefs, because dissenting voices are absent, marginalized, or dismissed. Echo chambers are distinguished from filter bubbles by their social dimension: in an echo chamber, members actively endorse and amplify conforming views rather than merely receiving algorithmically curated content. Both phenomena are theorized to contribute to political polarization and susceptibility to propaganda. (→ Ch.16)
See also: Filter bubble, Confirmation bias, Algorithmic amplification
Emotional appeal The use of emotionally charged content — including fear, anger, pride, disgust, hope, and empathy — to persuade audiences, typically bypassing or short-circuiting careful analytical reasoning. Emotional appeals are among the most ancient and effective tools in the propagandist's repertoire, functioning to mobilize action, create in-group cohesion, and dehumanize out-groups. Emotional appeals are not inherently manipulative — all effective communication contains emotional elements — but they become propaganda techniques when they are used to bypass rather than complement reasoned argument. (→ Ch.8)
See also: Heuristic processing, Anchoring bias
Epistemic authority The socially recognized capacity of an individual, institution, or source to produce and certify knowledge claims that others accept as credible. Epistemic authority is distributed across societies through education, credentialing, professional norms, and track records of reliability. Propaganda frequently targets epistemic authority — either by claiming it falsely (through false experts), by undermining the legitimate epistemic authority of scientists, journalists, and courts, or by concentrating epistemic authority within a single party or leader. (→ Ch.6)
See also: Authority appeal (false expertise), Epistemic infrastructure
Epistemic infrastructure The set of institutions, norms, practices, and technologies through which a society produces, evaluates, distributes, and corrects knowledge. Epistemic infrastructure includes universities, scientific peer review, journalism standards, independent courts, and cultural norms of evidence-based reasoning. Propaganda and disinformation operations frequently target epistemic infrastructure because degrading the mechanisms of truth-telling is more powerful than contesting individual facts. Defending epistemic infrastructure is therefore central to democratic resilience. (→ Ch.6)
See also: Epistemic authority, Democratic backsliding, Habermas public sphere
F
Fake experts (FLICC) The recruitment and promotion of individuals with apparent credentials — often genuine credentials in an adjacent field — to provide apparent scientific or expert legitimacy for claims that lack genuine expert consensus. One of the five FLICC techniques, fake experts exploit the public's reliance on credentialed expertise and the media's norm of "both-sides" coverage to manufacture the impression of scientific controversy. The technique was documented extensively in climate denial and tobacco science campaigns. (→ Ch.9)
See also: Authority appeal (false expertise), FLICC, Manufactured doubt
False equivalence A rhetorical and journalistic practice in which two positions, sources, or claims are presented as if they carry equal evidential weight or legitimacy when they do not. False equivalence enables minority or fringe positions to gain credibility through association with mainstream positions, and it creates the appearance of balanced coverage while distorting the actual state of evidence. False equivalence is distinct from genuine intellectual balance, which acknowledges where consensus lies while still giving attention to dissent. (→ Ch.7)
See also: Fake experts (FLICC), Manufactured doubt
Firehose of falsehood A model of information warfare, most closely associated with contemporary Russian state communications strategy, that involves the high-volume, rapid-fire dissemination of multiple mutually contradictory false claims with no apparent commitment to consistency. The strategy aims not to persuade audiences of any specific alternative narrative but to overwhelm them with competing claims, destroy their confidence in the possibility of knowing the truth, and produce political paralysis. RAND Corporation analysts coined the phrase in their 2016 study of Russian propaganda techniques. (→ Ch.24)
See also: Reflexive control, Big Lie, Disinformation
Filter bubble A state of intellectual isolation produced when personalization algorithms on digital platforms curate content so that users primarily encounter information that reflects and reinforces their existing preferences and worldviews. The term was introduced by internet activist Eli Pariser in his 2011 book of the same name. Filter bubbles differ from echo chambers in that the isolation is primarily algorithmic rather than socially chosen. Empirical research has produced mixed findings about the actual magnitude of filter bubble effects. (→ Ch.16)
See also: Echo chamber, Algorithmic amplification, Confirmation bias
FLICC An acronym developed by John Cook and colleagues as a taxonomy of the five major techniques used to manufacture doubt about scientific consensus: Fake experts, Logical fallacies, Impossible expectations, Cherry picking, and Conspiracy theories. FLICC provides a useful framework for identifying science denial rhetoric across domains including tobacco, climate change, vaccines, and evolution, and it forms the basis for technique-based inoculation approaches to media literacy. (→ Ch.9)
See also: Fake experts, Cherry picking, Conspiracy framing, Manufactured doubt, Inoculation theory
Four-Minute Men A network of approximately 75,000 volunteer speakers organized by the U.S. Committee on Public Information during World War I to deliver brief pro-war speeches in movie theaters, churches, and public spaces during natural pauses in entertainment (the four-minute reel change in silent film). The Four-Minute Men represented an early application of mass advertising techniques to political propaganda and demonstrated the reach that organized volunteer networks could achieve before electronic media. (→ Ch.12)
See also: CPI (Committee on Public Information)
Framing The process by which communicators — deliberately or unconsciously — present information within a particular context, structure, or set of assumptions that shapes how audiences understand and evaluate it. Framing effects have been extensively documented in political communication: the same policy can attract majority support or opposition depending on whether it is framed as a gain or a loss, a right or a privilege, a cost or an investment. Framing is among the most pervasive and difficult-to-detect propaganda techniques because it operates through what is treated as background assumption rather than explicit assertion. (→ Ch.7)
See also: Agenda-setting, Priming, Strategic omission
G
Glittering generalities Emotionally appealing, vague, and universally valued concepts — freedom, family, tradition, progress, security — deployed in propaganda to associate a message, movement, or leader with these positive values without making specific, falsifiable claims. The Institute for Propaganda Analysis, which coined the term in 1937, identified glittering generalities as one of the seven core propaganda devices. Because glittering generalities are too abstract to be meaningfully contested, they insulate propaganda claims from refutation while building affective association. (→ Ch.8)
See also: Emotional appeal, Loaded language
H
Habermas public sphere Jürgen Habermas's theoretical concept of a domain of social life — distinct from both the state and the private sphere — in which citizens assemble to form public opinion through rational-critical discourse. The ideal public sphere, as Habermas described it, is characterized by openness, equal access, and the force of the better argument rather than status or power. Habermas's theory provides a normative benchmark for evaluating how propaganda, media concentration, and algorithmic curation distort genuinely democratic communication. His later work acknowledges that the ideal was always partial and that structural conditions of the media systematically advantage powerful actors. (→ Ch.2)
See also: Epistemic infrastructure, Lippmann-Dewey debate, Democratic backsliding
Heuristic processing A mode of information processing, described in dual-process theories of cognition, in which judgments are made quickly using mental shortcuts (heuristics) rather than through careful analytical evaluation. Heuristics include source credibility, emotional tone, familiarity, and consensus cues. Propaganda and advertising frequently design messages to trigger heuristic rather than analytical processing — for instance, by using attractive spokespeople, emotionally resonant imagery, or simple repetition — because heuristic judgments are less likely to detect logical flaws or factual errors. (→ Ch.4)
See also: Anchoring bias, Illusory truth effect, Motivated reasoning
I
Identity-protective cognition A pattern of motivated reasoning in which individuals assess evidence not primarily for its truth value but for its consistency with the beliefs, values, and identities of their social group. Developed by Dan Kahan and colleagues, identity-protective cognition explains why providing more information to politically polarized audiences often does not reduce disagreement and may increase it: greater cognitive sophistication enables more skilled rationalization. The concept is central to understanding why factual corrections often fail and why inoculation approaches focus on technique rather than content. (→ Ch.4)
See also: Motivated reasoning, Backfire effect, Confirmation bias
Illusory truth effect The empirically robust finding that repeated exposure to a statement increases the likelihood that it will be judged true, regardless of its actual truth value. First documented by Hasher, Goldstein, and Toppino in 1977, the illusory truth effect arises from processing fluency: a statement that has been encountered before is processed more easily, and this ease of processing is misattributed to familiarity and plausibility. The effect operates even when subjects are informed that a claim is false before rating it, and even among highly educated individuals. (→ Ch.4)
See also: Repetition, Big Lie, Heuristic processing
In-group/out-group dynamics The social psychological processes by which individuals categorize themselves and others into groups (in-groups and out-groups), generating differential evaluations, trust, and behavior toward members of each. Propaganda systematically exploits in-group/out-group dynamics to build group solidarity, dehumanize perceived enemies, and mobilize collective action including violence. Techniques including name-calling, us-vs.-them framing, and the identification of scapegoats all rely on activating and intensifying in-group/out-group distinctions. (→ Ch.4)
See also: Name-calling, Dog whistle, Emotional appeal
Information diet The totality of information sources — news outlets, social media platforms, search engines, personal conversations, books, and other media — that an individual regularly consumes. The concept is used in media literacy discourse by analogy to nutritional diet: just as a poor nutritional diet can impair physical health, a poor information diet — one that is narrow, partisan, or heavily populated by unreliable sources — may impair epistemic health and democratic citizenship. The SIFT method provides practical tools for improving individual information diets. (→ Ch.33)
See also: SIFT method, Filter bubble, Media literacy
Information warfare The use of information and communication systems as instruments of conflict — to degrade an adversary's decision-making capacity, sow confusion and division within enemy populations, influence public opinion in neutral or allied countries, and shape the information environment to favor one's own strategic objectives. Information warfare encompasses PSYOP/MISO operations, cyberattacks on communication infrastructure, disinformation campaigns, and the manipulation of digital platforms. Modern information warfare blurs the boundaries between wartime and peacetime and between domestic and foreign audiences. (→ Ch.27)
See also: PSYOP/MISO, Reflexive control, Firehose of falsehood
Inoculation theory A social psychological theory, developed by William McGuire in the 1960s and extended by Sander van der Linden and others, that predicts that exposure to weakened forms of persuasive attacks, together with refutation of those attacks, builds resistance to future full-strength persuasion attempts. The biological metaphor of vaccination is central: as a vaccine introduces a weakened pathogen to stimulate immune response, inoculation messaging exposes individuals to the logic and technique of manipulation so that they recognize and resist it when encountered. (→ Ch.33)
See also: Attitude inoculation, Prebunking, Technique-based inoculation, FLICC
J
Juche ideology The official state ideology of North Korea, developed by Kim Il-sung and subsequently elaborated by Kim Jong-il, centering on the principles of self-reliance (juche), independence, and the absolute primacy of the party and leader. Juche functions simultaneously as political philosophy, civic religion, and propaganda framework, providing the ideological scaffolding for the North Korean personality cult and the total information control system. It represents one of the most thoroughly documented examples of a state ideology designed to foreclose alternative worldviews. (→ Ch.22)
See also: Kim family cult of personality, Total information control, Kwangmyong
K
Kim family cult of personality The systematic construction and perpetuation of an extreme reverence for the Kim dynasty — Kim Il-sung, Kim Jong-il, and Kim Jong-un — as the organizing principle of North Korean political and cultural life. The cult employs every available medium, including monuments, murals, film, music, school curricula, and obligatory public rituals, to present the Kim family as quasi-divine figures whose personal greatness is the source of national identity and survival. The North Korean case represents the most extensively documented contemporary example of a totalitarian personality cult. (→ Ch.22)
See also: Juche ideology, Total information control, Kwangmyong
Kwangmyong The North Korean domestic intranet — a closed, state-controlled network accessible to North Korean citizens that provides content entirely approved by the state, with no connections to the global internet. Kwangmyong exemplifies the total information control strategy of the Kim regime: citizens can access a curated version of digital communication while being systematically prevented from accessing external information. The network illustrates how information architecture itself can be a propaganda instrument. (→ Ch.22)
See also: Total information control, Kim family cult of personality, Media capture
L
Liar's dividend The strategic benefit accrued by bad actors from the mere existence of deepfake and synthetic media technology, even without deploying that technology. Because audiences know that convincing fakes are possible, genuine recordings of real events can now be credibly dismissed as fabricated. The liar's dividend thus empowers bad actors to deny authentic evidence against them, compounding the direct disinformation threat of deepfakes with an indirect threat to the evidentiary value of authentic documentation. (→ Ch.38)
See also: Deepfake, Disinformation
Loaded language Words and phrases that carry strong emotional connotations — positive or negative — beyond their literal denotative meanings, used to prime an audience's emotional response and frame an argument without explicit argument. Examples include the difference between "freedom fighter" and "terrorist," "pro-life" and "anti-abortion," or "collateral damage" and "civilian deaths." Loaded language is among the most pervasive and difficult-to-detect propaganda techniques because it operates within apparently neutral descriptive sentences. (→ Ch.7)
See also: Framing, Dog whistle, Glittering generalities
Love bombing A technique used in high-demand religious groups, cults, and coercive relationships in which a recruit is overwhelmed with affection, attention, praise, and a sense of belonging in the early stages of recruitment. Love bombing creates strong emotional bonds and a sense of debt and gratitude that are subsequently leveraged to demand compliance and discourage exit. The sudden withdrawal of love bombing — and its replacement with criticism or punishment — is used as a behavioral control tool. (→ Ch.29)
See also: Coercive persuasion, Bounded choice, Milieu control
Lippmann-Dewey debate A foundational controversy in democratic theory, played out primarily in the writings of Walter Lippmann and John Dewey in the 1920s, about whether democratic citizens are capable of the self-governance democracy requires. Lippmann, in Public Opinion (1922) and The Phantom Public (1925), argued that modern societies are too complex for ordinary citizens to understand and that expert management of public affairs was necessary. Dewey, in The Public and Its Problems (1927), rejected this technocratic conclusion, arguing that the problem was inadequate communication and civic education rather than inherent citizen incapacity. The debate frames enduring questions about propaganda, media literacy, and democratic theory. (→ Ch.2)
See also: Habermas public sphere, Media literacy, Rationality assumption
M
Manufactured doubt The strategic production of uncertainty about established scientific knowledge, typically funded by industries whose products would be regulated if the evidence were accepted at face value. Distinguished from genuine scientific uncertainty by the fact that it is produced not by new empirical findings but by communications strategies including fake experts, cherry-picked data, and the amplification of outlier research. The tobacco industry's campaign against evidence of smoking's health harms is the canonical case; the same playbook has been applied in climate change denial, pharmaceutical regulation debates, and vaccine safety controversies. (→ Ch.5)
See also: Doubt manufacturing, FLICC, Agnotology
Media capture The process by which private economic interests or political actors come to control the editorial independence of media organizations, such that coverage systematically reflects the interests of the controlling party rather than serving the public information needs. Media capture may occur through direct ownership, through regulatory control, through advertising dependence, or through the cultivation of access journalism. Captured media is a key instrument of democratic backsliding and is distinguished from state media by the maintenance of a formally independent appearance. (→ Ch.32)
See also: Democratic backsliding, Spin dictatorship
Media literacy The capacity to access, analyze, evaluate, and create media content with a critical awareness of how media is produced, distributed, and consumed. Media literacy encompasses understanding the economic and institutional structures of media industries, the techniques of persuasion and propaganda, the epistemological principles of evaluating evidence, and the practical skills of source verification. Media literacy education is broadly recommended as a democratic resilience tool, though debate continues about its effectiveness and limitations. (→ Ch.33)
See also: SIFT method, Inoculation theory, Counter-propaganda
Milieu control One of Robert Lifton's eight criteria of ideological totalism: the control of all social and physical environments to eliminate outside information and reinforce the group's reality. In a milieu-controlled environment, all relationships, communications, and experiences are filtered through the organization's framework. Information that contradicts the group's worldview is either excluded from the environment or systematically reframed as error, temptation, or evidence of the outside world's corruption. (→ Ch.29)
See also: Coercive persuasion, Bounded choice, Sacred science, Total information control
Misinformation False or inaccurate information that is spread regardless of intent to deceive. Misinformation encompasses honest mistakes, misremembering, and misunderstanding as well as the unintentional spread of disinformation by people who believe it to be true. The distinction between misinformation and disinformation is analytically important because the two phenomena require different interventions: misinformation may be addressed through correction and education, while disinformation requires attention to the actors and institutions producing it. (→ Ch.1)
See also: Disinformation, Deception gradient
Motivated reasoning A cognitive process in which an individual's reasoning is shaped by a desired conclusion rather than by evidence and logic alone. In motivated reasoning, the individual unconsciously searches for evidence supporting the desired conclusion, evaluates supporting evidence more credulously than challenging evidence, and constructs post-hoc rationalizations for conclusions reached on non-epistemic grounds. Motivated reasoning differs from simple irrationality in that the mechanisms involved are often sophisticated: highly intelligent individuals may engage in more elaborate motivated reasoning than less sophisticated ones. (→ Ch.4)
See also: Identity-protective cognition, Confirmation bias, Backfire effect
N
Name-calling A propaganda technique that attaches negative labels to individuals, groups, policies, or ideas in order to trigger rejection without requiring substantive argument. One of the seven propaganda techniques identified by the Institute for Propaganda Analysis (1937), name-calling works by activating emotional and identity-based responses rather than analytical evaluation. Contemporary forms include the use of ideological labels as terms of abuse, dehumanizing metaphors, and the association of opponents with widely reviled figures or movements. (→ Ch.8)
See also: Loaded language, In-group/out-group dynamics, Glittering generalities
NetzDG (Network Enforcement Act) The German Netzwerkdurchsetzungsgesetz, enacted in 2017 and subsequently revised, which requires large social media platforms to remove "obviously illegal" content — including hate speech, incitement, and defamation under German law — within 24 hours of a valid complaint, or face substantial fines. NetzDG represents one of the most significant early attempts by a liberal democracy to impose legal obligations on platforms to address harmful speech, and it has become a reference point in international debates about the appropriate legal framework for platform content moderation. (→ Ch.35)
See also: Section 230, Brandenburg standard
O
On-the-spot guidance (KCNA) The North Korean practice in which the Supreme Leader's personal visits to sites of production, education, or military activity — and his spoken instructions during those visits — are treated as authoritative policy directives and published as official guidance by the Korean Central News Agency (KCNA). The practice embeds the leader's personal authority directly into the management of all state institutions and represents a structural feature of the personality cult. It also ensures that the KCNA output reflects the leader's personal presence in all domains of national life. (→ Ch.22)
See also: Kim family cult of personality, Juche ideology
P
Prebunking An application of inoculation theory in which misinformation or manipulation techniques are exposed and refuted before audiences encounter them in the wild, rather than after. Prebunking focuses on the techniques of manipulation (logical fallacies, emotional exploitation, source distortion) rather than specific false claims, making it more transferable across topics and more durable as claims evolve. Empirical research by van der Linden, Lewandowsky, and others suggests prebunking outperforms post-hoc fact-checking in reducing misinformation acceptance. (→ Ch.33)
See also: Inoculation theory, Attitude inoculation, Technique-based inoculation, Fact-checking
Priming The process by which exposure to one stimulus influences the response to a subsequent stimulus by activating related concepts in memory. In political communication, priming describes how media coverage increases the weight audiences assign to particular considerations when evaluating political figures and issues — for instance, a period of heavy crime coverage may lead audiences to weight crime performance more heavily when evaluating an incumbent's approval. Priming and agenda-setting operate in related but distinct ways: agenda-setting determines what is salient; priming determines which considerations are activated in judgment. (→ Ch.7)
See also: Agenda-setting, Framing, Anchoring bias
Propaganda (multiple definitions) The concept of propaganda has been defined in multiple, sometimes competing ways across its long history of study. Key formulations include: (1) Harold Lasswell's early definition of the management of collective attitudes by the manipulation of significant symbols; (2) Jacques Ellul's broad sociological definition of propaganda as a technique for producing active social conformity, encompassing both overt political messaging and the diffuse "sociological propaganda" of consumer culture; (3) definitions emphasizing intent, which restrict propaganda to messages designed to bypass critical reasoning rather than engage it; and (4) neutral social-scientific definitions focused on the systematic, large-scale nature of persuasive communication. The definitional question matters because it shapes which phenomena fall under the category and what ethical and legal frameworks apply. (→ Ch.1)
See also: Disinformation, Strategic communication, Deception gradient
Propaganda-democracy feedback loop The mutually reinforcing relationship between propaganda and democratic erosion: effective propaganda weakens the epistemic conditions (shared facts, independent media, civic trust) necessary for democratic function; and weakened democracy reduces the institutional checks that constrain propaganda. The feedback loop can be self-reinforcing, contributing to democratic backsliding even in long-established democracies. Understanding the loop's mechanisms is central to designing effective democratic resilience strategies. (→ Ch.2)
See also: Democratic backsliding, Epistemic infrastructure, Habermas public sphere
PSYOP/MISO Psychological Operations (PSYOP), now officially redesignated Military Information Support Operations (MISO) in U.S. military doctrine, refers to planned operations using communications to influence the attitudes, beliefs, and behaviors of foreign target audiences in support of national objectives. MISO operations are a formal instrument of state power, governed by legal and policy frameworks that restrict them to foreign audiences (U.S. law prohibits domestic application). They may employ any communication medium and range from tactical battlefield applications to strategic, long-duration influence campaigns. (→ Ch.27)
See also: Information warfare, Strategic communication, Reflexive control
Public sphere The domain of social life in which citizens assemble and engage in discussion to form public opinion, mediating between civil society and the state. The concept, central to Jürgen Habermas's theoretical framework, has become a key normative reference point in discussions of propaganda, media, and democracy. A healthy public sphere requires conditions including freedom of expression, universal access, independence from state and commercial interests, and norms of evidence-based deliberation. Propaganda, media concentration, and algorithmic curation all represent threats to the conditions of a functional public sphere. (→ Ch.2)
See also: Habermas public sphere, Epistemic infrastructure, Lippmann-Dewey debate
R
Rationality assumption The contested assumption, derived from classical liberal political philosophy and standard economic theory, that individuals are capable of and typically engage in rational, evidence-based evaluation of political and commercial information. The rationality assumption underpins legal frameworks for free speech (counter-speech will correct false speech) and market regulation (informed consumers will choose appropriately). Propaganda studies, behavioral economics, and social psychology have all produced extensive evidence that the rationality assumption is an idealized approximation rather than a description of typical human cognition. (→ Ch.2)
See also: Heuristic processing, Motivated reasoning, Lippmann-Dewey debate
Reflexive control A Russian military and strategic communications concept, developed in the 1960s and 1970s, describing the ability to induce an adversary to make decisions that serve one's own interests rather than theirs — by shaping the information environment in which they deliberate, distorting the adversary's model of reality, or creating the conditions under which the adversary's "rational" decision-making leads to self-defeating conclusions. Reflexive control is broader than disinformation: it encompasses all measures to shape an adversary's decision-making process, including deception, feints, and the manipulation of perceptions of cost and risk. (→ Ch.24)
See also: Information warfare, Firehose of falsehood, PSYOP/MISO
Repetition The communication strategy of repeating a message, claim, or symbol frequently and across multiple channels in order to increase familiarity, memorability, and perceived truth. Repetition is among the most well-documented propaganda mechanisms, operating through the illusory truth effect (familiarity increases perceived truth), the availability heuristic (frequently encountered information is more easily recalled and weighted), and classical conditioning of emotional associations. The strategy is central to advertising, political messaging, and totalitarian propaganda alike. (→ Ch.4)
See also: Illusory truth effect, Big Lie, Anchoring bias
S
Sacred science One of Robert Lifton's eight criteria of ideological totalism: the treatment of the group's ideology as a sacred and inviolable truth that is beyond question, criticism, or empirical challenge. Sacred science combines the moral authority of religion with the epistemological authority claims of science, delegitimizing any questioning of the ideology as both moral failure and intellectual error. It is a feature of totalist movements because it forecloses the possibility of internal reform and demands unconditional intellectual loyalty. (→ Ch.29)
See also: Coercive persuasion, Milieu control, Bounded choice
Section 230 Section 230 of the U.S. Communications Decency Act (1996), which provides that interactive computer services (including social media platforms) are not treated as publishers or speakers of third-party content and therefore bear no liability for that content. Section 230 also protects platforms' good-faith efforts to moderate content. The provision has been widely credited with enabling the growth of the internet economy by protecting platforms from potentially unlimited defamation and other liability, while critics argue it reduces platforms' incentives to address harmful and false content. (→ Ch.35)
See also: NetzDG, Brandenburg standard
Significance quest theory A social psychological theory, developed by Arie Kruglanski, propositing that extremist radicalization is driven by a fundamental human need for personal significance — a sense of mattering, of having worth and meaning in the world. Propaganda and recruitment communications from extremist movements systematically target individuals whose sense of significance has been threatened or diminished (by humiliation, failure, or marginalization) and offer them a path to restored or elevated significance through membership in the cause. (→ Ch.29)
See also: In-group/out-group dynamics, Emotional appeal
SIFT method A four-step framework for evaluating information online, developed by information literacy educator Mike Caulfield: Stop, Investigate the source, Find better coverage, Trace claims to original context. SIFT is designed as a practical, accessible alternative to comprehensive fact-checking that can be applied in real-time browsing. The framework emphasizes lateral reading — opening new tabs to investigate sources using external checks — rather than attempting to evaluate credibility from within a single source. (→ Ch.33)
See also: Media literacy, Truth sandwich, Prebunking
Simplification The propaganda technique of reducing complex, multidimensional political, social, or scientific issues to a simple binary or a single explanatory factor. Simplification makes messages easier to understand and emotionally compelling but obscures the genuine complexity that effective policy and civic reasoning require. Extremist propaganda is particularly reliant on simplification (e.g., reducing economic grievance to ethnic scapegoating) because simple narratives are more emotionally satisfying and more resistant to nuanced correction. (→ Ch.8)
See also: Emotional appeal, Framing
Sockpuppet A fake online identity, typically an individual account rather than a mass-automated bot, created to make astroturfing activity appear to come from independent members of the public. Sockpuppets may be entirely fabricated personas or stolen/adapted identities of real people. Unlike automated bots, sockpuppets are typically operated by human beings with strategic intent, making their behavior more adaptive and harder to detect by automated content moderation systems. (→ Ch.17)
See also: Astroturfing, Coordinated inauthentic behavior (CIB)
Spin dictatorship A model of contemporary authoritarianism, developed by political scientists Sergei Guriev and Daniel Treisman, in which leaders maintain power not through mass terror (as in classical totalitarianism) but through the manipulation of public information to manufacture consent. Spin dictators use propaganda, media capture, and strategic co-optation of independent voices to create a favorable image of competent governance, while reserving targeted repression for political opponents rather than applying it broadly. The model is relevant to cases including contemporary Russia, Hungary, and Turkey. (→ Ch.32)
See also: Democratic backsliding, Media capture
Strategic communication The deliberate use of communication to advance organizational or national objectives, encompassing public diplomacy, public affairs, information operations, and persuasion campaigns. Strategic communication is a neutral term used across government, military, corporate, and non-profit contexts. It becomes a topic of ethical concern when strategic communication crosses into deception, manipulation, or coercion — a boundary that is contested in practice and varies across legal jurisdictions and professional ethical codes. (→ Ch.27)
See also: Propaganda, PSYOP/MISO, Counter-propaganda
Strategic omission The deliberate exclusion of relevant information from a communication in order to create a misleading impression without making explicitly false statements. Strategic omission is a primary tool of propaganda because it enables technically honest claims to function as deception, making it difficult to challenge on factual grounds. The deception gradient framework situates strategic omission near the less extreme end of information manipulation, but its practical effects can be as significant as outright fabrication. (→ Ch.3)
See also: Deception gradient, Framing, Cherry picking (FLICC)
Symbols In propaganda analysis, visual, linguistic, or ritual objects that carry condensed emotional, ideological, and identity meanings beyond their literal content. Propaganda makes systematic use of symbols — flags, uniforms, slogans, architectural forms, ritual gestures — because they activate affective responses and group identifications more efficiently than propositional argument. Harold Lasswell defined propaganda as the management of collective attitudes through "significant symbols," placing symbolism at the heart of the theoretical framework. (→ Ch.3)
See also: Glittering generalities, Loaded language, Emotional appeal
T
Technique-based inoculation A form of inoculation that focuses on familiarizing audiences with the general techniques of manipulation — logical fallacies, emotional exploitation, false authority — rather than with the specific false claims currently in circulation. Because specific disinformation claims evolve rapidly, technique-based inoculation is more transferable and durable than claim-specific prebunking. Research by Sander van der Linden and colleagues using the "Bad News" game has demonstrated significant technique-based inoculation effects. (→ Ch.33)
See also: Inoculation theory, Prebunking, FLICC
Thought-stopping clichés One of Robert Lifton's eight criteria of ideological totalism: the use of formulaic phrases, mantras, slogans, or stock responses that terminate analytical reasoning when ideologically challenging thoughts or questions arise. Thought-stopping clichés function as cognitive circuit breakers, short-circuiting independent evaluation and replacing it with group-approved templates. Examples include religious formulas, political slogans, and the loaded jargon of high-demand groups, all of which provide members with ready-made responses that foreclose further questioning. (→ Ch.29)
See also: Coercive persuasion, Sacred science, Milieu control
Total information control A system of censorship, surveillance, and controlled information supply in which a state or organization attempts to prevent its population from accessing all information that is not approved by the controlling authority, while providing a curated alternative information environment. Total information control is the information-architectural expression of totalitarianism: it aims not merely to suppress dissent but to prevent the formation of the independent knowledge base that would make dissent possible. North Korea's system — combining Kwangmyong, signal jamming, travel restrictions, and severe penalties for consuming foreign media — represents the most thoroughly documented contemporary example. (→ Ch.22)
See also: Kwangmyong, Milieu control, Kim family cult of personality
Transfer technique A propaganda technique that creates positive or negative associations for a person, idea, or product by visually, linguistically, or symbolically linking it to something already invested with positive or negative value. Identified by the Institute for Propaganda Analysis (1937), transfer operates through the mechanism of classical conditioning: by repeatedly associating a leader with revered national symbols, or an opponent with reviled images, propagandists transfer emotional valence without making propositional claims. Advertising relies on transfer extensively (pairing products with attractive, aspirational lifestyles). (→ Ch.8)
See also: Symbols, Emotional appeal, Glittering generalities
Transparency test A criterion for evaluating whether a persuasion attempt crosses the ethical line from legitimate persuasion to manipulation: would the communicator be willing to fully disclose, to the audience, the techniques, intent, and funding behind the communication? Propaganda techniques that rely on concealment of sponsorship (astroturfing), manufactured origin (sockpuppets), or deliberate emotional manipulation typically fail the transparency test. The transparency test is one of several ethical principles proposed for distinguishing ethical persuasion from manipulation. (→ Ch.36)
See also: Ethical persuasion, White propaganda / gray propaganda / black propaganda
Truth sandwich A strategic communication recommendation for journalists and communicators addressing false claims: frame the report by stating the truth first, briefly describe the false claim and debunk it, then return to and reinforce the truth. The approach is designed to counter the tendency of conventional "myth-busting" formats to inadvertently amplify falsehoods by placing them prominently in headlines and ledes. Empirical support for the truth sandwich approach, while intuitive, remains an active area of research. (→ Ch.34)
See also: Correction paradox, Prebunking, Illusory truth effect
U
Us vs. them The fundamental binary of propagandistic communication: the division of the social world into an in-group (virtuous, threatened, deserving of solidarity) and an out-group (dangerous, alien, deserving of suspicion or hostility). Us-vs.-them framing reduces political and social complexity to a single axis of conflict, activates tribal identity psychology, and makes cooperation across group boundaries cognitively and emotionally difficult. It is a structural feature of authoritarian and totalitarian propaganda and a recurrent element of extremist recruitment communications.
See also: In-group/out-group dynamics, Name-calling, Simplification
V
Vaccine hesitancy Delay in acceptance or refusal of vaccines despite the availability of vaccine services, as defined by the World Health Organization. Vaccine hesitancy is a major public health consequence of health disinformation, driven by a complex mix of manufactured doubt (deliberate anti-vaccine campaigns), genuine concerns amplified beyond their evidentiary base, institutional distrust, and identity-protective cognition. It illustrates the real-world stakes of propaganda and disinformation analysis — the WHO identified vaccine hesitancy as one of the ten greatest threats to global health in 2019. (→ Ch.28)
See also: Manufactured doubt, Doubt manufacturing, Identity-protective cognition, Inoculation theory
W
Whataboutism A rhetorical technique of deflecting criticism by responding not to the substance of the criticism but by pointing to comparable (or allegedly comparable) wrongdoing by the critic or their allies. Whataboutism is identified with Soviet propaganda, where it was a standard response to Western criticism of human rights abuses, but is prevalent across ideological contexts. It operates by changing the subject, implying hypocrisy, and invoking the logical but often fallacious principle of consistency to suggest that criticism is invalid unless equally applied. (→ Ch.24)
See also: False equivalence, Reflexive control
White propaganda / gray propaganda / black propaganda A classification scheme for propaganda based on source transparency. White propaganda is openly attributed to its true source, with acknowledged intent to persuade. Gray propaganda has no clear attribution, leaving its source ambiguous. Black propaganda falsely attributes its content to a source other than its true origin — for instance, a government fabricating documents purportedly produced by an adversary. The classification is significant because the ethical and legal status of propaganda operations varies significantly across the three categories, with black propaganda generally considered the most ethically problematic and legally actionable. (→ Ch.3)
See also: Deception gradient, Transparency test, Coordinated inauthentic behavior (CIB)
End of Glossary
Terms are cross-referenced throughout the text. For further reading on individual concepts, consult the bibliography and the further reading sections of the relevant chapters. For an alphabetical index of persons, institutions, and historical events, see the Index.