> "The most brilliant propagandist technique will yield no success unless one fundamental principle is borne in mind constantly — it must confine itself to a few points and repeat them over and over."
In This Chapter
- Learning Objectives
- Introduction
- Section 12.1: Defining Propaganda — From Bernays to Ellul
- Section 12.2: The Institute for Propaganda Analysis — The Classic Seven
- Section 12.3: Totalitarian Propaganda — Nazi Germany and Soviet Dezinformatsiya
- Section 12.4: Cold War Propaganda — Psychological Operations
- Section 12.5: Modern Political Propaganda — Cambridge Analytica and Micro-Targeting
- Section 12.6: Visual Propaganda — Image Manipulation and Meme Warfare
- Section 12.7: The RESIST Counter-Propaganda Framework
- Section 12.8: Propaganda vs. Persuasion vs. Education — The Ethical Line
- Key Terms
- Callout Boxes
- Discussion Questions
- Summary
Chapter 12: Propaganda — Historical Techniques and Modern Applications
"The most brilliant propagandist technique will yield no success unless one fundamental principle is borne in mind constantly — it must confine itself to a few points and repeat them over and over." — Joseph Goebbels, Der Angriff (1928)
"The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society." — Edward Bernays, Propaganda (1928)
Learning Objectives
By the end of this chapter, students will be able to:
- Define propaganda and distinguish it from related concepts including persuasion, education, and advertising.
- Identify and apply the seven classic propaganda techniques identified by the Institute for Propaganda Analysis (1938).
- Analyze historical cases of totalitarian propaganda, including Nazi and Soviet techniques, and explain their psychological mechanisms.
- Trace the evolution of propaganda from Cold War psychological operations to contemporary data-driven micro-targeting.
- Recognize visual propaganda techniques and explain how images and memes function as propaganda carriers.
- Apply the RESIST framework to evaluate contemporary content for propaganda techniques.
- Navigate the ethical continuum from legitimate persuasion through advocacy to manipulative propaganda.
Introduction
In 1928, two books appeared that together defined the modern understanding of propaganda — and set the terms for debates that continue to this day. The first was Edward Bernays's Propaganda, a frank celebration of mass persuasion as a necessary technique of democratic governance. The second was Adolf Hitler's Mein Kampf volume two, which included detailed instruction on propaganda as an instrument of political domination. Both men were students of the same phenomenon: the power of systematic communication to shape what millions of people believe, feel, and do.
The distance between Bernays and Goebbels is often treated as the distance between democratic persuasion and totalitarian manipulation. But this easy distinction obscures important continuities. Bernays himself was the nephew of Sigmund Freud, and he applied psychoanalytic insights to commercial and political communication with the explicit goal of bypassing rational deliberation — of reaching people's unconscious desires and anxieties in ways they would not recognize as manipulation. He helped sell the United States into World War I, helped the United Fruit Company overthrow a democratically elected Guatemalan government, and pioneered techniques of mass persuasion that his admirers and his critics both recognized as foundational to modern advertising, public relations, and political communication.
This chapter traces propaganda from its historical roots through the systematic techniques of the 20th century's most destructive ideological movements, through Cold War psychological operations, and into the data-driven, algorithmically amplified propaganda of the 21st century. Understanding these techniques is not merely academic: it is a survival skill for citizens of digital democracies.
Section 12.1: Defining Propaganda — From Bernays to Ellul
12.1.1 Etymology and Early Meanings
The word "propaganda" comes from the Latin propagare — to propagate, to spread, to cultivate. The Congregation for the Propagation of the Faith (Congregatio de Propaganda Fide), established by Pope Gregory XV in 1622, was a committee of cardinals tasked with spreading Catholic doctrine to non-Christian peoples. The word initially carried no negative connotation: it simply meant the organized effort to propagate a set of beliefs.
The negative connotation accumulated through the 20th century, as propaganda became associated with the systematic deceptions of totalitarian states. By mid-century, "propaganda" in common usage meant roughly what it means today: manipulative communication designed to serve the interests of the communicator at the expense of the recipient's rational autonomy.
12.1.2 Bernays: Propaganda as Governance
Edward Bernays is often called the "father of public relations" — a term he partly invented as a more palatable synonym for "propaganda" when the latter became disreputable. His 1928 book Propaganda opens with a frank acknowledgment that the manipulation of mass opinion is not a pathology of democracy but one of its structural features.
"The conscious and intelligent manipulation of the organized habits and opinions of the masses," Bernays wrote, "is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country."
Bernays's framework has several defining features:
Manipulation is necessary: Bernays argued that the complexity of modern society makes direct rational deliberation by citizens impossible. Elites must simplify and manage the information environment to enable governance.
Emotions over reason: Drawing on his uncle Freud's psychoanalytic insights, Bernays argued that most human behavior is driven by unconscious desires and anxieties rather than rational calculation. Effective propaganda should target these unconscious drivers.
Engineering consent: Bernays's most famous phrase — the "engineering of consent" — treats public opinion as a technical problem to be solved by specialists. Democracy becomes not a system of citizen deliberation but a system of elite management of citizen attitudes.
Commercial applications: Bernays is credited with several commercial propaganda innovations: linking cigarette smoking to women's liberation (the "Torches of Freedom" campaign, 1929), creating the concept of the American breakfast (including eggs and bacon, which were promoted by Bernays on behalf of the bacon industry), and engineering public acceptance of the 1954 CIA-backed coup in Guatemala by creating a fake consensus of expert opinion.
Bernays's work is foundational not because it represents an extreme but because it makes explicit what is usually implicit: that organized persuasion in mass societies necessarily operates through techniques that fall well short of the rational deliberation that liberal democratic theory requires.
12.1.3 Ellul: Propaganda as Total Social Phenomenon
The French philosopher and sociologist Jacques Ellul offered the most comprehensive theoretical account of propaganda in his 1962 work Propaganda: The Formation of Men's Attitudes. Ellul's analysis was more pessimistic and structurally oriented than Bernays's:
Propaganda is unavoidable: For Ellul, propaganda is not a technique deployed by cynical elites but a structural feature of modern mass societies. Mass media, mass education, and mass politics all require simplified, emotionally compelling messages that inevitably take on propagandistic character.
Sociological propaganda: Ellul distinguished between "political propaganda" (the organized campaigns of states and parties) and "sociological propaganda" (the diffuse, largely unintended propaganda embedded in advertising, entertainment, and everyday cultural products). Sociological propaganda is, in Ellul's view, more powerful and more difficult to resist precisely because it operates below the threshold of conscious recognition.
Integration propaganda: Modern propaganda, Ellul argued, does not simply implant specific beliefs but integrates individuals into social totalities — making them feel they belong to a movement, a people, a cause larger than themselves. This integration function explains the social dimensions of propaganda: why propaganda produces collective behavior and not just individual belief change.
Informed audiences are more susceptible: Counter-intuitively, Ellul argued that educated, information-consuming audiences are more susceptible to propaganda than less-informed ones, because they consume more information that can be propagandistically shaped, and because they believe themselves to be immune to manipulation — a belief that propaganda exploits.
12.1.4 The Persuasion-Manipulation Distinction
The most practically important conceptual question for a chapter on propaganda is the distinction between legitimate persuasion and manipulative propaganda. This distinction is contested but important.
Legitimate persuasion works by providing evidence, arguments, and accurate information that allow recipients to update their beliefs through rational processes. The persuader respects the recipient's rational agency — their capacity to evaluate evidence and reach independent conclusions. Persuasion aims to bring the recipient to conclusions that the evidence actually supports.
Propaganda/manipulation works by bypassing or short-circuiting rational deliberation. It exploits cognitive biases, emotional vulnerabilities, group loyalties, and social pressures to produce belief change that the recipient would reject if they could evaluate the communicative act from the outside. Manipulation does not rely on the recipient's rational capacity — it circumvents it.
This distinction is easier to state than to apply. All communication involves selection and framing; no speaker presents a view-from-nowhere. Emotional appeals are not inherently manipulative — they may accurately communicate the emotional significance of genuine facts. And the line between persuasion and manipulation is often a matter of degree rather than kind.
Nonetheless, several markers reliably distinguish propaganda from legitimate persuasion:
- Accuracy: Does the communication accurately represent the facts it cites? Propaganda routinely misrepresents, exaggerates, or fabricates.
- Attribution: Does the communication accurately represent its source and motivation? Propaganda typically conceals its origins and interests.
- Targeting of vulnerabilities: Does the communication exploit specific psychological vulnerabilities — fear of out-groups, desire for in-group belonging, cognitive biases — rather than appealing to evidence?
- Resistance to correction: Does the communication acknowledge uncertainty and respond to counter-evidence? Propaganda typically does not.
Section 12.2: The Institute for Propaganda Analysis — The Classic Seven
12.2.1 Historical Context
In 1937, a group of American academics and public intellectuals founded the Institute for Propaganda Analysis (IPA), motivated by concern about the rising tide of fascist and communist propaganda both in Europe and in the United States. The IPA published a newsletter and several books aimed at helping ordinary citizens recognize propaganda techniques in the political communication they encountered daily.
The IPA's work was itself a product of a particular historical moment — between World War I's manipulation of public opinion and the gathering catastrophe of World War II. Its founders had witnessed how effectively propaganda had been used to bring the United States into World War I (through the Creel Committee's systematic management of public opinion) and were watching the Nazi propaganda machine reshape German society with terrifying efficiency.
The IPA identified seven propaganda techniques, which remain the most widely cited taxonomy of propaganda techniques in media literacy education:
12.2.2 The Seven Classic Techniques
Technique 1: Name-Calling
Name-calling attaches a negative label to a person, group, idea, or institution, substituting emotional reaction for rational evaluation. Rather than engaging with the substance of an opponent's position, the propagandist names the opponent with a label designed to trigger disgust, fear, or contempt.
The power of name-calling lies in its efficiency: a single emotionally loaded label can activate a complex network of negative associations without requiring any argument to be made or evidence to be presented. The label does the work — the audience's prior associations do the rest.
Historical examples range from the obviously vile (Nazi characterizations of Jewish people as "rats" and "vermin," explicitly dehumanizing language that facilitated violence) to the commonplace in contemporary political discourse ("radical," "socialist," "RINO," "fascist," "snowflake"). The technique is not limited to extreme contexts; it is pervasive in everyday political communication.
The counter-technique: identify the label; question what argument it is substituting for; ask whether engaging with the substance of the position would change your assessment.
Technique 2: Glittering Generalities
The mirror image of name-calling, glittering generalities associate a person, group, policy, or idea with positively loaded abstract words — "freedom," "democracy," "patriotism," "our values," "the American way." The words are too abstract to have precise meaning and too emotionally loaded to be evaluated critically.
Like name-calling, glittering generalities bypass rational evaluation by triggering positive emotional responses. The audience's prior positive associations with "freedom" or "the people" are transferred to whatever is being associated with these terms, without any argument being made for why the association is appropriate.
Glittering generalities are particularly effective because they are structurally difficult to oppose. Who could oppose "freedom"? The word's very abstraction makes it impossible to argue against. The propagandist can apply the label to almost anything — including proposals that, if described accurately, would be widely opposed.
Contemporary political communication is saturated with glittering generalities. "America First," "Hope and Change," "Make [X] Great Again," "Build Back Better" — these slogans function through exactly the mechanism the IPA identified in 1938.
Technique 3: Transfer
Transfer exploits the authority or prestige of respected institutions, symbols, or figures by associating them with a person, idea, or product. By connecting the propaganda target to something the audience already respects, the propagandist transfers that respect to the target without making any argument.
Common transfer carriers include: - Religious symbols (the cross, Star of David) associated with political positions - National symbols (flags, national monuments, patriotic imagery) associated with political figures or products - Scientific authority ("studies show," "experts agree") falsely claimed for non-scientific claims - Historical heroes associated with contemporary political movements - Sports and entertainment celebrities' reputations attached to political causes
Transfer does not require explicit endorsement — mere association is often enough. Photographs of politicians at church; campaign events at national monuments; political advertisements using military imagery — all deploy transfer.
Technique 4: Testimonial
Testimonial involves having a respected or well-known figure endorse a product, person, or idea — whether or not the endorsing figure has any relevant expertise or genuine knowledge of the product. The audience is expected to transfer their trust in the endorser to the endorsed.
The IPA noted that testimonial is distinct from legitimate expert opinion: a doctor's opinion on medical matters is not testimonial but evidence. Testimonial occurs when the endorser's credibility is being borrowed in a domain where they have no relevant expertise — a celebrity endorsing a pharmaceutical product; a retired general endorsing a political candidate; an actor endorsing a financial product.
Modern influencer marketing is essentially the systematic commercial application of testimonial propaganda. The celebrity or influencer endorses products, ideas, and lifestyles in which they often have no expertise and from which they receive undisclosed financial compensation — precisely the combination that the IPA identified as propagandistically problematic.
Technique 5: Plain Folks
Plain Folks is the technique of associating a politician, product, or idea with ordinary, working-class people, suggesting that it represents the authentic concerns of "real" people as opposed to an elite or establishment. The propagandist (often an elite) presents themselves or their positions as reflecting the authentic wisdom of ordinary people.
"Plain Folks" functions by activating anti-elitist sentiment and creating identification between the audience and the propagandist. If a candidate can credibly project the image of someone who "understands ordinary people" because they dress casually, eat at diners, speak informally, and reference their humble origins, the audience is more likely to trust them and support their positions.
Contemporary manifestations include: politicians photographed in diners, at NASCAR races, or at faith community events; wealthy political candidates emphasizing their families' immigrant or working-class origins; corporate leaders presenting themselves as regular guys who happen to have started companies.
Technique 6: Card Stacking
Card stacking involves presenting only the evidence that supports the propagandist's position while suppressing or ignoring contrary evidence. The individual facts cited may be accurate; the overall impression created — that the evidence overwhelmingly supports one conclusion — is false through selective presentation.
Card stacking is perhaps the most intellectually sophisticated of the seven techniques because it can be executed entirely through accurate information. The propagandist need not fabricate anything; they simply choose which accurate information to present.
Card stacking differs from normal argument in an important way: in a well-functioning deliberative context, an advocate is expected to anticipate and respond to the strongest objections to their position. Card stacking avoids even acknowledging contrary evidence — it presents a systematically one-sided picture as if it were comprehensive.
Contemporary examples abound: political advertisements that selectively cite economic statistics; medical practitioners who present only studies supporting a preferred treatment while ignoring contrary evidence; politicians who selectively cite crime statistics to support policy positions; advocates who present only worst-case scenarios of opponent policies.
Technique 7: Bandwagon
Bandwagon appeals invoke the power of social conformity: everyone is doing it, so you should too. The propagandist appeals not to the merits of a position but to the social consequences of holding or not holding it. The implicit message is: don't be left behind; your group, your people, real [Americans/patriots/progressives] all believe this.
Bandwagon is particularly powerful because social conformity is not merely a cognitive bias but an evolutionary adaptation. Humans are deeply social animals for whom belonging and social approval are fundamental motivations. Appeals to what "everyone believes" or "your side believes" bypass rational evaluation by activating social belonging needs.
Contemporary social media amplifies bandwagon effects through visible like counts, share numbers, trending topics, and follower counts — all of which function as social proof signals that trigger bandwagon effects without requiring any explicit argument.
12.2.3 Evaluating the IPA Framework
The IPA framework has endured because its seven techniques capture genuinely pervasive patterns in manipulative communication. But the framework also has significant limitations:
- The techniques are not mutually exclusive or exhaustive — most propaganda deploys multiple techniques simultaneously.
- The framework focuses on discrete rhetorical techniques without adequately addressing the structural and institutional contexts that give propaganda its power.
- The framework implies that once you recognize a technique, you are immune to it — but psychological research suggests that awareness of manipulation does not always prevent its effects.
- The framework was developed in an era of mass broadcast media and does not fully account for the personalized, algorithmically targeted propaganda of the digital age.
Section 12.3: Totalitarian Propaganda — Nazi Germany and Soviet Dezinformatsiya
12.3.1 Nazi Propaganda: Goebbels's Principles
Joseph Goebbels, Reich Minister of Public Enlightenment and Propaganda from 1933 to 1945, was the most systemized practitioner of state propaganda in the 20th century. His propaganda apparatus achieved effects that no previous government had managed: the rapid transformation of a pluralistic democratic society into a totalitarian state within a few years, sustained by popular mobilization rather than pure coercion.
Goebbels left behind an extensive record of his principles through his diaries, speeches, and memoranda. Several key principles stand out:
Simplification and repetition: Goebbels consistently emphasized that effective propaganda must reduce complex realities to simple, memorable slogans and themes. "The essence of propaganda consists in winning people over to an idea so sincerely, so vitally, that in the end they succumb to it utterly and can never escape from it." The message must be simple enough to be retained and repeated, and it must be repeated relentlessly.
Emotional activation over rational argument: Goebbels explicitly rejected rational argumentation as the primary mode of propaganda. "Propaganda has only one object — to conquer the masses. Every means that furthers this aim is good; every means that hinders it is bad." Emotional appeals, spectacle, music, ritual, and visual imagery were prioritized over reasoned argument.
The Big Lie: Often associated with Nazi propaganda (though the phrase originated as Hitler's description of alleged Jewish propaganda), the big lie principle holds that audiences are more likely to believe a colossal falsehood than a small one, because they cannot imagine anyone fabricating something so large. A small lie is easier to disprove; a big lie is almost too large to challenge because it seems impossible that anyone would have the audacity to fabricate it.
Control of the information environment: Goebbels understood that propaganda requires not only the projection of a specific message but the elimination of competing messages. Nazi propaganda systematically destroyed independent media, banned opposition publications, burned books, and expelled or murdered journalists and intellectuals. The propaganda message was most effective not as one voice among many but as the only voice.
The Enemy: Effective propaganda, for Goebbels, required a clearly defined enemy — a scapegoat onto whom all the anxieties and frustrations of a disoriented population could be projected. The Jews served this function in Nazi propaganda with terrifying efficiency. The enemy must be simultaneously dangerous (to justify extreme measures) and contemptible (to deny them moral standing).
Spectacle and aesthetics: Nazi rallies, films, architecture, and visual design were deliberately aestheticized — turned into overwhelming sensory experiences that bypassed critical thought. Leni Riefenstahl's Triumph of the Will (1935) represents the apex of this approach: a film that turns a political event into an aesthetic experience, creating emotional identification with the Nazi regime through pure cinematic power.
12.3.2 Psychological Mechanisms of Totalitarian Propaganda
The psychological mechanisms through which Nazi propaganda achieved its effects have been studied extensively by historians and psychologists. Several mechanisms deserve emphasis:
Identity fusion: Totalitarian propaganda fuses individual identity with collective identity — being German becomes inseparable from being Nazi. Any criticism of the regime becomes a self-criticism, creating psychological resistance to outside information.
Dehumanization and moral exclusion: Propaganda that successfully dehumanizes the target group — representing Jews as rats, parasites, or viruses — makes violence against them psychologically manageable for ordinary people. Moral exclusion (defining the target group as outside the moral community) is a prerequisite for atrocity.
Social proof and conformity pressure: When everyone around you appears to believe the propaganda, dissent becomes psychologically costly. The public performance of belief — the required salute, the mandatory participation in rallies — creates social pressure that reinforces conformity even among those who are privately skeptical.
Fear: Totalitarian propaganda combined positive messages (national greatness, Aryan community) with fear — of the enemy, of social exclusion, of the state. Fear narrows cognitive focus, reduces critical capacity, and increases receptivity to simple, authoritarian messages.
12.3.3 Soviet Dezinformatsiya
Soviet propaganda operated on two registers: domestic propaganda (shaping the beliefs of Soviet citizens) and foreign disinformation (dezinformatsiya — active measures targeting foreign populations).
Domestic Soviet propaganda shared many features with Nazi propaganda: simplification, repetition, elimination of competing media, construction of enemies (kulaks, imperialists, Trotskyites). But Soviet propaganda was distinctive in its emphasis on pseudo-scientific legitimacy — presenting Marxist-Leninist ideology as a scientific system, making opposition to it equivalent to opposition to science itself.
Foreign dezinformatsiya was the Soviet KGB's active measures program directed at foreign populations. Key techniques included:
- Document forgery: Creating fake documents attributed to foreign governments, intelligence agencies, or prominent figures to damage their credibility or manipulate policy.
- Media manipulation: Planting false stories in foreign media, often beginning in developing-world outlets and amplifying them until they reached major Western publications.
- Front organizations: Creating apparently independent organizations — peace movements, human rights groups, academic institutes — secretly controlled and funded by Soviet intelligence.
- Defector exploitation: Using genuine defectors' authentic knowledge of Western intelligence practices to lend credibility to false or exaggerated claims.
- Conspiracy narratives: Constructing and circulating conspiracy theories about Western governments that, even if not widely believed, served to muddy the information environment and reduce trust in Western institutions.
The most extensively documented Soviet disinformation operation — Operation INFEKTION — involved the systematic spread of the false claim that the United States government had created the AIDS virus at Fort Detrick, Maryland. Beginning in 1983 in Indian and African media outlets, the story spread globally and was believed by millions of people. Contemporary variants of this claim persist in online disinformation ecosystems today.
Section 12.4: Cold War Propaganda — Psychological Operations
12.4.1 US Information Warfare
The United States created its own sophisticated propaganda apparatus during the Cold War, operating through both official and covert channels.
The United States Information Agency (USIA), established in 1953 under the Eisenhower administration, was the official US government body responsible for public diplomacy — communicating US values and policies to foreign audiences. The USIA produced films, magazines, exhibitions, and cultural exchanges designed to create positive associations with American democracy and free-market capitalism.
Voice of America (VOA), established during World War II and expanded during the Cold War, broadcast news and cultural programming to audiences behind the Iron Curtain. VOA's journalistic independence (formally maintained, if imperfectly) distinguished it from Soviet propaganda outlets in terms of credibility.
Radio Free Europe and Radio Liberty, funded by the CIA through the 1970s (and subsequently through congressional appropriations), were designed to provide news to Eastern European and Soviet audiences that their domestic state media suppressed. The services cultivated real credibility among their audiences through relatively accurate journalism — though critics noted that they also served US foreign policy interests.
Covert propaganda: Beyond official channels, the CIA ran numerous covert propaganda operations, including funding of cultural organizations (the Congress for Cultural Freedom), academic journals, artistic movements, and foreign political parties. The revelation of these operations in the 1960s and 1970s significantly damaged US credibility.
12.4.2 The Psychological Operations Toolkit
Both US and Soviet psychological operations ("psyops") drew on a common toolkit of techniques developed over decades of study and practice:
White, gray, and black propaganda: "White" propaganda is attributed propaganda — the communicator is identified. "Gray" propaganda has no clear attribution. "Black" propaganda is falsely attributed — it appears to come from a source other than the actual producer. Black propaganda is the most deceptive and the most powerful when it works.
Leaflet warfare: During World War II and Korea, millions of leaflets were dropped over enemy territory by both sides, containing appeals to surrender, claims about the war's progress, and targeted messages designed to undermine morale.
Radio warfare: Wartime radio operations included broadcasts from fake "enemy" stations, broadcasting demoralizing content attributed to enemy governments that had never produced it.
Rumor and disinformation: Strategic planting of rumors in enemy populations — about food shortages, military defeats, leader health, sexual scandals — designed to undermine morale and institutional trust.
Section 12.5: Modern Political Propaganda — Cambridge Analytica and Micro-Targeting
12.5.1 The Data-Driven Turn
Modern political propaganda represents a fundamental evolution in the techniques of political manipulation. The core innovation of the post-2010 period is the marriage of mass behavioral data (collected through social media platforms, consumer databases, and voter records) with the psychological insights of academic personality research, enabling propaganda to be precisely targeted to individual psychological vulnerabilities at mass scale.
12.5.2 Psychographic Targeting
The OCEAN model of personality (Openness, Conscientiousness, Extraversion, Agreeableness, Neuroticism) provides a framework for characterizing individual psychological profiles in ways that predict political attitudes and behavioral tendencies. Research by Michal Kosinski and colleagues at Cambridge demonstrated that OCEAN profiles could be predicted with high accuracy from Facebook "likes" — enabling the construction of personality profiles for hundreds of millions of people without their awareness or consent.
Cambridge Analytica, a political data firm, claimed to have used psychographic profiles to deliver precisely targeted political advertising to individual voters during the 2016 US presidential election and the Brexit referendum. The firm's claimed capabilities — delivering different messages to high-neuroticism, low-agreeableness voters versus high-openness, high-conscientiousness voters — represented a fundamental evolution in propaganda technique: mass personalization.
The full scope of what Cambridge Analytica actually delivered (as opposed to what it claimed) remains contested. The firm's claims about its capabilities were themselves propagandistic — overstated for commercial purposes. What is established:
- The firm harvested Facebook user data for tens of millions of users without adequate consent, through a personality quiz app that collected data not just from users but from their social networks.
- The firm used this data to construct psychological profiles and targeting models for political advertising.
- The firm worked for the Trump campaign, the Brexit Leave campaign, and numerous other political clients.
- The firm's owner, Alexander Nix, boasted in secretly recorded footage about using entrapment operations, fabricated "deep state" opposition material, and coordinated fake grassroots campaigns on behalf of political clients.
The Cambridge Analytica scandal prompted widespread regulatory and policy responses, including GDPR enforcement actions, platform data access restrictions, and congressional hearings.
12.5.3 The Structure of Modern Political Advertising
Modern political advertising operates in an information environment fundamentally different from the broadcast era:
Micro-targeting: Political advertisements can be targeted to specific demographic slices — age, location, income, political history, consumer behavior — far more precisely than broadcast or even cable television allowed.
Dark ads: On platforms like Facebook, political advertisers can purchase "dark ads" — advertisements visible only to their targets and not to researchers, journalists, or regulators. This advertising opacity makes systematic study and accountability very difficult.
A/B testing at scale: Political campaigns routinely run dozens or hundreds of variants of the same message simultaneously, using algorithmic optimization to identify the most engaging version. This iterative refinement process means that political propaganda is continuously optimized for emotional impact.
Algorithmic amplification: Social media algorithms that prioritize high-engagement content systematically amplify political content that generates strong emotional responses — anger, fear, outrage — regardless of its accuracy. Political content creators have learned to craft messages that maximize algorithmic amplification.
Section 12.6: Visual Propaganda — Image Manipulation and Meme Warfare
12.6.1 The Semiotics of Propaganda Imagery
Visual propaganda exploits the distinctive features of image-based communication:
Speed: Images are processed faster than text. An emotionally charged image can trigger a response — fear, outrage, desire, disgust — before conscious evaluation can intervene.
Apparent indexicality: Photographs carry a cultural presumption of "being there" — of showing what actually happened. This presumption of evidential status makes photographic imagery particularly powerful as propaganda, since false or decontextualized images are automatically granted the credibility of "evidence."
Non-propositional content: Images convey meaning without making explicit propositional claims. A photograph can create a powerful impression without making any statement that could be factually evaluated or rebutted. This non-propositional character makes visual propaganda legally and logically difficult to challenge.
Aesthetic experience: Visual propaganda, particularly at its most sophisticated (Riefenstahl's films, Soviet poster art, Nazi architecture), creates aesthetic experiences that embed political content in a framework of beauty, grandeur, or sublimity that activates emotional rather than rational responses.
12.6.2 Classic Visual Propaganda Techniques
Juxtaposition: Placing images of a political enemy alongside images of disease, vermin, filth, or decay to create subconscious negative associations without making explicit claims.
Scale and perspective: Nazi architects deliberately designed buildings and spaces to dwarf individual human figures, creating a visual rhetoric of state power overwhelming individual agency.
Selective cropping: Cropping photographs to remove context (protestors surrounding an aggressor, for example) to change their apparent meaning. This is the visual equivalent of false context (Type 5 in the Wardle-Derakhshan taxonomy).
Color and light: Nazi and Soviet propaganda both made systematic use of light — particularly back-lighting and golden light — to create associations of transcendence, divinity, and power around political figures.
Idealized typification: Propaganda imagery typically depicts not real individuals but idealized types — the noble worker, the pure-bred soldier, the dangerous enemy. Typification strips out individuality and complexity, reducing people to category memberships.
12.6.3 Meme Warfare in the Digital Age
The internet meme is the contemporary form of visual propaganda: a combination of image and text designed for rapid sharing, easy comprehension, and high emotional impact. Political memes function as propaganda through several mechanisms:
Humor as Trojan horse: Comedy creates a lowered critical guard. Political memes that are funny activate pleasure responses that create positive associations with the political content, while the humor format signals that the content is "just a joke" — making it harder to criticize.
In-group signaling: Sharing a political meme is a social act — a declaration of group membership. Memes create shared cultural references that mark the boundaries of political communities, making them more about identity than argument.
Iterative amplification: Successful memes are modified, adapted, and re-shared by many users, with each iteration spreading the underlying political message to new audiences while appearing to be independent, organic content.
False equivalences: Memes routinely create false equivalences through visual juxtaposition — placing images of two unrelated things side by side to imply a comparison or connection that would not survive verbal articulation.
Decontextualization: Like false context (Type 5) in the information disorder taxonomy, political memes routinely strip images of their original context and recaption them with false attributions.
The "Great Meme War" — a term used by alt-right internet communities to describe their memetic propaganda campaign during the 2016 US election — illustrated that meme production and distribution could constitute a form of organized political propaganda with real-world effects.
Section 12.7: The RESIST Counter-Propaganda Framework
12.7.1 From Recognition to Action
Recognizing propaganda techniques is necessary but not sufficient for resisting propaganda. The RESIST framework, developed by the UK Government Communications Headquarters (GCHQ) and adapted by various media literacy organizations, provides a structured approach to countering propaganda at both the individual and systemic levels.
R — Recognize: The first step is recognizing content as propaganda — identifying the specific techniques being deployed. This requires familiarity with the IPA seven techniques, an understanding of visual propaganda, and awareness of modern digital manipulation tactics. Recognition alone activates the analytical thinking that propaganda attempts to bypass.
E — Examine: Critically examine the content's claims, sources, and techniques. Who created this? What evidence is cited? Are the sources verifiable? What techniques are being used? What is the communicator's interest in persuading you? Examination involves the full toolkit of source verification: lateral reading, reverse image search, domain analysis, and fact-checking.
S — Source-check: Verify the source of the content independently. A claim is not more credible because the source appears authoritative; apparent authority must be verified. This includes checking whether a source is genuine (vs. imposter) and whether it has a track record of accuracy.
I — Identify techniques: Explicitly identify which propaganda techniques are present. Naming the technique — "this is bandwagon," "this is name-calling," "this is card stacking" — activates the metalinguistic awareness that reduces the technique's effectiveness. Research suggests that explicitly labeling a manipulation technique reduces its impact.
S — Stop: Stop before sharing. The most impactful individual action in the information ecosystem is the decision not to amplify propaganda content. Each share decision is an opportunity to break the propagandistic chain. Pausing before sharing — even briefly — significantly reduces impulsive re-sharing of emotionally charged content.
T — Teach: Share your knowledge of propaganda techniques with others. Propaganda thrives in information environments where audiences are credulous and uncritical. Each person who understands and can recognize propaganda techniques becomes a local inoculation node, spreading resistance rather than credulity through their social networks.
12.7.2 Limitations of Individual Counter-Propaganda Approaches
The RESIST framework correctly emphasizes individual agency in countering propaganda. But individual counter-propaganda capacity is limited by structural factors that individual awareness cannot overcome:
Time and cognitive load: The deliberate evaluation that RESIST recommends is cognitively demanding. In an information environment that presents hundreds of items daily, comprehensive application of the framework is impossible. Cognitive resources are finite.
The asymmetry of creation and evaluation: Creating propaganda is cheap; evaluating it is expensive. A five-second meme requires five seconds to create and five minutes to fully evaluate. At scale, this asymmetry overwhelmingly favors propaganda producers.
Social context effects: Individual counter-propaganda capacity is undermined in social contexts where propaganda belief is the norm. When your social network, community, or family all believe a propaganda narrative, individual resistance is socially costly.
Platform design: The design of social media platforms — algorithmic amplification of emotionally engaging content, default-on sharing features, like and share counts as social proof — is systematically aligned with propaganda spread rather than critical evaluation.
These structural factors suggest that individual media literacy, while necessary, is insufficient without complementary structural interventions: platform design changes, algorithmic transparency requirements, political advertising disclosure rules, and media regulation.
Section 12.8: Propaganda vs. Persuasion vs. Education — The Ethical Line
12.8.1 The Spectrum of Influence
The concepts of propaganda, persuasion, education, advocacy, and public relations exist on a spectrum of influence rather than in sharply defined separate categories. Understanding where any particular communication falls on this spectrum requires attending to several dimensions:
Accuracy: Does the communication accurately represent the facts, including inconvenient facts that complicate the message?
Transparency: Is the communicator's identity, interest, and motivation disclosed?
Respect for autonomy: Does the communication engage the recipient's rational capacity, or does it seek to bypass it through emotional manipulation, cognitive bias exploitation, or social pressure?
Reciprocity: Would the communicator apply the same communication norms to messages directed at themselves? (Would the propagandist accept the same techniques if applied by the opposing side?)
Public interest: Does the communication serve a genuine public interest, or does it primarily serve the interests of the communicator at the expense of the recipient?
12.8.2 Public Relations and Advocacy
Public relations — the organized management of communication between an organization and its publics — occupies an ambiguous position on the propaganda-persuasion spectrum. At its best, PR involves accurate communication that honestly represents an organization's position and addresses legitimate concerns. At its worst, PR is sophisticated propaganda — manufacturing consent through Bernays's techniques of agenda-setting, front organization creation, and media management.
The tobacco industry's funding of "scientific controversy" about smoking's health effects is the paradigm case of propaganda disguised as public interest research. The industry funded scientists to produce doubt rather than knowledge, giving the appearance of genuine scientific debate where consensus existed, in order to delay regulatory action. This model — creating manufactured controversy — has been explicitly replicated by industries facing regulation on climate change, chemical safety, pharmaceutical side effects, and other public health issues.
Advocacy — communicating on behalf of a cause or position one genuinely supports — occupies different moral space than PR for hire. Advocacy is not inherently propaganda, even when it is rhetorically sophisticated. The key question is whether advocacy accurately represents its position and its evidence, discloses its nature and motivation, and respects the rational agency of its audience.
Political advertising is perhaps the most contested domain. Most political advertising is card stacking at minimum, and much is name-calling, false transfer, and emotional manipulation. Yet political advertising is protected as core political speech in many democratic systems, on the theory that the marketplace of competing political messages provides an adequate check. The adequacy of this check in the era of dark ads, micro-targeting, and algorithmic amplification is increasingly questioned.
12.8.3 Education and Propaganda
The distinction between education and propaganda is fundamental to democratic theory but genuinely difficult to maintain in practice. Legitimate education: - Presents multiple perspectives and genuine disagreements in scholarly understanding - Encourages students to evaluate evidence and reach their own conclusions - Distinguishes between established fact and contested interpretation - Maintains epistemic humility about its own limitations
Propaganda disguised as education: - Presents a single perspective as obvious truth - Discourages questioning of foundational premises - Conflates factual claims with ideological positions - Treats the educator's conclusions as the natural result of learning rather than one position among several
The line between education and propaganda is contested by different political communities, who often accuse each other's educational content of being propagandistic. A useful test: does the educational content prepare students to evaluate the curriculum itself critically? Genuine education enables critique of itself; propaganda is resistant to the critical tools it claims to provide.
Key Terms
Propaganda: Organized communication designed to shape beliefs and behavior, typically by bypassing rational deliberation through emotional manipulation, cognitive bias exploitation, or information control.
Name-Calling: Attaching a negative label to a person or position to substitute emotional reaction for rational evaluation.
Glittering Generalities: Associating a person or position with positively loaded abstract words that trigger approval without argument.
Transfer: Exploiting the authority of respected symbols, institutions, or figures by associating them with a propaganda target.
Testimonial: Having a respected figure endorse a product or position outside their area of expertise.
Plain Folks: Associating a political figure or position with ordinary working people to create identification.
Card Stacking: Presenting only evidence that supports one's position while suppressing contrary evidence.
Bandwagon: Appealing to social conformity — everyone believes this, so you should too.
Dezinformatsiya: Soviet/Russian intelligence term for active measures operations involving false information.
Psychographic targeting: Using psychological profile data (personality traits, values, interests) to deliver highly personalized political messages.
Dark ads: Political advertising visible only to targeted recipients and not publicly visible or archived.
Meme warfare: The systematic production and distribution of political memes as propaganda.
Manufacturing controversy: The industry technique of funding scientific-seeming research to create the appearance of debate about settled scientific questions.
RESIST framework: A counter-propaganda framework: Recognize, Examine, Source-check, Identify techniques, Stop (before sharing), Teach others.
The engineering of consent: Bernays's phrase for the organized management of public opinion in mass society.
Inoculation theory: The theory that pre-exposing audiences to weakened propaganda techniques builds resistance to actual propaganda.
Callout Boxes
Primary Source 12.1: Goebbels on Propaganda "We do not talk to say something, but to obtain a certain effect." — Joseph Goebbels, 1934. This statement captures the essence of Goebbels's approach to communication: the goal is effect, not truth. It stands in direct contrast to the norms of scientific discourse, journalism, and education — all of which treat accuracy as a constraint on communication rather than an optional feature.
Research Spotlight 12.1: The Illusory Truth Effect and Propaganda Experimental research consistently finds that repeated exposure to political propaganda claims increases their perceived truth even when subjects are told the claims are false. This "illusory truth effect" has important implications for counter-propaganda strategies: simply repeating a claim in order to debunk it may inadvertently reinforce it. Research by Pennycook et al. (2020) suggests that "accuracy nudges" — prompts that activate critical thinking before exposure — are more effective than after-the-fact corrections.
Ethics Corner 12.1: Persuasion or Manipulation? The line between persuasion and propaganda is easier to state in principle than to apply in practice. Consider: Is a political campaign advertisement that emphasizes a candidate's military service record "transfer" propaganda, or legitimate evidence of relevant experience? Is a non-profit advocacy organization that uses emotionally compelling imagery of suffering children to fundraise engaging in emotional manipulation or legitimate communication of genuine facts? These questions do not have simple answers. The key diagnostic questions are: Is the information accurate? Is the communicator's motivation disclosed? Is the recipient's rational agency being engaged or bypassed?
Discussion Questions
-
Edward Bernays argued that the "engineering of consent" is a necessary feature of mass democracy — that complex societies cannot function through genuine citizen deliberation. Do you agree? Is there a version of this argument that is compatible with democratic values? What would "genuine deliberation" look like in a mass society?
-
The seven IPA techniques were identified in 1938. Which of the seven do you believe is most prevalent in contemporary political communication? Which has evolved most significantly in the digital era? Which (if any) do you believe can be used legitimately?
-
Compare Nazi propaganda's techniques with contemporary political propaganda. What are the important similarities? What are the important differences? Does the comparison trivialize contemporary propaganda, or does it illuminate genuine continuities?
-
Cambridge Analytica's psychographic targeting represented (or claimed to represent) a new frontier in propaganda precision. Does individualized targeting change the moral character of propaganda? Is "personalized" manipulation more or less harmful than broadcast manipulation?
-
The RESIST framework asks individuals to "stop before sharing." Is individual restraint an adequate response to the structural propaganda dynamics created by social media platforms? What structural interventions would complement individual counter-propaganda capacity?
-
Where would you draw the line between legitimate political advocacy and propaganda? Consider a cause you strongly support — could your own communication about that cause be characterized as propaganda by some of the criteria discussed in this chapter?
Summary
This chapter traced propaganda from its etymological roots through the systematic techniques of the 20th century's most destructive ideological movements, the Cold War's institutional propaganda apparatus, and the data-driven, algorithmically amplified propaganda of the digital age.
The foundational conceptual distinction is between legitimate persuasion — which engages the recipient's rational agency — and propaganda/manipulation — which bypasses it through emotional exploitation, cognitive bias targeting, and information control. This distinction is contested in specific cases but provides the necessary ethical framework for evaluating communication.
The IPA's seven techniques — name-calling, glittering generalities, transfer, testimonial, plain folks, card stacking, and bandwagon — remain the foundational taxonomy, even as the digital environment has created new vectors for each.
Totalitarian propaganda — particularly the Nazi model under Goebbels — demonstrated that systematic, state-controlled propaganda could transform democratic societies with terrifying speed. Its psychological mechanisms — identity fusion, dehumanization, social proof, and fear — are not historically unique but are available to any sufficiently motivated and resourced actor.
Modern propaganda has evolved into a data-driven, personalized, algorithmically amplified apparatus that operates at previously impossible scale and precision. The Cambridge Analytica case illustrated the convergence of academic personality research, social media data, and political consulting into something genuinely new in the history of political manipulation.
The RESIST framework provides a starting point for individual counter-propaganda practice, while acknowledging that individual awareness is insufficient without structural reform of the information environments in which propaganda operates.
Next: Chapter 13 examines conspiracy theories — their psychological roots, structural features, and the particular challenges they pose for debunking and counter-messaging.