> "Falsehood flies, and the truth comes limping after it." — Jonathan Swift, The Examiner, 1710
In This Chapter
- Learning Objectives
- Introduction
- Section 2.1: Ancient Misinformation — Rumor, Propaganda, and the Early State
- Section 2.2: The Printing Press Revolution
- Section 2.3: Yellow Journalism and Mass Media
- Section 2.4: Radio, Film, and Totalitarian Propaganda
- Section 2.5: The Television Age
- Section 2.6: The Internet Era
- Section 2.7: Social Media and the Modern Infodemic
- Section 2.8: Historical Patterns and Lessons
- Key Terms Glossary
- Discussion Questions
- References and Notes
Chapter 2: The History of Misinformation — From Rumor to the Internet Age
"Falsehood flies, and the truth comes limping after it." — Jonathan Swift, The Examiner, 1710
Learning Objectives
By the end of this chapter, students will be able to:
- Trace the history of misinformation from ancient civilizations through the digital age, identifying continuities and discontinuities across historical periods.
- Analyze how changes in communication technology — from oral culture to print, broadcast, and digital media — transformed the scale, speed, and character of misinformation.
- Describe key historical examples of organized propaganda, including state-sponsored campaigns, religious misinformation, and wartime deception.
- Explain how yellow journalism, sensationalism, and media competition created conditions for misinformation spread in the nineteenth and twentieth centuries.
- Evaluate the role of totalitarian regimes in developing systematic propaganda techniques and assess the mechanisms by which propaganda shapes public belief.
- Compare the mechanisms and reach of misinformation in different media eras: print, radio/film, television, and internet.
- Identify the structural features of social media platforms that enable the "infodemic" — the rapid, global spread of false and misleading health and political information.
- Apply historical perspective to understand current misinformation challenges, recognizing both the novelty of the digital information environment and the persistence of underlying human vulnerabilities.
- Construct historical arguments about patterns in misinformation, drawing on evidence from multiple periods and geographies.
Introduction
Misinformation is not a product of the internet age. It is not a symptom of declining education, deteriorating public discourse, or the uniquely dysfunctional character of contemporary politics. Misinformation is, in fact, as old as human communication itself. Where there are people with competing interests, imperfect information, and the capacity to speak and write, there will be false, misleading, and strategically deceptive claims.
What changes across history is not whether misinformation exists but how it spreads, who spreads it, at what scale and speed, and with what effects. Each revolution in communication technology — the invention of writing, the printing press, the telegraph, radio, television, and the internet — has transformed the misinformation landscape, typically by dramatically expanding reach and reducing the friction of spreading false claims, while simultaneously creating new opportunities for verification and correction.
Understanding this history is essential for several reasons. First, it provides perspective: the current moment, while genuinely challenging, is not categorically unprecedented. Societies have navigated serious misinformation crises before, and the historical record offers lessons about what has and hasn't worked in response. Second, it reveals patterns: certain conditions consistently generate misinformation (political conflict, war, economic anxiety, technological disruption), certain techniques recur across centuries (false attribution, manufacturing consensus, exploiting fear), and certain institutions consistently mitigate or amplify misinformation (free press, public education, credible authority). Third, history reveals the deep roots of our current crisis: many of the most important misinformation vectors of the digital age — coordinated inauthentic behavior, manufactured doubt, propaganda — were not invented online. They were refined over decades and centuries before being digitized.
This chapter surveys the history of misinformation from antiquity to the present. It is necessarily selective — a complete history would fill multiple volumes — but aims to trace the most significant developments, highlight the most instructive cases, and identify the underlying patterns that illuminate our current challenges.
Section 2.1: Ancient Misinformation — Rumor, Propaganda, and the Early State
The Antiquity of False Claims
The oldest written records already contain propaganda. The Behistun Inscription (circa 515 BCE), a massive trilingual monument carved into a cliff in present-day Iran by Darius I of Persia, is one of the earliest large-scale examples of what we would now call political propaganda. Darius used the inscription — placed where caravans and armies would see it — to legitimize his seizure of power after a period of dynastic chaos, presenting his conquest of rivals as the restoration of rightful order. The inscription is not straightforwardly false, but it is highly selective: it frames events to glorify Darius and delegitimize opponents in ways that modern historians recognize as self-serving distortion.
Ancient Egypt provides even earlier examples. Pharaohs routinely erased their predecessors' names from monuments and replaced them with their own, rewriting history in stone. Rameses II's accounts of the Battle of Kadesh (circa 1274 BCE) against the Hittites are narrated as a glorious Egyptian victory — when the historical evidence strongly suggests it was at best a draw, with a treaty negotiated on relatively even terms.
2.1.1 Rumors in Ancient Greece and Rome
Ancient Greek and Roman culture generated rich documentation of misinformation, primarily in the form of rumor (pheme in Greek, fama in Latin). Greek historians like Thucydides — himself concerned with distinguishing accurate history from myth and unreliable report — noted how rumors in Athens distorted public understanding of distant military campaigns.
The Roman republic and early empire were particularly sophisticated environments for political misinformation. Roman political life was characterized by intense competition for public office and imperial favor, creating strong incentives for defamation, rumor-mongering, and propaganda. Mark Antony and Octavian (later Augustus Caesar) waged extended campaigns of character assassination against each other in the years after Julius Caesar's assassination. Octavian deployed the poet Virgil, the historian Livy, and the satirist Horace to advance a cultural program presenting his rule as the restoration of traditional Roman values — a form of soft propaganda that shaped Roman culture for generations.
The Roman satirist Juvenal coined the phrase panem et circenses (bread and circuses) to describe how the Roman ruling class distracted the populace from political realities with food distributions and public spectacles. While we might debate whether this constitutes "misinformation" in a narrow sense, it describes a sophisticated strategy of managing public attention and perception through spectacle rather than truth.
💡 Intuition Pump: The Persistent Propaganda Formula
Note the elements of Roman political propaganda: appeal to tradition (we are restoring old values, not doing something new); character assassination of opponents; control of cultural production (poets, historians, artists); and management of public attention through spectacle. Each of these elements appears in recognizable form in contemporary political communication. The technology changes; the basic strategies persist.
2.1.2 Athenian Courts and the Art of Persuasion
The Athenian legal system depended on jury persuasion, and this created a market for professional speechwriters (logographoi) and for the study of rhetoric. The sophist movement — associated with figures like Protagoras, Gorgias, and Thrasymachus — developed highly sophisticated techniques for making weak arguments appear strong, a capability Plato found deeply threatening to truth and democratic governance.
Plato's dialogues, especially Gorgias, Phaedrus, and the Republic, are in significant part a response to the problem of persuasion without truth. Plato was deeply concerned that the democratic city would be governed by whoever was best at public performance rather than whoever knew the truth. His critique of the sophists is, in some respects, an ancient critique of what we would now call post-truth politics: the prioritization of persuasion over accuracy.
Aristotle's Rhetoric took a more pragmatic approach: understanding rhetorical technique was necessary precisely in order to defend against it. He distinguished between legitimate rhetorical appeals (to character, to emotion appropriately deployed, and to argument) and illegitimate manipulation. This distinction — between honest persuasion and deceptive manipulation — is as relevant today as it was in fourth-century BCE Athens.
2.1.3 Wartime Deception and Propaganda in the Ancient World
Military deception is as old as warfare. Sun Tzu's The Art of War (circa 5th century BCE) places deception at the center of military strategy: "All warfare is based on deception." The Trojan Horse story — whatever its historical basis — represents the Western world's archetypal image of strategic deception.
Ancient armies routinely used psychological operations alongside military force: false rumors about the size of forces, fabricated intelligence about supply routes, and deliberate misinformation about commanders' intentions. Julius Caesar's Gallic Wars describes multiple instances of strategic misinformation in military campaigns.
What distinguished ancient from modern misinformation was primarily scale and speed. In the ancient world, misinformation traveled at the speed of the fastest messenger. A false rumor could spread across a city in hours but across an empire in weeks or months. The audiences for propaganda were limited by illiteracy, geographic constraint, and the bottleneck of manuscript reproduction. These were not trivial limitations.
Section 2.2: The Printing Press Revolution
Gutenberg and the Information Explosion
Johannes Gutenberg's development of movable type printing in Europe around 1440–1450 is often described as the most important communication revolution before the internet. It dramatically reduced the cost of producing texts, allowed identical copies to be produced in large numbers, and eventually broke the Church's near-monopoly on literate culture.
The printing press enabled the Reformation. Martin Luther's 95 Theses (1517) spread across Germany in weeks through print — a dissemination speed previously unimaginable. By 1520, his writings had been read across much of Europe. The Reformation could not have happened at the scale it did without print.
But print also enabled spectacular misinformation campaigns. The same technology that spread Luther's theological arguments spread vicious anti-Semitic woodcuts, false accounts of Papal corruption, and counter-Reformation propaganda that the Church produced in response. From the beginning, the democratization of information was inseparable from the democratization of misinformation.
2.2.1 Pamphlet Wars and Religious Propaganda
The sixteenth and seventeenth centuries saw the emergence of the pamphlet as the primary vehicle for political and religious argument — and misinformation. Pamphlets were cheap to produce, anonymous or pseudonymous, and could be printed, distributed, and read without institutional mediation. They were, in many respects, the social media of their era: participatory, low-cost, high-volume, and largely unfiltered.
The pamphlet wars of the English Civil War (1642–1651) are a particularly well-documented example. Both Royalists and Parliamentarians produced torrents of pamphlets containing not only argument but fabricated speeches, forged documents, invented atrocities, and character assassination. The newsbooks and mercuries of the period — early newspapers — mixed genuine reporting with partisan invention in ways that made them unreliable witnesses even by contemporaries' standards.
Religious propaganda was often indistinguishable from religious argument. Each side in the Reformation and Counter-Reformation accused the other of heresy, conspiracy, and moral depravity, frequently on fabricated evidence. The Blood Libel — the false accusation that Jewish communities used the blood of Christian children in religious rituals — was given new life by print technology, which allowed the slander to spread far faster and more widely than manuscript transmission had permitted.
📊 Real-World Application: The Origins of Fake News
The term "fake news" is often treated as a modern coinage, but deliberately fabricated news stories have existed as long as the news industry itself. In the late 18th and early 19th centuries, newspapers commonly fabricated letters, interviews, and reports — often without explicit acknowledgment that they were fictional. Benjamin Franklin is known to have produced a fabricated newspaper supplement in 1782 purporting to be a British colonial newspaper describing atrocities committed by American Indian allies. The line between opinion, satire, and deliberate fabrication was extremely porous in early print culture.
2.2.2 The Role of Cheap Print in Democratizing Misinformation
By the eighteenth century, falling print costs had created a mass reading public in Europe and North America, and with it, a mass market for sensation, rumor, and partisan fabrication. Revolutionary France saw an extraordinary explosion of pamphlet literature attacking the monarchy, aristocracy, and Church — much of it consisting of invented scandals and defamation.
The pamphleteer as a figure occupied an ambiguous position: a potential vector for both political liberation and irresponsible invention. Jean-Paul Marat's L'Ami du peuple (The Friend of the People) mixed genuine radical political argument with incitement and false accusations that contributed to the Terror's political violence. The revolutionary press both enabled democratic participation and generated the atmosphere of suspicion and accusation that made mass political violence possible.
This historical pattern — cheap, accessible media enabling both democratic participation and dangerous misinformation — recurs at every subsequent stage of media technology development. The internet is not exceptional in combining these features; it is the most recent and most powerful instance of a recurring dynamic.
Section 2.3: Yellow Journalism and Mass Media
The Penny Press and the Birth of Mass Readership
The mid-nineteenth century saw another revolution in print: the development of cheap, high-circulation newspapers aimed at mass urban audiences. The "penny press" (newspapers sold for one cent) transformed journalism from an elite to a popular institution. Papers like Benjamin Day's New York Sun (founded 1833) and James Gordon Bennett's New York Herald competed for mass readership by emphasizing crime, scandal, entertainment, and sensation.
The pressure of competition for mass audiences created powerful incentives for sensationalism that remain recognizable today: dramatic headlines, emotionally compelling narratives, simple moral frames, and a preference for conflict over complexity. The penny press was not simply irresponsible; it also democratized access to information, covered politics for audiences who had previously been excluded, and introduced investigative reporting as a journalistic form. But the competitive pressure for circulation created an environment in which accuracy was frequently sacrificed for appeal.
2.3.1 Yellow Journalism and the Spanish-American War
The most celebrated (and debated) episode in the history of sensationalist journalism is the role of William Randolph Hearst's New York Journal and Joseph Pulitzer's New York World in the events leading to the Spanish-American War of 1898.
The two newspapers were locked in fierce circulation competition and both covered the Cuban struggle for independence from Spain with vivid, emotionally charged, and often inaccurate reporting. Stories of Spanish brutality were real — the Spanish colonial administration under Governor-General Valeriano Weyler ("Butcher Weyler" in the American press) did engage in brutal counterinsurgency tactics — but were often embellished or fabricated for maximum emotional impact.
The most famous story from this period is probably apocryphal: the alleged exchange in which Hearst's illustrator Frederic Remington cabled from Cuba saying there was nothing happening and Hearst supposedly replied, "You furnish the pictures, and I'll furnish the war." No evidence this exchange actually occurred has been found, and historians debate the degree to which the press actually caused the war versus reflected and amplified public sentiment that was already tilting toward intervention. What is not in dispute is that the yellow press provided a political atmosphere in which the explosion of the USS Maine in Havana Harbor in February 1898 could be exploited for war-mongering purposes. "REMEMBER THE MAINE! TO HELL WITH SPAIN!" became a rallying cry despite the complete absence of evidence (never subsequently found) that Spain was responsible for the explosion.
⚠️ Common Pitfall: The "Media Made It Happen" Fallacy
The Hearst-Pulitzer story is often used to argue that media can single-handedly cause wars and other major events. This oversimplifies a complex causal picture. Media shapes the information environment in which publics and governments make decisions, but it does so in interaction with genuine events, existing political pressures, economic interests, and institutional dynamics. Attributing the Spanish-American War wholly to yellow journalism ignores the genuine political pressures for intervention and treats the public as infinitely malleable. The truth is more nuanced: media can amplify, distort, and accelerate trends, but it typically does not act alone.
2.3.2 The Professionalization of Journalism and Its Limits
The excesses of yellow journalism contributed to a counter-movement: the professionalization of journalism in the early twentieth century. Organizations like the American Society of Newspaper Editors (ASNE, founded 1922) developed professional codes of ethics emphasizing accuracy, fairness, and independence. Schools of journalism were established at universities. The professional journalist as a figure — trained, credentialed, institutionally accountable — emerged as a counter to the sensationalist penny-press tradition.
This professionalization was genuine and consequential. American journalism in the mid-twentieth century, for all its limitations, maintained standards that meaningfully constrained the most egregious fabrication. But professionalization also had limits: it sometimes produced its own orthodoxies and blind spots, constrained certain voices and perspectives, and was vulnerable to manipulation by sophisticated public relations practitioners — particularly those working for government and corporate interests.
Edward Bernays, nephew of Sigmund Freud and founder of the modern public relations industry, argued explicitly in his 1928 book Propaganda that public opinion could and should be engineered by experts. Bernays developed techniques for manufacturing apparent public consensus, placing stories with journalists, and using third-party validators to make corporate and government interests appear to be organic public sentiment. The PR industry he pioneered created systematic mechanisms for introducing false or misleading information into ostensibly legitimate media channels — a vector that remains highly effective.
Section 2.4: Radio, Film, and Totalitarian Propaganda
The New Mass Media
Radio and cinema emerged as mass media in the early twentieth century with properties fundamentally different from print. Print required literacy, active engagement, and at least minimal individual processing time. Radio and film reached audiences with emotional immediacy, combining verbal and visual or sonic persuasion in ways that bypassed some of print's cognitive mediation. They were also broadcast media — information flowed from a small number of centralized producers to vast audiences, with no mechanism for audience response or correction.
These properties made radio and film enormously powerful instruments for both democratic communication and totalitarian control. Franklin D. Roosevelt's radio "fireside chats" built a sense of intimate connection with millions of Americans who had never been near Washington. Adolf Hitler's amplified mass rallies, broadcast to millions of Germans who couldn't attend, created the experience of participating in a massive unified movement — even for listeners sitting alone in their living rooms.
2.4.1 Goebbels and the Nazi Propaganda Machine
Joseph Goebbels, appointed Minister of Public Enlightenment and Propaganda by Hitler in 1933, created the most sophisticated state propaganda apparatus in history to that point. The Nazi propaganda machine controlled newspapers, radio, film, theater, visual art, music, and public education — creating an information environment in which contrary perspectives were not merely suppressed but made nearly unimaginable.
Goebbels' approach combined several elements that remain instructive:
The Big Lie (Große Lüge): A concept Hitler introduced in Mein Kampf, the "big lie" refers to a falsehood so large that the ordinary person cannot believe anyone would dare invent something so enormous. The Nazi use of massive, audacious falsehoods — particularly about Jewish people and about Germany's historical grievances — exploited the human tendency to find credibility in proportion to a claim's apparent magnitude.
Repetition and saturation: Goebbels understood that repeated exposure increases perceived credibility (what psychologists would later call the "illusory truth effect"). Key Nazi messages were repeated across all media, in all contexts, at every level of culture — until they became the water in which Germans swam.
Emotional engagement over rational argument: Goebbels was explicit that propaganda should aim at emotion rather than intellect. Mass rallies, martial music, visual spectacle, and simple powerful slogans engaged feelings of belonging, pride, and threat rather than inviting critical analysis.
Enemy construction: The propaganda machine constantly emphasized threats — from Jews, Communists, Western decadence, Slavic Untermensch — creating a permanent state of siege consciousness that made critical analysis of government claims feel disloyal and dangerous.
Control of all information channels: The Reich Press Chamber (Reichspressekammer) required all journalists, editors, and publishers to be members, and membership required demonstrating "racial purity" and political reliability. Jewish journalists and editors were expelled. Foreign news was controlled. The result was a near-complete capture of the information environment.
🎓 Advanced: Jacques Ellul's Theory of Propaganda
French philosopher Jacques Ellul's Propaganda: The Formation of Men's Attitudes (1962) is the most rigorous theoretical account of totalitarian propaganda. Ellul argued that modern propaganda is not merely about lying to people; it works by integrating individuals into a movement so completely that they become propagandists for themselves — true believers who propagate the system's messages organically. He identified "sociological propaganda" — the diffuse, ambient propaganda of a culture — as potentially more powerful than direct political messaging. His framework remains highly relevant to understanding how social media ecosystems can function as distributed propaganda machines without central direction.
2.4.2 Leni Riefenstahl and the Aestheticization of Power
Leni Riefenstahl's Triumph of the Will (1935), a documentary film of the 1934 Nuremberg Rally, is the most analyzed propaganda film in history. Riefenstahl's use of innovative cinematographic techniques — multiple cameras, dramatic angles, aerial shots, careful editing — transformed a political rally into an aesthetic spectacle of overwhelming power and unity.
Triumph of the Will illustrates a dimension of propaganda that goes beyond factual falsity: the aestheticization of politics, the use of beauty, grandeur, and emotional resonance to make political power appear natural, inevitable, and sublime. This form of propaganda does not require specific false propositions; it works by shaping affect, by making certain political arrangements feel overwhelmingly right.
The film raises profound questions about the ethics of aesthetics in political communication that remain unresolved. Political advertising, campaign imagery, and visual branding all use aesthetic techniques to shape political affect in ways that bypass explicit argument. The line between legitimate political communication and manipulative aestheticization is genuinely difficult to draw.
2.4.3 Soviet Propaganda
The Soviet Union developed a parallel propaganda system, ideologically opposite but structurally similar to the Nazi model. Socialist Realism — the mandated artistic style for Soviet culture from the 1930s — required all art, literature, and film to depict the triumph of socialist revolution in heroic, optimistic terms. Posters, films, novels, and music were produced by state-supported artists and deployed through all available channels to maintain the ideological orientation of Soviet citizens.
Soviet propaganda is instructive in its use of active measures (aktivnyye meropriyatiya) — intelligence operations designed to influence foreign opinion. Soviet active measures included forged documents, fabricated news stories planted in foreign newspapers, front organizations presenting Soviet positions as independent civil society views, and orchestrated disinformation campaigns targeting foreign leaders and publics.
The KGB's documented use of these techniques — including the fabrication of stories claiming the CIA developed HIV as a biological weapon (Operation INFEKTION, circa 1983–1987) — provides direct historical precedent for contemporary Russian disinformation campaigns. The techniques were not invented for the internet age; they were adapted to it.
Section 2.5: The Television Age
Television and the Power of the Visual
Television, which became a mass medium in Western countries in the 1950s and globally by the 1970s, added another dimension to media's role in politics and information: the live, intimate visual presence of political leaders and events in the home. Television transformed political communication more profoundly than any previous medium since the printing press.
The Kennedy-Nixon debates of 1960 are the canonical illustration: surveys of radio listeners tended to score Nixon as the winner; television viewers, struck by Kennedy's ease and Nixon's pallor, gave the victory to Kennedy. Appearance, demeanor, and performance became as politically consequential as argument and policy.
Television's visual power created new opportunities for misinformation: - Out-of-context footage: A crowd shown in proximity to an unrelated event could appear to be reacting to it. - Misleading visual framing: The same event could appear orderly or violent depending on camera angle and editing. - False authority through production quality: Professional-looking video conveyed credibility independent of content accuracy. - The "seeing is believing" fallacy: Viewers' tendency to trust visual evidence made fabricated or misleadingly edited footage particularly effective.
2.5.1 The Gulf of Tonkin and Media Management
The Gulf of Tonkin incident (August 1964) illustrates how governments can use media systems to generate false public understanding of major events. President Lyndon Johnson's administration reported that North Vietnamese torpedo boats had attacked US destroyers in the Gulf of Tonkin on August 4, 1964 — an attack that almost certainly did not occur, as subsequent declassified evidence strongly suggests. Congress voted overwhelmingly for the Gulf of Tonkin Resolution authorizing escalation of the Vietnam War — based substantially on a fabricated or at minimum wildly misrepresented incident.
The American press largely accepted the administration's account. This illustrates a recurring pattern: during periods of national security crisis, professional journalism's reliance on official sources creates vulnerability to deliberate government deception. The mechanisms for independent verification were limited, the presumption of good faith was high, and the professional norm of "objectivity" — reporting both sides — was itself exploited, since one "side" was the government presenting fabricated information.
The eventual disclosure of the Gulf of Tonkin deception, the release of the Pentagon Papers (1971), and the broader disillusionment of the Vietnam era fundamentally changed American journalism's relationship to government authority — toward a more adversarial model that remains culturally significant, though also not without its own distortions.
2.5.2 Tobacco Advertising and the Manufactured Credibility Crisis
One of the most consequential misinformation campaigns of the television age was the tobacco industry's response to accumulating evidence linking smoking to cancer. By the early 1950s, the epidemiological evidence was strong enough that major tobacco companies understood internally that their products caused cancer. Their response, documented in internal company documents released through litigation, was to commission public relations campaigns explicitly designed to create the appearance of scientific controversy.
The tobacco companies' strategy — funding alternative scientists, emphasizing "uncertainty," lobbying against regulation, and placing favorable stories in mainstream media — was not unique to tobacco. It was adapted by industries facing regulatory scrutiny throughout the late twentieth century and became the template for manufactured doubt campaigns on leaded gasoline, asbestos, climate change, and other issues.
Television advertising was central to tobacco's cultural reach. The Marlboro Man, the Virginia Slims "You've come a long way, baby" campaign, and celebrity endorsements embedded smoking in images of masculinity, independence, and sophistication. When such advertising was eventually banned in the United States (TV and radio ban effective 1971), the industry pivoted to other visual and cultural channels.
Section 2.6: The Internet Era
The Early Internet: Utopian Hopes and Early Disillusionment
The internet's early development in the 1990s was accompanied by extraordinary optimism about its potential to democratize information, empower citizens, and undermine authoritarian control. The network's architecture — decentralized, resistant to censorship, allowing anyone to publish — seemed to promise a revolution in the economics of truth: for the first time, anyone could reach anyone, with any information, instantaneously.
These hopes were not baseless. The internet did enable remarkable expansions of access to information, political organizing, whistleblowing, and cross-cultural communication. But the same architecture that empowered citizen journalists and human rights organizations also empowered hoaxers, conspiracy theorists, and propagandists. And the assumption that more information necessarily equals better-informed citizens proved naively optimistic.
2.6.1 Early Internet Hoaxes and Chain Emails
The first significant misinformation vector of the internet era was the chain email. Chain emails — messages urging forwarding to all one's contacts, typically containing false claims, urban legends, or emotional manipulation — spread through email networks with minimal friction and no verification. The classic structure borrowed from earlier chain letter traditions but achieved unprecedented reach through the exponential growth of email contact lists.
Common early chain email misinformation included: false health claims (drinking water from plastic bottles causes cancer), false urban legends (gang initiations involving highway driving), false political slanders, and emotional manipulation for attention ("This dying child's last wish is to receive one million emails"). The mechanisms making these effective — emotional urgency, appeal to concern for others, social proof from sender's apparent endorsement — remain the same mechanisms that drive viral misinformation sharing on contemporary social media.
📊 Real-World Application: From Chain Emails to WhatsApp Forwards
Many of the false claims that spread virally on WhatsApp, Facebook Messenger, and other encrypted messaging platforms in the 2010s and 2020s were direct descendants of 1990s chain email hoaxes — often the same claims with updated formatting. Research on viral WhatsApp misinformation in Brazil, India, and Africa found many "new" health and political rumors that were recognizable descendants of earlier internet-era false claims. The medium changed; the social mechanics did not.
2.6.2 Early Online Conspiracy Communities
The mid-to-late 1990s saw the formation of early online communities organized around conspiracy theories. Usenet groups (alt.conspiracy, alt.assassination.jfk) and early web forums created spaces where people with minority beliefs — JFK assassination theories, UFO claims, anti-government extremism — could find each other, share "evidence," and develop increasingly elaborate theoretical frameworks.
These communities were the precursors to the Reddit communities, 4chan boards, and Discord servers that would become major conspiracy incubators in the 2010s. What changed with the transition to social media was scale, speed, and algorithmic amplification — not the basic social dynamics of conspiracy community formation, which were already visible in early internet communities.
The late 1990s and early 2000s also saw the development of early partisan news and blog ecosystems — the "blogosphere" — that began the fragmentation of the shared information environment that had characterized the broadcast era. Political blogs created spaces for audience self-selection around political perspective in ways that prefigured the more extreme fragmentation of social media.
Section 2.7: Social Media and the Modern Infodemic
The Algorithmic Information Environment
The 2010s transformed the information landscape in ways that differed qualitatively, not just quantitatively, from earlier internet development. The key developments were:
-
Social media scale: Facebook reached 1 billion users in 2012, 2 billion by 2017. Twitter, YouTube, Instagram, and later TikTok created platforms through which individual posts could reach global audiences within hours.
-
Algorithmic curation: Rather than users actively seeking information, algorithms curated information feeds based on engagement signals — likes, comments, shares, watch time. Because emotionally arousing content generates more engagement than calm, accurate reporting, algorithms systematically amplified content with high emotional valence.
-
The attention economy: Platform business models depended on capturing and monetizing user attention. The incentive was not to inform users accurately but to maximize time on platform. Accuracy and engagement were not perfectly correlated; outrage, fear, and novelty often outperformed accuracy.
-
Participatory architecture: Unlike broadcast media, social platforms allowed users to easily share, comment, remix, and amplify content — creating viral distribution mechanisms that no earlier medium had.
2.7.1 The 2016 Election and Disinformation
The 2016 United States presidential election brought several distinct misinformation dynamics into sharp relief:
Russian information operations: The Internet Research Agency (IRA), a Russian organization with state links, ran coordinated networks of fake social media accounts targeting American audiences. Senate Intelligence Committee reports documented thousands of social media posts, ads, and accounts designed to amplify divisive content, suppress minority voter turnout, and support specific candidates. The operation's sophistication — adapting content to micro-targeted audiences across multiple platforms — represented a new scale of state-sponsored disinformation.
Domestic political misinformation: Fabricated news stories — "Pope Francis Endorses Donald Trump," "Hillary Clinton Sold Weapons to ISIS" — were produced by a variety of actors including Macedonian teenagers running monetized clickbait farms, domestic partisan content producers, and established conspiracy outlets. Studies found these stories generated millions of Facebook shares.
Platform amplification: Facebook's algorithmic amplification of emotionally engaging content gave misinformation an organic distribution advantage over accurate reporting. Internal Facebook research (partially revealed through whistleblower Frances Haugen in 2021) showed that the company's algorithms were recommending increasingly extreme content to users who engaged with politically charged posts.
Filter bubbles and echo chambers: Research found that social media use was associated with selective exposure — people encountering primarily content consistent with their existing views — though the magnitude and causal significance of this effect has been debated in subsequent research.
2.7.2 COVID-19 and the Infodemic
The COVID-19 pandemic beginning in 2020 generated what the World Health Organization (WHO) called an "infodemic" — an overabundance of information, including false and misleading information, spreading alongside the virus itself. The infodemic presented extraordinary public health challenges: false claims about virus origins, unproven treatments, vaccine misinformation, and conspiracy theories about government responses were spread at scale on social media, creating genuine obstacles to public health responses.
COVID-19 misinformation operated at multiple levels:
False treatment claims: Claims that bleach, ultraviolet light, hydroxychloroquine, ivermectin, and other substances could prevent or cure COVID-19 spread widely on social media, resulting in cases of self-poisoning, resource misallocation, and undermining of evidence-based treatment.
Vaccine misinformation: False claims about COVID-19 vaccines — that they caused infertility, contained microchips, altered DNA, or were tested on children — spread rapidly through anti-vaccine networks that had been active long before the pandemic. Research found that exposure to COVID-19 vaccine misinformation significantly reduced stated willingness to be vaccinated.
Origin misinformation: Claims about COVID-19's origin — including claims it was a bioweapon created in a Chinese laboratory — varied enormously in their evidentiary basis, from entirely baseless claims to legitimate scientific debates about zoonotic transmission versus laboratory origins. The politicization of the origin question made it difficult to distinguish manufactured political propaganda from genuine scientific uncertainty.
Institutional trust and the infodemic: The infodemic was amplified by pre-existing erosion of trust in medical and government institutions. People who already distrusted the CDC, WHO, or mainstream media were more likely to turn to alternative information sources during the pandemic crisis — and alternative information sources were often significantly less reliable.
✅ Best Practice: WHO's Infodemic Management
In response to the COVID-19 infodemic, the World Health Organization developed a structured "infodemic management" framework including: fact-checking partnerships with media organizations, collaboration with social media platforms to reduce algorithmic amplification of misinformation, inoculation campaigns explaining common manipulation techniques, and "myth buster" content addressing specific false claims. The WHO's approach represents an institutionalized application of the epistemological principles discussed in Chapter 1: identifying false claims, explaining the accurate alternative, and building resistance to future manipulation.
2.7.3 Algorithmic Amplification and Structural Misinformation
A distinctive feature of social media-era misinformation is that much of it does not require deliberate deception. Algorithmic amplification systematically promotes content with high emotional engagement regardless of accuracy. This creates structural misinformation: a system in which false, misleading, or distorted content is promoted not because anyone intends to deceive but because the system's incentive structures favor engagement over accuracy.
Research by MIT Media Lab scholars Soroush Vosoughi, Deb Roy, and Sinan Aral found that on Twitter, false news spread faster, farther, and to more people than true news — and that human behavior (not bots) was primarily responsible for this asymmetry. False news was more novel and provoked more emotional reactions (particularly disgust and fear), and novelty and emotion predict sharing behavior.
This finding has profound implications: improving the accuracy of the information environment cannot rely solely on defeating deliberate deception. It requires changing the structural incentives of platforms that systematically advantage emotional content over accurate content.
Section 2.8: Historical Patterns and Lessons
What History Teaches Us
After surveying two and a half millennia of misinformation, several persistent patterns emerge:
Pattern 1: Communication technology revolutions consistently enable misinformation scale-up Each major expansion in communication reach — the printing press, cheap newspapers, radio, television, the internet — initially enables dramatic expansion in the reach and impact of misinformation before counter-mechanisms develop. The printing press enabled pamphlet wars and religious propaganda before norms of journalistic credibility developed. Radio enabled totalitarian propaganda before broadcast regulation emerged. The internet is in this phase now: dramatically expanding misinformation reach before effective counter-institutions have developed.
Pattern 2: Misinformation thrives in conditions of political crisis and social anxiety Across all periods, misinformation proliferates most rapidly during wars, revolutions, epidemics, economic crises, and periods of intense political competition. These are precisely the conditions in which the stakes of accurate information are highest and the demand for simple, explanatory narratives is greatest. Misinformation offers psychological resolution to anxiety by providing clear enemies, clear causes, and a legible world — even when that legibility is false.
Pattern 3: Institutional counter-forces do matter The presence of credible, independent epistemic institutions — free press, public education, strong scientific communities, independent judiciary — consistently constrains misinformation's worst effects. Nazi Germany and the Soviet Union required the destruction of these institutions to achieve maximum propaganda effectiveness. The historical evidence supports investment in epistemic institutions as a structural defense against misinformation, not just individual media literacy as a personal skill.
Pattern 4: Misinformation is rarely innocent Historical misinformation has almost always served someone's interests: state power, military objectives, commercial profit, religious authority, or political advantage. Understanding who benefits from a particular piece of misinformation — cui bono — is a crucial analytical tool. This doesn't mean all misinformation is deliberately produced by cynical actors; much spreads organically through motivated reasoning and social sharing. But the content of misinformation is rarely random: it tends to serve identifiable interests.
Pattern 5: Corrections work imperfectly but are better than silence The historical record shows that corrections, fact-checks, and counter-narratives are imperfect tools — they rarely eliminate false beliefs once established. But they do reduce the spread and durability of misinformation when applied promptly, from credible sources, with alternative explanations. The Reformation-era Catholic Church's failure to respond effectively to early Lutheran propaganda through print — partly from institutional reluctance to engage with the new medium — contributed to the Reformation's rapid spread.
Pattern 6: The distinction between sincere and strategic misinformation is crucial Some misinformation is produced by people who genuinely believe it (sincere misinformation). Some is produced by people who know it is false and deploy it strategically (deliberate disinformation). Some is produced by people who neither know nor care whether it is true (what philosopher Harry Frankfurt called "bullshit"). These different types of misinformation call for different responses: belief correction works for sincere misinformation; legal and regulatory tools may be needed for deliberate disinformation; changing the epistemic culture of truth-seeking matters for all three.
🎓 Advanced: The Limits of the "Information Disorder" Framework
The Wardle-Derakhshan typology (misinformation / disinformation / malinformation) has become the standard framework in the field, but critics note its limitations. It implies that the primary problem is individual bad actors producing false content, rather than systemic features of information environments. It may underemphasize how structurally-generated misinformation (through algorithmic amplification of emotional content) differs from intentionally-produced disinformation. And it may overemphasize the role of falsity per se, when selective truth (presenting accurate facts in misleading contexts) and omission (failing to report crucial context) may be equally important vectors of public misinformation.
Key Terms Glossary
Active Measures (Aktivnyye meropriyatiya): Soviet/Russian intelligence term for influence operations including forgeries, planted news stories, front organizations, and disinformation designed to shape foreign opinion.
Astroturfing: Creating the appearance of grassroots public support for a position that is actually organized by corporate, government, or other institutional interests.
Big Lie (Große Lüge): A propaganda technique, associated with Nazi Germany, involving audacious falsehoods too large for audiences to believe anyone would fabricate.
Disinformation: False information deliberately created and spread to deceive; distinguished from misinformation by intentionality.
Framing: The selection and emphasis of certain aspects of reality over others in presenting information, shaping how audiences interpret and evaluate that information.
Illusory Truth Effect: The tendency for repeated exposure to a claim to increase its perceived credibility, regardless of actual truth value.
Infodemic: Term coined by the WHO for the overabundance of information — accurate and inaccurate — during the COVID-19 pandemic, creating public health challenges.
Malinformation: Accurate information spread with harmful intent, such as sharing a private individual's address to enable harassment.
Manufactured Doubt: Deliberate strategy of creating the appearance of scientific controversy about established findings, pioneered by the tobacco industry.
Misinformation: False or inaccurate information spread regardless of intent to deceive.
Pamphlet War: Sustained political or religious debate conducted through the rapid production and distribution of cheap, often anonymous or pseudonymous printed pamphlets; characteristic of 16th–17th century European conflicts.
Propaganda: Information, especially of a biased or misleading nature, used to promote a political cause or point of view. The term itself has both neutral (organized persuasion) and pejorative (manipulative deception) uses.
Public Relations: The professional practice of managing how an organization or individual is perceived by the public, including through media placement, third-party validation, and narrative management.
Socialist Realism: The mandated artistic style in the Soviet Union (and related states) from the 1930s onward, requiring art to depict the triumph of socialism in heroic, optimistic terms.
Structural Misinformation: The systematic promotion of false or misleading content that results from platform incentive structures (algorithmically favoring emotional engagement) rather than deliberate deception by individual actors.
Yellow Journalism: Sensationalist journalism characterized by exaggeration, dramatic headlines, and emotional content prioritized over accuracy; associated with late 19th-century American newspapers competing for mass circulation.
Discussion Questions
-
The printing press enabled both the Reformation and extensive religious propaganda and violence. Social media has enabled both remarkable expansions of human connection and dangerous misinformation. Does the historical analogy suggest that information technology revolutions inevitably pass through a dangerous phase before stabilizing? What would be needed to accelerate the stabilization?
-
Goebbels argued that propaganda should target emotion rather than intellect. Modern platform algorithms reward emotional engagement over informational accuracy. Is there a meaningful ethical distinction between Goebbels' intentional emotional manipulation and the unintentional emotional optimization of social media algorithms? Does intent matter for the harm caused?
-
American journalism developed a professional ethics of "objectivity" as a response to the perceived excesses of yellow journalism. This norm — presenting "both sides" — has been criticized for creating false equivalence between well-supported and poorly-supported claims. How should journalism navigate the tension between procedural fairness (presenting multiple perspectives) and epistemic responsibility (not treating all perspectives as equally valid)?
-
The Gulf of Tonkin incident shows how government deception can succeed for years when journalists rely primarily on official sources. What structural features of the relationship between journalism and government authority enabled this deception? What would a more epistemically robust relationship look like?
-
Comparing Nazi propaganda in the 1930s with contemporary social media-era misinformation: what are the most important structural similarities and differences? What does this comparison imply for understanding the current information environment?
-
The WHO's "infodemic" framing treats COVID-19 misinformation as an emergency requiring active management, analogous to the virus itself. Critics argue this framing creates risks for freedom of expression if governments claim authority to suppress "misinformation." How should democratic societies balance the epistemic harms of misinformation against the political risks of government information control?
-
The historical record suggests misinformation consistently thrives in conditions of political crisis and social anxiety. If this is correct, what does it imply about the relationship between addressing misinformation and addressing the underlying political and social conditions that make populations susceptible to it?
References and Notes
Bernays, Edward. Propaganda. New York: Horace Liveright, 1928.
Ellul, Jacques. Propaganda: The Formation of Men's Attitudes. Translated by Konrad Kellen and Jean Lerner. New York: Alfred A. Knopf, 1965.
Goebbels, Joseph. Die Tagebücher von Joseph Goebbels [The Goebbels Diaries]. Edited by Elke Fröhlich. Munich: K.G. Saur, 1987–2006.
Oreskes, Naomi, and Erik M. Conway. Merchants of Doubt. New York: Bloomsbury Publishing, 2010.
Roberts, Andrew. Napoleon: A Life. New York: Viking, 2014.
Starr, Paul. The Creation of the Media: Political Origins of Modern Communications. New York: Basic Books, 2004.
Vosoughi, Soroush, Deb Roy, and Sinan Aral. "The Spread of True and False News Online." Science 359, no. 6380 (2018): 1146–1151.
Wardle, Claire, and Hossein Derakhshan. "Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making." Council of Europe Report DGI(2017)09, 2017.
Wu, Tim. The Attention Merchants: The Epic Scramble to Get Inside Our Heads. New York: Alfred A. Knopf, 2016.