28 min read

Scientific knowledge is not simply a collection of facts but a structured process of hypothesis formation, experimental testing, peer review, replication, and accumulating consensus. This process is self-correcting over time: errors are identified...

Chapter 16: Scientific Misinformation — Climate, Vaccines, and GMOs

Learning Objectives

By the end of this chapter, students will be able to:

  1. Explain how scientific consensus forms and distinguish between uncertainty at the research frontier and manufactured doubt about established science.
  2. Describe the "Merchants of Doubt" framework developed by Oreskes and Conway, and trace the tobacco strategy's application across multiple scientific controversies.
  3. Identify the specific false claims made by climate change denialists, evaluate the evidence against each claim, and explain the psychological inoculation approach to building resilience.
  4. Analyze vaccine safety misinformation beyond the Wakefield fraud, including VAERS misuse and mRNA-specific COVID-19 claims.
  5. Evaluate the science of GMO safety and the specific ways the Séralini affair and organic industry interests have shaped public perception.
  6. Describe the evolution denial movement, including the strategic shift from creationism to intelligent design documented in the Wedge Document.
  7. Apply cultural cognition theory (Kahan) and identity threat models to explain why scientific misinformation persists even among well-educated populations.
  8. Evaluate science communication strategies using evidence from research on the failure of the deficit model and success of framing and inoculation approaches.

Section 16.1: The Nature of Scientific Consensus

How Scientific Knowledge Differs from Opinion

Scientific knowledge is not simply a collection of facts but a structured process of hypothesis formation, experimental testing, peer review, replication, and accumulating consensus. This process is self-correcting over time: errors are identified and corrected, though not always quickly or painlessly. The process is also social — it depends on communities of researchers checking each other's work, on institutions that organize knowledge production, and on norms of transparency that allow scrutiny.

Understanding how scientific consensus forms is essential for distinguishing genuine scientific uncertainty from manufactured doubt. Several features are critical:

Consensus forms gradually through accumulation of evidence. No single study establishes scientific consensus. The evidentiary basis for major scientific consensus positions — evolution, anthropogenic climate change, vaccine safety, the age of the universe — consists of thousands of independent lines of evidence from multiple research traditions and methods. This redundancy is not a weakness but a strength: it means the consensus is robust to errors in any individual study.

The frontier is always uncertain; the core is highly reliable. Research papers report findings at the frontier of knowledge, where uncertainty is highest. Headlines reporting "scientists discover X" frequently represent work that future research will refine, qualify, or overturn. This frontier uncertainty is normal and healthy. It is categorically different from uncertainty about established findings that have been replicated, tested from multiple angles, and held up for decades.

Scientific consensus can be measured. Studies by John Cook and colleagues have documented the scientific consensus on human-caused climate change at approximately 97% of publishing climate scientists. Similar consensus levels exist for vaccine safety, evolution, and the efficacy of antibiotics. These consensus levels are not votes — they reflect the proportion of scientific work that actively supports the consensus position when the research literature is systematically analyzed.

Distinguishing Frontier Uncertainty from Manufactured Doubt

The most consequential confusion exploited by science denialists is the distinction between: - Frontier uncertainty: Genuine uncertainty about specific questions at the edge of what science currently knows (e.g., the precise climate sensitivity value, the specific mechanisms of mRNA vaccine immune response). - Manufactured doubt: Strategic amplification of uncertainty about well-established science to prevent policy action.

Manufactured doubt deliberately exploits the public's inability to distinguish these two types of uncertainty. When a tobacco-funded scientist published a study questioning whether smoking causes cancer in the 1970s, the goal was not to advance scientific knowledge but to create the impression of scientific controversy where none existed among actual tobacco researchers. This strategy — the tobacco strategy — has been applied to climate change, ozone depletion, acid rain, and other scientific questions with inconvenient policy implications.


Section 16.2: Merchants of Doubt — The Tobacco Strategy Applied

Oreskes and Conway's Framework

Historians of science Naomi Oreskes and Erik Conway's 2010 book "Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming" is the foundational text for understanding how organized doubt manufacturing works. Their research drew on internal documents from the tobacco industry, revealed in litigation, to trace a consistent strategy applied across multiple scientific controversies.

The core insight of Oreskes and Conway is that the same small network of scientists, PR firms, and ideologically motivated organizations worked to manufacture doubt about the scientific consensus on tobacco cancer, second-hand smoke, acid rain, the ozone hole, and climate change. The strategy was not to produce better science but to exploit the public's assumption that scientific controversy must exist if scientists are publicly disagreeing.

The Key Actors

Several figures appear repeatedly in the "Merchants of Doubt" narrative:

Frederick Seitz, a distinguished physicist and former National Academy of Sciences president, was paid by R.J. Reynolds Tobacco to administer a research program in the 1970s and 80s, then by the George C. Marshall Institute (a think tank he helped found) to contest climate science. Seitz was a genuine scientist of distinction whose late-career work on behalf of industry and in denial of scientific consensus represented a departure from his scientific work.

S. Fred Singer, a physicist, publicly challenged scientific consensus on acid rain, the ozone hole, and climate change across four decades. Singer was associated with multiple industry-funded organizations and produced scientific commentary that was funded by industries with financial interests in the outcomes.

The George C. Marshall Institute, founded in 1984 ostensibly to support Reagan's Strategic Defense Initiative, became a primary vehicle for manufacturing doubt on climate science. It is now defunct, having spun out into the CO2 Coalition.

The Heartland Institute has hosted annual "International Conference on Climate Change" events that bring together climate denialists to create the appearance of an alternative scientific community.

PR Firms and the Media Strategy

The tobacco industry's internal documents, made public through litigation, revealed sophisticated media strategies for creating doubt:

The "controversy strategy": If scientific controversy could not be manufactured among scientists, it could be manufactured in media coverage. Journalists trained to present "both sides" would present a tobacco spokesperson alongside a cancer researcher, creating the impression of a scientific debate that did not exist in the scientific literature.

Op-ed placement: Industry-funded scientists placed op-eds in major newspapers creating an appearance of scientific dissent. Mainstream media published these pieces without adequately investigating the authors' funding relationships.

Letter campaigns: Coordinated letter campaigns to scientific journals and newspaper editors, presenting a paper-trail impression of widespread dissent.

The application to climate change: When internal oil industry and tobacco documents were revealed in litigation, they showed that oil companies including ExxonMobil had their own internal scientific research on climate change in the 1970s and 80s that reached conclusions consistent with the mainstream scientific consensus — and then funded public campaigns to manufacture doubt about those conclusions.


Section 16.3: Climate Change Misinformation

The Scientific Consensus

The scientific consensus on human-caused climate change is among the most thoroughly documented in modern science. Key elements:

  • The Earth's average surface temperature has increased approximately 1.1°C since the pre-industrial era (1850-1900 baseline).
  • The primary driver is increased atmospheric concentrations of greenhouse gases, principally CO2, resulting from fossil fuel combustion, deforestation, and industrial processes.
  • This consensus is supported by multiple independent lines of evidence: direct temperature measurement, satellite data, sea level rise, ice core records, ocean heat content, species range shifts, glacier retreat, and more.
  • 97% of actively publishing climate scientists agree that recent climate change is primarily human-caused (Cook et al. 2013, Lynas et al. 2021).

Specific False Claims and Their Debunking

Climate change denial takes multiple forms. John Cook and colleagues have developed a taxonomy of denial types:

Trend denial (Type 1): "The Earth is not actually warming." Rebuttal: Global mean surface temperature measurements from NASA, NOAA, Berkeley Earth, and the UK Met Office all independently confirm warming of approximately 1.1°C since pre-industrial times.

Attribution denial (Type 2): "The warming is natural, not human-caused." Rebuttal: Natural factors (solar variability, volcanic activity) cannot explain the observed warming pattern; in fact, natural factors would have produced slight cooling over the past 50 years. Only the addition of human greenhouse gas emissions explains the observed warming.

Impact denial (Type 3): "Even if it's warming and human-caused, it won't be that bad." Rebuttal: Current warming is already producing documented impacts on ecosystems, sea levels, extreme weather events, and human communities. Climate models consistently project increasingly severe impacts at higher temperature levels.

Solution denial (Type 4): "We can't or shouldn't do anything about it; solutions are too costly." Rebuttal: This is a policy argument, not a scientific claim, though it is sometimes dressed in pseudo-scientific language about cost-benefit analysis.

Science process denial (Type 5): "Climate scientists are engaged in a conspiracy; scientific institutions cannot be trusted." Rebuttal: The climate change consensus involves researchers across dozens of countries, thousands of independent research institutions, and multiple competing research programs and methodologies. The conspiracy claim requires implausible coordination.

The 97% Consensus and How It Is Misrepresented

John Cook and colleagues' 2013 study systematically examined approximately 12,000 peer-reviewed papers on climate change and found that 97% of those expressing a position endorsed the consensus that human activity was causing recent warming. This finding has been replicated in multiple independent analyses.

Common misrepresentations of the consensus: - "The 97% number was made up." The methodology was transparent and the study was peer-reviewed and published in Environmental Research Letters. Multiple independent studies reach similar conclusions. - "Scientists disagree about whether climate change is real." This conflates frontier disagreements about specific parameters with core disagreements about the basic phenomenon. - "Many scientists have signed a petition against the consensus." The Oregon Petition Project, which claimed to have 31,000 scientist signatures challenging climate science, included signatories with no relevant credentials and was not peer-reviewed.

Psychological Inoculation Against Climate Denial

Researchers at the University of Cambridge and the Yale Program on Climate Communication have developed inoculation approaches to building resilience against climate denial techniques. The approach follows inoculation theory: expose people to weakened versions of the manipulation techniques used in climate denial before they encounter full-strength denial arguments.

Research by Sander van der Linden and colleagues demonstrated that brief "inoculation" messages that warned about climate denial manipulation techniques (combined with affirmation of the 97% consensus) were more effective at maintaining accurate beliefs when subjects subsequently encountered denial arguments than simple correction alone. This work underlies the "Inoculation Science" project and has been applied in large-scale social media interventions.


Section 16.4: Vaccine Safety Misinformation

Beyond Wakefield: The Long History

Andrew Wakefield's 1998 Lancet paper claiming a link between the MMR vaccine and autism is the most studied and documented case of vaccine misinformation, but it is far from the only one. Wakefield's paper was formally retracted by The Lancet in 2010 after a General Medical Council investigation found that Wakefield had engaged in ethical violations and had undisclosed financial conflicts of interest (he was paid by a law firm seeking to sue vaccine manufacturers). Wakefield lost his medical license. However, the anti-vaccine movement predates Wakefield's paper and continues to draw on a broader repertoire of claims.

VAERS Misuse

The Vaccine Adverse Event Reporting System (VAERS), maintained jointly by the CDC and FDA, is a passive surveillance system designed to detect signals of potential vaccine adverse events. VAERS accepts reports from anyone — healthcare providers, patients, or others — regardless of whether the reporter believes the vaccine caused the event. The system is designed to capture potential signals for further investigation; it is not designed to establish causation.

During the COVID-19 pandemic, VAERS data was systematically misused to support claims that vaccines were causing widespread death and injury. The misuse pattern: - Reports of deaths occurring after vaccination were presented as evidence that vaccines caused those deaths. VAERS explicitly states that listing an event in VAERS does not establish that the vaccine caused it. - Raw VAERS numbers were compared to adverse event rates without baseline comparison to the underlying population's event rates. - VAERS was designed to be a sensitive system that over-reports rather than under-reports; interpreting its raw numbers as evidence of causation inverts its purpose.

The distinction between adverse events "following" vaccination and those "caused by" vaccination is essential. Because vaccines are given to large populations, random events that would have occurred anyway will occur after vaccination in many people. A useful benchmark: deaths from lightning strikes, heart attacks, and strokes occur in the population continuously; some fraction of the population that receives a vaccine will experience these events in the following days by chance.

Natural Immunity Claims

Claims that natural immunity from prior infection is superior to vaccine-induced immunity became widespread during the COVID-19 pandemic. The scientific evidence on this is genuinely complex: - Natural immunity after COVID-19 infection provides significant protection against reinfection, sometimes comparable to or exceeding vaccination-induced immunity for some variants. - Natural immunity's durability varies by severity of original infection, individual immune response, and the variant encountered. - The comparison between natural and vaccine-induced immunity must account for the cost of acquiring natural immunity: COVID-19 infection caused significant morbidity, mortality, and long COVID in millions. - For diseases where natural infection poses lower risks (some childhood diseases), the natural immunity argument is more compelling, but for serious diseases including COVID-19, the risk-benefit calculation favors vaccination.

The scientific reality is nuanced; the misinformation version presents a false dichotomy in which "natural immunity" is categorically superior and vaccines are therefore unnecessary.

mRNA Vaccine Misinformation

COVID-19 mRNA vaccines (Pfizer-BioNTech and Moderna) prompted a wave of misinformation specific to the technology:

"mRNA vaccines alter your DNA." False. mRNA is a messenger molecule that carries instructions from DNA to ribosomes for protein synthesis. It does not enter the cell nucleus where DNA is located, cannot be incorporated into DNA, and is rapidly degraded after delivering its instructions. The biological mechanism by which mRNA could alter DNA does not exist.

"The spike protein is uniquely toxic." Evidence does not support the claim that spike protein produced by vaccination causes the organ damage claimed by vaccine skeptics. The concentration of spike protein produced by mRNA vaccines is orders of magnitude lower than that produced by COVID-19 infection itself.

"These vaccines were not adequately tested." mRNA vaccine trials enrolled tens of thousands of participants. The review process was intensive. "Emergency Use Authorization" did not bypass the clinical trial process but represented an emergency expediting of review of trial data that met established safety and efficacy standards.


Section 16.5: GMO Misinformation

The Science of GMO Safety

Genetically modified organisms (GMOs) have been the subject of extensive safety research and regulatory review. The scientific consensus on the safety of currently approved GMOs for human consumption is clear. A 2016 National Academies of Sciences report analyzed hundreds of studies and concluded that GMOs present no unique risks to human health. The WHO, the American Medical Association, and scientific bodies in Europe and elsewhere have reached similar conclusions.

The genetic modification of crops has been practiced through selective breeding for millennia. Modern GMO techniques introduce specific genetic changes with greater precision than traditional breeding. The safety concern — that inserting foreign genes could create toxic proteins or allergens — is a legitimate scientific question that has been extensively researched. The regulatory framework for GMO approval (in the US through USDA, EPA, and FDA) requires demonstration that introduced proteins are not toxic and not similar to known allergens.

This does not mean that all possible GMO applications are safe; each new application must be evaluated on its own merits. But the blanket claim that "GMOs are dangerous" is not supported by the available evidence.

The Séralini Affair

The most consequential scientific controversy in the GMO debate was the 2012 paper by French molecular biologist Gilles-Éric Séralini and colleagues, published in Food and Chemical Toxicology, claiming to show that Roundup-tolerant GM corn caused tumors in rats. The paper was accompanied by dramatic photographs of rats with large tumors and received enormous media coverage.

The scientific response was devastating: - The Sprague-Dawley rat strain used in the study is prone to spontaneous tumor development, making the control-group comparison essential and the study design problematic. - The study used too few animals per group for statistical reliability. - Séralini cherry-picked results to show the most alarming images without proper statistical analysis. - The European Food Safety Authority, German BfR, and other scientific bodies reviewed the paper and found its methodology fundamentally flawed. - Food and Chemical Toxicology retracted the paper in 2013.

Séralini and colleagues claimed the retraction was the result of industry pressure, pointing to the journal's appointment of a former Monsanto employee to its editorial board. The retraction decision has remained controversial in part because Food and Chemical Toxicology's editors acknowledged that the paper was not fraudulent, only that the data were "inconclusive" — an unusual basis for retraction.

The affair illustrates several dynamics in scientific misinformation: a sensational study with serious methodological flaws receives massive media coverage; correction receives far less; accusations of industry bias allow the original study's proponents to maintain narrative credibility despite retraction.

The Precautionary Principle and Organic Industry Interests

The precautionary principle — that in cases of uncertainty, caution should guide decisions that could cause harm — is a legitimate scientific and policy norm. Its misapplication in the GMO debate involves invoking precaution not in proportion to evidence of risk but categorically, treating GMOs as uniquely deserving of precaution while failing to apply the same standard to alternatives.

The organic food industry has a significant financial interest in GMO skepticism. Organic certification in most countries prohibits GMO ingredients, making GMO safety claims directly competitive with organic market positioning. Multiple studies of media coverage of GMOs have found that anti-GMO messaging is often funded or amplified by organic food producers and that the scientific consensus on GMO safety is poorly communicated relative to anti-GMO concerns.

This does not make GMO opposition invalid per se — legitimate concerns about agricultural consolidation, corporate control of the seed supply, pesticide use patterns, and biodiversity exist independently of human health safety questions. However, conflating these legitimate policy concerns with fabricated or misrepresented health safety claims harms both public understanding and the quality of policy debates.


Section 16.6: Evolution Denial

The Historical Arc: Creationism to Intelligent Design

Opposition to evolutionary biology in the United States follows a traceable historical arc shaped partly by legal constraints. Early 20th century opposition to teaching evolution in public schools reached its famous moment with the 1925 Scopes Trial. The Balanced Treatment for Creation-Science and Evolution-Science Act, passed in Louisiana in 1981, was struck down by the Supreme Court in Edwards v. Aguillard (1987) as unconstitutional establishment of religion.

After Edwards, the creationist movement developed "intelligent design" (ID) as an ostensibly non-religious alternative. ID claims that biological complexity — the "irreducible complexity" of some biological systems, as articulated by biochemist Michael Behe in "Darwin's Black Box" (1996) — cannot be explained by natural selection and therefore implies an unspecified "designer."

The Wedge Document

In 1999, an internal document from the Discovery Institute's Center for Science and Culture (the leading ID advocacy organization) was leaked and became known as the "Wedge Document." The document revealed that the Discovery Institute's ID program had explicitly religious goals — "to replace materialistic explanations with the theistic understanding that nature and human beings are created by God" — and outlined a 20-year plan for achieving this through science, media, and political strategy.

The Wedge Document's exposure was significant because it contradicted the Discovery Institute's public claims that ID was a scientific rather than religious enterprise. This became directly relevant in the 2005 Kitzmiller v. Dover Area School District case.

Kitzmiller v. Dover

The 2005 federal trial in Kitzmiller v. Dover Area School District is the most comprehensive judicial examination of intelligent design's scientific status. Judge John E. Jones III, appointed by Republican President George W. Bush, ruled after a six-week trial that:

  • Intelligent design is not science; it is a religious proposition.
  • Teaching ID in public school science classes violates the Establishment Clause of the First Amendment.
  • The Discovery Institute's expert witnesses had significantly misrepresented the scientific literature.
  • Claims of "irreducible complexity" (particularly of the bacterial flagellum, cited by Behe) had been answered by the existing evolutionary biology literature.

Jones's 139-page opinion was widely praised as a comprehensive and carefully documented examination of the scientific and legal issues. The Discovery Institute, which had initially encouraged the Dover school board and then distanced itself from the case when it appeared likely to go badly, claimed Jones had simply copied from the ACLU's proposed findings of fact.

The Scientific Status of Evolution

Evolution is among the most thoroughly confirmed theories in biology. Evolutionary biology is supported by: - The fossil record, showing gradual modification over time - Comparative anatomy and the existence of vestigial structures - Biogeographic patterns of species distribution - Direct observation of natural selection acting on populations - Molecular evidence including shared DNA sequences and endogenous retroviruses - The entire field of genetics, which provides the mechanistic basis Darwin lacked

Describing evolution as "just a theory" reflects a common misunderstanding of scientific usage of the word "theory." In science, a theory is an explanatory framework supported by substantial evidence — a much stronger claim than everyday usage of "theory" as speculation.


Section 16.7: Why Scientific Misinformation Persists

Cultural Cognition Theory

Dan Kahan, a Yale law professor who leads the Cultural Cognition Project, has produced research challenging the intuitive explanation for why scientific misinformation persists. The intuitive explanation is the "deficit model" — people believe false things about science because they lack information. The remedy would be better science communication.

Kahan's research directly challenges this. In a series of studies, he found that scientific literacy and numeracy do not reduce belief in false scientific claims; in some cases, they increase polarization. High-numeracy individuals who identify as politically conservative are more likely to reach incorrect conclusions about politically contested scientific topics than low-numeracy conservatives — because their greater cognitive capacity allows them to better rationalize preexisting views.

The explanation Kahan offers is "cultural cognition": people's factual beliefs about politically contested empirical questions are shaped by their cultural worldviews. Accepting the scientific consensus on climate change, for example, requires accepting policies (carbon regulation, government intervention) that threaten core commitments of some conservative cultural worldviews. The motivated reasoning is not conscious but is a consequence of identity-protective cognition.

This finding has profound implications for science communication. It suggests that simply providing more accurate information will not change minds of high-engagement deniers, because the problem is not information deficit but motivated identity protection.

Identity Threat and Motivated Skepticism

Kahan's work builds on a broader psychological literature on motivated reasoning. When information threatens a person's identity or social group membership, people engage in motivated skepticism — applying much higher scrutiny to evidence that conflicts with their group identity than to evidence that confirms it.

In domains where scientific consensus has become politically polarized — climate change, vaccine safety, GMO safety, evolution — motivated skepticism creates systematic divergence between scientific consensus and public belief that tracks political identity more than education level.

Research by Troy Campbell and Aaron Kay demonstrated an important asymmetry: people are more likely to reject scientific consensus when they dislike the policy implications than when the same evidence has neutral or favorable policy implications. In one study, conservatives were more likely to accept climate change evidence when the proposed solution was nuclear power (consistent with some conservative policy preferences) than carbon taxes.

This "solution aversion" suggests that framing effects — how the science is presented and what policy responses are associated with it — are as important as the scientific content itself.

The Social Dimension

Scientific misinformation is not primarily an individual cognitive failure but a social phenomenon. Beliefs that are normative within one's community tend to be adopted regardless of their relationship to evidence. Research by Gordon Pennycook and David Rand suggests that what they call "lazy thinking" — relying on intuition and social cues rather than deliberate analysis — is the primary driver of misinformation susceptibility.

The social enforcement of scientific misinformation is particularly powerful in closed communities. In communities where vaccine skepticism is normative, parents who vaccinate risk social ostracism; in communities where evolution is rejected, students who accept it face family conflict. The social costs of aligning with scientific consensus can exceed the benefits.


Section 16.8: Science Communication Best Practices

The Failure of the Deficit Model

The dominant assumption underlying much science communication has been the "deficit model": the public believes false things about science because they lack information (have a knowledge "deficit"), and the remedy is to provide them with accurate information. This model has driven museum exhibits, popular science books, outreach programs, and public science communication for decades.

Research consistently finds that the deficit model fails in politically contested domains. Providing more information: - Rarely changes minds of those already committed to false beliefs - Can backfire ("backfire effect") in some contexts, though this effect has proven more fragile in replication than originally claimed - Creates "implied truth" effects when some false claims are corrected and others are not - Fails to account for the social and identity dimensions of belief

The deficit model's failure does not mean information is irrelevant. For people with genuine information gaps — those who hold scientific misinformation not from identity reasons but from simple lack of exposure to the evidence — informational interventions can work. The problem is that this population is not the one most in need of intervention.

Framing and Narrative

Research by the FrameWorks Institute and others has documented that the frame in which scientific information is presented significantly affects acceptance. Key findings:

Emphasize shared values: Climate change acceptance among conservatives increases when framed around national security, economic opportunity, and technological innovation rather than environmentalism and regulation.

Use trusted messengers: Physicians are among the most trusted sources for vaccine information. Military leaders are trusted on climate. Farmers are trusted on GMO information. Peer communities are often more effective than external experts.

Narrative over statistics: Narrative case studies of real people affected by climate change, vaccine-preventable diseases, or GMO-related policy failures are more persuasive than statistical summaries.

The "fact + fallacy" approach: Research by Ullrich Ecker and colleagues finds that corrections are more effective when they follow the structure: present the accurate fact, explain the fallacy used in the false claim, then reinforce the accurate fact. This structure preempts motivated use of the false claim.

Inoculation Theory in Science Communication

The most promising development in science communication research is the application of inoculation theory — originally developed in social psychology to study resistance to persuasion — to building resilience against scientific misinformation.

Inoculation theory, associated with William McGuire's original work in the 1960s and more recently with Sander van der Linden and Jon Roozenbeek's application to misinformation, proposes that exposing people to weakened versions of misleading arguments before they encounter full-strength versions makes them resistant to those arguments. The mechanism is analogous to vaccination: a weakened exposure primes the cognitive "immune system."

Applied to science communication, inoculation involves: 1. Warning that manipulation attempts are coming 2. Providing a "microdose" of the misleading argument technique (not the specific false claim) 3. Explaining the manipulation technique being used 4. Providing the accurate information

This approach has been validated in multiple studies and deployed at scale. The "Bad News" game (now "Harmony Square," "Go Viral," and related games) allows players to experience creating misinformation firsthand, providing inoculation against the techniques used. Research on "prebunking" YouTube advertisements based on inoculation principles showed significant reductions in susceptibility to misinformation manipulation techniques among the hundreds of millions of users exposed.


Callout Boxes

PRIMARY SOURCE: The Wedge Document The Discovery Institute's internal "Wedge Document," leaked in 1999, stated: "Design theory promises to reverse the stifling dominance of the materialist worldview, and to replace it with a science consonant with Christian and theistic convictions." This statement proved crucial in the Kitzmiller trial because it directly contradicted the Discovery Institute's public claim that intelligent design was a purely scientific proposition with no religious content or purpose.

THE VAERS PROBLEM IN PRACTICE Between 2020 and 2022, anti-vaccine social media accounts shared thousands of posts presenting VAERS data as evidence that COVID-19 vaccines were causing widespread death. A common format: "VAERS shows X deaths after COVID vaccination — more than all other vaccines combined!" This is technically true and deeply misleading. VAERS numbers surged because 1) more vaccines were being given than ever before, 2) public awareness of VAERS led to higher reporting rates, and 3) the pandemic context meant many deaths occurred near the time of vaccination. VAERS data cannot be used to calculate vaccine-caused death rates without denominator data and baseline comparison — which consistently shows no excess mortality attributable to COVID-19 vaccines in healthy populations.

THE 97% CONSENSUS: WHAT IT MEANS AND DOESN'T MEAN The 97% scientific consensus on human-caused climate change does not mean that 97% of all people who have ever studied science agree. It means that 97% of published peer-reviewed studies that express a position on the cause of recent warming attribute it primarily to human activity. The consensus figure has been reproduced in multiple independent analyses using different methodologies. Prominent "dissenting scientists" frequently do not publish research in relevant peer-reviewed journals; their dissent is expressed primarily in op-eds, think tank reports, and congressional testimony.


Key Terms

  • Manufactured doubt: Strategic creation of the appearance of scientific controversy about well-established findings, typically by industry actors with financial interests in preventing regulatory action.
  • Deficit model: The assumption that scientific misinformation persists because people lack information; challenged by cultural cognition research.
  • Cultural cognition: Dan Kahan's theory that factual beliefs about politically contested scientific questions are shaped by cultural worldview rather than simply by scientific literacy.
  • VAERS: Vaccine Adverse Event Reporting System; a passive surveillance database that captures reported events following vaccination but cannot establish causation.
  • Precautionary principle: The norm that in cases of uncertainty about serious risks, caution should guide decisions; frequently misapplied to GMOs.
  • Inoculation theory: The approach to building resilience against misinformation by exposing people to weakened versions of manipulation techniques before full-strength encounters.
  • Merchants of doubt: Oreskes and Conway's term for the network of scientists, PR firms, and organizations that manufactured doubt across multiple scientific controversies.
  • Irreducible complexity: Michael Behe's concept that some biological systems could not have evolved through natural selection; rejected by evolutionary biology.
  • Solution aversion: The tendency to reject scientific consensus when the associated policy solutions conflict with one's values.
  • Prebunking: Inoculation-based intervention that teaches manipulation techniques before exposure to specific false claims.

Discussion Questions

  1. The cultural cognition research suggests that providing more information to high-engagement climate change denialists may actually increase their resistance. If true, what follows for science communication strategy? Should scientists simply give up on persuading committed denialists and focus on different audiences?

  2. The Séralini affair involved a paper that was retracted for methodological inadequacy but not fraud, and in which accusations of industry bias had some factual basis (the journal's editorial board). How should science communicators handle retracted research whose retraction is contested?

  3. Oreskes and Conway document that the same small network of scientists and PR firms worked on tobacco, acid rain, ozone, and climate denial. What does this pattern tell us about the relationship between scientific expertise and corporate interests? Are there structural reforms that might prevent this?

  4. VAERS was designed as a sensitive signal-detection system, not a causation database. Given that this design has been systematically exploited for misinformation, should VAERS be redesigned? What would the public health cost of making VAERS less accessible be?

  5. Evolution denial in the US is closely tied to religious identity for many people. How should science educators and communicators approach evolution for audiences for whom accepting it poses an identity threat? Is "inoculation" appropriate in this context?

  6. The precautionary principle is a legitimate scientific and policy norm. How can science communicators support appropriate precautionary reasoning about genuine risks while resisting its misapplication to manufactured risks?


Summary

Scientific misinformation about climate change, vaccines, GMOs, and evolution shares common structural features: it exploits legitimate frontier uncertainty to manufacture doubt about established findings; it deploys motivated skepticism amplified by identity threat; and it resists correction through additional information alone.

The "Merchants of Doubt" framework reveals that industry-funded science denial is not a collection of independent controversies but a coordinated strategy reusing the same actors, techniques, and PR approaches across multiple issues over decades. Understanding this history is essential for recognizing similar operations when they occur.

Cultural cognition research challenges the deficit model's assumption that information provision solves scientific misinformation. The more promising approaches — inoculation, trusted messengers, value-consistent framing — address the social and identity dimensions of belief rather than assuming that facts speak for themselves.

Chapter 17 extends this analysis to health misinformation more broadly, including alternative medicine, nutritional pseudoscience, and the COVID-19 infodemic.