Chapter 33: Exercises

Misinformation and Engagement Optimization: The Epistemic Crisis


Section A: Comprehension and Analysis

Exercise 1 [Definitional] Define misinformation, disinformation, and malinformation in your own words. For each category, provide an original example not used in the chapter and explain how your example fits the category's defining characteristics.

Exercise 2 [Analysis] The Vosoughi et al. (2018) study found that bots amplified both true and false news at roughly equal rates, and that the differential spread of false news was driven primarily by humans. Write a 300-word analysis explaining why this finding matters for policy design. What types of interventions does it recommend against, and what types does it recommend toward?

Exercise 3 [Critical Reading] The chapter states that the engagement-optimization algorithm amplifies misinformation "not through any deliberate intent to spread falsehood, but through the indifferent pursuit of engagement." Write a 250-word response arguing either: (a) the absence of intent substantially reduces platform moral responsibility, or (b) the absence of intent does not substantially reduce platform moral responsibility. Use at least one analogical argument.

Exercise 4 [Application] Identify a piece of misinformation that has circulated on social media in the past year (you may find examples through news coverage of misinformation events). Apply the Vosoughi et al. framework to explain why this particular content likely spread: how was it novel? What emotions did it generate? How did it compare to accurate information on the same topic in terms of these dimensions?

Exercise 5 [Conceptual] Explain the "liar's dividend" in your own words. Then generate one historical example (from before the deepfake era) where a comparable dynamic operated — where uncertainty about the authenticity of evidence was exploited to avoid accountability. What does this historical precedent suggest about whether the liar's dividend is a new problem or a new manifestation of an old one?

Exercise 6 [Comparative] Compare and contrast the mechanisms by which misinformation spreads on a feed-based platform (like Facebook or Instagram) versus a search-and-recommendation platform (like YouTube). How do the different architectures create different misinformation dynamics?

Exercise 7 [Evidence Evaluation] A researcher claims: "Platform fact-checking labels don't work because they have no effect on user beliefs." A second researcher claims: "Platform fact-checking labels do work because they reduce sharing of labeled content." Are these claims contradictory? Write a 200-word response that reconciles them or explains why they cannot be reconciled. What would "working" mean for a fact-checking label?


Section B: Research and Investigation

Exercise 8 [Research] Using the Internet Archive's Wayback Machine or news archives, research the history of a specific anti-vaccination narrative that predates social media (e.g., the Andrew Wakefield MMR-autism paper from 1998). Write a 400-word analysis of how the narrative spread differently before and after social media amplification. What does this comparison tell you about the role of social media as an amplifier versus an originator of misinformation?

Exercise 9 [Data Collection] Choose a specific health topic currently generating social media discussion (e.g., a new supplement, treatment, or health concern). Search for content about this topic on at least two social media platforms. Document: (a) the accuracy of the content you find, (b) the engagement metrics on accurate versus inaccurate content, (c) what the recommendation system suggests you watch or read next. Write a 350-word report on your findings.

Exercise 10 [Field Research] Identify a community in your region (a neighborhood, small town, or county) that has experienced significant reduction in local journalism coverage in the past decade — either through newspaper closure, significant staff reduction, or loss of a local TV station. Research what information sources community members now rely on. Write a 500-word analysis of the relationship between the local journalism collapse and information ecosystem quality in that community.

Exercise 11 [Media Analysis] Find three examples of media coverage of misinformation events from the past three years. Analyze how each news outlet describes the relationship between the platform and the misinformation: does the coverage frame the platform as responsible? As a passive conduit? As an active amplifier? Write a 300-word comparison of the framings.

Exercise 12 [Platform Investigation] Using AlgoTransparency or similar third-party tools that analyze platform recommendations, investigate what a hypothetical new user interested in a specific topic (e.g., natural health, political content of your choice) would be recommended after initial engagement. Document the recommendation pathway over at least five steps. Write a 400-word analysis of what you observe.


Section C: Design and Application

Exercise 13 [Design Challenge] Design a prebunking intervention for one specific misinformation narrative currently circulating in your community or social networks. Your design should: (a) identify the specific techniques used in the misinformation, (b) design content that inoculates against those techniques, (c) consider how to make the content engaging enough to spread through the same algorithmic channels as the misinformation. Present your design in 400 words with a brief prototype (sketch, script, or outline).

Exercise 14 [Policy Design] You are advising a mid-sized social media platform on its misinformation policy. The platform has a budget that allows for either (a) hiring 50 additional content moderators to manually review flagged content, (b) implementing a fact-checking partnership with three established organizations, or (c) investing in engineering to develop a "friction" feature that adds reading prompts before sharing. Based on the evidence in this chapter, which would you recommend and why? Write a 400-word policy memo.

Exercise 15 [Scenario Analysis] A public health department wants to counter COVID-19 booster shot misinformation spreading through community Facebook groups. They have no relationship with the platform and cannot require enforcement action. Design a multi-component counter-misinformation strategy using only tools available to them: their own content creation, community partnerships, media outreach, etc. What does the research suggest would be most effective, given their constraints?

Exercise 16 [UX Design] Mock up (sketch, wireframe, or describe in detail) a redesigned social media sharing interface that incorporates friction-based interventions based on the evidence in this chapter. Your design should address: accuracy nudges, reading prompts, emotional temperature indicators, and/or delay mechanisms. Write a 300-word design rationale explaining what each element is intended to accomplish and what evidence supports it.


Section D: Perspective-Taking and Ethics

Exercise 17 [Perspective-Taking] Write a 400-word first-person account from the perspective of a sincere COVID-19 vaccine skeptic who believes they are sharing accurate health information that mainstream institutions are suppressing. How does this person evaluate the information they encounter? What makes them trust the sources they trust? This exercise requires you to take the perspective seriously without endorsing it.

Exercise 18 [Stakeholder Analysis] Identify all major stakeholders in the vaccine misinformation ecosystem: platforms, advertisers, creators of misinformation content, amplifiers/sharers, people who become vaccine hesitant, people who contract vaccine-preventable diseases, public health systems, etc. Map the incentives of each stakeholder. Where do incentive misalignments create structural openings for misinformation to spread?

Exercise 19 [Ethical Analysis] A platform discovers that one of its highest-revenue creators is responsible for a significant volume of health misinformation. The creator's content has not violated any existing platform policies, but a public health analysis estimates it has contributed to measurable reductions in vaccination rates. Write a 400-word ethical analysis of the platform's options: creating new policy retroactively applicable to the creator, removing the creator despite no policy violation, doing nothing, or other alternatives. What ethical frameworks support each option?

Exercise 20 [Counter-Arguments] The chapter argues that platforms' engagement-optimization systems structurally amplify misinformation. Write the strongest possible counter-argument to this claim: what evidence would an advocate for platforms emphasize? What limitations of the research would they highlight? After presenting the strongest counter-argument, evaluate it: what does it establish, and what does it leave unaddressed?


Section E: Extended Projects

Exercise 21 [Extended Research Paper] Research and write a 1,500-word paper on the relationship between misinformation and one specific public health outcome (e.g., measles vaccination rates, COVID-19 mortality, opioid treatment uptake). Your paper should: (a) document the specific misinformation narratives relevant to your topic, (b) examine evidence for their spread via social media, (c) assess the evidence connecting social media misinformation to the health outcome, (d) evaluate platform responses, and (e) propose structural interventions.

Exercise 22 [Comparative Analysis] Write a 1,200-word comparative analysis of misinformation dynamics during two different political events: one from the pre-social-media era (before 2004) and one from the social media era. How did false information spread in each case? What institutions countered it? What determined whether correction was effective? What does the comparison reveal about what is structurally new about social media misinformation?

Exercise 23 [Policy Brief] Write a 1,000-word policy brief for a national legislature considering platform misinformation legislation. The brief should: (a) explain the structural problem in accessible terms, (b) review the evidence on existing interventions, (c) recommend two to three specific regulatory requirements with evidence-based rationale, and (d) address likely objections (free speech, overreach, competitive disadvantage).

Exercise 24 [Experimental Design] Design a randomized controlled trial to test the effectiveness of one specific misinformation intervention — of your choice — that has not yet been tested in the literature. Your design should specify: the research question, the intervention, the control condition, the sample, the outcome measures, the timeline, and the primary ethical considerations. Present your design in 800 words.

Exercise 25 [Synthesis Essay] Write a 1,000-word synthesis essay arguing for or against the following claim: "The misinformation crisis is primarily a technological problem requiring technological solutions." Your essay should engage with evidence from this chapter and demonstrate understanding of both technological and non-technological dimensions of the problem.


Section F: Practical Media Literacy

Exercise 26 [Skill Building] Practice lateral reading: take three social media posts that make specific factual claims (find real examples from your own feed or through a media literacy resource). For each, open multiple browser tabs to search for corroborating or contradicting information from independent sources before engaging with the original content. Write a 200-word reflection on how this process affected your assessment of each claim.

Exercise 27 [Source Evaluation] Evaluate five social media accounts that regularly share health information. For each, assess: (a) the credentials of the account holder, (b) the sources cited for claims, (c) whether claims can be verified through independent sources, (d) whether the account has been fact-checked and found inaccurate, (e) what the recommendation algorithm suggests to users who follow this account. Write a 300-word comparative evaluation.

Exercise 28 [Personal Audit] Conduct a personal media diet audit. For one week, track every source from which you receive news or information about current events. Note: how many were social media posts? News websites? Podcasts? Personal conversations? For social media sources, assess what percentage were algorithmically recommended versus sought out directly. Write a 300-word reflection on what the audit reveals about your own information ecosystem.

Exercise 29 [Correction Practice] Find three examples of misinformation you have personally shared, liked, or commented on in the past year (or, if you cannot identify personal examples, find three examples from public figures who publicly shared misinformation they later corrected). For each, assess: what made the content believable? What should have triggered skepticism? What would have been needed to identify it as false before sharing? Write a 300-word reflective analysis.

Exercise 30 [Community Engagement] Design and execute a 30-minute media literacy workshop for a community group (family members, neighbors, classmates, community organization). Focus on one specific misinformation risk relevant to your community. After running the workshop, write a 400-word reflection: what worked, what didn't, and how did participants respond to learning about algorithmic misinformation amplification?


Section G: Synthesis and Advanced Analysis

Exercise 31 [Systems Analysis] Map the misinformation ecosystem as a system: identify the actors (creators, amplifiers, platforms, fact-checkers, journalists, regulators), the relationships between them, the feedback loops that reinforce misinformation spread, and the points where interventions could disrupt those feedback loops. Present your map visually and write a 400-word explanation of the system dynamics.

Exercise 32 [Historical Analogy] The chapter notes that misinformation, propaganda, and false information have existed throughout human history. Research one pre-digital information crisis (e.g., yellow journalism and the Spanish-American War, the "War of the Worlds" radio broadcast, Cold War propaganda campaigns). Write a 600-word analysis of similarities and differences with contemporary social media misinformation, focusing on: scale, speed, structural amplification, and institutional response.

Exercise 33 [Cross-Chapter Synthesis] The engagement-optimization mechanisms described in Part II of this book (variable reward schedules, social validation, infinite scroll) are also implicated in misinformation spread. Write a 500-word analysis connecting what you learned about behavioral design in those chapters to what you have learned about misinformation in this chapter. How do the psychological vulnerabilities exploited by dark patterns relate to the psychological vulnerabilities exploited by misinformation?

Exercise 34 [Legal Analysis] Section 230 of the Communications Decency Act currently provides platforms with broad immunity for user-generated content, including misinformation. Research the current legislative debate around Section 230 reform. Write a 600-word analysis of: (a) the current protection Section 230 provides, (b) specific reform proposals that have been advanced, (c) the arguments for and against each reform, and (d) which reforms, if any, you find most likely to address misinformation without creating unacceptable costs to free expression.

Exercise 35 [Integrative Project] Develop a comprehensive anti-misinformation strategy for a specific institution — a university, a hospital system, a public library, a faith community, or another institution of your choice. Your strategy should address: (a) internal communication practices that reduce misinformation risk, (b) education programs for community members, (c) relationships with credible information sources, and (d) protocols for responding when members of your community encounter or spread misinformation. Present your strategy in 1,000 words.