Chapter 32: Exercises — Political Polarization and Algorithmic Amplification
Reflection Exercises
Exercise 1 [Reflection] Track your own political content consumption on social media for one week. Keep a log of political posts you encounter, noting: the platform, the political direction of the content, whether it generated positive or negative affect in you, and whether you engaged with it (liked, shared, commented). At the end of the week, analyze the log. Does your social media political diet reflect genuine diversity of perspective, or does it tilt in a particular direction? Did any of the content change your mind, or did it primarily confirm existing views?
Exercise 2 [Reflection] Think about your political views on a specific contested policy issue—immigration, healthcare, climate change, tax policy, or any other you care about. How did you form those views? What sources of information were most influential? How much of your political information consumption occurs through social media versus other sources (television, print journalism, conversations, books)? Do you think your information environment has made you more or less understanding of people who hold different views?
Exercise 3 [Reflection] The chapter describes affective polarization as mutual hostility and distrust between political groups. Honestly assess your own affective polarization: How do you feel about people who vote for the party opposite to your own preferences? Do you consider them uninformed, stupid, or morally wrong, or do you see them as people with genuinely different values who have arrived at different conclusions? Has this changed over the course of your lifetime?
Exercise 4 [Reflection] Have you ever changed your mind about a significant political question? If so, what prompted the change? Was it a social media encounter, a personal conversation, a book or documentary, a personal experience, or something else? What does your answer suggest about which channels of political information are most effective for genuinely shifting beliefs?
Exercise 5 [Reflection] The chapter describes the social media incentive structure as rewarding outrage over nuance. Reflect on your own social media posting behavior: Have you ever posted something more extreme or provocative than you actually believe because you knew it would generate more engagement? Have you ever softened or nuanced a post because you knew the more extreme version would generate more engagement but felt dishonest? What does this reveal about how platform incentives shape political communication?
Exercise 6 [Reflection] Consider a social or political issue on which you have a strongly held view. Now try to articulate the most charitable version of the opposing position—the strongest case that could be made by a thoughtful person who disagrees with you. How difficult is this exercise? How well does your social media feed prepare you for it?
Exercise 7 [Reflection] Imagine you are a news editor in a social media environment where clicks and shares determine whether your journalism reaches readers. What pressures would you face? Would your journalism be more or less likely to be nuanced, measured, and accurate under these conditions compared to a traditional subscription model? How does this thought experiment illuminate the structural pressures on political information quality?
Research Exercises
Exercise 8 [Research] Find and read Bail et al.'s 2018 paper "Exposure to opposing views on social media can increase political polarization" (Proceedings of the National Academy of Sciences). What was the exact research design? What did the liberal and conservative participants see? What were the main findings for each group? Why did exposure to opposing views increase rather than decrease polarization? How do the authors interpret their findings?
Exercise 9 [Research] Research the Pew Research Center's Political Polarization survey series. Find the data on "very unfavorable" views of the opposing party from 1994 to the most recent available survey year. Create a timeline chart of the trend. What is the overall trajectory? Are there particular periods of especially rapid increase? Does the timeline support the hypothesis that social media caused the trend, or does it suggest other explanations?
Exercise 10 [Research] Find and read the 2018 UN Fact-Finding Mission report on Myanmar (or the summary document). What specific findings did the report make about Facebook's role in the Rohingya genocide? What recommendations did the mission make to Facebook? What did Facebook do in response? Based on the report, what changes in Facebook's operations would have been necessary to prevent the harm?
Exercise 11 [Research] Research Boxell, Gentzkow, and Shapiro's 2017 paper on internet use and political polarization. What was their main finding about which demographic groups showed the largest polarization increases? What do they propose as an explanation? What implications does their finding have for attributing polarization primarily to social media?
Exercise 12 [Research] Find and examine Facebook's "Civic Integrity Policy" and its evolution from 2020 to the present. What specific policies did Facebook implement before and during the 2020 election? How have these policies changed since the election? What was the stated rationale for each change?
Exercise 13 [Research] Research the "Molly Brady et al. 2023 Science study" on algorithmic curation and political polarization during the 2020 election. What were the four experimental conditions (chronological vs. algorithmic feed, and close contacts only vs. broader network)? What did they find? Why did the authors interpret this as evidence against the filter bubble theory?
Exercise 14 [Research] Research WhatsApp's lynching problem in India (2017-2019). Find at least three independent news reports or academic papers documenting specific incidents. What were the circumstances of each incident? What content spread through WhatsApp groups to incite the violence? What did WhatsApp do in response? What has research found about the effectiveness of WhatsApp's forwarding limits?
Analysis Exercises
Exercise 15 [Analysis] Apply the affective vs. ideological polarization distinction to the following scenarios and determine which type of polarization each primarily illustrates: (a) Democrats and Republicans have diverging views on whether climate change is a serious threat (50% of Republicans vs. 85% of Democrats in a survey saying yes); (b) Democratic and Republican voters are more likely than ever before to say they would be unhappy if their child married someone from the other party; (c) The median Democrat in the House of Representatives has moved left and the median Republican has moved right over 30 years; (d) 70% of Republicans and 70% of Democrats agree that $15 minimum wage is a reasonable policy, but each party thinks the other party opposes it. For each, what does the type of polarization suggest about possible interventions?
Exercise 16 [Analysis] Facebook's "angry emoji" weighting decision is documented in internal research as having amplified divisive political content. Analyze this decision using the "intent vs. effect" framework from the book. What was the likely intent behind the weighting decision? What were the mechanisms connecting the decision to the harmful effect? What information did Facebook have, or should have had, about the likely effects? Does the unintentionality of the harmful effect reduce Facebook's moral or legal responsibility for it?
Exercise 17 [Analysis] The chapter presents three different mechanisms by which social media might contribute to polarization: (1) filter bubbles reducing cross-cutting exposure, (2) algorithmic amplification of outrage-generating content, and (3) counterproductive effects of encountering opposing views in a hostile social media environment (Bail et al.). For each mechanism, assess: What type of evidence would best test this mechanism? Does the evidence cited in the chapter adequately test it? Which mechanism do you think is best-supported and most significant?
Exercise 18 [Analysis] Compare the Myanmar case to the India WhatsApp lynching cases along the following dimensions: the nature of the harmful content, the platform involved, the role of the algorithm, the role of human actors deliberately spreading content, the speed of escalation from online content to offline violence, the platform's prior knowledge of the risk, and the platform's response. What do the similarities suggest about the common enabling conditions for social media-facilitated political violence?
Exercise 19 [Analysis] Meta's decision to reduce algorithmic amplification of political content in 2022-2023 can be interpreted in multiple ways: as a genuine attempt to reduce polarization, as a commercially motivated decision to avoid regulatory risk, as a de facto political choice (since reducing political content may benefit incumbents over challengers), or as a structural response to user preferences. Develop the strongest version of each interpretation. Which do you find most convincing, and why?
Exercise 20 [Analysis] The chapter notes that researchers have found that polarization increased more among groups with low internet use (older Americans) than among high-internet-use groups. Design a study that would test whether this finding is evidence that social media is not the primary driver of polarization, or whether it might be explained by other factors (for example, that older Americans are heavy cable news consumers). What controls would you need? What would the strongest confound be?
Exercise 21 [Analysis] Evaluate the claim that social media companies bear significant responsibility for political polarization. Consider: What specific choices did companies make that contributed to polarization? What information did they have about these choices' effects? Were the choices commercially motivated? Were alternative choices available? How does the answer to each of these questions affect your assessment of responsibility? How should that responsibility be allocated between companies, users, political elites, and structural factors?
Creative Exercises
Exercise 22 [Creative] Design an alternative Facebook News Feed algorithm specifically optimized for democratic deliberation rather than engagement. What signals would it use? What content would it prioritize? What would it deprioritize? What trade-offs would this involve for user experience and platform revenue? Present your design as a product specification with specific algorithmic changes and their rationale.
Exercise 23 [Creative] Write a 1,000-word first-person account from the perspective of a social media user who has undergone the radicalization pathway documented in research—someone who started with mainstream political content and was algorithmically directed toward increasingly extreme positions. The account should be sympathetic, non-caricatured, and psychologically realistic. Include specific platform features and content types that would have been encountered along the way.
Exercise 24 [Creative] You are the chief policy officer at a social media company preparing for a major national election in six months. Write a 1,500-word memo to the executive team outlining the specific risks the company faces, the policy measures you recommend, the expected costs of those measures (in revenue, user experience, and employee relations), and the expected benefits. What are the most difficult trade-offs you face, and how do you recommend resolving them?
Exercise 25 [Creative] Imagine it is 2015 and you are a researcher at Facebook who has just analyzed internal data showing that the angry emoji weighting decision is amplifying divisive political content. Write the internal memo you would send to product leadership. What evidence would you present? What changes would you recommend? How would you anticipate and respond to the commercial objections your recommendation would face?
Exercise 26 [Creative] Write a comparative essay (1,500-2,000 words) comparing the social conditions that made the Myanmar genocide possible to a current or emerging situation elsewhere in the world where social media platforms are operating in environments with weak institutions, intense ethnic or religious conflict, and inadequate content moderation infrastructure. What warning signs should platform companies be monitoring? What preventive actions should they take?
Group Exercises
Exercise 27 [Group] "Red team / Blue team" exercise: Divide the group into two teams. Both teams receive the same set of political news headlines from a fictional country with a contentious election. Team A is told they are members of the ruling party. Team B is told they are members of the opposition. Each team discusses and interprets the headlines. Then compare interpretations. What divergences emerged from identical information, based solely on partisan framing? What does this exercise suggest about the limits of information as a solution to polarization?
Exercise 28 [Group] "Filter bubble audit": Each participant identifies three political news stories they have seen on social media in the past week. In groups, discuss: Did different participants see different stories? Did the same story appear differently in different feeds? What sources did the stories come from? What emotions did they evoke? Collectively map the group's social media political information environment. How diverse is it? Who is likely missing?
Exercise 29 [Group] Mock congressional hearing: Assign roles — a senator questioning a social media CEO about the company's role in political polarization, the CEO defending the company's record, a researcher presenting academic evidence, a civil society advocate presenting harms, and a communications expert. Role-play the hearing, then debrief: What questions were hardest to answer? What responses were most effective? What does the exercise reveal about the challenges of platform accountability?
Exercise 30 [Group] "What would we need to know?" exercise: As a group, identify the three most important empirical questions about social media and political polarization that the chapter cannot definitively answer. For each question, design the research study that would best answer it, including: study design, sample, what you would measure, ethical challenges, and what results would tell you. Which of the three questions is most tractable, and which is most important?
Exercise 31 [Group] International comparison exercise: Each small group is assigned one country (Brazil, India, Myanmar, US, or another country with documented social media and political violence connections). Groups research the specific case and present: the political context, the social media platforms involved, the nature of the harm, the platform's response, and the remaining unresolved issues. After all presentations, identify patterns across cases: What enabling conditions appear consistently? What platform responses were effective?
Exercise 32 [Group] "Deliberation by design" exercise: Imagine you are tasked with redesigning one social media platform specifically to support democratic deliberation. In groups, develop: the platform's core design principles, three specific feature decisions, two specific algorithmic design choices, one content moderation policy, and one transparency mechanism. Present your design and have other groups critique it. What would be most challenging to implement? What might go wrong?
Exercise 33 [Group] "Structural vs. cultural" policy debate: Divide into two groups. Group A argues that the primary interventions for addressing political polarization should target structural factors (economic inequality, geographic sorting, electoral reform). Group B argues that cultural and media environment interventions (including social media regulation) should be the primary focus. Each group presents for 10 minutes, then engages in debate. What would a policy approach look like that genuinely integrated both sets of interventions?
Applied and Extended Exercises
Exercise 34 [Applied] Conduct a "cross-cutting exposure experiment": For one week, deliberately seek out and read political commentary from sources on the opposite side of the political spectrum from your usual consumption. Keep a daily journal noting: what you read, your emotional reactions, whether you found anything persuasive, whether you found anything surprising, and whether the experience changed any of your views. Write a reflection at the end of the week analyzing your experience in light of the Bail et al. finding that cross-cutting exposure can harden polarization.
Exercise 35 [Applied] Analyze the political content on your most-used social media platform over a one-week period. Create a structured log tracking: the political direction of content that appears in your feed (without seeking it), the emotional register of that content (calm/analytical vs. emotionally charged), the source type (mainstream news, partisan outlet, individual commentator), whether the content is primarily positive/constructive or negative/attacking, and your engagement with it. Summarize your findings. Does your platform's political content environment match the characteristics described in the chapter? What platform-specific features might explain any differences from the general pattern?