Chapter 17 Quiz: Algorithms, the Attention Economy, and Filter Bubbles


Instructions: Answer all ten questions. For multiple-choice questions, select the single best answer. For short-answer questions, aim for 2–4 sentences. Total point value: 100 points.


Questions 1–4: Multiple Choice (7 points each)

Question 1

Herbert Simon's foundational 1971 insight about information abundance holds that:

A) More information always produces better-informed citizens. B) In a world of information abundance, the scarce resource is attention, not information. C) The printing press was the primary driver of political propaganda. D) Advertising-supported media is inherently unreliable.


Question 2

The "feedback loop" in recommendation algorithms describes:

A) The process by which users consciously choose to follow accounts with similar views. B) The mechanism by which engagement with content signals interest, producing more similar content, reinforcing the signal, and escalating toward more engaging content. C) The practice of social media companies sharing user data with political campaigns. D) The cycle by which propaganda is created, distributed, and subsequently debunked.


Question 3

Ribeiro et al.'s (2019) study of YouTube's recommendation algorithm found:

A) That YouTube's algorithm randomly recommended content from across the political spectrum. B) That explicit extremist content was never recommended to users who started with mainstream content. C) Systematic pathways from mainstream political content through the "alternative influence network" to explicitly extremist content, driven by engagement optimization. D) That YouTube's 2019 policy changes fully resolved the radicalization pathway problem.


Question 4

The key finding of Bail et al. (2018) — "Exposure to Opposing Views on Social Media Can Increase Political Polarization" — was:

A) That people who were shown more diverse content immediately adopted more moderate views. B) That Republicans who followed a liberal bot became more conservative, contrary to the prediction that cross-cutting exposure reduces polarization. C) That Twitter's algorithm was more prone to radicalization than YouTube's. D) That filter bubbles had no measurable effect on political attitudes.


Questions 5–7: Multiple Choice/Short Answer (10 points each)

Question 5

Distinguish between a filter bubble and an echo chamber. Your answer should specify: (a) which is algorithmically created and which is socially maintained; (b) what the empirical evidence says about the relative size of each effect; and (c) why the distinction matters for propaganda analysis.


Question 6

The Frances Haugen disclosures revealed that Facebook's internal research had found that the "angry" reaction generates approximately how much more algorithmic distribution than the "like" reaction?

A) Twice as much B) Three times as much C) Five times as much D) Ten times as much

Why is this finding significant for understanding the relationship between Facebook's design choices and the amplification of outrage-generating political content?


Question 7

What were Cambridge Analytica's claimed capabilities versus their documented activities? Why does this distinction matter, and what is the "dark ad" problem that makes even conventional behavioral micro-targeting analytically significant?


Questions 8–10: Short Answer/Essay (varies)

Question 8 (8 points)

What does Tim Wu's The Attention Merchants argue about the history of advertising-supported media, and how does his argument explain why the attention economy of the internet systematically advantages propaganda?


Question 9 (8 points)

The EU Digital Services Act represents one of three major regulatory approaches to algorithmic design discussed in this chapter. Briefly describe the DSA approach (what it requires, what it does not require) and identify one significant strength and one significant weakness of this approach as a response to algorithmic amplification of disinformation.


Question 10 (10 points — Essay)

The chapter argues that engagement optimization systematically advantages propaganda over accurate information. Using at least three specific pieces of evidence from this chapter (research findings, documented cases, or internal documents), develop the argument that the attention economy creates structural conditions favorable to propaganda — and then identify the strongest objection to this argument and respond to it.

Your answer should demonstrate understanding of the structural analysis (not just list examples), engage seriously with the objection, and reach a clear conclusion.


Answer Key

Question 1: B — Simon's foundational insight was precisely that information abundance shifts the bottleneck from information scarcity to attention scarcity.

Question 2: B — The feedback loop is the core mechanism: engagement signals interest, which produces more similar content, reinforcing the signal and escalating escalation.

Question 3: C — Ribeiro et al. documented systematic pathways, driven by engagement optimization, from mainstream to extremist content. The 2019 changes were partial and addressed specific channels without resolving the underlying structural incentive.

Question 4: B — The counterintuitive finding was that Republican participants became more conservative after following a liberal bot — cross-cutting exposure without relational context triggers defensive identity protection, not moderation.

Question 5: Full credit requires: (a) filter bubble = algorithmically created; echo chamber = socially maintained / human choice-driven; (b) empirical research (Bakshy et al. 2015) found filter bubble effects are real but smaller than Pariser claimed, and human choice is a larger driver; (c) the distinction matters because the remedies are different — algorithmic redesign addresses filter bubbles but may not address echo chambers, and the Bail et al. finding suggests that algorithmic cross-cutting exposure may backfire.

Question 6: C — Five times as much. Significance: Facebook's engagement optimization directly rewarded content that produced anger reactions — an amplification premium for outrage-generating content that is characteristic of propaganda. This is a designed structural advantage for the most emotionally provocative content, not a neutral distribution system.

Question 7: Claimed capabilities: psychographic profiling at scale using OCEAN model to deliver customized persuasion messages exploiting psychological vulnerabilities, described as a revolution in political communication. Documented activities: improper harvesting of Facebook data on 87 million users, conventional political micro-targeting using behavioral and demographic data. The distinction matters because the inflated claims should not produce complacency. The "dark ad" problem: behavioral micro-targeting allows simultaneous, invisible messaging to different segments — different political messages to different audiences with no visibility to journalists, opponents, or regulators. This is a structural propaganda capability regardless of psychographic sophistication.

Question 8: Wu argues that advertising-supported media has always been in the business of harvesting and selling human attention to advertisers — content is bait; audience attention is the product sold. The internet intensified this logic by enabling real-time, granular tracking of attention and optimizing distribution accordingly. The consequence for propaganda: content that captures attention most reliably is emotionally engaging content, which is precisely what propaganda is engineered to produce. Propaganda is therefore structurally advantaged in any engagement-optimized information distribution system.

Question 9: DSA requires: very large platforms to conduct systemic risk assessments of their design choices, implement mitigation measures for identified risks, and submit to independent audit. It does not prescribe specific algorithmic designs or prohibit specific content. Strength: responds directly to Haugen disclosures by creating external requirement to act on internally identified harms; does not require government control of editorial decisions. Weakness: effectiveness depends on quality and independence of risk assessment process; platforms conducting their own assessments with regulators lacking technical capacity to challenge them may produce compliance ritual without substance.

Question 10: Full credit requires: at least three specific pieces of evidence (e.g., Vosoughi et al. 2018 finding that false news spreads faster; Haugen disclosure of 5x angry reaction weighting; Ribeiro et al. radicalization pathway; Bail et al. showing ineffectiveness of simple counter-exposure); structural analysis showing why the architecture favors propaganda (not just listing examples); serious engagement with an objection (e.g., "platforms also surface accurate viral content" or "users exercise choice"); and a clear, defended conclusion.


Chapter 17 | Part 3: Channels