Chapter 21 Quiz: Personalization, Filter Bubbles, and the Algorithmic Self
Instructions
Select the best answer for each question. After completing all 22 questions, check your answers against the answer key at the end.
1. Eli Pariser's original observation that led to the filter bubble concept was:
A) That political advertising on Facebook targeted users with personally relevant messages B) That his conservative Facebook friends had been algorithmically filtered out of his News Feed without his explicit choice C) That Facebook's News Feed showed more advertising than organic social content over time D) That users who engaged with political content received increasingly extreme recommendations
2. Which of the following is NOT one of the three features Pariser identifies as distinguishing algorithmic filter bubbles from ordinary human information selectivity?
A) Individualization — each person's bubble is unique to them B) Invisibility — you don't see what is filtered out C) Non-consensuality — you didn't choose to be in it D) Ideological bias — the algorithm systematically favors particular political views
3. The "feedback loop" in social media personalization is best described as:
A) The way user complaints feed back into platform design decisions B) The engagement-signal-to-content-recommendation-to-engagement cycle that progressively narrows personalized information environments C) The technical process by which advertising revenue feeds back into algorithm development budgets D) The social process by which content goes viral through multiple waves of sharing
4. Collaborative filtering personalizes content recommendations based on:
A) Professional editors' judgments about what content is highest quality B) Legal requirements for content diversity in different geographic markets C) The behavior of users with similar engagement histories to the target user D) Real-time trending topics across the platform's entire user base
5. What is the key analytical distinction between a filter bubble and an echo chamber?
A) Filter bubbles affect news consumption; echo chambers affect entertainment consumption B) Filter bubbles are more extreme than echo chambers in their information restriction C) Filter bubbles are created by algorithmic curation; echo chambers are created by social selection D) Filter bubbles are temporary phenomena; echo chambers are permanent social structures
6. Bail et al.'s (2018) finding about exposing Twitter users to opposing political content was:
A) That cross-cutting exposure significantly reduced political polarization among all users B) That cross-cutting exposure had no measurable effect on political views either way C) That cross-cutting exposure increased political polarization — liberals became more liberal, conservatives more conservative D) That cross-cutting exposure only affected users who were initially moderately engaged with political content
7. The Bakshy et al. (2015) Facebook study found that the algorithm reduced exposure to ideologically cross-cutting content by approximately how much compared to showing users their full News Feed?
A) 2 percent B) 8 percent C) 25 percent D) 45 percent
8. The "personalization paradox" described in the chapter refers to:
A) The contradictory finding that personalization increases engagement but decreases satisfaction B) The tension between the genuine value of relevant recommendations and the epistemic cost of reduced serendipitous discovery C) The paradox that platforms claim to serve user interests while actually optimizing for advertiser interests D) The observation that users who receive the most personalized experience trust the platform least
9. The chapter's concept of "identity lock-in" refers to:
A) Users being unable to change their username or profile information after account creation B) Platforms storing user data in ways that prevent account deletion C) The progressive divergence between the algorithm's model of who a user is and who they actually are or are becoming D) The process by which users develop strong platform-specific identities that prevent them from migrating to competitors
10. Which type of behavioral signal generates the STRONGEST personalization signals according to the chapter?
A) Passively scrolling past content B) Spending a certain amount of time on a page C) Explicit engagement actions such as clicking, sharing, or commenting D) Hovering over content without clicking
11. "Filter tightening" occurs when:
A) Platforms reduce the amount of advertising content shown to users over time B) The personalization feedback loop progressively narrows the information environment as the algorithm becomes more confident about user preferences C) Regulatory requirements force platforms to restrict certain types of content in specific markets D) Users actively configure their own content filters to reduce exposure to unwanted material
12. The term "algocracy" as used in the chapter refers to:
A) A political system in which governments govern through algorithmic systems rather than elected representatives B) The academic discipline studying algorithmic systems and their social effects C) The governance of information environments by commercial algorithms that make decisions about what people know D) A proposed regulatory framework for algorithmic accountability developed by European regulators
13. How does location data contribute to social media personalization?
A) It allows platforms to show users content from accounts physically near them B) It allows platforms to generate behavioral profiles through ambient data — your physical movements signal interests, affiliations, and activities without any active platform engagement C) It allows platforms to comply with geographic content restrictions and copyright laws D) It is used primarily for emergency alert systems rather than personalization
14. Cross-platform data sharing by Meta (across Facebook, Instagram, WhatsApp, and Messenger) creates personalization profiles that are:
A) More limited than single-platform profiles because privacy regulations restrict data sharing B) Substantially richer than single-platform profiles because behavioral data from each platform informs personalization on all others C) Identical to single-platform profiles because each Meta platform uses separate algorithms D) Less accurate than single-platform profiles because combining data from different contexts introduces noise
15. Maya's experience, as described in the chapter, illustrates which specific filter bubble pattern?
A) High exposure to local civic information, low awareness of national political issues B) High awareness of national and global political issues, very low awareness of local civic information about her actual community C) Extremely homogeneous political content, with complete absence of any content outside her political perspective D) Heavy exposure to misinformation about national elections, combined with accurate local news
16. According to Velocity Media's experience with "serendipity injection" (introducing content outside users' engagement patterns):
A) It significantly increased engagement across all content categories B) It reduced polarization in political categories and improved satisfaction in entertainment categories C) It reduced engagement initially but improved satisfaction in entertainment, while cross-cutting political content showed the same polarization-increasing effect found by Bail et al. D) It had no measurable effect on either engagement or political polarization
17. The chapter states that social media platforms have partially replaced local news. What is the key problem with this replacement?
A) Social media platforms charge users for local news content that was previously free B) Social media algorithms are optimized for engagement, not for the epistemic breadth and civic information that local news traditionally provided C) Social media companies do not employ professional journalists and therefore cannot cover local news accurately D) Federal regulations prohibit social media platforms from distributing local political news
18. The "naive realism" of social media described in the chapter refers to:
A) The false belief that social media platforms are neutral technology with no political effects B) The tendency to treat one's personalized feed as a window onto social reality rather than an algorithmically curated selection C) Research participants who significantly underestimate how much time they spend on social media D) Platform executives who genuinely believe their products create social benefit
19. Research by Andrew Guess, referenced in the chapter's "Voices from the Field" section, found that during the 2020 US election:
A) Both liberal and conservative social media users were equally exposed to misinformation B) Facebook users showed no significant political information environment differences between partisan groups C) Different partisan groups were effectively operating with different factual realities — not just different interpretations of shared facts D) Conservative users showed higher engagement with political content, but not different informational environments
20. Which of the following practices does the chapter identify as most directly addressing the local civic information gap created by personalization?
A) Using privacy browsers and VPN services to mask behavioral signals from personalization systems B) Deliberately investing in local news sources through subscriptions and regular consultation rather than waiting for local news to appear in social feeds C) Following a more diverse set of accounts on existing social media platforms D) Using ad blockers to remove personalized advertising from news websites
21. The "engagement-relevance conflation" described in the chapter refers to:
A) The mistake that users make in assuming that popular content is relevant to their lives B) The platform design error of counting engagement signals as evidence of advertising effectiveness C) Platforms operationalizing "relevance" as "what you will engage with" — which may differ significantly from content that is genuinely informative or epistemically valuable D) The academic conflation of engagement metrics with meaningful civic participation
22. The Velocity Media case study concludes that the honest answer about filter bubble solutions is:
A) That aggressive cross-cutting content exposure is the most effective technical fix for filter bubble effects B) That government regulation is the only tool with sufficient power to address personalization-driven information fragmentation C) That there is no simple technical fix for the epistemic problems personalization creates D) That users who are educated about filter bubbles consistently make better information choices than uninformed users
Answer Key
- B
- D
- B
- C
- C
- C
- B
- B
- C
- C
- B
- C
- B
- B
- B
- C
- B
- B
- C
- B
- C
- C