Chapter 7 Quiz: The Rise of Digital and Social Media
Instructions
This quiz covers the material from Chapter 7. Questions vary in format: multiple choice, true/false, short answer, and matching. Answers are hidden in collapsible sections — attempt each question before revealing the answer.
Part 1: Multiple Choice
Question 1
The "Good Times" email virus hoax of 1994 is significant in the history of digital misinformation primarily because it demonstrated:
A) That computer viruses could spread through email attachments B) That warning messages are more shareable than corrections, regardless of accuracy C) That the government was monitoring private email communications D) That early internet users were uniquely credulous compared to later generations
Reveal Answer
**Correct Answer: B** The Good Times hoax warned users about a technically impossible threat. Its significance is that it spread far faster than any authoritative debunking, demonstrating the "warning is more shareable than the correction" principle that has recurred across every subsequent platform. Warnings are emotionally salient, actionable, and prosocial in intent — corrections lack these properties.Question 2
Which of the following best describes the "social graph" as a technical concept?
A) A chart showing the growth of social media user numbers over time B) The network of people and connections that defines a social network's structure C) An algorithm that determines which content appears in users' feeds D) A visual representation of how often users share content from different sources
Reveal Answer
**Correct Answer: B** The social graph is the technical term for the network structure comprising nodes (individuals) and edges (connections between individuals) that defines the architecture of a social network. Facebook's core innovation was the systematization and mining of the social graph at scale, enabling information to flow through preexisting personal relationships.Question 3
The concept of "network effects" explains why Facebook was difficult to displace after achieving critical mass. Network effects refers to:
A) The psychological effect of seeing many people use a platform, which makes it seem more trustworthy B) The increase in a network's value as more people join, creating incentives to adopt the dominant platform C) The phenomenon by which information spreads faster in larger networks D) The tendency of users to form echo chambers within social networks
Reveal Answer
**Correct Answer: B** Network effects describe how a network's value increases with the number of participants. A social network where all your friends are present is more valuable than one where only some are, which creates powerful incentives to join — and stay on — the dominant platform. This also means the potential reach of any viral content on a dominant platform is enormous.Question 4
During the 2013 Boston Marathon bombing investigation on Reddit, Sunil Tripathi was falsely identified as a suspect. Which of the following was NOT a structural factor contributing to this error?
A) Confirmation bias operating at scale within the crowdsourced investigation B) A reward structure that prioritized being first with an identification over being accurate C) Reddit's algorithm actively promoting false identification posts D) The illusion of expertise among amateur photograph analysts
Reveal Answer
**Correct Answer: C** Reddit did not have an algorithm actively promoting the false identification posts — this was a case of human-driven crowdsourced investigation, not algorithmic amplification. The other three options are genuine structural factors: confirmation bias (once a suspect was nominated, searchers found confirming evidence), speed incentives (the platform rewarded being first), and the illusion of expertise (amateurs mimicking the form of forensic analysis without the substance).Question 5
The term "verification collapse" (coined by scholar Mathew Ingram) describes:
A) The deliberate suppression of fact-checking by social media platforms B) The systematic failure of verification norms under conditions of real-time social media C) The process by which verified accounts lose their verification status for posting misinformation D) The collapse of traditional news organizations' fact-checking departments due to budget cuts
Reveal Answer
**Correct Answer: B** Verification collapse describes the structural failure of information verification under real-time social media conditions. It is not about deliberate suppression or institutional budget decisions. The problem is that real-time platforms reward speed, while verification takes time — creating a structural incentive that systematically privileges speed over accuracy.Question 6
What distinguishes "private virality" in encrypted messaging apps from misinformation spread on public social media platforms?
A) Private virality spreads faster because algorithms amplify it more aggressively B) Private virality is resistant to algorithmic downranking, labeling, and outside monitoring because content is encrypted C) Private virality only affects small audiences because groups are limited in size D) Private virality is less harmful because it cannot include images or videos
Reveal Answer
**Correct Answer: B** The defining feature of private virality is its invisibility to outside monitoring, fact-checking, and algorithmic intervention. Because WhatsApp messages are end-to-end encrypted, the platform itself cannot read their content, journalists cannot screenshot and report on them, and fact-checkers cannot label them. The result is a category of misinformation spread that resists the standard interventions developed for public platforms.Question 7
Which of the following best describes the "parasocial relationship" as it relates to influencer misinformation?
A) A fake relationship between two influencers who pretend to be friends for marketing purposes B) A one-sided relationship in which audience members feel genuine personal connection to a media figure, creating credibility transfer C) A professional partnership between influencers and brands that is not disclosed to audiences D) The relationship between social media platforms and the advertisers who fund them
Reveal Answer
**Correct Answer: B** Parasocial relationships (theorized by Horton and Wohl in 1956) are one-sided — the audience feels a genuine personal connection to the media figure, but the media figure does not know them. In the context of influencer misinformation, this one-sided connection creates credibility transfer: audiences trust influencers' recommendations (including health claims) as they would trust a friend, applying less critical scrutiny than they would to a stranger's claims.Question 8
Research by Pennycook and Rand (2019) on the Like button showed that:
A) Users who liked content were less likely to share false information B) Social proof signals like like counts significantly affect judgments of news accuracy C) The Like button had no measurable effect on information credibility assessments D) Removing the Like button caused users to share more content overall
Reveal Answer
**Correct Answer: B** Pennycook and Rand's research showed that social proof signals — including like counts — significantly affect accuracy judgments, even for detectably false headlines. A false headline with many likes appears more credible than the same headline with few likes. Since false, emotionally arousing content systematically out-engages accurate content, like counts are often negatively correlated with accuracy in the domain of controversial claims.Question 9
The "Adpocalypse" of 2017 refers to:
A) The mass deletion of extremist content by YouTube's algorithm B) A major advertiser boycott triggered by brands discovering their ads appeared alongside extremist content C) The collapse of digital advertising revenue following a major data privacy scandal D) The introduction of YouTube's unskippable advertisement format
Reveal Answer
**Correct Answer: B** The Adpocalypse was a major advertiser boycott of YouTube in 2017, triggered by reporting that major brand advertisements were appearing alongside extremist, violent, and otherwise inappropriate content. YouTube's response included demonetizing many categories of controversial content — a policy that had the unintended consequence of disproportionately affecting legitimate creators discussing sensitive topics.Question 10
What specific electoral law did the WhatsApp campaign for Jair Bolsonaro in Brazil's 2018 election violate?
A) Laws requiring political advertising to be approved by an election commission B) Laws prohibiting paid mass messaging campaigns C) Laws requiring campaign spending to be publicly disclosed D) Laws limiting the number of messages any candidate campaign could send
Reveal Answer
**Correct Answer: B** Brazilian electoral law prohibits paid mass messaging campaigns. The WhatsApp campaigns for Bolsonaro — which involved businesses and individuals paying for mass messaging — violated this law. The campaigns were particularly difficult to detect and counter because WhatsApp's private architecture made organized campaigns essentially invisible to regulatory authorities.Part 2: True or False
Question 11
The "continued influence effect" describes the phenomenon in which false information continues to affect belief and decision-making even after explicit, unambiguous correction.
Reveal Answer
**TRUE** The continued influence effect (Lewandowsky et al., 2012) is a well-documented cognitive phenomenon. False information persists in influencing judgment even after correction, partly because the correction cannot undo the neural encoding of the original information, and partly because corrections often receive less attention and emotional engagement than the original false claim.Question 12
MySpace was founded before Facebook and was the dominant social network in the United States before Facebook surpassed it.
Reveal Answer
**TRUE** MySpace was founded in 2003 and was the dominant US social network from approximately 2005 to 2008, before Facebook — which opened to the general public in 2006 — surpassed it. Facebook's greater structural consistency (unlike MySpace, which allowed chaotic user customization) contributed to its eventual dominance.Question 13
The "blogosphere as fact-checker" theory holds that because anyone can immediately and publicly respond to any claim, false information will always be rapidly and completely corrected.
Reveal Answer
**FALSE** The theory holds that false information would be rapidly *challenged*, but this does not mean corrections will be complete or will reach the same audience as the original false claim. Moreover, the Rathergate case showed that the blogosphere's "fact-checking" function was driven in part by politically motivated actors, not by disinterested truth-seekers. The theory's optimistic version was substantially undermined by empirical experience.Question 14
WhatsApp's end-to-end encryption means that WhatsApp itself can read and moderate the content of messages sent through the platform.
Reveal Answer
**FALSE** End-to-end encryption means that messages are encrypted on the sender's device and can only be decrypted on the recipient's device. WhatsApp itself cannot read the content of messages in transit. This is precisely what makes private virality through WhatsApp so difficult to address — the platform cannot identify and remove false content in the same way that public platforms can.Question 15
YouTube's recommendation algorithm was initially designed to maximize click-through rate (the number of videos users clicked on), and later switched to maximizing watch time.
Reveal Answer
**TRUE** YouTube did switch its primary optimization metric from clicks to watch time, based on internal research showing that watch time correlated better with user satisfaction than click-through rate. This switch also had the unintended consequence of favoring progressively more extreme content, since extreme content tends to hold viewer attention longer than moderate content.Question 16
Variable reward schedules — the pattern of intermittent, unpredictable rewards — are among the weakest known operant conditioning mechanisms and produce minimal persistent behavior.
Reveal Answer
**FALSE** Variable reward schedules are among the *most powerful* known operant conditioning mechanisms, not the weakest. Like slot machines, they produce persistent behavior precisely because rewards are unpredictable — the possibility of a reward maintains behavior even during unrewarded intervals. Social media notification systems exploit this mechanism to maintain compulsive checking behavior.Question 17
The term "citizen journalism" refers exclusively to journalism produced by professional journalists working outside mainstream media organizations.
Reveal Answer
**FALSE** Citizen journalism refers to journalism produced by ordinary citizens — non-professionals — who gather and publish news content using accessible web platforms. The "citizen" qualifier specifically distinguishes it from professional journalism. Its democratic promise is precisely that it does not require professional credentials — but this also means it lacks the institutional accountability structures that constrain professional journalism.Question 18
Aza Raskin, who invented the infinite scroll feature used in social media feeds, has expressed regret about the feature's behavioral consequences.
Reveal Answer
**TRUE** Aza Raskin has publicly expressed regret about the infinite scroll, estimating that it wastes approximately 200,000 human lifetimes per day in unintended social media use. Similarly, Justin Rosenstein (Facebook Like button co-creator) has expressed regret about his feature's contribution to addictive platform behaviors and "false positivity."Part 3: Short Answer
Question 19
Explain in two to three sentences why email chain hoaxes — like the Good Times virus warning — exploited "strong ties" rather than "weak ties," and why this made them more effective at spreading misinformation.
Reveal Answer
**Model Answer:** Email chains typically travel through personal networks — family members and close friends — which sociologists classify as "strong ties." Information from strong-tie sources receives less critical scrutiny than information from strangers because we have generally found our close contacts to be reliable in other domains. When a family member forwards a health scare warning, the implicit personal vouching of the relationship transfers to the content of the message, making recipients less likely to verify the claim independently.Question 20
What is the "first mover advantage" in misinformation, and why does the "continued influence effect" make it so consequential?
Reveal Answer
**Model Answer:** The "first mover advantage" refers to the disproportionate persistence of the first account of an event, even if subsequent investigation reveals it to be false. The entity that publishes first — even incorrectly — reaches a large audience before corrections emerge. The "continued influence effect" makes this consequential because, as research by Lewandowsky et al. demonstrates, false information continues to affect belief and decision-making even after explicit correction. A correction cannot simply "overwrite" the false impression left by the original claim; the false information continues to exert influence even after people know it was wrong.Question 21
Why did WhatsApp's responses to mob violence in India (limiting forwarding, adding "forwarded" labels) address the misinformation problem only partially rather than solving it?
Reveal Answer
**Model Answer:** WhatsApp's interventions were inherently limited by the platform's fundamental architecture: end-to-end encryption prevents content-based moderation. Limiting forwarding reduced the mechanical ease of mass distribution but did not prevent motivated actors from manually copying and re-sending content, or from creating new forwarding chains. The "forwarded" label reduced the implied personal endorsement of forwarded content, but research on correction effects suggests this label may have had limited impact on receivers who were already emotionally primed by the content. Neither intervention addressed the underlying community conditions — fear, distrust of outsiders, lack of local verification resources — that made communities susceptible to acting on unverified warnings.Question 22
Distinguish between the "social graph" and the "interest graph" as organizational principles for social media. Why does this distinction matter for misinformation spread?
Reveal Answer
**Model Answer:** The social graph organizes information flow through personal relationships — you see content because people you know shared it. The interest graph organizes information flow through predicted interests — you see content because an algorithm predicts you will engage with it, regardless of who produced it. Social-graph-based platforms (like early Facebook) inherit credibility from personal relationships: content feels endorsed by friends. Interest-graph-based platforms (like TikTok's For You Page) can surface content from total strangers if behavioral data predicts engagement. For misinformation, the social graph creates credibility from relationship trust; the interest graph creates vulnerability by exposing users to high-engagement (potentially false) content from unknown sources with no relationship-based accountability.Question 23
What is the "creator economy" and how does its monetization structure create incentives that may promote health misinformation?
Reveal Answer
**Model Answer:** The creator economy is the ecosystem in which individuals monetize content creation through platform revenue sharing, sponsorships, merchandise, and direct audience support, without institutional affiliation. The monetization structure creates misinformation incentives in several ways: (1) Platform revenue correlates with engagement, and emotionally arousing health content (fear of vaccines, excitement about miracle cures) generates more engagement than calm, accurate health reporting. (2) Sponsorships from supplement companies, alternative health brands, and wellness product manufacturers reward creators who cultivate audience skepticism of mainstream medicine — the larger the audience that distrusts conventional healthcare, the larger the potential customer base for alternative products. (3) The parasocial relationships cultivated by successful creators make audiences less critical of sponsored health claims than they would be of advertising encountered in other contexts.Part 4: Matching
Question 24
Match each concept (left column) with its correct definition (right column):
| Concept | Definition |
|---|---|
| A. Rathergate | 1. The phenomenon in which network value increases as more participants join |
| B. Network effects | 2. The spread of content through closed, encrypted channels invisible to outside monitoring |
| C. Autoplay | 3. A 2004 controversy in which bloggers questioned the authenticity of CBS News documents about George W. Bush's National Guard service |
| D. Private virality | 4. A platform feature that automatically loads the next video when the current one ends, reducing user decision points |
| E. Narrative transportation | 5. The psychological state of being absorbed in a story in ways that suspend critical evaluation |
Reveal Answer
**Correct Matching:** - A → 3 (Rathergate: the 2004 CBS/Bush document controversy) - B → 1 (Network effects: value increases with participants) - C → 4 (Autoplay: automatic next video loading) - D → 2 (Private virality: spread through encrypted channels) - E → 5 (Narrative transportation: story absorption that suspends critical evaluation)Question 25
Match each platform feature (left column) with the specific misinformation mechanism it enables (right column):
| Feature | Misinformation Mechanism |
|---|---|
| A. Like button | 1. Extends content reach beyond original followers, amplifying both true and false information |
| B. Share/Retweet button | 2. Creates a false social proof signal suggesting popular content is accurate |
| C. Notification system | 3. Maintains compulsive checking behavior through variable reward scheduling |
| D. Algorithmic recommendation | 4. Surfaces high-engagement (often emotionally extreme) content regardless of accuracy |
| E. Group messaging | 5. Embeds misinformation in personal network credibility, making it feel individually endorsed |
Reveal Answer
**Correct Matching:** - A → 2 (Like button: false social proof, popular = credible) - B → 1 (Share button: extends reach beyond original followers) - C → 3 (Notifications: variable reward scheduling, compulsive checking) - D → 4 (Algorithm: high-engagement content regardless of accuracy) - E → 5 (Group messaging: network credibility embedding)End of Chapter 7 Quiz
Total questions: 25 | Estimated completion time: 45-60 minutes