Chapter 14 Quiz: What Are Dark Patterns?

Instructions: Select the best answer for each question. Answer key appears at the end.


Question 1. Harry Brignull coined the term "dark patterns" in what year?

A) 2004 B) 2007 C) 2010 D) 2013


Question 2. According to Brignull's original definition, which element is essential to classifying a design choice as a dark pattern (as opposed to merely bad design)?

A) The design must cause measurable financial harm to the user B) The design must have been tested using A/B experimentation C) The design must benefit the company at the user's expense, with some awareness in the design chain that it works against the user D) The design must violate at least one existing consumer protection regulation


Question 3. The "roach motel" dark pattern is best described as:

A) A design that makes users feel ashamed for choosing the opt-out option B) A design where entering a commitment is easy but exiting is deliberately difficult C) A design that reveals hidden costs only at the final step of a transaction D) A design that disguises advertisements as organic user content


Question 4. Which of the following is the best example of the "trick questions" dark pattern?

A) A "Cancel My Membership" button that is colored gray and placed in small font B) An opt-out checkbox worded as "Uncheck this box if you do not want to NOT receive emails" C) A notification that reads "Your network is waiting for you!" D) A social media feed that shows outrage-inducing content more frequently than neutral content


Question 5. "Confirmshaming" was a term coined primarily in reference to:

A) Cookie consent banners that make accepting tracking the default option B) Opt-out labels that make users feel foolish or ashamed for not opting in C) Platform onboarding flows that collect data before the user understands what they are consenting to D) Algorithmic suppression of content that contradicts a user's prior engagement patterns


Question 6. "Privacy zuckering" refers to:

A) Mark Zuckerberg's personal data collection practices B) The practice of making users share more personal information than they intended C) Facebook's practice of selling user data to third-party advertisers D) The deceptive labeling of data collection practices in privacy policies


Question 7. The "bait and switch" pattern in social media is best illustrated by:

A) TikTok's use of ephemeral content that disappears after 24 hours B) Facebook's evolution from a private friend network to an advertising-driven platform that deprioritizes friends' posts C) Instagram's use of "Sponsored" labels in small gray text above paid posts D) LinkedIn's automatic notification of profile views to the person whose profile was viewed


Question 8. Which of the following social media dark patterns is described as operating at the algorithmic level rather than the interface level?

A) Confirmshaming in notification permission prompts B) Roach motel design in account deletion flows C) Algorithmic amplification of outrage D) Frictionless sharing defaults


Question 9. The "frictionless sharing" pattern, exemplified by Facebook's 2011 Open Graph system, was problematic primarily because:

A) It made it technically impossible to share content with specific audiences B) It shared users' activities without requiring an explicit sharing decision for each activity C) It collected users' contact lists without disclosing how they would be used D) It disguised advertisements as shared user activities


Question 10. Ephemeral content formats (such as Instagram Stories) are considered a dark pattern because:

A) They collect more user data than non-ephemeral content B) They prevent users from accessing their own content after it disappears C) They create artificial time pressure that drives compulsive check-in behavior D) They are used exclusively to display advertising without disclosure


Question 11. The "intent-effect gap" refers to:

A) The gap between what users say they want from social media and what they actually do on it B) The space between what designers consciously intend and what their optimized systems produce C) The difference between a platform's stated privacy policy and its actual data practices D) The lag between when a dark pattern is deployed and when regulators become aware of it


Question 12. Philosopher Evan Selinger's concept of "systemic responsibility" suggests that the ethics of large technological systems should be evaluated primarily by:

A) The conscious intentions of the individual engineers who built them B) The presence or absence of regulatory violations at the time of deployment C) Outcomes and power differentials rather than individual intent D) The financial harm caused to users who were deceived


Question 13. The chapter describes three levels of the expertise asymmetry between platform designers and users. Which of the following is NOT listed as one of these levels?

A) Professional training versus everyday navigation B) Scale and iteration capacity of platforms versus individual user learning C) Differential access to legal representation in disputes D) Cognitive bandwidth limitations and vulnerable populations


Question 14. Shoshana Zuboff's concept of "behavioral surplus" refers to:

A) The excess engagement that platforms achieve beyond users' intentions B) A vast excess of knowledge about human behavioral patterns that users do not possess about themselves C) The additional advertising revenue generated by dark pattern designs D) The surplus of design expertise held by platform teams relative to regulatory bodies


Question 15. Under the EU's General Data Protection Regulation (GDPR), consent to personal data processing must be (select all that apply according to the chapter):

A) Freely given B) Specific C) Informed D) Unambiguous

(Note: All four apply — select the best single answer for multiple-choice format)

A) Only freely given and specific B) Only informed and unambiguous C) Freely given, specific, informed, and unambiguous D) Freely given and informed only


Question 16. The EU's Digital Services Act (DSA) requires platforms to:

A) Remove all advertising from social media feeds B) Obtain explicit consent before any algorithmic curation of content C) Not use interfaces designed to deceive or manipulate users or impair their free and informed decisions D) Publish all A/B test results that affect user behavior within 60 days


Question 17. In the Velocity Media meeting scenario, Dr. Aisha Johnson's core objection to asking for contact list access on day one was:

A) That the practice violates iOS developer guidelines B) That consent obtained before the user understands the product is not meaningfully informed consent C) That contact list access is never appropriate for a social media application D) That the company's privacy policy does not authorize collection of contact data


Question 18. According to research cited in the chapter, TikTok's notification system (when left at default settings) sent users approximately how many notifications per day?

A) 5 B) 17 C) 34 D) 52


Question 19. The EU's Dark Patterns Taskforce (2022) found that what percentage of the most popular websites deployed at least one dark pattern?

A) 47% B) 68% C) 83% D) 97%


Question 20. The FTC sued Amazon in 2023 over which dark pattern?

A) Hidden advertising fees in Amazon Prime Video B) The Prime subscription cancellation flow that used multiple steps and emotional framing to retain cancelers C) Amazon's collection of voice data through Alexa without disclosure D) Amazon's use of confirmshaming in product review prompts


Question 21. The chapter's diagnostic question "Would users endorse this design choice if they understood how it works?" is best understood as a test for:

A) Regulatory compliance B) Aesthetic quality of the interface C) Whether consent that depends on ignorance is meaningful consent D) Whether the design achieves its intended engagement metrics


Question 22. Which of the following organizations was founded by former Google design ethicist Tristan Harris and is mentioned in the chapter as advocating for design standards that prioritize user wellbeing?

A) The Electronic Frontier Foundation B) The Center for Humane Technology C) The Oxford Internet Institute D) The European Data Protection Board


Answer Key

  1. C
  2. C
  3. B
  4. B
  5. B
  6. B
  7. B
  8. C
  9. B
  10. C
  11. B
  12. C
  13. C
  14. B
  15. C
  16. C
  17. B
  18. B
  19. D
  20. B
  21. C
  22. B