Chapter 14 Exercises: What Are Dark Patterns?
Reflection Exercises
Exercise 1 [Reflection] Recall the last time you were surprised by a charge, subscription, or data disclosure that you did not expect when you started using a product or service. Reconstruct the experience step by step: at what point did you commit to the process, and at what point did the unexpected element appear? Using the vocabulary of this chapter, identify which dark pattern (if any) was at work.
Exercise 2 [Reflection] Think about a social media app you use daily. Without opening it, write down: what you think the app's notification settings look like, how many separate categories of notifications you could toggle, and what your current settings are. Then open the app and check. How accurate was your mental model? What does the gap between your model and reality suggest about the notification architecture?
Exercise 3 [Reflection] The chapter describes a spectrum from "bad design" to "unethical design" to "predatory design." Identify one interface experience that you would place at each point on this spectrum from your own recent digital life, and write a paragraph for each explaining your classification.
Exercise 4 [Reflection] Consider the concept of "meaningful consent." Write two paragraphs: one arguing that the standard consent mechanisms in social media onboarding (clicking "I Agree") constitute meaningful consent, and one arguing that they do not. Then write a third paragraph stating which argument you find more convincing and why.
Exercise 5 [Reflection] The chapter discusses the "intent-effect gap" — the space between what designers intend and what their systems produce. Recall a time when you did something that had unintended negative consequences for someone else. How did your awareness of the intent-effect gap in that personal situation change (or not change) your sense of moral responsibility? Apply this reflection to the platform design context.
Exercise 6 [Reflection] After reading about privacy zuckering, review the privacy settings on two social media accounts you hold. What are the defaults? What would a user who never touched settings share? Write a one-page reflection on what you found and how you feel about it.
Exercise 7 [Reflection] The Velocity Media scenario presents Marcus Webb as someone who is not malicious but whose optimization goals lead to manipulative outcomes. Have you ever been in a situation — at work, in school, in a group project — where you pursued a goal in a way that, in retrospect, worked against someone else's interests without you fully recognizing it at the time? What does that experience teach you about systemic responsibility?
Research Exercises
Exercise 8 [Research] Visit darkpatterns.org (now deceptive.design) and browse the Hall of Shame for five examples. For each, write a brief analysis identifying: (a) which category from Brignull's taxonomy it falls under, (b) who benefits and at whose expense, and (c) whether it is better classified as bad design, unethical design, or predatory design.
Exercise 9 [Research] Find and read the FTC's 2022 report "Bringing Dark Patterns to Light." Identify the three cases the FTC highlights as enforcement examples. For each case, summarize the dark pattern alleged and the regulatory action taken. Evaluate whether you think the remedy was proportionate to the harm.
Exercise 10 [Research] Research the EU's Digital Services Act provisions on dark patterns (Article 25). Find one news report or academic commentary analyzing how the DSA is being enforced as of the current year. Write a 500-word summary of the enforcement landscape, noting any major cases or gaps in enforcement you discovered.
Exercise 11 [Research] Research the history of Facebook's privacy settings changes between 2005 and 2015. Create a timeline of major changes to default sharing settings, noting each time the default moved toward more sharing rather than less. Analyze this timeline as evidence for or against the "bait and switch" characterization used in this chapter.
Exercise 12 [Research] Find the academic paper by Narayanan et al. (2020) "Dark Patterns: Past, Present, and Future" or a comparable systematic review of dark patterns research. Summarize what the researchers found about the prevalence of dark patterns across different sectors and the state of empirical evidence for their harms.
Exercise 13 [Research] Research California's Age-Appropriate Design Code Act (AADC). What design practices does it require or prohibit? Find one example of a platform that has changed its design in response to the AADC and describe the change. Assess whether the change represents genuine improvement in user protection or surface-level compliance.
Exercise 14 [Research] Look up Harry Brignull's original 2010 presentation or the darkpatterns.org website from its early years (using the Wayback Machine if necessary). How has the dark patterns taxonomy evolved from its original form to current formulations? What patterns have been added, removed, or redefined?
Analysis Exercises
Exercise 15 [Analysis] Take a screenshot of a cookie consent banner from three different websites you visit regularly. For each, analyze: (a) the visual hierarchy of "accept" versus "reject/manage" options, (b) the number of steps required to reject all non-essential cookies versus accept all, (c) the language used, and (d) whether the banner complies with GDPR as you understand it. Compile your analysis into a comparative table.
Exercise 16 [Analysis] Choose a social media platform's account deletion or deactivation flow and document every step required to permanently delete an account. Map each step against the dark pattern categories in this chapter. Calculate the total number of clicks and decisions required. Compare this to the number of steps required to create an account on the same platform.
Exercise 17 [Analysis] Collect five examples of "confirmshaming" language from email newsletters, app notifications, or pop-up prompts. For each, identify: (a) the self-concept being targeted, (b) the emotion being evoked, and (c) an alternative, neutral phrasing that would present the same choice without shame. Analyze what pattern of values the confirmshaming language reveals about how platforms construct their users.
Exercise 18 [Analysis] The chapter argues that algorithmic amplification of outrage is a dark pattern, even though it is not a visible UI element. Develop criteria for classifying algorithmic behaviors (rather than interface elements) as dark patterns. Does your framework require intent? Does it require harm? Does it require deception? Apply your criteria to three specific algorithmic behaviors: outrage amplification, ephemeral content recommendation, and personalized advertising.
Exercise 19 [Analysis] Analyze the Velocity Media meeting scenario through the lens of organizational ethics rather than individual ethics. Draw an organizational chart of who in the company had knowledge of, or responsibility for, each element of the proposed onboarding flow. For each role, assess: what did they know, what were their incentives, and what ethical obligations did they bear? Who, if anyone, was most responsible for the outcome?
Exercise 20 [Analysis] The chapter describes the asymmetry of expertise between platform designers and users. Develop a "power audit" framework for any given dark pattern that quantifies this asymmetry: how much specialized knowledge does a user need to recognize and resist this pattern? Apply your framework to three of the patterns discussed in this chapter and rank them from most to least coercive.
Exercise 21 [Analysis] Read LinkedIn's terms of service and privacy policy, focusing on the sections related to contact importing and network expansion. Identify all the mechanisms by which LinkedIn uses your data and your network connections to grow its own network. For each mechanism, determine whether it meets Brignull's definition of a dark pattern and why.
Creative Exercises
Exercise 22 [Creative] Design a "light pattern" — a design choice that is the deliberate opposite of a dark pattern, making the user's interests the primary design constraint even at cost to the platform's engagement metrics. Choose one common dark pattern from this chapter and redesign it as a light pattern. Mock up the interface (sketched or digital) and write a one-page rationale explaining your design choices.
Exercise 23 [Creative] Write a fictional internal design document from the perspective of a Velocity Media product designer who has been tasked with redesigning the notification system. The document should include: the design problem, proposed solutions with UX rationale, anticipated impact on engagement metrics, and an ethics review section responding to concerns Dr. Johnson might raise. The document should feel authentic to how design documents are actually written.
Exercise 24 [Creative] You are a UX designer at a startup that has just been told by the CEO that the company needs to implement three dark patterns to hit its quarterly growth targets. Write a letter to the CEO making the business, ethical, and legal case against this directive. Use specific arguments from this chapter and reference specific regulatory risks.
Exercise 25 [Creative] Create an "Onboarding Bill of Rights" — a document that specifies what every app user has a right to know and decide during the onboarding process. Write it as a formal document with at least eight specific rights and brief explanations of each. Consider: what information must be disclosed, what choices must be genuine, and what defaults are impermissible.
Exercise 26 [Creative] Rewrite the onboarding experience Maya had with TikTok (described in the sidebar) as it would look if every element were designed as a "light pattern." Write out each step of the redesigned onboarding, noting what information is provided, what choices are offered, what the defaults are, and how the experience differs from the original. Then write a brief analysis of what the platform would likely lose (in engagement terms) and gain (in user trust terms) from this redesign.
Group Exercises
Exercise 27 [Group] Dark pattern audit: As a group, collectively use five different social media platforms for one week while taking notes on any dark pattern you observe. Compile your observations into a collaborative database, categorizing each observation by pattern type, platform, and severity. At the end of the week, present your findings to the class, including any patterns that appeared across multiple platforms and any that were unique to a single platform.
Exercise 28 [Group] Simulate a Velocity Media board meeting where the board is reviewing the ethics audit of the company's top five user-facing features. Each group member takes a role: CEO (Sarah Chen), Head of Product (Marcus Webb), Head of Ethics (Dr. Aisha Johnson), Legal Counsel, and an External Investor. For each feature, the board must decide whether to continue, modify, or discontinue it. After the simulation, debrief: whose arguments were most persuasive, and what structural factors determined the outcome?
Exercise 29 [Group] Divide into two teams and debate the following proposition: "Platform companies that deploy dark patterns should face individual criminal liability for the executives who approved them, not just corporate fines." One team argues for individual liability, one team argues against. Draw on the chapter's material on intent, the intent-effect gap, and systemic responsibility to make your arguments.
Exercise 30 [Group] Collaborative design sprint: Your group has been hired to redesign one social media platform's notification system to eliminate all dark patterns while maintaining commercial viability. You have 90 minutes. Produce: (a) a diagnosis of the existing system's dark patterns, (b) a set of design principles for the redesign, (c) wireframes or written specifications for the redesigned notification center, and (d) a business case for why this redesign is commercially sustainable.
Exercise 31 [Group] Role-play a congressional hearing on dark patterns in social media. One group member plays the senator chairing the hearing. Others play: a platform CEO defending their onboarding design, a UX researcher presenting evidence of harm, a user advocate, and a child safety expert. The hearing should last at least 30 minutes. Debrief afterward: what arguments were most effective, and what would a fair regulatory outcome have looked like?
Exercise 32 [Group] Media literacy exercise: Find five advertisements in social media feeds and classify each as (a) clearly identified advertising, (b) disguised advertising, or (c) ambiguous. Present your classifications to the class and explain your reasoning. Discuss: what design elements made the distinction clear or unclear, and what regulatory standard should apply?
Extended Research Projects
Exercise 33 [Research/Analysis] Conduct a longitudinal dark pattern audit: Document every dark pattern you encounter across all digital products you use for one full month. Record the platform, the pattern type, the context, and your response. At the end of the month, analyze your data: Which platforms were most and least problematic? Which pattern types were most common? Did your awareness of dark patterns change your behavior? Write a 1,500-word report on your findings.
Exercise 34 [Research/Creative] Investigate the history of persuasive design in pre-digital contexts: direct mail marketing, casino design, tobacco advertising, and political propaganda. Write a 1,500-word essay arguing either (a) that digital dark patterns are continuous with a long tradition of manipulative commercial design, or (b) that digital dark patterns represent a qualitative break from previous persuasion technologies because of their scale, personalization, and invisibility.
Exercise 35 [Research/Analysis] Comparative regulatory analysis: Compare the EU Digital Services Act, the UK Online Safety Act, and any U.S. federal legislation on dark patterns (including any bills introduced but not passed). For each regulatory instrument: (a) What dark patterns does it address? (b) What enforcement mechanisms does it create? (c) How does it balance platform liability against free expression? Write a 2,000-word comparative analysis and conclude with a recommendation for which approach is most effective and why.