Exercises: The Attention Economy
These exercises progress from concept checks to challenging applications. Estimated completion time: 3-4 hours.
Difficulty Guide: - ⭐ Foundational (5-10 min each) - ⭐⭐ Intermediate (10-20 min each) - ⭐⭐⭐ Challenging (20-40 min each) - ⭐⭐⭐⭐ Advanced/Research (40+ min each)
Part A: Conceptual Understanding ⭐
Test your grasp of core concepts from Chapter 4.
A.1. Herbert Simon wrote that "a wealth of information creates a poverty of attention." In your own words, explain how this insight reframes the economics of the internet. Why is attention, rather than information, the scarce resource that drives platform business models?
A.2. Tim Wu traces the attention merchant model from the penny press through radio, television, and digital platforms (Section 4.1.2). Identify the key innovation at each stage that allowed advertisers to capture attention more effectively. What pattern do you observe across the progression?
A.3. Section 4.1.3 describes a five-step platform business model. The chapter warns that engagement optimization is "not the same as satisfaction, happiness, or informed citizenship." Using a specific platform you use regularly, explain why the algorithm's optimization for engagement might produce outcomes that are against your own interests.
A.4. Define the following terms from Section 4.2 and explain how each functions within the Fogg Behavior Model (motivation, ability, trigger): - (a) Variable reward schedule - (b) Infinite scroll - (c) Push notification
A.5. Distinguish between persuasive design and dark patterns as described in Section 4.3.1. Why does the chapter argue that this distinction matters — even if both involve influencing user behavior?
A.6. Section 4.4.1 describes Zuboff's concept of "behavioral surplus." In two to three sentences, explain how behavioral surplus differs from the data needed to improve a service. Why does Zuboff argue that this distinction is central to understanding surveillance capitalism?
A.7. The chapter presents three ethical frameworks — utilitarian, Kantian, and virtue ethics — for evaluating the attention economy (Section 4.5.3). Choose one framework and explain, in three to four sentences, how it would evaluate the practice of a platform using engagement-maximizing algorithms that increase user anxiety.
Part B: Applied Analysis ⭐⭐
Analyze scenarios, arguments, and real-world situations using concepts from Chapter 4.
B.1. Consider the following scenario:
A popular news app redesigns its interface. Previously, it showed articles in a chronological list with a "Load More" button at the bottom of each page. After the redesign, articles appear in an algorithmically ranked infinite scroll, with push notifications for "trending" stories. The app also introduces a "streak" feature: users who open the app seven consecutive days receive a badge on their profile.
Identify at least four specific persuasive design techniques in this redesign. For each, explain (a) which element of the Fogg Behavior Model it targets and (b) whose interests it primarily serves — the user's or the platform's.
B.2. Eli's experience with the 14-step Smart City opt-out process (Section 4.3.2) illustrates the dark pattern Brignull calls "roach motel." Analyze this example through Zuboff's surveillance capitalism framework. How does making opt-out difficult serve the cycle of behavioral surplus extraction? What would a genuinely user-serving opt-out process look like?
B.3. Read the following argument:
"People choose to spend time on social media. No one forces them to scroll Instagram or watch TikTok. If they find it addictive, they can just put their phones down. Calling design features 'manipulative' is paternalistic — it denies people agency over their own choices."
Identify at least three assumptions in this argument. For each, explain why it may be incomplete or flawed, drawing on concepts from Sections 4.2, 4.3, and 4.5.3.
B.4. The chapter describes how YouTube's auto-play recommendation algorithm tends to push users toward increasingly extreme content because extreme content generates stronger emotional reactions (Section 4.2.3). Using the attention economy framework, explain why this is a structural incentive rather than a deliberate editorial decision. What does this distinction mean for governance — who is responsible?
B.5. Mira's question about VitraMed — "the model is the same: collect data, extract surplus, build predictions, sell the predictions" (Section 4.4.2) — suggests a parallel between surveillance capitalism and health-tech. Eli responds that the key question is "who controls the predictions." Evaluate Eli's criterion. Is it sufficient to distinguish ethical from unethical data use, or are additional criteria needed? If so, propose at least two.
B.6. Section 4.6.1 describes the EU Digital Services Act's requirement that platforms offer non-personalized alternatives to algorithmic recommendations. Analyze the potential effects of this requirement from three perspectives: (a) the user, (b) the platform, and (c) a content creator who depends on algorithmic amplification for their livelihood. Do all three stakeholders benefit? Where do their interests conflict?
Part C: Real-World Application Challenges ⭐⭐-⭐⭐⭐
These exercises ask you to investigate your own data environment. Complete them with your actual devices, services, and experiences.
C.1. ⭐⭐ Attention Audit. Over the next 24 hours, track every time you pick up your phone or open a social media app. For each session, record: (a) what prompted you to pick up the phone (notification, boredom, habit, intentional task), (b) what app you used, (c) how long you spent, and (d) whether you did what you intended to do or were redirected by the app's design. After 24 hours, calculate: What percentage of your sessions were initiated by external triggers (notifications)? What percentage involved unintended use (you went in for one thing and ended up doing something else)? Write a one-paragraph reflection connecting your findings to the concepts in Sections 4.1 and 4.2.
C.2. ⭐⭐ Dark Pattern Safari. Spend 30 minutes browsing websites or apps with the explicit goal of identifying dark patterns. Find at least five examples and classify each using Brignull's taxonomy from Section 4.3.2 (forced continuity, roach motel, confirmshaming, trick questions, disguised ads, friend spam, hidden costs, privacy zuckering, misdirection). For each, take a screenshot or describe the interface element, name the dark pattern type, and explain how it works against the user's interests. Suggested hunting grounds: subscription services, e-commerce checkout flows, cookie consent banners, free-to-play games, or social media privacy settings.
C.3. ⭐⭐⭐ Notification Detox Experiment. Turn off all non-essential push notifications on your phone for 48 hours (keep only calls, texts, and any genuinely urgent apps). During those 48 hours, journal briefly each evening: How many times did you check your phone? Did you feel more or less anxious? Were you more productive, less productive, or about the same? After the experiment, write a one-page reflection addressing: (a) which notifications did you miss, if any, and (b) what does this experience reveal about the role of triggers (Fogg's model) in your daily technology use?
C.4. ⭐⭐⭐ Autoplay Resistance Test. Choose a platform with an auto-play feature (YouTube, Netflix, TikTok, or similar). In your first session, use the platform normally with auto-play enabled and note how many videos or episodes you watch and for how long. In a second session (on a different day), disable auto-play and note the same metrics. Compare the two sessions. How much additional content did auto-play cause you to consume? Was the additional content valuable to you? Write a short analysis connecting your findings to Section 4.2.3's discussion of decision points and frictionless design.
C.5. ⭐⭐⭐ Cookie Consent Comparison. Visit ten different websites and interact with their cookie consent banners. For each, record: (a) how many clicks it takes to "Accept All," (b) how many clicks it takes to reject all non-essential cookies, (c) whether the "Reject" or "Manage Preferences" option is as visually prominent as the "Accept All" button, and (d) how long the process takes for each option. Present your findings in a table. Calculate the average "friction ratio" — the number of clicks required to reject versus accept. Write a paragraph analyzing your findings through the lens of dark patterns and the consent fiction (Section 4.3.3).
Part D: Synthesis & Critical Thinking ⭐⭐⭐
These questions require you to integrate multiple concepts from Chapter 4 and think beyond the material presented.
D.1. Section 4.5.1 presents the Haidt and Twenge research on social media and adolescent mental health alongside critiques from Przybylski and Orben. The chapter intentionally does not take a definitive position on causation. Write a 300-400 word analysis that: (a) explains why establishing causation in this domain is particularly difficult, (b) identifies at least two confounding variables that complicate the research, and (c) argues whether the uncertainty about causation should increase or decrease the urgency of governance responses. Justify your position.
D.2. The chapter's final "Common Pitfall" box (Section 4.6.3) warns against placing the burden of resisting the attention economy entirely on individuals. But the chapter also offers an "Action Checklist" of individual strategies. Is this a contradiction? Write a two-paragraph analysis that reconciles these two positions. In your analysis, distinguish between what individuals can and should do and what requires systemic solutions (regulation, design reform, business model change). Use at least one analogy from another domain (e.g., environmental regulation, public health, workplace safety) to support your argument.
D.3. Zuboff's surveillance capitalism framework (Section 4.4) and the attention economy model (Section 4.1) describe overlapping but distinct phenomena. In a 300-500 word essay, explain how these two frameworks relate to each other. Where do they overlap? Where do they diverge? Is the attention economy a subset of surveillance capitalism, or is surveillance capitalism a consequence of the attention economy? Defend your position with evidence from the chapter.
D.4. The chapter describes how engagement-maximizing algorithms preferentially surface content that triggers outrage and moral indignation (Section 4.5.2). Connect this to the concept of autonomy in Section 4.5.3. Can a person make autonomous political decisions in an information environment systematically shaped to provoke emotional reactions? What would Kant say? What would a virtue ethicist say? Where do these two perspectives agree, and where do they diverge?
Part E: Research & Extension ⭐⭐⭐⭐
These are open-ended projects for students seeking deeper engagement. Each requires independent research beyond the textbook.
E.1. The Fogg Network. Research B.J. Fogg's Persuasive Technology Lab at Stanford and identify at least three former students or affiliates who went on to work at major technology companies. For each, trace what they built: what persuasive design features did they implement, and what was the measurable effect on user engagement? Write a 1,000-word report addressing: (a) whether Fogg bears responsibility for how his students applied his research, (b) whether there is a meaningful ethical distinction between studying persuasion and building persuasive systems, and (c) what obligations, if any, academic researchers have when their work is commercialized in ways they did not intend. Use at least three sources beyond this textbook.
E.2. Dark Patterns and the Law. Research how dark patterns are being addressed in current legislation. Compare at least two legal frameworks (e.g., the EU Digital Services Act, the California Privacy Rights Act, India's Digital Personal Data Protection Act, or the FTC's enforcement actions). Write an 800-1,200 word analysis covering: (a) how each framework defines or addresses dark patterns, (b) what remedies or penalties are available, (c) how enforcement has worked in practice (with at least one example), and (d) whether existing legal tools are adequate to address the taxonomy of dark patterns described in Section 4.3.2.
E.3. Designing for Wellbeing. The Center for Humane Technology (Section 4.6.2) advocates for design that serves human wellbeing rather than engagement metrics. Research three specific design alternatives that have been proposed or implemented — for example, chronological feeds, screen time dashboards, demetrication (removing like counts), or friction-based interventions. For each, evaluate: (a) what problem it addresses, (b) whether evidence suggests it works, (c) what trade-offs it introduces, and (d) why platforms have been slow to adopt it (or have adopted and then reversed it). Write a 1,000-1,500 word report. Conclude with your own design proposal for a single feature that you believe would meaningfully improve user wellbeing on a platform you use daily.
Solutions
Selected solutions are available in appendices/answers-to-selected.md.