18 min read

> — Richard Serra and Carlota Fay Schoolman, Television Delivers People (1973)

Learning Objectives

  • Define the attention economy and explain how human attention became a tradeable commodity
  • Describe how engagement optimization algorithms work and why they incentivize extreme content
  • Identify and classify dark patterns in digital interfaces
  • Analyze the concept of behavioral surplus and its role in surveillance capitalism
  • Evaluate the relationship between the attention economy and mental health, political polarization, and democratic discourse
  • Assess emerging governance responses including platform regulation, design standards, and digital wellbeing tools

Chapter 4: The Attention Economy

"When the product is free, you are the product." — Richard Serra and Carlota Fay Schoolman, Television Delivers People (1973)

Chapter Overview

You probably didn't plan to spend 47 minutes on your phone before getting out of bed this morning. And yet, there you were — checking notifications, scrolling a feed, watching a video that auto-played after the one you actually wanted to see. By the time you looked up, nearly an hour had passed.

This is not an accident. It is the product of one of the most sophisticated systems of behavioral engineering ever created — the attention economy, an economic model in which human attention is treated as a scarce resource to be captured, measured, and sold to advertisers.

In Chapters 1 through 3, we explored what data is, how its collection has evolved historically, and who has claims over it. This chapter examines why so much data is collected in the first place: because the dominant business model of the internet depends on maximizing the time, engagement, and emotional investment that users spend on platforms — and selling that attention to the highest bidder.

In this chapter, you will learn to: - Recognize how platform business models create incentives to capture and hold attention - Identify specific design techniques used to manipulate user behavior - Evaluate the social costs of attention-economy incentives — for mental health, democracy, and autonomy - Analyze proposed solutions, from regulation to design reform to individual resistance


4.1 The Economics of Attention

4.1.1 Herbert Simon's Insight

In 1971, the economist and cognitive scientist Herbert Simon — who would later win the Nobel Prize in Economics — wrote what may be the most prescient observation about the information age:

"In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention."

Simon's insight reframes the digital economy. The scarce resource is not information — there is an abundance of it. The scarce resource is your attention: the finite cognitive bandwidth you have for processing information, making decisions, and engaging with the world.

The attention economy is the economic system organized around the capture and monetization of this scarce resource.

4.1.2 The Attention Merchant Model

Tim Wu, in his 2016 book The Attention Merchants, traces this model back to the 19th century — to the moment Benjamin Day realized he could sell the New York Sun for a penny (below cost) by making up the revenue through advertising. The newspaper wasn't really selling news; it was selling its readers' attention to advertisers.

This model has been replicated and refined through every subsequent media technology:

Era Medium Attention Capture Method Revenue Model
1830s Penny press Sensational headlines Advertising
1920s Radio Entertainment programming Advertising
1950s Television Primetime scheduling Advertising
2000s Search/social Algorithmic personalization Targeted advertising
2020s Short-form video AI recommendation engines Targeted advertising + commerce

The progression tells a story: each new medium captures attention more effectively, personalizes the experience more precisely, and extracts more data from the interaction.

4.1.3 The Platform Business Model

Modern platforms — Meta, Google, TikTok, X — operate on a specific variation of the attention merchant model:

  1. Offer a free service (search, social networking, video) to attract users
  2. Collect data about user behavior, preferences, relationships, and emotional states
  3. Use algorithms to optimize the service for maximum engagement (time spent, interactions generated)
  4. Sell targeted advertising based on the behavioral profiles built from that data
  5. Reinvest profits into further engagement optimization, creating a feedback loop

The key innovation is step 3: algorithmic engagement optimization. Unlike a newspaper editor who selects the same front page for every reader, a platform algorithm personalizes the experience for each user, moment by moment, to maximize the probability that they will continue engaging.

Common Pitfall: Many people assume that platform algorithms show them "what they want to see." This is partially true but misleading. The algorithms optimize for engagement — and engagement is not the same as satisfaction, happiness, or informed citizenship. Content that makes you angry often generates more engagement than content that makes you happy. Content that confirms your existing beliefs generates more engagement than content that challenges them. The algorithm isn't trying to make you happy. It's trying to keep you scrolling.


4.2 The Architecture of Persuasion

4.2.1 Persuasive Design and B.J. Fogg

The design techniques that platforms use to capture attention didn't emerge by accident. Many were developed in academic settings and deliberately applied.

B.J. Fogg, a Stanford psychologist, founded the Persuasive Technology Lab in 1998 and developed what he called "captology" — the study of computers as persuasive technologies. His Fogg Behavior Model holds that a behavior occurs when three elements converge: motivation, ability, and a trigger (now called a "prompt").

Platforms operationalize this model with precision:

  • Motivation: Social validation (likes, comments), fear of missing out, curiosity, outrage
  • Ability: Frictionless interfaces, one-click interactions, auto-play, infinite scroll
  • Triggers: Push notifications, red badge counts, email reminders, "someone you might know" suggestions

Several of Fogg's former students went on to design core engagement features at major platforms. Nir Eyal, author of Hooked: How to Build Habit-Forming Products (2014), explicitly taught companies how to create products that users return to compulsively — without external prompts.

4.2.2 Variable Reward Schedules

One of the most powerful engagement techniques draws directly from behavioral psychology. B.F. Skinner discovered in the 1950s that pigeons would press a lever most compulsively not when they received a reward every time, but when rewards came at unpredictable intervals — a variable reward schedule.

Social media feeds exploit this principle. When you pull down to refresh your feed, you don't know what you'll find. Sometimes there's an exciting notification — a message from a friend, a viral post, breaking news. Sometimes there's nothing interesting. The unpredictability itself is what drives the behavior, just as it drives slot machine gambling.

Tristan Harris, a former Google design ethicist, has called the smartphone "a slot machine in your pocket." The analogy is more precise than it first appears: both slot machines and social media feeds use variable reward schedules, both are designed to be difficult to stop, and both generate enormous revenue from the attention they capture.

4.2.3 The Infinite Scroll and Auto-Play

Two design innovations deserve specific attention for their role in attention capture:

Infinite scroll — pioneered by Aza Raskin in 2006 — eliminates the natural stopping point that exists in paginated content. When a web page has a "next" button, the button creates a moment of decision: continue or stop. Infinite scroll removes that moment, turning content consumption into a continuous, frictionless flow.

Raskin later expressed regret: "It's as if they've taken behaviorism and weaponized it. It's one of my biggest regrets."

Auto-play — used by YouTube, Netflix, and TikTok — eliminates the decision point between videos. When one video ends, the next begins automatically. YouTube's recommendation algorithm selects the next video to maximize the probability that you'll keep watching. Research has shown that this system tends to push users toward increasingly extreme content because extreme content generates stronger emotional reactions, which generate higher engagement.

Reflection: Open your phone's screen time settings (Settings > Screen Time on iPhone; Settings > Digital Wellbeing on Android). Look at your daily averages. Are they higher or lower than you expected? Pick the app where you spend the most time. Can you identify specific design elements — notifications, infinite scroll, auto-play, variable rewards — that keep you engaged?


4.3 Dark Patterns

4.3.1 Defining Dark Patterns

Dark patterns — a term coined by UX designer Harry Brignull in 2010 — are user interface design choices that manipulate users into actions they didn't intend or wouldn't choose if fully informed. Unlike persuasive design, which might argue it helps users achieve their goals more easily, dark patterns work against the user's interests.

4.3.2 A Taxonomy of Dark Patterns

Dark Pattern Description Example
Forced continuity Making it easy to sign up for a free trial but deliberately difficult to cancel Requiring a phone call to cancel a subscription that was started with one click
Roach motel Easy to get into, hard to get out of Social media accounts that require navigating buried menus to delete
Confirmshaming Using guilt or shame to steer choices "No thanks, I don't want to save money" as the opt-out text
Trick questions Confusing wording designed to produce unintended consent Double negatives in privacy settings ("Uncheck this box to not opt out")
Disguised ads Ads designed to look like content or navigation "Download" buttons on freeware sites that are actually ads
Friend spam Requesting contact access and then messaging contacts without clear consent LinkedIn's early growth strategy of emailing users' entire contact lists
Hidden costs Concealing charges until late in the checkout process Service fees, "convenience fees," and taxes revealed only at final checkout
Privacy zuckering Default settings that share more data than users realize Facebook's repeated changes to default privacy settings toward greater openness
Misdirection Drawing attention away from important information Making "Accept All" cookies prominent while hiding "Manage Preferences"

Eli encountered dark patterns when he tried to opt out of the Smart City sensor data collection in his neighborhood. "The city website had a 14-step opt-out process," he reported to the class. "You had to create an account, verify your address, list each type of data individually, and confirm via email. The whole thing took 35 minutes. Opting in was automatic — you didn't have to do anything."

"And how many of your neighbors went through that process?" Dr. Adeyemi asked.

"I'm guessing approximately zero," Eli said. "Which is the point."

Dark patterns are closely connected to one of our recurring themes: the consent fiction. Platforms present users with "choices" — accept terms of service, manage cookie preferences, adjust privacy settings — that technically constitute consent but are designed to ensure that the vast majority of users accept the most data-extractive option.

Research by researchers at Ruhr University Bochum (2019) found that when cookie consent banners offered "Accept All" as a prominent button and buried "Reject All" in secondary menus, 90% of users clicked "Accept All." When both options were equally prominent, acceptance dropped to 50%. The "consent" was an artifact of design, not an expression of informed preference.

Connection: We'll examine the full scope of the consent problem in Chapter 9 (Data Collection and Consent). For now, note how dark patterns transform consent from a protection for users into a mechanism for extracting permission that users would not grant if the process were fair.


4.4 Behavioral Surplus and Surveillance Capitalism

4.4.1 Zuboff's Framework

Shoshana Zuboff's The Age of Surveillance Capitalism (2019) provides the most comprehensive theoretical framework for understanding the attention economy's data dynamics.

Zuboff argues that Google discovered a new form of surplus value that she calls behavioral surplus. When a user searches Google, some of the data generated is used to improve the search service (making results more relevant). But much of the data — the patterns of clicking, hovering, scrolling, and navigating — exceeds what is needed for service improvement. This excess — the behavioral surplus — is fed into prediction algorithms that anticipate what users will do, think, want, and buy.

These predictions are packaged into prediction products and sold on behavioral futures markets — otherwise known as the advertising market. Advertisers pay not just for your attention but for the probability that you will take a specific action (click, purchase, vote, believe).

The logic of surveillance capitalism, Zuboff argues, creates an imperative for ever-deeper data extraction:

More behavioral data → Better predictions → Higher advertising revenue
                                          ↓
                              More investment in data extraction
                                          ↓
                              More invasive collection methods
                                          ↓
                              More behavioral data → (cycle repeats)

4.4.2 Beyond Advertising: Behavioral Modification

Zuboff's most unsettling claim is that surveillance capitalism has evolved beyond predicting behavior to modifying it. If a platform can predict that you will click on an ad with 70% probability, it can earn more money by designing interventions that push that probability to 85%.

These interventions can be subtle — timing a notification to arrive when you're most vulnerable to engagement, surfacing content that triggers anxiety (which increases scrolling), or creating social comparison dynamics that drive insecurity and compensatory posting.

The distinction matters for governance: predicting behavior based on observed data is one thing; engineering behavior through manipulative design is another. The former can arguably be governed through transparency and consent; the latter raises questions of autonomy and dignity that go beyond data protection.

Mira found Zuboff's framework transformative — and personally uncomfortable. "I keep thinking about VitraMed," she told Eli. "We're predicting health risks, which is supposed to be beneficial. But the model is the same: collect data, extract surplus, build predictions, sell the predictions. How do we make sure we stay on the right side of this?"

"By asking who controls the predictions," Eli replied. "If the patients control their predictions and decide what to do with them, that's one thing. If VitraMed sells the predictions to insurance companies without the patients knowing, that's surveillance capitalism with a lab coat."


4.5 Social Costs

4.5.1 Mental Health

The attention economy's social costs are significant and increasingly well-documented, though causation remains contested.

The relationship between social media use and mental health — particularly among adolescents — has been intensely studied since the mid-2010s. Key findings include:

Research Spotlight: Haidt and Twenge's Analysis (2017-2023)

Question: Is there a causal relationship between smartphone/social media adoption and the rise in adolescent depression and anxiety since 2012?

Method: Large-scale epidemiological data analysis (Monitoring the Future, CDC Youth Risk Behavior Survey) combined with international comparisons and natural experiments.

Key Finding: Adolescent depression, anxiety, self-harm, and suicide rates began rising sharply around 2012 — the year smartphone ownership among American teens surpassed 50%. The pattern is more pronounced among girls. International data shows similar timing in countries where smartphone adoption occurred around the same period.

Why It Matters: If the attention economy is contributing to a mental health crisis among young people, this represents a social cost of the business model that market mechanisms alone will not address.

Limitations: Correlation is not causation. Other factors (economic anxiety, academic pressure, political polarization, COVID-19) may contribute. Some researchers, including Andrew Przybylski and Amy Orben, argue that the effect sizes are small and the evidence insufficient to establish causation.

Note

The mental health debate is ongoing and politically charged. This textbook does not take a definitive position on causation because the science is genuinely contested. What is clear is that the design choices of the attention economy — variable rewards, social comparison mechanisms, engagement-maximizing algorithms — are intentional, and their effects on vulnerable populations deserve scrutiny regardless of where the causal debate settles. We'll return to this topic in depth in Chapter 35 (Children, Teens, and Digital Vulnerability).

4.5.2 Political Polarization and Democratic Discourse

Engagement-maximizing algorithms have documented effects on political discourse:

  • Outrage amplification: Content expressing moral outrage receives significantly more engagement (shares, comments) than neutral content. Algorithms learn this and preferentially surface outrage-inducing content.
  • Filter bubbles and echo chambers: Personalization algorithms can create "filter bubbles" (Eli Pariser's term) where users are exposed primarily to information that confirms their existing beliefs. The evidence on whether this actually increases polarization is mixed — but the potential for algorithmic reinforcement of existing biases is clear.
  • Misinformation spread: A 2018 MIT study found that false news stories on Twitter spread six times faster than true stories — not because of bots, but because human users found false stories more novel and emotionally arousing. Algorithms that optimize for engagement thus systematically advantage falsehoods.

4.5.3 Autonomy and the Problem of Manipulation

Perhaps the deepest cost of the attention economy is its effect on human autonomy — the capacity to make free, informed, self-directed choices.

The philosophical concern is not that platforms force users to do things against their will (they don't), but that they systematically manipulate the conditions under which choices are made — the information presented, the options available, the emotional state of the user — in ways that serve the platform's interests rather than the user's.

This raises questions that connect to Chapter 6's ethical frameworks: - Under a utilitarian analysis, does the pleasure of social media use outweigh the harm of manipulation? - Under a Kantian analysis, is treating users as means to advertising revenue (rather than ends in themselves) inherently wrong? - Under a virtue ethics analysis, does the attention economy cultivate or undermine the virtues of self-control, reflection, and genuine connection?


4.6 Governance Responses

4.6.1 Regulatory Approaches

Governments are beginning to respond to the harms of the attention economy:

  • The EU's Digital Services Act (DSA) requires large platforms to disclose how their recommendation algorithms work, offer non-personalized alternatives, and prohibit targeted advertising to minors.
  • The UK's Age Appropriate Design Code requires platforms to default to the most privacy-protective settings for users under 18 and prohibit "nudge techniques" that encourage children to weaken their privacy settings.
  • California's Age-Appropriate Design Code (modeled on the UK version) requires data protection impact assessments for services likely to be used by children.
  • The EU AI Act classifies certain AI systems that exploit vulnerabilities to manipulate behavior as "prohibited" — though the scope of this prohibition is still being defined through implementation.

4.6.2 Design Reform

A growing movement advocates for design reform from within the technology industry:

  • The Center for Humane Technology (founded by Tristan Harris) advocates for design standards that align technology with human wellbeing rather than engagement metrics.
  • The concept of "calm technology" (coined by Mark Weiser and John Seely Brown) envisions technology that informs without demanding attention.
  • Time Well Spent metrics — pioneered by some platforms under public pressure — attempt to measure whether users feel their time on the platform was valuable, not just whether they spent a lot of it.

4.6.3 Individual Strategies

While systemic change requires collective action, individuals can take steps to resist the attention economy:

Action Checklist: Reclaiming Your Attention

  • [ ] Audit your notifications. Turn off all non-essential push notifications. If an app doesn't need to interrupt you in real time, it shouldn't.
  • [ ] Set screen time limits. Use built-in tools (iOS Screen Time, Android Digital Wellbeing) to set daily limits for attention-capturing apps.
  • [ ] Disable autoplay. Turn off auto-play on YouTube, Netflix, and social media feeds.
  • [ ] Use chronological feeds. Where available, switch from algorithmic to chronological feeds.
  • [ ] Batch your checking. Instead of responding to every notification in real time, check social media at designated times.
  • [ ] Recognize dark patterns. When you notice a design choice that seems to manipulate you, name it. Awareness is the first defense.

Common Pitfall: Individual strategies are necessary but insufficient. Placing the burden of resisting the attention economy entirely on users is like telling people to swim harder while the current is designed to carry them in the opposite direction. Systemic problems require systemic solutions — regulation, design standards, and business model reform.


4.7 Chapter Summary

Key Concepts

  • The attention economy treats human attention as a scarce, tradeable commodity
  • Platform business models depend on engagement optimization — algorithmic maximization of time spent and interactions generated
  • Dark patterns are design choices that manipulate users against their interests
  • Behavioral surplus (Zuboff) is the data extracted beyond what is needed for service improvement, used to build prediction products
  • The social costs include effects on mental health, political polarization, misinformation spread, and individual autonomy
  • Governance responses span regulation (DSA, UK AADC), design reform, and individual resistance

Key Debates

  • Is the attention economy fundamentally different from previous advertising-supported media, or just more efficient?
  • Does algorithmic engagement optimization cross the line from persuasion to manipulation?
  • Should individuals bear responsibility for managing their attention, or is systemic reform required?
  • Can the attention economy be reformed without abandoning the advertising business model entirely?

Applied Framework

When evaluating a platform's design, ask: 1. What is this platform optimizing for? (Engagement? Satisfaction? Learning?) 2. What design techniques does it use to capture attention? 3. Whose interests are served by these designs? 4. What would a version of this platform designed for user wellbeing look like?


What's Next

In Chapter 5: Power, Knowledge, and Data, we'll step back to examine the theoretical foundations of data power — drawing on Foucault's analysis of power/knowledge, contemporary theories of information asymmetry, and the structural dynamics that make data governance a question not just of technology or law but of power.

Before moving on, complete the exercises and quiz to solidify your understanding of the attention economy.


Chapter 4 Exercises → exercises.md

Chapter 4 Quiz → quiz.md

Case Study: TikTok's Recommendation Algorithm → case-study-01.md

Case Study: The Facebook Emotional Contagion Experiment → case-study-02.md