Further Reading: The Attention Economy
The sources below provide deeper engagement with the themes introduced in Chapter 4. They are organized by topic and include a mix of foundational texts, empirical research, accessible popular works, and policy reports. Annotations describe what each source covers and why it is relevant to the chapter's core questions.
The Attention Economy and Platform Business Models
Wu, Tim. The Attention Merchants: The Epic Scramble to Get Inside Our Heads. New York: Knopf, 2016. The definitive history of the attention economy, tracing the advertising-supported model from Benjamin Day's penny press through radio, television, and the internet. Wu demonstrates that the commodification of attention is not a creation of Silicon Valley but a recurring pattern in media history — one that has intensified with each new technology. Essential reading for understanding the historical arc described in Section 4.1.
Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: PublicAffairs, 2019. The foundational text for Section 4.4's discussion of behavioral surplus, prediction products, and behavioral modification. Zuboff argues that Google pioneered a new economic logic that claims human experience as free raw material for commercial extraction. The book is ambitious, sprawling, and at times polemical, but its framework for understanding how data extraction drives the attention economy is indispensable. Readers should complement it with critical responses (see Morozov below).
Vaidhyanathan, Siva. Antisocial Media: How Facebook Disconnects Us and Undermines Democracy. New York: Oxford University Press, 2018. A focused analysis of Facebook's specific role in the attention economy, with particular attention to the platform's effects on democratic discourse, political polarization, and public knowledge. Vaidhyanathan argues that Facebook's problems are structural — rooted in its business model — rather than the result of individual decisions or bad actors. A strong companion to the chapter's discussion of engagement optimization and political polarization.
Persuasive Design and Digital Manipulation
Eyal, Nir. Hooked: How to Build Habit-Forming Products. New York: Portfolio/Penguin, 2014. A practitioner's guide to building products that users return to compulsively. Eyal's "Hook Model" — trigger, action, variable reward, investment — operationalizes the behavioral psychology principles described in Section 4.2 and is widely used in the technology industry. The book is valuable precisely because it makes the engineering of compulsive behavior explicit and systematic. Reading it alongside the chapter's critique of persuasive design reveals the tension between innovation and manipulation.
Harris, Tristan. "How Technology Is Hijacking Your Mind — from a Magician and Google Design Ethicist." Medium, May 18, 2016. The essay that launched the "time well spent" movement and eventually the Center for Humane Technology. Harris, a former Google design ethicist, catalogs specific techniques platforms use to exploit cognitive vulnerabilities — from variable rewards to social reciprocity to the illusion of choice. The essay is accessible, concrete, and directly relevant to Section 4.2's discussion of the architecture of persuasion and Section 4.6.2's coverage of design reform.
Fogg, B.J. Persuasive Technology: Using Computers to Change What We Think and Do. San Francisco: Morgan Kaufmann, 2003. The academic foundation for the persuasive design field. Fogg's Behavior Model (motivation, ability, trigger) is the framework described in Section 4.2.1, and this book is where it was first systematically articulated. While the book predates social media, its principles were directly applied by Fogg's students at companies including Instagram and Facebook. Essential for understanding the intellectual origins of engagement engineering.
Dark Patterns and Manipulative Design
Brignull, Harry. Deceptive Patterns: Exposing the Tricks Tech Companies Use to Control You. London: Testimonium, 2023. The definitive work on dark patterns by the UX designer who coined the term in 2010. Brignull provides a comprehensive taxonomy of deceptive design practices (expanded from his original classification), documents real-world examples with screenshots and analysis, and proposes design ethics principles. Directly relevant to Section 4.3's taxonomy and essential for students who want to identify dark patterns in their own digital environments.
Gray, Colin M., Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs. "The Dark (Patterns) Side of UX Design." Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Paper 534. ACM, 2018. An empirical study of how UX practitioners understand and justify the use of dark patterns in their work. The researchers found that dark patterns are often rationalized as "business requirements" or "what users expect," and that organizational culture plays a larger role than individual ethics in determining whether manipulative designs are deployed. This paper adds institutional and sociological depth to the chapter's treatment of dark patterns as a design phenomenon.
Mental Health, Social Media, and Social Costs
Haidt, Jonathan. The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness. New York: Penguin Press, 2024. The culmination of Haidt's research with Jean Twenge on the relationship between smartphone adoption, social media use, and adolescent mental health decline. Haidt argues that the shift from "play-based childhood" to "phone-based childhood" is a primary driver of the post-2012 rise in anxiety, depression, and self-harm among teens. Directly relevant to Section 4.5.1's Research Spotlight on Haidt and Twenge. Readers should pair this with Orben and Przybylski's critiques (below) for a balanced perspective.
Orben, Amy, and Andrew K. Przybylski. "The Association between Adolescent Well-Being and Digital Technology Use." Nature Human Behaviour 3, no. 2 (2019): 173-182. The most prominent counterpoint to the Haidt/Twenge thesis. Orben and Przybylski analyze large-scale datasets and find that the association between digital technology use and adolescent wellbeing is negative but very small — comparable in magnitude to the effect of wearing glasses or eating potatoes. This paper is essential for understanding the scientific debate that Section 4.5.1 intentionally leaves open. Students should read it alongside Haidt's work and form their own assessment.
Vaidhyanathan, Siva. "The Attention Economy and the Problem of Scale." In Antisocial Media, Chapter 3. Oxford University Press, 2018. A focused chapter-length treatment of how engagement optimization operates at scale to amplify outrage, misinformation, and polarization. Vaidhyanathan's analysis connects the business model (selling attention to advertisers) to the social outcomes (degraded public discourse) in a way that complements the chapter's discussion in Section 4.5.2.
Governance, Regulation, and Alternatives
Center for Humane Technology. Ledger of Harms. Online resource, continuously updated. An annotated catalog of documented harms associated with attention-economy platforms, organized by category (mental health, democracy, relationships, attention). The Ledger functions as a living evidence base for the social costs discussed in Section 4.5 and is a useful starting point for research projects and policy analysis. Available freely at humanetech.com.
Yeung, Karen. "'Hypernudge': Big Data as a Mode of Regulation by Design." Information, Communication & Society 20, no. 1 (2017): 118-136. A legal scholar's analysis of how algorithmic personalization functions as a form of regulation — shaping behavior not through explicit rules but through the continuous manipulation of information environments. Yeung's concept of the "hypernudge" connects the attention economy's architecture of persuasion to governance theory, arguing that algorithmic nudges are qualitatively different from traditional nudges because they are dynamic, opaque, and operate at scale. Directly relevant to the chapter's discussion of autonomy and behavioral modification.
European Commission. "Digital Services Act: Regulation (EU) 2022/2065." Official Journal of the European Union, 2022. The full text of the DSA, which requires large platforms to disclose recommendation algorithm logic, offer non-personalized alternatives, prohibit targeted advertising to minors, and conduct systemic risk assessments. The most significant piece of attention-economy governance enacted to date. Essential primary source for Section 4.6.1's discussion of regulatory approaches.
Information Commissioner's Office (UK). "Age Appropriate Design: A Code of Practice for Online Services." ICO, 2020. The UK's landmark code requiring platforms to default to the most privacy-protective settings for under-18 users and prohibiting "nudge techniques" that encourage children to weaken their privacy protections. A model for children-focused attention economy governance that has influenced legislation in California, Ireland, and the EU. Directly referenced in Section 4.6.1.
These readings are starting points, not endpoints. As subsequent chapters examine privacy (Part 2), algorithmic systems (Part 3), and governance (Part 4), the concepts introduced here — attention as commodity, design as manipulation, behavioral surplus as fuel — will recur as structural themes. The attention economy is not a single chapter's concern; it is the economic foundation on which nearly every other issue in this textbook rests.