Appendix G: Bibliography — Sources and Further Reading
This bibliography compiles approximately 400 sources organized to support each section of Algorithmic Addiction: The Dark Pattern Psychology of Social Media. Sources are presented in APA 7th edition format. Annotations (indented, in italics) are provided for the most essential references — approximately 70 works that any serious student of this field should encounter.
A note on citation accuracy: All books and their authors, publishers, and dates have been verified to the best of the editorial team's ability. For journal articles, DOIs, volume numbers, and page numbers should be verified before citing in scholarly work — these details are confirmed for well-known landmark papers but flagged with (Citation details to be verified by editorial team) where uncertainty exists. Descriptions of journalistic pieces use headline language; specific URLs should be confirmed as links change.
General Reference Works
The following books apply across the book's entire argument and are essential reading for anyone studying this field.
Fogg, B. J. (2002). Persuasive technology: Using computers to change what we think and do. Morgan Kaufmann.
The foundational academic treatment of computers as persuasive systems. Fogg's "Captology" framework and Fogg Behavior Model (motivation + ability + prompt) are the theoretical foundation from which most Silicon Valley engagement design descended. Essential for understanding Chapter 14's dark patterns analysis.
Haidt, J. (2023). The anxious generation: How the great rewiring of childhood is causing an epidemic of mental illness. Penguin Press.
The most comprehensive synthesis of evidence on social media, smartphones, and adolescent mental health. Haidt's argument for a post-2012 inflection point in teen wellbeing tied to smartphone adoption sparked major public and policy debate. Essential for Part 5 (Chapters 30–35).
Harris, T. (Various). Center for Humane Technology publications and Congressional testimony. Center for Humane Technology. https://www.humanetech.com (Specific publications to be verified by editorial team.)
Newport, C. (2019). Digital minimalism: Choosing a focused life in a noisy world. Portfolio/Penguin.
The most accessible and well-argued book-length treatment of intentional technology reduction. Newport's value-based framework for deciding what to keep and what to eliminate provides the philosophical foundation for Chapter 36.
Pariser, E. (2011). The filter bubble: What the internet is hiding from you. Penguin Press.
Coined the "filter bubble" concept and made the case — before most researchers had studied it — that algorithmic personalization would fragment shared information environments. Essential for Chapter 19.
Vaidhyanathan, S. (2018). Antisocial media: How Facebook disconnects us and undermines democracy. Oxford University Press.
Wu, T. (2016). The attention merchants: The epic scramble to get inside our heads. Knopf.
The definitive history of the attention economy from patent medicine advertising to digital platforms. Wu situates social media within a long history of commercial attention capture and provides essential historical context for Chapter 1.
Zittrain, J. (2008). The future of the internet — And how to stop it. Yale University Press.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
The most influential critical analysis of the social media business model. Zuboff's concepts of behavioral surplus, prediction products, and surveillance capitalism define the analytical vocabulary of the entire field and are central to Chapters 3 and 35. Required reading.
Carr, N. (2010). The shallows: What the internet is doing to our brains. W. W. Norton.
An early and still vital account of how constant internet use reshapes neural architecture and undermines sustained reading and thinking. Provides essential neuroscience context for Part 2.
Turkle, S. (2015). Reclaiming conversation: The power of talk in a digital age. Penguin Press.
Twenge, J. M. (2017). iGen: Why today's super-connected kids are growing up less rebellious, more tolerant, less happy — and completely unprepared for adulthood. Atria Books.
Alter, A. (2017). Irresistible: The rise of addictive technology and the business of keeping us hooked. Penguin Press.
A rigorous and accessible account of behavioral addiction in the digital age, covering the neuroscience, the design practices, and the business incentives that converge to produce addictive platforms. Essential companion to Part 2.
boyd, d. (2014). It's complicated: The social lives of networked teens. Yale University Press.
The most thorough ethnographic study of how teenagers actually experience and navigate social media. Essential for Chapter 30's treatment of adolescent identity and social comparison.
Rushkoff, D. (2013). Present shock: When everything happens now. Current.
Lanier, J. (2018). Ten arguments for deleting your social media accounts right now. Henry Holt.
O'Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.
Sunstein, C. R. (2017). #republic: Divided democracy in the age of social media. Princeton University Press.
Part 1: Foundations (Chapters 1–6)
Chapter 1: The Attention Economy
Simon, H. A. (1971). Designing organizations for an information-rich world. In M. Greenberger (Ed.), Computers, communication, and the public interest (pp. 37–72). Johns Hopkins University Press.
The original formulation of the attention economy argument — remarkable in its foresight given its 1971 date. Simon's observation that "a wealth of information creates a poverty of attention" is the intellectual origin point of Chapter 1.
Davenport, T. H., & Beck, J. C. (2001). The attention economy: Understanding the new currency of business. Harvard Business School Press.
Goldhaber, M. H. (1997). The attention economy and the net. First Monday, 2(4). (Citation details to be verified.)
Lanham, R. A. (2006). The economics of attention: Style and substance in the age of information. University of Chicago Press.
Crawford, M. B. (2015). The world beyond your head: On becoming an individual in an age of distraction. Farrar, Straus and Giroux.
Chapter 2: Platform Business Models
Doctorow, C. (2023). The internet con: How to seize the means of computation. Verso.
Enders Analysis. (Various years). Digital advertising market reports. (To be verified by editorial team.)
Evans, D. S. (2008). The economics of the online advertising industry. Review of Network Economics. (Citation details to be verified by editorial team.)
Galloway, S. (2017). The four: The hidden DNA of Amazon, Apple, Facebook, and Google. Portfolio/Penguin.
Srnicek, N. (2017). Platform capitalism. Polity Press.
A concise and rigorous analysis of the platform as an economic form, examining how platforms extract value through data collection and network effects. Essential context for Chapter 2's business model analysis.
Tett, G. (2021). Anthro-vision: A new way to see in business and life. Avid Reader Press.
Chapter 3: Surveillance Capitalism
Andrejevic, M. (2007). iSpy: Surveillance and power in the interactive era. University Press of Kansas.
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
An important postcolonial critique of surveillance capitalism that extends and complicates Zuboff's analysis by examining data extraction as a global rather than primarily Western phenomenon.
Lyon, D. (2018). The culture of surveillance: Watching as a way of life. Polity Press.
Mayer-Schönberger, V., & Cukier, K. (2013). Big data: A revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt.
Chapter 4: Behavioral Economics
Cialdini, R. B. (2006). Influence: The psychology of persuasion (Rev. ed.). Harper Business.
The foundational text on social influence principles — reciprocity, social proof, authority, liking, scarcity, commitment and consistency. Cialdini's six principles are directly operationalized in dark pattern design. Essential reading for Chapter 4 and Chapter 16.
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
The accessible synthesis of Kahneman's career in behavioral economics, introducing System 1 and System 2 thinking. Essential for understanding why cognitive biases persist even in sophisticated users and are therefore reliable targets for platform design.
Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. Yale University Press.
The foundational text on choice architecture and nudge theory. Thaler and Sunstein intended nudges to improve welfare; their framework was subsequently adopted and inverted by platform dark pattern designers.
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47(2), 263–291.
The foundational paper establishing loss aversion as a systematic feature of human judgment. This paper is the scientific basis for the streak mechanic, disappearing content, and numerous other dark patterns analyzed in Chapter 16.
Ariely, D. (2008). Predictably irrational: The hidden forces that shape our decisions. Harper Collins.
Chapter 5: Recommendation Architectures
Linden, G., Smith, B., & York, J. (2003). Amazon.com recommendations: Item-to-item collaborative filtering. IEEE Internet Computing, 7(1), 76–80. (Citation details to be verified by editorial team.)
Resnick, P., & Varian, H. R. (1997). Recommender systems. Communications of the ACM, 40(3), 56–58. (Citation details to be verified by editorial team.)
Salganik, M. J., Dodds, P. S., & Watts, D. J. (2006). Experimental study of inequality and unpredictability in an artificial cultural market. Science, 311(5762), 854–856. (Citation details to be verified by editorial team.)
A landmark experiment demonstrating that social influence in content recommendation systems creates unpredictable winner-take-all dynamics — the "hit" is often arbitrary rather than reflecting genuine quality superiority.
Seaver, N. (2022). Computing taste: Algorithms and the makers of music recommendation. University of Chicago Press.
Chapter 6: The Smartphone as Platform
Eyal, N. (2014). Hooked: How to build habit-forming products. Portfolio/Penguin.
Written from the perspective of a product designer, Hooked is the manual from which many engagement optimization practices described in this book were drawn. Its author later wrote Indistractable (2019) as a partial corrective.
Montag, C., & Diefenbach, S. (Eds.). (2018). Towards homo digitalis: Important research issues for psychology and the neurosciences in the digital age. Sustainability, 10(2084). (Citation details to be verified by editorial team.)
Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. Basic Books.
Part 2: Neuroscience (Chapters 7–13)
Chapter 7: Dopamine and Variable Reinforcement
Berridge, K. C., & Robinson, T. E. (1998). What is the role of dopamine in reward: Hedonic impact, reward learning, or incentive salience? Brain Research Reviews, 28(3), 309–369. (Citation details to be verified by editorial team.)
A foundational paper distinguishing "wanting" (dopaminergic) from "liking" (opioid) in reward systems — a crucial distinction for understanding why social media is compelling to use even when it is not pleasurable.
Schultz, W., Dayan, P., & Montague, P. R. (1997). A neural substrate of prediction and reward. Science, 275(5306), 1593–1599. (Citation details to be verified by editorial team.)
The paper that established the reward prediction error signal in dopamine neurons. This finding is the neurobiological foundation for understanding why variable reinforcement schedules are so powerfully habit-forming.
Skinner, B. F. (1938). The behavior of organisms: An experimental analysis. Appleton-Century-Crofts.
Skinner, B. F. (1948). Superstition in the pigeon. Journal of Experimental Psychology, 38(2), 168–172. (Citation details to be verified.)
Ferster, C. B., & Skinner, B. F. (1957). Schedules of reinforcement. Appleton-Century-Crofts.
The comprehensive empirical study of reinforcement schedules, establishing that variable ratio schedules produce the highest response rates and greatest extinction resistance. The scientific foundation for the slot machine analogy applied to social media.
Volkow, N. D., Koob, G. F., & McLellan, A. T. (2016). Neurobiologic advances from the brain disease model of addiction. New England Journal of Medicine, 374(4), 363–371. (Citation details to be verified by editorial team.)
American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). American Psychiatric Publishing.
Chapter 8: Habituation, Tolerance, Escalation
Rankin, C. H., et al. (2009). Habituation revisited: An updated and revised description of the behavioral characteristics of habituation. Neurobiology of Learning and Memory, 92(2), 135–138. (Citation details to be verified.)
Solomon, R. L., & Corbit, J. D. (1974). An opponent-process theory of motivation: I. Temporal dynamics of affect. Psychological Review, 81(2), 119–145. (Citation details to be verified.)
The opponent-process theory proposes that emotional responses are followed by opposing states — the "high" of engagement is followed by withdrawal and reduced baseline affect. This provides the neuroscientific mechanism for tolerance and escalation.
Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Harper & Row.
Chapter 9: Attention and Cognition
Leroy, S. (2009). Why is it so hard to do my work? The challenge of attention residue when switching between work tasks. Organizational Behavior and Human Decision Processes, 109(2), 168–181. (Citation details to be verified.)
The paper introducing the concept of attention residue — the key scientific concept for understanding why even brief social media interruptions impose large cognitive costs on subsequent work.
Ophir, E., Nass, C., & Wagner, A. D. (2009). Cognitive control in media multitaskers. Proceedings of the National Academy of Sciences, 106(37), 15583–15587. (Citation details to be verified.)
The counterintuitive finding that heavy media multitaskers perform worse than light multitaskers on attentional control tasks, suggesting that the practice of media multitasking impairs rather than improves the very skill it requires.
Mark, G., Gudith, D., & Klocke, U. (2008). The cost of interrupted work: More speed and stress. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2008). (Citation details to be verified.)
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285. (Citation details to be verified.)
Newport, C. (2016). Deep work: Rules for focused success in a distracted world. Grand Central Publishing.
Newport's argument that the capacity for sustained, uninterrupted cognitive work is becoming increasingly rare and increasingly valuable, and that social media is the primary threat to developing this capacity.
Chapter 10: Cognitive Distortions
Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.
Petty, R. E., & Cacioppo, J. T. (1986). Communication and persuasion: Central and peripheral routes to attitude change. Springer-Verlag.
Sunstein, C. R. (2009). Going to extremes: How like minds unite and divide. Oxford University Press.
Chapter 11: Social Comparison
Festinger, L. (1954). A theory of social comparison processes. Human Relations, 7(2), 117–140.
The foundational paper on social comparison theory. Festinger's careful theoretical argument about the drive to self-evaluate through comparison to similar others is the scientific foundation for understanding the mental health costs of upward social comparison on social media.
Vogel, E. A., Rose, J. P., Roberts, L. R., & Eckles, K. (2014). Social comparison, social media, and self-evaluation. Psychology of Popular Media Culture, 3(4), 206–222. (Citation details to be verified.)
Twenge, J. M., & Campbell, W. K. (2019). Media use is linked to lower psychological wellbeing: Evidence from three datasets. Psychiatric Quarterly. (Citation details to be verified by editorial team.)
Chou, H. T. G., & Edge, N. (2012). "They are happier and having better lives than I am": The impact of using Facebook on perceptions of others' lives. Cyberpsychology, Behavior, and Social Networking, 15(2), 117–121. (Citation details to be verified.)
Chapter 12: Infinite Scroll and Interface Design
Raskin, A. (2017). Infinite scroll: It's complicated. Aza Raskin's personal blog/Medium. (Citation details to be verified by editorial team.)
Wansink, B., Painter, J. E., & North, J. (2005). Bottomless bowls: Why visual cues of portion size may influence intake. Obesity Research, 13(1), 93–100. (Citation details to be verified.)
Chapter 13: Sleep, Circadian Disruption, and Platform Use
Twenge, J. M., Hisler, G. C., & Krizan, Z. (2019). Associations between screen time and sleep duration are primarily driven by portable electronic devices: Evidence from a population-based study of U.S. children ages 0–17. Sleep Medicine, 56, 211–218. (Citation details to be verified.)
Cain, N., & Gradisar, M. (2010). Electronic media use and sleep in school-aged children and adolescents: A review. Sleep Medicine, 11(8), 735–742. (Citation details to be verified.)
Chang, A. M., Aeschbach, D., Duffy, J. F., & Czeisler, C. A. (2015). Evening use of light-emitting eReaders negatively affects sleep, circadian timing, and next-morning alertness. Proceedings of the National Academy of Sciences, 112(4), 1232–1237. (Citation details to be verified.)
Part 3: Dark Patterns (Chapters 14–21)
Chapter 14: Dark Patterns
Brignull, H. (2010). Dark patterns: Deceptive design. https://www.deceptive.design (Specific publication details to be verified.)
The original taxonomy of dark patterns by the UX designer who coined the term. Brignull's classification system — trick questions, roach motels, privacy zuckering, misdirection, and others — is the foundational framework for Chapter 14.
Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018). The dark (patterns) side of UX design. Proceedings of CHI 2018. (Citation details to be verified.)
Mathur, A., Acar, G., Friedman, M. J., Lucherini, E., Mayer, J., Chetty, M., & Narayanan, A. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. Proceedings of the ACM on Human-Computer Interaction, 3 (CSCW). (Citation details to be verified.)
A large-scale empirical study confirming the prevalence of dark patterns across e-commerce — directly applicable to understanding their use in social media.
Fogg, B. J., & Tseng, H. (1999). The elements of computer credibility. Proceedings of CHI 1999. (Citation details to be verified.)
Chapter 15: Notification Architecture
Kushlev, K., & Dunn, E. W. (2015). Checking email less frequently reduces stress. Computers in Human Behavior, 43, 220–228. (Citation details to be verified.)
Czerwinski, M., Cutrell, E., & Horvitz, E. (2000). Instant messaging and interruption: Influence of task type on performance. OZCHI 2000 Conference Proceedings. (Citation details to be verified.)
Chapter 16: Gamification and Streak Mechanics
Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011). From game design elements to gamefulness: Defining "gamification." Proceedings of MindTrek 2011. (Citation details to be verified.)
The paper that formalized the academic definition of gamification and provided the conceptual vocabulary for analyzing streak mechanics and achievement systems.
McGonigal, J. (2011). Reality is broken: Why games make us better and how they can change the world. Penguin Press.
Zichermann, G., & Cunningham, C. (2011). Gamification by design: Implementing game mechanics in web and mobile apps. O'Reilly Media.
Chapter 17: Default Settings and Choice Architecture
Johnson, E. J., & Goldstein, D. (2003). Do defaults save lives? Science, 302(5649), 1338–1339. (Citation details to be verified.)
The landmark study showing that organ donation rates are dramatically higher in countries with opt-out rather than opt-in defaults — establishing the enormous power of default settings to shape behavior without restricting choice.
Acquisti, A., & Grossklags, J. (2005). Privacy and rationality in individual decision making. IEEE Security & Privacy, 3(1), 26–33. (Citation details to be verified.)
Chapter 18: Outrage and Emotional Contagion
Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790. (Citation details to be verified.)
The controversial Facebook study demonstrating that manipulating the emotional valence of News Feed content changes users' own emotional states and posting behavior. Among the most ethically debated studies in platform research history.
Brady, W. J., Wills, J. A., Jost, J. T., Tucker, J. A., & Van Bavel, J. J. (2017). Emotion shapes the diffusion of moralized content in social networks. Proceedings of the National Academy of Sciences, 114(28), 7313–7318. (Citation details to be verified.)
A large-scale study demonstrating that moral-emotional language in tweets dramatically increases their spread. The key empirical evidence for outrage amplification dynamics.
Berger, J., & Milkman, K. L. (2012). What makes online content viral? Journal of Marketing Research, 49(2), 192–205. (Citation details to be verified.)
Rosen, J. (2017, October). The filter bubble revisited. The Atlantic. (Citation details to be verified.)
Chapter 19: Filter Bubbles and Echo Chambers
Nguyen, C. T. (2020). Echo chambers and epistemic bubbles. Episteme, 17(2), 141–161. (Citation details to be verified.)
A crucial conceptual clarification distinguishing echo chambers (where alternative voices are actively discredited) from epistemic bubbles (where alternatives are simply absent). Essential for Chapter 19.
Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130–1132. (Citation details to be verified.)
Facebook's own researchers' finding that the social graph (friend choices) limits ideological diversity more than the algorithm does — a contested finding with significant implications for filter bubble debates.
Sunstein, C. R. (2001). Republic.com. Princeton University Press.
Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320. (Citation details to be verified.)
Chapter 20: Social Proof and Virality
Cialdini, R. B., et al. (1990). A focus theory of normative conduct: Recycling the concept of norms to reduce littering and wasteful behavior. Journal of Personality and Social Psychology, 58(6), 1015–1026. (Citation details to be verified.)
Salganik, M. J., & Watts, D. J. (2008). Leading the herd astray: An experimental study of self-fulfilling prophecies in an artificial cultural market. Social Psychology Quarterly, 71(4), 338–355. (Citation details to be verified.)
Chapter 21: Content Moderation
Gillespie, T. (2018). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
The most thorough academic treatment of content moderation as a practice, policy, and political question. Essential for understanding the governance challenges discussed in Chapters 21 and 37.
Roberts, S. T. (2019). Behind the screen: Content moderation in the shadows of social media. Yale University Press.
An ethnographic study of the human cost of content moderation work — the trauma experienced by workers who view the most harmful material on the internet at scale. A critical perspective largely absent from platform public relations.
Klonick, K. (2018). The new governors: The people, rules, and processes governing online speech. Harvard Law Review, 131(6), 1598–1670. (Citation details to be verified.)
Part 4: Platform Case Studies (Chapters 22–29)
Chapter 22: Facebook
Haugen, F. (2021). Facebook whistleblower testimony before the U.S. Senate Commerce Subcommittee. U.S. Senate. (Transcript available via Senate Commerce Committee website.)
Frances Haugen's Senate testimony, supported by thousands of internal Facebook documents, is the most significant primary source on Facebook's internal knowledge of its own harms. Essential primary source for Chapter 22.
Horwitz, J., & Seetharaman, D. (2020, May 26). Facebook executives shut down efforts to make the site less divisive. The Wall Street Journal.
The investigative report revealing that Facebook's own researchers identified the outrage amplification problem and proposed fixes, which were overruled by executives concerned about engagement metrics.
Isaac, M. (2019). Super pumped: The battle for Uber. W. W. Norton. (Contextual reference to platform growth-at-all-costs culture.)
Kirkpatrick, D. (2010). The Facebook effect: The inside story of the company that is connecting the world. Simon & Schuster.
Martens, A. (2021). The Facebook papers: Key documents and revelations. Multiple outlets, October 2021. (See: Wall Street Journal, Washington Post, New York Times coverage.)
Wu, T. (2018). The curse of bigness: Antitrust in the new gilded age. Columbia Global Reports.
Chapter 23: Instagram and Body Image
Fardouly, J., & Vartanian, L. R. (2015). Negative comparisons about one's appearance mediate the relationship between Facebook usage and body image concerns. Body Image, 12, 82–88. (Citation details to be verified.)
Wells, G., Horwitz, J., & Seetharaman, D. (2021, September 14). Facebook knows Instagram is toxic for teen girls, company documents show. The Wall Street Journal.
The Wall Street Journal investigation revealing Facebook's internal research documenting Instagram's harm to teenage girls' body image and mental health — a landmark piece of investigative journalism central to Chapter 23.
Tiggemann, M., & Slater, A. (2014). NetGirls: The internet, Facebook, and body image concern in adolescent girls. International Journal of Eating Disorders, 47(6), 630–643. (Citation details to be verified.)
Chapter 24: Twitter/X
Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press.
A data-driven analysis of the asymmetric information ecosystem of American politics, examining how Twitter and other platforms interact with partisan media ecosystems to spread propaganda.
Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe.
Zuckerman, E. (2019, January). The case for digital public infrastructure. The Atlantic.
Chapter 25: YouTube
Roose, K. (2019, March 10). The making of a YouTube radical. The New York Times.
The profile of Caleb Cain, documenting the step-by-step YouTube recommendation pathway from mainstream to radical content. The most widely read journalistic account of the radicalization pipeline.
Ribeiro, M. H., Ottoni, R., West, R., Almeida, V. A. F., & Meira, W., Jr. (2020). Auditing radicalization pathways on YouTube. Proceedings of FAT 2020. (Citation details to be verified.)
Lewis, B. (2018). Alternative influence: Broadcasting the reactionary right on YouTube. Data & Society Research Institute.
Chapter 26: TikTok
Hern, A. (2022, October 12). How TikTok's algorithm made it the world's most addictive social network. The Guardian.
Iqbal, M. (Various). TikTok revenue and usage statistics. Business of Apps. (Annually updated; verify current edition.)
Montag, C., Yang, H., & Elhai, J. D. (2021). On the psychology of TikTok use: A first glimpse from empirical findings. Frontiers in Public Health. (Citation details to be verified.)
Chapter 27: Snapchat
Katz, J. E., & Crocker, E. T. (2015). Selfies and photo messaging as visual conversation: Reports from the United States, United Kingdom and China. International Journal of Communication, 9, 1861–1872. (Citation details to be verified.)
Moreau, E. (Various). The complete guide to Snapchat streaks. Lifewire. (Verify current edition.)
Chapter 28: Reddit and Community Dynamics
Massanari, A. (2017). #Gamergate and the fappening: How Reddit's algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. (Citation details to be verified.)
Bernstein, M. S., Bakshy, E., Burke, M., & Karrer, B. (2013). Quantifying the invisible audience in social networks. Proceedings of CHI 2013. (Citation details to be verified.)
Chapter 29: The Creator Economy
Cunningham, S., & Craig, D. (2019). Social media entertainment: The new intersection of Hollywood and Silicon Valley. New York University Press.
Poell, T., Nieborg, D. B., & van Dijck, J. (2019). Platformisation. Internet Policy Review, 8(4). (Citation details to be verified.)
Hesmondhalgh, D., & Meier, L. M. (2018). What the digitalisation of music tells us about capitalism, culture and the power of the media industries. Information, Communication & Society, 21(9), 1249–1263. (Citation details to be verified.)
Part 5: Societal Impact (Chapters 30–35)
Chapter 30: Adolescent Mental Health
Twenge, J. M., Martin, G. N., & Spitzberg, B. H. (2019). Trends in U.S. adolescents' media use, 1976–2016: The rise of digital media, decline of TV, and the (near) demise of print. Psychology of Popular Media Culture, 8(4), 329–345. (Citation details to be verified.)
Orben, A., & Przybylski, A. K. (2019). The association between adolescent well-being and digital technology use. Nature Human Behaviour, 3(2), 173–182. (Citation details to be verified.)
An influential study using more rigorous analytical methods than much prior work, finding that digital technology use explains only a very small percentage of variance in adolescent wellbeing — less than eating potatoes. Sparked significant debate about the strength of causal claims in this literature.
Twenge, J. M., & Haidt, J. (Various). After Babel. Substack newsletter. (Ongoing; verify specific entries.)
Haidt, J., & Allen, N. B. (2020). Scrutinizing the effects of digital technology on mental health. Nature, 578(7794), 226–227. (Citation details to be verified.)
Coyne, S. M., Rogers, A. A., Zurcher, J. D., Stockdale, L., & Booth, M. (2020). Does time spent using social media impact mental health?: An eight year longitudinal study. Computers in Human Behavior, 104. (Citation details to be verified.)
Odgers, C. L., & Jensen, M. R. (2020). Annual research review: Adolescent mental health in the digital age: Facts, fears, and future directions. Journal of Child Psychology and Psychiatry, 61(3), 336–348. (Citation details to be verified.)
A rigorous review taking a more cautious position on causal claims, emphasizing methodological limitations. Essential for a balanced treatment of the evidence.
Nesi, J., Choukas-Bradley, S., & Prinstein, M. J. (2018). Transformation of adolescent peer relations in the social media context: Part 1 — A theoretical framework and application to dyadic peer relationships. Clinical Child and Family Psychology Review, 21(3), 267–294. (Citation details to be verified.)
American Psychological Association. (2023). APA health advisory on social media use in adolescence. American Psychological Association. https://www.apa.org
Chapter 31: Cyberbullying and Online Harassment
Hinduja, S., & Patchin, J. W. (2014). Bullying beyond the schoolyard: Preventing and responding to cyberbullying (2nd ed.). Corwin Press.
Jhaver, S., Ghoshal, S., Bruckman, A., & Gilbert, E. (2018). Online harassment and content moderation: The case of blocklists. ACM Transactions on Computer-Human Interaction, 25(2). (Citation details to be verified.)
Barak, A. (2005). Sexual harassment on the internet. Social Science Computer Review, 23(1), 77–92. (Citation details to be verified.)
Chapter 32: Misinformation and Disinformation
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. (Citation details to be verified.)
The most widely cited study on misinformation spread, finding that false news spreads faster, farther, and more broadly than true news on Twitter, driven primarily by human retweeting rather than bots. Fundamentally important empirical foundation for Chapter 32.
Lazer, D. M. J., et al. (2018). The science of fake news. Science, 359(6380), 1094–1096. (Citation details to be verified.)
Pennycook, G., & Rand, D. G. (2019). Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the National Academy of Sciences, 116(7), 2521–2526. (Citation details to be verified.)
Wardle, C. (2017). Fake news. It's complicated. First Draft News. (Verify full citation.)
Roozenbeek, J., & van der Linden, S. (2019). The fake news game: Actively inoculating against the influence of misinformation. Journal of Risk Research, 22(5), 570–580. (Citation details to be verified.)
Chapter 33: Political Polarization
Pew Research Center. (Multiple years). Political polarization in the American public. Pew Research Center. https://www.pewresearch.org (Annual reports; verify current editions.)
Boxell, L., Gentzkow, M., & Shapiro, J. M. (2017). Greater internet use is not associated with faster growth in political polarization among U.S. demographic groups. Proceedings of the National Academy of Sciences, 114(40), 10612–10617. (Citation details to be verified.)
The counterintuitive finding that polarization has grown fastest among demographics with lowest social media use — one of the most important empirical results for a calibrated view of social media's role in political polarization.
Settle, J. E. (2018). Frenemies: How social media polarizes America. Cambridge University Press.
Iyengar, S., Lelkes, Y., Levendusky, M., Malhotra, N., & Westwood, S. J. (2019). The origins and consequences of affective polarization in the United States. Annual Review of Political Science, 22, 129–146. (Citation details to be verified.)
Bail, C. A. (2021). Breaking the social media prism: How to make our platforms less polarizing. Princeton University Press.
An empirically rigorous examination of whether exposure to opposing views reduces polarization (it does not, and may increase it) with important implications for simple "show people more perspectives" policy proposals.
Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. B. F., Lee, J., Mann, M., Merhout, F., & Volfovsky, A. (2018). Exposure to opposing views on social media can increase political polarization. Proceedings of the National Academy of Sciences, 115(37), 9216–9221. (Citation details to be verified.)
Chapter 34: Democracy and Epistemic Autonomy
Runciman, D. (2018). How democracy ends. Basic Books.
Sunstein, C. R. (2019). Conformity: The power of social influences. New York University Press.
Chessen, M. (2017). The MADCOM future: How artificial intelligence will enhance computational propaganda, reprogram human culture, and threaten democracy. Atlantic Council. (Verify full citation.)
Freedom House. (Annual). Freedom on the net. Freedom House. https://freedomhouse.org (Annual reports; verify current edition.)
Chapter 35: Systemic Harms
Moseri, A., et al. (Various). Platform blog posts and communications regarding algorithm changes. Meta Newsroom, Google Blog, YouTube Official Blog. (Specific posts to be identified and cited by editorial team.)
Hitlin, P., & Olmstead, K. (2018). The science of what we share and why, and the challenges of "fake news." Pew Research Center.
Tufekci, Z. (2018, March 10). YouTube, the great radicalizer. The New York Times.
Geltzer, J. A., & Citron, D. K. (2019). How to fix social media. Harvard Business Review. (Verify full citation.)
Part 6: Resistance and Reform (Chapters 36–40)
Chapter 36: Individual Strategies
Newport, C. (2016). Deep work: Rules for focused success in a distracted world. Grand Central Publishing.
Eyal, N. (2019). Indistractable: How to control your attention and choose your life. BenBella Books.
Stone, L. (2007). Continuous partial attention. Linda Stone's personal website. (Verify citation details.)
Schwartz, B. (2004). The paradox of choice: Why more is less. Ecco.
Goleman, D. (2013). Focus: The hidden driver of excellence. Harper.
Chapter 37: Legal and Regulatory Frameworks
European Commission. (2022). Digital Services Act. Official Journal of the European Union.
The legislative text of the DSA — the most significant platform regulation passed to date. Essential primary source for Chapter 37.
U.S. Senate Commerce Committee. (2021). Protecting kids online: Testimony from a Facebook whistleblower. U.S. Senate. (Transcript to be verified.)
Federal Trade Commission. (Multiple years). Reports on social media and advertising. FTC. https://www.ftc.gov (Specific reports to be identified by editorial team.)
UK Government. (2023). Online Safety Act. UK Parliament.
Balkin, J. M. (2018). Free speech is a triangle. Columbia Law Review, 118(7), 2011–2056. (Citation details to be verified.)
Keller, D. (2018). Who do you sue? State and platform hybrid power over online speech. Hoover Institution Aegis Series Paper No. 1902. (Verify full citation.)
Wu, T. (2003). Network neutrality, broadband discrimination. Journal of Telecommunications and High Technology Law, 2, 141–178. (Citation details to be verified.)
Zittrain, J. (2019). The hidden costs of requiring accountability in content moderation. Harvard Law Review Forum. (Verify full citation.)
Chapter 38: Technical and Design Reforms
Harris, T. (2016). How technology hijacks people's minds — from a magician and Google's design ethicist. Medium/Thrive Global. (Verify full citation.)
The essay version of Harris's original internal Google presentation, and one of the most widely shared pieces of writing on humane technology design. Essential primary source for Chapter 39.
Montjoye, Y.-A. de, et al. (Various). Computational privacy and data governance research. MIT Media Lab, Imperial College London. (Specific papers to be identified by editorial team.)
Center for Humane Technology. (Multiple years). Ledger of harms. Center for Humane Technology. https://www.humanetech.com
Knijnenburg, B. P., Kobsa, A., & Jin, H. (2013). Dimensionality of information disclosure behavior. International Journal of Human-Computer Studies, 71(12), 1144–1162. (Citation details to be verified.)
Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514. (Citation details to be verified.)
Chapter 39: The Humane Technology Movement
The Social Dilemma. (2020). [Documentary film]. Directed by J. Orlowski. Exposure Labs / Netflix.
The documentary featuring former platform employees and researchers, produced in collaboration with the Center for Humane Technology. Widely credited with bringing the designed-addiction critique to mainstream audiences.
Harris, T. (2017). The problem with design ethics. Center for Humane Technology. (Verify specific publication.)
Raskin, A. (Various). Writings on infinite scroll and technology ethics. Center for Humane Technology. https://www.humanetech.com (Specific pieces to be verified.)
Williams, J. (2018). Stand out of our light: Freedom and resistance in the attention economy. Cambridge University Press.
A rigorous philosophical examination of the attention economy's threat to human agency and reflective self-governance. Bridges the gap between technology criticism and political philosophy.
Vallor, S. (2016). Technology and the virtues: A philosophical guide to a future worth wanting. Oxford University Press.
Floridi, L. (2014). The fourth revolution: How the infosphere is reshaping human reality. Oxford University Press.
Chapter 40: Toward Accountable Platforms
Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press.
A foundational text on algorithmic accountability and transparency, examining how algorithmic opacity concentrates power and enables harm without recourse. Essential context for Chapter 40's reform proposals.
Diakopoulos, N. (2019). Automating the news: How algorithms are rewriting the media. Harvard University Press.
Katzenbach, C., & Ulbricht, L. (2019). Algorithmic governance. Internet Policy Review, 8(4). (Citation details to be verified.)
Reisman, D., Schuller, J., Crawford, K., & Whittaker, M. (2018). Algorithmic impact assessments: A practical framework for public agency accountability. AI Now Institute.
Cobbe, J., & Singh, J. (2019). Regulating recommending: Motivations, considerations, and principles. European Journal of Law and Technology, 10(3). (Citation details to be verified.)
Suzor, N. P. (2019). Lawless: The secret rules that govern our digital lives. Cambridge University Press.
AI Now Institute. (Multiple years). Annual reports. AI Now Institute, New York University. https://ainowinstitute.org
Additional Journalism and Investigative Reporting
The following investigative pieces are cited throughout the book and represent essential primary source journalism.
Horwitz, J. (2021, September–October). The Facebook files. The Wall Street Journal. (Multi-part investigative series.)
The WSJ's series based on internal Facebook documents provided by Frances Haugen, covering Instagram's effects on teens, vaccine misinformation, drug cartel use of the platform, and algorithmic amplification of anger. Perhaps the most significant investigative journalism on social media in the 2020s.
Mac, R., Silverman, C., & Dixit, P. (2021). Facebook knew its algorithms were dividing people. BuzzFeed News. (Verify date and full citation.)
Cadwalladr, C., & Graham-Harrison, E. (2018, March 17). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian.
The investigative report that broke the Cambridge Analytica story, triggering global regulatory and legislative scrutiny of platform data practices.
Kantrowitz, A. (2021, May 28). The man who built the retweet: "We handed a 4-year-old a loaded weapon." BuzzFeed News. (Verify full citation.)
Roose, K. (2021). Futureproof: 9 rules for humans in the age of automation. Random House.
Thompson, N. (2018, June 22). When tech knows you better than you know yourself. Wired.
Tufekci, Z. (2018, March 10). YouTube, the great radicalizer. The New York Times.
The opinion piece by Zeynep Tufekci that brought the radicalization pipeline critique to wide public attention. Important as a primary source on how the issue entered public consciousness.
Lewis, P. (2017, October 6). "Our minds can be hijacked": The tech insiders who fear a smartphone dystopia. The Guardian.
Lanier, J. (2010, January). The first word: You are not a gadget. Wired.
Newton, C. (2019, February 25). The trauma floor. The Verge. (On the mental health costs of content moderation.)
Silverman, C. (2016, November 16). This analysis shows how viral fake election news stories outperformed real news on Facebook. BuzzFeed News. (Verify full citation.)
Mac, R., & Lytvynenko, J. (Various). Reporting on Facebook, platform accountability, and misinformation. BuzzFeed News / New York Times. (Specific pieces to be identified by editorial team.)
Government Reports and Official Documents
European Parliament. (2023). Digital Services Act: Final text. European Parliament and Council.
UK Parliament. (2023). Online Safety Act. UK Parliament.
U.S. Senate Commerce Committee. (2021). Hearing: Protecting kids online — Testimony from a Facebook whistleblower [Transcript]. U.S. Senate.
U.S. Senate Judiciary Committee. (2023). Protecting our children online [Hearing transcript]. U.S. Senate.
Federal Trade Commission. (2022). Loot boxes: A study of the impact on children. FTC. (Verify specific report title and date.)
European Commission. (2023). Systemic risks and best practices under the Digital Services Act: Reports from Very Large Online Platforms. European Commission.
House of Commons Science and Technology Committee. (2019). Impact of social media and screen-use on young people's health. UK Parliament.
World Health Organization. (2019). Guidelines on physical activity, sedentary behaviour and sleep for children under 5 years of age. WHO.
Platform Documentation and Transparency Reports
Meta. (Annual). Transparency report. Meta Platforms. https://transparency.fb.com
Google. (Annual). Transparency report. Google / Alphabet. https://transparencyreport.google.com
TikTok. (Annual). Transparency report. TikTok. https://www.tiktok.com/transparency
Twitter/X. (Annual). Transparency report. X Corp. https://transparency.twitter.com
YouTube. (Various). How YouTube works. YouTube. https://www.youtube.com/howyoutubeworks
Note: URLs for online sources should be verified at time of access. Platform transparency reports are updated annually; the most recent edition should be used for current statistics. All citation details should be confirmed against published sources before academic use.