Appendix E: Glossary — 200+ Terms Defined

This glossary compiles over 200 terms introduced and developed throughout Algorithmic Addiction: The Dark Pattern Psychology of Social Media. Entries are alphabetized, with the chapter of first significant use noted in parentheses. Cross-references appear in italics. Where a term spans multiple chapters, the chapter of primary treatment is listed; secondary appearances are not exhaustively catalogued here.

Readers are encouraged to use this glossary not only as a reference tool but as a conceptual map of the field — the relationships between terms illuminate the interlocking systems that constitute the attention economy.


A

A/B testing (Chapter 14) — An experimental method in which two or more variants of a design element, interface, or algorithm are deployed to different user subsets simultaneously to measure which version produces superior engagement outcomes. Social media platforms run thousands of A/B tests continuously, treating users as unwitting experimental subjects in behavioral optimization trials. The cumulative effect of A/B testing is the systematic selection of interface designs that maximize addictive engagement over user wellbeing. See also: engagement optimization, friction, dark patterns.

Addiction, behavioral (Chapter 7) — A pattern of compulsive engagement with a rewarding behavior — such as gambling, gaming, or social media use — that persists despite negative consequences and shares neurobiological features with substance addiction, including tolerance, withdrawal, craving, and loss of control. Behavioral addiction to social media is increasingly recognized in clinical literature, though formal diagnostic criteria remain debated. See also: compulsive use, problematic use, tolerance, withdrawal.

Adolescent brain development (Chapter 30) — The ongoing maturation of neural structures, particularly the prefrontal cortex, that continues throughout adolescence and into the mid-twenties. Because the reward-seeking limbic system develops ahead of the regulatory prefrontal cortex, adolescents are neurobiologically more susceptible to reward-based manipulation, social comparison pressure, and addictive use patterns. See also: prefrontal cortex, nucleus accumbens, identity formation, social comparison theory.

Affective computing (Chapter 16) — A field of computer science focused on systems that can detect, interpret, and simulate human emotions, often using facial recognition, physiological sensors, or text analysis. Platforms increasingly employ affective computing to tailor content delivery to users' emotional states in real time. See also: emotional contagion, engagement optimization.

Affordance (Chapter 14) — In design theory, any feature of an object or system that signals or enables a particular action by the user. Digital affordances such as the Like button, the share icon, or the pull-to-refresh gesture shape user behavior by making certain actions easy, visible, and socially rewarding while obscuring others. See also: dark patterns, friction, default settings.

Aggregated behavioral data (Chapter 3) — Large datasets compiled from the tracked behaviors of many users, used to identify patterns, train machine learning models, and predict individual behavior. Aggregated behavioral data is the raw material of surveillance capitalism, converted into products sold to advertisers and other third parties. See also: behavioral surplus, surveillance capitalism, data exhaust.

Algorithmic amplification (Chapter 18) — The process by which recommendation and ranking algorithms preferentially surface certain content — typically that which drives higher engagement — over other content. Algorithmic amplification has been implicated in the spread of misinformation, political radicalization, and outrage-driven discourse. See also: algorithmic curation, outrage amplification, radicalization pipeline, filter bubble.

Algorithmic curation (Chapter 5) — The automated selection, ranking, and presentation of content to individual users based on predicted engagement, past behavior, and demographic inferences. Unlike editorial curation by humans, algorithmic curation operates at scale and is optimized for platform-defined metrics rather than journalistic or social values. See also: collaborative filtering, engagement optimization, EdgeRank, TikTok For You Page.

Algorithmic transparency (Chapter 38) — The degree to which the logic, training data, and optimization targets of a recommendation algorithm are disclosed to users, regulators, and researchers. Algorithmic transparency is a contested regulatory demand: platforms resist disclosure on grounds of trade secrecy and security, while critics argue opacity enables harm without accountability. See also: design ethics, Digital Services Act, consent architecture.

Attention budget (Chapter 36) — A framework for treating daily attentional capacity as a finite resource to be deliberately allocated rather than surrendered to platform defaults. The concept of an attention budget, popularized in digital wellbeing discourse, encourages users to set intentional limits on platform engagement and protect deep-focus time. See also: digital minimalism, cognitive load, Time Well Spent.

Attention economy (Chapter 1) — An economic model in which human attention is treated as a scarce resource to be captured, measured, and sold to advertisers. First theorized by Herbert Simon in 1971, the attention economy became the dominant logic of digital platforms in the 2010s, transforming what had appeared to be free services into elaborate systems for monetizing user time and focus. See also: engagement metric, surveillance capitalism, behavioral surplus, programmatic advertising.

Attention residue (Chapter 9) — The cognitive phenomenon in which partial activation of a prior task continues to consume working memory resources after a person has shifted to a new task. Social media interruptions generate attention residue that degrades performance on subsequent activities, even after the phone is put away. See also: task-switching cost, cognitive load, lateral inhibition.

Autoplay (Chapter 12) — A platform feature that automatically begins the next video, podcast, or content item without requiring any user action. Autoplay removes friction from continued consumption, exploits the path-of-least-resistance principle, and transforms passive continuation into the default state. Netflix, YouTube, and TikTok all employ autoplay, with significant effects on time-on-platform. See also: friction, infinite scroll, default settings, time-on-platform.

Autonomy-preserving design (Chapter 39) — An approach to product design that prioritizes the user's capacity to make free, informed, and deliberate choices about their technology use, rather than nudging behavior toward platform-preferred outcomes. Autonomy-preserving design stands in explicit contrast to persuasive technology and dark patterns. See also: ethical design, humane technology, consent architecture, design ethics.


B

Behavioral addiction — See addiction, behavioral.

Behavioral economics (Chapter 4) — A field combining psychology and economics that studies how cognitive biases and heuristics cause human decision-making to deviate from the rational-actor model. Social media platforms systematically exploit behavioral economics insights — loss aversion, social proof, default effects — to maximize engagement. See also: loss aversion, cognitive bias, dark patterns.

Behavioral surplus (Chapter 3) — Shoshana Zuboff's term for the data generated by user behavior that exceeds what is needed to improve platform services and is instead extracted, analyzed, and sold as a raw material for predicting and modifying future behavior. Behavioral surplus is the foundational input of surveillance capitalism. See also: surveillance capitalism, data exhaust, aggregated behavioral data.

Binge-watching (Chapter 12) — The practice of watching multiple episodes of a television series in rapid succession, typically facilitated by autoplay and the removal of natural stopping cues. Platform design actively engineers binge-watching through countdown timers, cliffhanger content sequencing, and the absence of completion cues. See also: autoplay, infinite scroll, streak mechanic.

Bottomless bowl effect (Chapter 12) — A phenomenon, studied by Brian Wansink, in which the absence of clear endpoint cues leads to overconsumption; applied to digital media, it describes how infinite scroll and autoplay remove the natural stopping points that would otherwise terminate a session. See also: infinite scroll, autoplay, friction.

Bundled notifications (Chapter 15) — A notification delivery strategy that collects multiple alerts and delivers them together at intervals, rather than sending each individually. While sometimes framed as a user-friendly feature reducing interruptions, bundled notifications can also be timed for maximum re-engagement effect. Contrast with notification timing. See also: notification timing, variable reinforcement, engagement optimization.


C

Cambridge Analytica scandal (Chapter 22) — The 2018 revelation that political consulting firm Cambridge Analytica had harvested the Facebook profile data of approximately 87 million users without their knowledge, using it to build psychographic targeting profiles for political advertising. The scandal catalyzed regulatory scrutiny of platform data practices globally. See also: surveillance capitalism, behavioral surplus, GDPR, Section 230.

Captology (Chapter 14) — B.J. Fogg's term for the study of computers as persuasive technologies — the intersection of computing and persuasion. Captology provided the academic framework subsequently operationalized by Silicon Valley product designers to build systems that change user beliefs and behaviors. See also: persuasive technology, dark patterns, engagement optimization.

Churn (Chapter 2) — The rate at which users disengage from or abandon a platform over a given period. Reducing churn is a primary platform objective, driving features designed to increase switching costs, deepen habitual use, and maximize the social cost of leaving. See also: DAU, MAU, platform dependency, lock-in.

Click-through rate (CTR) (Chapter 2) — A metric measuring the proportion of users who click on a link, advertisement, or piece of content relative to the total who viewed it. CTR is a foundational engagement metric in digital advertising but can incentivize sensationalist, misleading, or emotionally provocative content that maximizes clicks. See also: engagement metric, programmatic advertising, outrage amplification.

Cognitive bias (Chapter 4) — A systematic deviation from rational judgment arising from the brain's use of mental shortcuts (heuristics). Social media platforms are specifically engineered to exploit cognitive biases including confirmation bias, availability heuristic, social proof, and loss aversion. See also: behavioral economics, dark patterns, loss aversion, social proof.

Cognitive dissonance (Chapter 10) — The psychological discomfort experienced when a person holds two contradictory beliefs or when their behavior conflicts with their stated values. Platforms exploit cognitive dissonance reduction — users rationalize excessive use rather than changing behavior — to sustain engagement despite users' own stated preferences to use less. See also: behavioral addiction, compulsive use, commitment device.

Cognitive load (Chapter 9) — The total amount of mental effort required to process information and complete tasks using working memory. Platform interfaces are often designed to minimize immediate cognitive load to ease continued use, while simultaneously imposing hidden long-term cognitive costs through fragmentation of attention. See also: attention residue, task-switching cost, lateral inhibition.

Collaborative filtering (Chapter 5) — A recommendation algorithm technique that predicts a user's preferences by identifying similar patterns across large numbers of users and surfacing content that comparable users engaged with. Collaborative filtering is the foundation of most platform recommendation systems. See also: algorithmic curation, engagement optimization, filter bubble.

Commitment device (Chapter 16) — A behavioral economics mechanism in which a person makes a binding advance commitment to a future action, exploiting their tendency toward consistency. Platforms use inverted commitment devices — public declarations of use (posting histories, follower counts) that increase the social cost of disengagement. See also: loss aversion, cognitive dissonance, reciprocity principle.

Confirmation bias (Chapter 10) — The tendency to seek out, interpret, and remember information that confirms pre-existing beliefs while discounting contradictory evidence. Algorithmic curation amplifies confirmation bias by learning user preferences and selectively surfacing confirming content. See also: filter bubble, echo chamber, epistemic bubble.

Consent architecture (Chapter 38) — The design of systems that govern how users are informed about and asked to agree to data collection, use tracking, and personalization. Consent architecture is frequently manipulated through dark patterns — pre-ticked boxes, buried opt-outs, false urgency — to extract nominal consent while obscuring the full scope of data extraction. See also: dark patterns, GDPR, default settings, autonomy-preserving design.

Compulsive use (Chapter 7) — A pattern of social media engagement driven by urges that feel difficult or impossible to resist, often persisting despite the user's recognition that the behavior is harming their wellbeing. Compulsive use is distinguished from problematic use by the presence of urge-driven automaticity. See also: behavioral addiction, problematic use, withdrawal, variable reinforcement.

Content moderation (Chapter 21) — The practice of reviewing, filtering, removing, or labeling user-generated content that violates platform policies or legal requirements. Content moderation at scale involves a combination of automated detection systems and human reviewers, with significant consequences for both free expression and the mental health of moderation workers. See also: Digital Services Act, Section 230, algorithmic amplification.

Conversion rate (Chapter 2) — A metric measuring the proportion of users who complete a desired action — making a purchase, signing up for a service, clicking an advertisement — relative to total users exposed to the opportunity. Conversion rate optimization drives much of the persuasive design embedded in commercial platforms. See also: engagement metric, dark patterns, programmatic advertising.

CPM (Cost Per Mille) (Chapter 2) — The cost an advertiser pays per one thousand impressions of their advertisement. CPM is a foundational metric in programmatic advertising and creates direct financial incentives for platforms to maximize the total volume of content served to users. See also: programmatic advertising, real-time bidding, attention economy.

Creator economy (Chapter 29) — The ecosystem of individuals who produce content professionally or semi-professionally for distribution on social media platforms, deriving income from advertising revenue shares, brand partnerships, subscriptions, and tips. The creator economy creates platform dependency among creators and complicates critiques of platform harm, since many creators have substantial economic stakes in platform continuation. See also: platform dependency, engagement metric, algorithmic curation.


D

Dark patterns (Chapter 14) — User interface design choices that manipulate users into taking actions they did not intend or would not choose if fully informed, typically by exploiting cognitive biases, obscuring information, or creating artificial friction for opt-out paths. Coined by UX designer Harry Brignull in 2010, dark patterns are now subject to regulatory scrutiny under the GDPR and Digital Services Act. See also: consent architecture, friction, default settings, autonomy-preserving design.

Data broker (Chapter 3) — A commercial entity that collects, aggregates, and sells personal data about individuals, typically without direct consumer relationships. Data brokers operate downstream of social media platforms, purchasing or licensing behavioral surplus to enrich third-party advertiser profiles. See also: surveillance capitalism, behavioral surplus, programmatic advertising.

Data exhaust (Chapter 3) — The incidental data generated as a by-product of users' online activities — clicks, dwell time, scroll depth, cursor movements — that can be collected and monetized even when users are not consciously producing content. Data exhaust is a core input to behavioral surplus. See also: behavioral surplus, surveillance capitalism, aggregated behavioral data.

DAU (Daily Active Users) (Chapter 2) — A platform metric counting the number of unique users who engage with a platform on any given day. DAU is a primary indicator of platform health in investor reporting and drives design decisions aimed at converting occasional visitors into habitual daily users. See also: MAU, engagement metric, time-on-platform.

Default settings (Chapter 14) — The pre-configured options that apply to a user's account or experience absent active choice to change them. Because most users never change defaults, default settings function as de facto platform policy. Platforms systematically set defaults to maximize data collection, engagement, and notification frequency rather than user wellbeing. See also: dark patterns, friction, consent architecture, autonomy-preserving design.

Design ethics (Chapter 39) — The application of ethical reasoning to the practice of product and interface design, asking not only whether a design is effective at achieving its goals but whether those goals and methods are justifiable from the perspective of user and social wellbeing. Design ethics has gained institutional traction through movements like the Center for Humane Technology. See also: humane technology, autonomy-preserving design, ethical design, Center for Humane Technology.

Digital minimalism (Chapter 36) — A philosophy, popularized by Cal Newport, advocating for the intentional and selective use of digital tools based on deeply held values, rather than passive adoption of all available technology. Digital minimalism prescribes radically reducing optional social media use and filling recovered time with high-value offline activities. See also: attention budget, Time Well Spent, screen time, humane technology.

Digital Services Act (DSA) (Chapter 38) — A European Union regulation (entered into force 2022, applicable to large platforms from 2023) that imposes obligations on online platforms regarding illegal content, transparency of algorithmic systems, advertising practices, and the protection of minors. The DSA is the most comprehensive platform regulation to date and has become a reference point globally. See also: GDPR, Online Safety Act, Section 230, algorithmic transparency.

Dopamine (Chapter 7) — A neurotransmitter centrally involved in the brain's reward and motivation systems. Dopamine is released not primarily in response to reward itself but in anticipation of reward — particularly unpredictable or variable reward — making it central to understanding why variable-ratio reinforcement schedules are powerfully habit-forming. See also: reward prediction error, nucleus accumbens, variable reinforcement, Skinner box.

Doomscrolling (Chapter 9) — The compulsive consumption of negative news and distressing content through social media feeds, often continuing despite increasing distress. Doomscrolling is partly driven by evolutionary threat-detection instincts and partly by algorithmic amplification of outrage and fear, which reliably drive engagement. See also: outrage amplification, infinite scroll, compulsive use, emotional contagion.

Dwell time (Chapter 5) — The amount of time a user spends viewing a specific piece of content before scrolling or navigating away. Dwell time is used by recommendation algorithms as a signal of content relevance and quality, though it conflates genuine interest with confused, horrified, or compulsive lingering. See also: engagement metric, algorithmic curation, time-on-platform.


E

Echo chamber (Chapter 19) — A media environment in which individuals are primarily exposed to opinions and information that reinforce their pre-existing beliefs, with few encounters with contrary viewpoints. Echo chambers can arise from both algorithmic curation and from voluntary self-selection into like-minded communities. Distinguished from filter bubble by the greater role of user agency. See also: filter bubble, epistemic bubble, confirmation bias, polarization.

EdgeRank (Chapter 22) — Facebook's original content ranking algorithm, publicly described in 2010, which scored News Feed items based on affinity (relationship between user and content creator), weight (content type), and time decay. EdgeRank was a precursor to the more complex machine learning systems that replaced it. See also: algorithmic curation, News Feed, engagement optimization.

Emotional contagion (Chapter 18) — The phenomenon whereby emotions spread from person to person through exposure to emotional expressions and content. Facebook's 2014 study demonstrated that algorithmically manipulating the emotional valence of News Feed content changed users' own emotional states and posting behavior — a finding that raised profound ethical questions about platform power. See also: outrage amplification, algorithmic curation, affective computing.

Engagement metric (Chapter 2) — Any quantitative measure of user interaction with a platform or piece of content, including likes, comments, shares, views, click-through rates, and time-on-platform. Engagement metrics serve as proxies for advertising value but their optimization incentivizes platforms to maximize behavioral responses regardless of whether those responses reflect positive user experiences. See also: attention economy, DAU, time-on-platform, A/B testing.

Engagement optimization (Chapter 5) — The systematic application of data analysis, machine learning, and design modification to maximize user engagement with a platform. Engagement optimization is the central technical practice of social media platforms, driving both algorithmic and interface design decisions. See also: A/B testing, algorithmic curation, engagement metric, dark patterns.

Epistemic bubble (Chapter 19) — C. Thi Nguyen's term for an information environment that excludes contrary viewpoints not through active suppression but through the simple absence of diverse sources. Epistemic bubbles are distinguished from echo chambers by the lack of active reinforcement; users simply have not encountered other views. See also: echo chamber, filter bubble, confirmation bias.

Ethical design (Chapter 39) — The practice of designing digital products with explicit consideration of user wellbeing, societal impact, and ethical obligations, rather than optimizing purely for engagement or revenue. Ethical design encompasses autonomy-preserving design, consent architecture, and the rejection of dark patterns. See also: design ethics, autonomy-preserving design, humane technology.


F

Fear of missing out (FOMO) (Chapter 11) — The anxiety that others are having rewarding experiences from which one is absent, amplified by social media's curated presentation of peers' highlight reels. FOMO is a significant driver of compulsive checking behavior and functions as a form of social anxiety that platforms are specifically designed to exploit and intensify. See also: social comparison theory, parasocial relationship, variable reinforcement.

Feature phone (Chapter 6) — A mobile phone that has basic functionality beyond calls and texts (such as a camera and internet browsing) but lacks the full application ecosystem of a smartphone. Feature phones are less susceptible to social media addiction because they offer fewer engagement surfaces. See also: smartphone, screen time.

Filter bubble (Chapter 19) — Eli Pariser's term for the personalized information environment created by algorithmic curation, in which users are insulated from content that challenges their worldviews because algorithms have learned to serve only confirming content. Distinguished from echo chamber by its predominantly algorithmic rather than voluntary origin. See also: echo chamber, epistemic bubble, algorithmic curation, collaborative filtering.

Flow state (Chapter 8) — Mihaly Csikszentmihalyi's concept of a state of deep, effortless immersion in a challenging activity. Social media platforms deliberately mimic the conditions of flow through variable reward and continuous novelty, but produce a degraded version that lacks the productive challenge and skill development of genuine flow. See also: variable reinforcement, engagement optimization.

Friction (Chapter 14) — In design, any element that adds difficulty, time, or cognitive effort to a user action. Platforms strategically eliminate friction from engagement behaviors (scrolling, liking, sharing) while adding friction to disengagement behaviors (account deletion, notification disabling), creating an asymmetric environment that systematically favors continued use. See also: default settings, dark patterns, autonomy-preserving design.


G

GDPR (General Data Protection Regulation) (Chapter 38) — The European Union's comprehensive data protection law, effective from May 2018, which establishes rights for individuals over their personal data including rights of access, erasure, and portability, and imposes significant obligations and penalties on data controllers. GDPR has had global impact, prompting platform policy changes worldwide. See also: consent architecture, Digital Services Act, Online Safety Act, Section 230.

Gamification (Chapter 16) — The application of game-design elements — points, streaks, badges, leaderboards, rewards — to non-game contexts to drive behavior. Social media platforms extensively use gamification, particularly through like counts, follower numbers, and streak mechanics, to create engagement loops similar to those found in video games. See also: streak mechanic, variable reinforcement, engagement optimization.

Ghost followers (Chapter 20) — Inactive or bot accounts that follow a user but do not engage with their content, artificially inflating follower counts while reducing genuine engagement rates. Ghost followers distort the social proof dynamics of follower counts and create misleading signals of social status. See also: social proof, engagement metric, vanity metrics.


H

Habituation (Chapter 8) — A basic form of neural learning in which the response to a repeated stimulus diminishes over time. In the context of social media, habituation drives the need for escalating novelty, content intensity, or social validation to achieve the same engagement response, contributing to tolerance. See also: tolerance, sensitization, variable reinforcement, dopamine.

Haptic feedback (Chapter 15) — Physical vibration or tactile responses generated by a device, used in smartphone notifications to create a physical embodied signal of incoming social information. Haptic feedback strengthens the conditioned response to notification cues, making the phone itself a persistent behavioral trigger. See also: notification timing, variable reinforcement, conditioned response.

Humane technology (Chapter 39) — A movement and design philosophy, associated primarily with Tristan Harris and the Center for Humane Technology, advocating for the redesign of digital platforms around human flourishing rather than engagement maximization. Humane technology calls for aligning platform incentives with user wellbeing through both design changes and regulatory reform. See also: Center for Humane Technology, Time Well Spent, design ethics, ethical design.


I

Identity formation (Chapter 30) — The developmental process through which individuals — particularly adolescents — construct a coherent sense of self through exploration, social comparison, and the consolidation of values and roles. Social media platforms disrupt identity formation by substituting the external validation of likes and followers for internal identity consolidation, and by exposing adolescents to unrelenting social comparison at a developmentally vulnerable moment. See also: adolescent brain development, social comparison theory, social identity theory.

Infinite scroll (Chapter 12) — A user interface pattern in which content loads continuously as the user scrolls, with no natural endpoint or pagination. Infinite scroll was invented by Aza Raskin in 2006 and subsequently adopted by most major social platforms; Raskin has since publicly expressed regret about its social costs. Infinite scroll eliminates the natural stopping cues that would otherwise end a browsing session. See also: autoplay, friction, default settings, bottomless bowl effect.

Information overload (Chapter 9) — A state in which the volume of incoming information exceeds an individual's capacity to process it meaningfully, leading to decision fatigue, cognitive overwhelm, and reduced quality of judgment. Social media platforms produce chronic information overload, degrading users' capacity for sustained, deliberate thought. See also: cognitive load, attention residue, task-switching cost.


J

Junk sleep (Chapter 30) — Sleep that is physiologically degraded in quality due to nighttime phone use, blue light exposure, or late-night social media engagement. Junk sleep is associated with increased rates of depression and anxiety, particularly in adolescents, creating a feedback loop between social media use and mental health deterioration. See also: adolescent brain development, screen time, problematic use.


K

Key performance indicator (KPI) (Chapter 2) — A quantitative metric used to evaluate the success of a business or product objective. In social media platforms, KPIs typically center on engagement, retention, and advertising revenue, creating organizational incentives that systematically prioritize addictive design over user wellbeing. See also: engagement metric, DAU, MAU, time-on-platform.


L

Lateral inhibition (Chapter 9) — A neural mechanism by which active neurons suppress the activity of neighboring neurons, helping to sharpen sensory signals and focus attention. In cognitive terms, engagement with social media activates attention pathways that laterally inhibit sustained focus on competing tasks, contributing to the difficulty of deep concentration after phone use. See also: attention residue, cognitive load, task-switching cost.

Like button (Chapter 14) — The social feedback mechanism, first deployed by Facebook in 2009, that allows users to express positive reactions to content with a single click, generating quantified social approval scores. The Like button has become a cultural symbol of social media's quantification of social life and is a central mechanism of variable reinforcement on social platforms. See also: variable reinforcement, social proof, engagement metric, dopamine.

Lock-in (Chapter 29) — The degree to which switching away from a platform is made difficult or costly by technical barriers (data portability limitations), social barriers (losing one's social graph), or economic barriers (creator revenue streams tied to a specific platform). Lock-in is a deliberate design strategy to reduce churn. See also: platform dependency, churn, switching cost.

Loss aversion (Chapter 4) — The cognitive bias, established by Kahneman and Tversky, in which losses are psychologically weighted more heavily than equivalent gains. Platforms exploit loss aversion through streak mechanics (fear of breaking a streak), disappearing content (Snapchat Stories, Instagram Stories), and notification patterns that emphasize social activity missed. See also: streak mechanic, dark patterns, behavioral economics, FOMO.


M

MAU (Monthly Active Users) (Chapter 2) — A platform metric counting unique users who engage at least once in a thirty-day period. MAU is a standard reporting metric in platform financial disclosures and is a foundational input to advertising rate calculations. See also: DAU, engagement metric, time-on-platform.

Meaningful interactions (Chapter 22) — A phrase from Facebook's 2018 algorithm update announcement, in which the company claimed to be prioritizing content that generated "meaningful" social interactions over passive consumption. Critics noted that the update disproportionately amplified outrage and divisive content, since these reliably generate high comment and reaction counts. See also: engagement optimization, outrage amplification, EdgeRank, algorithmic curation.

Metatransparency (Chapter 38) — The principle that even when individual platform interactions are opaque, users have theoretical access to high-level platform policies through published terms of service. Critics argue metatransparency is inadequate because policies are written in inaccessible language and do not reflect actual data practices. See also: consent architecture, algorithmic transparency, GDPR.

Misinformation (Chapter 32) — False or inaccurate information spread regardless of intent to deceive, distinguished from disinformation which involves deliberate deception. Social media platforms have been shown to amplify misinformation because it frequently generates high engagement responses. See also: outrage amplification, algorithmic amplification, epistemic bubble, filter bubble.

Moral disengagement (Chapter 21) — Albert Bandura's concept describing the psychological mechanisms by which individuals and organizations deactivate moral self-regulation and justify harmful behaviors. Platform companies employ moral disengagement at institutional level by framing harms as user responsibility, emphasizing platform neutrality, and fragmenting accountability across technical systems. See also: design ethics, surveillance capitalism.


N

Networked public (Chapter 20) — danah boyd's term for the social environments constructed through networked technologies, characterized by persistence, searchability, replicability, and invisible audiences. Networked publics transform the social dynamics of self-presentation and create novel pressures around reputation management. See also: parasocial relationship, social identity theory, identity formation.

News Feed (Chapter 22) — Facebook's primary content display system, launched in 2006, which aggregates and ranks posts from a user's social connections and followed pages using algorithmic curation. The News Feed was among the first large-scale algorithmic content systems and became the template for subsequent social media feed designs. See also: EdgeRank, algorithmic curation, engagement optimization.

Notification timing (Chapter 15) — The strategic scheduling of push notifications to maximize the probability of immediate user re-engagement. Platforms optimize notification timing based on individual behavioral data, sending alerts at moments when users are most likely to be accessible and responsive. See also: bundled notifications, variable reinforcement, engagement optimization.

Nucleus accumbens (Chapter 7) — A structure in the basal ganglia that plays a central role in the reward circuit, receiving dopaminergic input from the ventral tegmental area and integrating reward prediction signals. The nucleus accumbens is implicated in both the rewarding effects of social validation and the craving states associated with behavioral addiction. See also: dopamine, reward prediction error, prefrontal cortex, variable reinforcement.


O

Online Safety Act (Chapter 38) — United Kingdom legislation (passed 2023) that imposes duties of care on online platforms to protect users — particularly children — from harmful content, with enforcement powers for the regulator Ofcom. The Online Safety Act represents a significant shift toward platform liability for user harm in UK law. See also: Digital Services Act, GDPR, Section 230, content moderation.

Operant conditioning (Chapter 7) — B.F. Skinner's framework for understanding how voluntary behaviors are shaped by their consequences: behaviors followed by rewards are strengthened; behaviors followed by punishment or non-reward are weakened. Social media platforms function as massive operant conditioning environments, using variable reward schedules to shape compulsive engagement. See also: Skinner box, variable reinforcement, dopamine, behavioral addiction.

Outrage amplification (Chapter 18) — The tendency of algorithmic ranking systems to disproportionately surface content that triggers anger, moral indignation, or outrage, because such content generates reliably high engagement responses. Outrage amplification has been linked to political polarization, misinformation spread, and the degradation of online discourse. See also: algorithmic amplification, emotional contagion, polarization, misinformation.


P

Parasocial relationship (Chapter 11) — A one-sided emotional bond in which a media consumer feels they know and have a relationship with a public figure, celebrity, or content creator who is unaware of their existence. Social media platforms intensify parasocial relationships through the intimate, personal presentation styles of creators and the illusion of direct interaction through comments and replies. See also: social comparison theory, creator economy, identity formation.

Persuasive technology (Chapter 14) — Technology deliberately designed to change user attitudes or behaviors, applying principles from behavioral psychology and persuasion research. B.J. Fogg's foundational work on persuasive technology provided the theoretical basis for many social media engagement design practices. See also: captology, dark patterns, engagement optimization, design ethics.

Platform dependency (Chapter 29) — The degree to which individuals, businesses, or communities have become reliant on a specific platform for social connection, information, economic activity, or political voice, making exit prohibitively costly. Platform dependency is both a consequence of network effects and a deliberate design objective. See also: creator economy, lock-in, churn, network effects.

Polarization, political (Chapter 33) — The process by which political views and identities become increasingly extreme and tribal, with reduced engagement across ideological lines. Research on whether social media causes polarization is contested, but platforms' incentive to amplify outrage and filter bubbles are identified as contributing factors. See also: echo chamber, filter bubble, outrage amplification, algorithmic amplification.

Prefrontal cortex (Chapter 7) — The anterior region of the frontal lobe, responsible for executive functions including planning, impulse control, working memory, and decision-making. The prefrontal cortex is the neural substrate of deliberate self-regulation and is the primary structure that users must recruit to resist addictive platform engagement. Its late maturation in adolescents increases vulnerability to behavioral addiction. See also: adolescent brain development, nucleus accumbens, impulse control, cognitive load.

Problematic use (Chapter 7) — Social media use that is associated with negative outcomes — reduced wellbeing, impaired academic or occupational functioning, relationship problems — without necessarily meeting clinical criteria for addiction. Problematic use is the broader category of which compulsive use and behavioral addiction are more severe subsets. See also: compulsive use, behavioral addiction, screen time.

Programmatic advertising (Chapter 2) — The automated buying and selling of digital advertising space using software and algorithms, with individual ad placements determined in milliseconds through real-time bidding auctions. Programmatic advertising created the economic infrastructure that made behavioral surveillance the dominant business model of the internet. See also: real-time bidding, CPM, surveillance capitalism, behavioral surplus.

Psychographics (Chapter 3) — Audience segmentation methods based on psychological characteristics, values, interests, and personality traits, as distinguished from demographic segmentation. Psychographic targeting, made possible by social media behavioral data, enables advertisers to reach individuals based on inferred psychological vulnerabilities. See also: Cambridge Analytica scandal, behavioral surplus, programmatic advertising.


R

Radicalization pipeline (Chapter 33) — The hypothesized process by which recommendation algorithms progressively expose users to increasingly extreme content, nudging them from mainstream views toward radical ideologies. The radicalization pipeline is most extensively documented on YouTube, where autoplay recommendations have been observed to trend toward more extreme content over successive viewing sessions. See also: algorithmic amplification, filter bubble, echo chamber, outrage amplification.

Real-time bidding (RTB) (Chapter 2) — The auction mechanism underlying programmatic advertising, in which advertising impressions are bought and sold through automated bidding in the milliseconds between a page load beginning and completing. RTB enables granular behavioral targeting at enormous scale. See also: programmatic advertising, CPM, surveillance capitalism.

Reciprocity principle (Chapter 16) — The social psychology principle that people feel obligated to return favors, gifts, or social attention. Platforms exploit reciprocity through follow-back norms, comment reply expectations, and social notification systems that create perceived social obligations to engage. See also: commitment device, social proof, dark patterns.

Recommendation algorithm (Chapter 5) — An automated system that selects and ranks content for individual users based on predicted preference or engagement, using techniques including collaborative filtering, content-based filtering, and reinforcement learning. Recommendation algorithms are the primary mechanism through which platforms execute engagement optimization. See also: algorithmic curation, collaborative filtering, TikTok For You Page, EdgeRank.

Reward prediction error (Chapter 7) — A neural signal, encoded in dopaminergic activity, that represents the discrepancy between expected and actual reward. When a reward is better than predicted, dopamine activity increases; when it is worse, dopamine activity decreases; when it is exactly as predicted, no signal is generated. Variable reinforcement exploits reward prediction error by ensuring rewards are systematically unpredictable. See also: dopamine, variable reinforcement, nucleus accumbens, operant conditioning.


S

Screen time (Chapter 36) — A general term for the total duration of use of digital devices, particularly smartphones and tablets. Screen time has become a primary metric in public discourse about digital wellbeing, though researchers note it is a crude measure that fails to distinguish between types, contexts, and quality of device use. See also: digital minimalism, problematic use, attention budget.

Section 230 (Chapter 37) — Section 230 of the US Communications Decency Act (1996), which provides broad immunity to online platforms from legal liability for user-generated content. Section 230 is the legal foundation enabling the scale of social media, and its reform or repeal is a contested regulatory debate. See also: Digital Services Act, Online Safety Act, content moderation, GDPR.

Sensitization (Chapter 8) — A neural process opposite to habituation, in which repeated exposure to a stimulus increases rather than decreases the response. Sensitization to social validation cues — a notification about a like or comment — can increase craving over time, contributing to the compulsive checking cycle. See also: habituation, tolerance, variable reinforcement.

Skinner box (Chapter 7) — An experimental apparatus developed by B.F. Skinner to study operant conditioning, in which an animal presses a lever to receive rewards on a variable schedule. The Skinner box has become a widely used metaphor for social media platforms, which function as highly sophisticated variable-reward environments where users are the experimental subjects. See also: operant conditioning, variable reinforcement, dopamine, behavioral addiction.

Social comparison theory (Chapter 11) — Leon Festinger's 1954 theory that humans have a fundamental drive to evaluate their abilities, opinions, and circumstances by comparing themselves to others. Social media provides an unprecedented constant stream of upward social comparison targets — curated highlights of others' lives — that reliably generate negative affect. See also: FOMO, identity formation, parasocial relationship, social identity theory.

Social graph (Chapter 22) — The network of relationships between users on a social media platform, represented as a graph of nodes (users) and edges (connections). The social graph is a core platform asset and a primary source of lock-in, since users cannot easily export their connections to competitor platforms. See also: network effects, lock-in, platform dependency.

Social identity theory (Chapter 11) — Henri Tajfel and John Turner's framework proposing that individuals derive a significant portion of their self-concept from membership in social groups, and that in-group favoritism and out-group derogation are natural consequences of social categorization. Social media amplifies social identity dynamics, intensifying in-group/out-group distinctions and contributing to polarization. See also: social comparison theory, polarization, echo chamber, identity formation.

Social proof (Chapter 16) — The tendency to look to others' behavior as evidence of correct action in situations of uncertainty, making high-engagement content appear more credible or worthwhile simply because many others have engaged with it. Platforms quantify social proof through visible like counts, share tallies, and follower numbers. See also: reciprocity principle, cognitive bias, dark patterns, engagement metric.

Streak mechanic (Chapter 16) — A gamification feature that rewards users for consecutive-day engagement with a platform or activity, and penalizes (through visual breakage of the streak) any day's failure to engage. Streak mechanics are a particularly powerful implementation of loss aversion, making non-use feel like a loss rather than neutral. Prominently used by Snapchat and Duolingo. See also: loss aversion, gamification, variable reinforcement, dark patterns.

Surveillance capitalism (Chapter 3) — Shoshana Zuboff's term for an economic logic in which behavioral data harvested from users is treated as a free raw material, processed into predictive products, and sold to businesses seeking to influence future behavior. Surveillance capitalism describes the business model of Google, Facebook, and most major social media platforms. See also: behavioral surplus, data exhaust, programmatic advertising, attention economy.

Switching cost (Chapter 29) — The economic, social, or psychological costs incurred when a user moves from one platform to another, including the loss of social connections, content history, and learned behaviors. High switching costs are a deliberate platform design objective. See also: lock-in, platform dependency, churn.


T

Task-switching cost (Chapter 9) — The cognitive performance penalty incurred when shifting attention from one task to another, manifest as slower response times and higher error rates for the period after the switch. Frequent social media interruptions impose substantial cumulative task-switching costs on users who attempt to combine platform use with demanding cognitive work. See also: attention residue, cognitive load, lateral inhibition.

Time Well Spent (Chapter 39) — A movement and reframing of platform design goals, originated by Tristan Harris, that advocates for measuring platform value by whether users feel their time was well spent rather than by raw engagement duration. Time Well Spent provided the conceptual foundation for the Center for Humane Technology. See also: humane technology, Center for Humane Technology, digital minimalism, engagement metric.

TikTok For You Page (FYP) (Chapter 26) — TikTok's primary content discovery feed, which presents algorithmically curated videos to users based on engagement signals rather than social connections. The FYP is widely regarded as the most effective recommendation algorithm yet deployed in social media, demonstrating that compelling content from strangers can be more engaging than content from known connections. See also: algorithmic curation, collaborative filtering, recommendation algorithm, engagement optimization.

Tolerance (Chapter 8) — In behavioral addiction, the phenomenon whereby increasing amounts of a rewarding behavior are required to achieve the same subjective effect. Social media users may exhibit tolerance through increasing time-on-platform, escalating content intensity needs, or reduced satisfaction from the same social validation signals that previously produced reward. See also: habituation, sensitization, behavioral addiction, withdrawal.

Tristan Harris (Chapter 39) — Former Google design ethicist turned technology critic and co-founder of the Center for Humane Technology, whose 2013 internal Google presentation "A Call to Minimize Distraction and Respect Users' Attention" catalyzed the humane technology movement. Harris's arguments before the US Senate and in the documentary The Social Dilemma brought attention to persuasive technology design to mainstream audiences.


U

UX (User Experience) (Chapter 14) — The totality of a user's interaction with and perception of a product or system, encompassing usability, aesthetics, emotional response, and functional effectiveness. UX design has become a domain in which dark patterns and persuasive technology are systematically applied to maximize engagement at the expense of genuine positive experience. See also: dark patterns, ethical design, persuasive technology.


V

Vanity metrics (Chapter 20) — Engagement metrics that look impressive — follower counts, total views, page likes — but do not necessarily correlate with meaningful outcomes. Platforms surface vanity metrics prominently because they drive social comparison and motivate continued content production and engagement, regardless of actual value. See also: engagement metric, social proof, social comparison theory.

Variable reinforcement (Chapter 7) — A reward schedule in which reinforcement is delivered unpredictably — sometimes after many responses, sometimes after few — rather than on a fixed schedule. Variable ratio schedules produce the highest rates of responding and the greatest resistance to extinction in both animal and human subjects, making them the most powerful and habit-forming reinforcement schedule known. Social media notifications, likes, and content feeds all implement variable reinforcement. See also: Skinner box, operant conditioning, dopamine, reward prediction error.

Viral coefficient (Chapter 20) — A metric measuring the number of new users each existing user generates through sharing or referral behavior. A viral coefficient above 1 produces exponential growth. Content designed for virality typically exploits emotional extremity, social identity, or outrage rather than accuracy or nuance. See also: outrage amplification, algorithmic amplification, engagement optimization.

Viral loop (Chapter 20) — A product design pattern in which the normal use of a product inherently generates invitations or exposures that bring in new users. Social media sharing mechanics, tagging features, and public posts all function as components of viral loops. See also: viral coefficient, network effects, engagement optimization.


W

Withdrawal (Chapter 7) — In behavioral addiction, negative psychological and physiological states — anxiety, irritability, restlessness — experienced when a habitual behavior is interrupted or reduced. Social media withdrawal symptoms, including anxiety, boredom, and preoccupation with the platform, are well-documented in experimental abstinence studies. See also: behavioral addiction, compulsive use, tolerance, variable reinforcement.

Working memory (Chapter 9) — The cognitive system responsible for temporarily holding and manipulating a limited amount of information during active cognitive tasks. Social media's fragmentation of attention and the cognitive residue it generates significantly impairs working memory capacity during subsequent cognitively demanding activities. See also: cognitive load, attention residue, task-switching cost.


Z

Zero-sum attention (Chapter 1) — The property of attention that makes it a genuinely scarce resource: time spent engaging with one platform or content item is time unavailable for all others, including offline relationships, creative work, rest, and civic participation. Zero-sum attention is the foundational economic logic of the attention economy. See also: attention economy, attention budget, digital minimalism.


Supplementary Entries

Affinity score (Chapter 5) — A measure, used in Facebook's EdgeRank and successor systems, of the strength of a user's relationship with a content source, based on interaction history. Higher affinity scores increase the probability that content from a given connection appears in a user's feed. See also: EdgeRank, algorithmic curation.

Agency, user (Chapter 36) — The capacity of individuals to make genuine, unmanipulated choices about their technology use. The erosion of user agency through dark patterns, persuasive design, and addictive mechanics is a central ethical concern of the book. See also: autonomy-preserving design, consent architecture, ethical design.

Anxiety, social media-related (Chapter 30) — A pattern of platform-specific anxiety encompassing fear of missing out, social comparison distress, notification anticipation anxiety, and approval-seeking behavior. Social media-related anxiety is distinct from but interacts with generalized anxiety disorder. See also: FOMO, social comparison theory, problematic use.

Audit, algorithmic (Chapter 38) — An independent technical examination of a platform's algorithmic systems to assess compliance, fairness, and societal impact. Algorithmic audits are proposed or mandated under the Digital Services Act and represent a key tool for regulatory oversight of AI-driven curation systems. See also: algorithmic transparency, Digital Services Act, engagement optimization.

Behavioral design (Chapter 14) — The application of behavioral science — psychology, behavioral economics, neuroscience — to the design of products and systems in order to produce desired user behaviors. Behavioral design is the field from which dark patterns and engagement optimization techniques are drawn. See also: dark patterns, persuasive technology, captology.

Biometric surveillance (Chapter 3) — The collection of physiological or behavioral data — including facial expressions, eye movements, heartrate, and gait — from users, increasingly enabled by device sensors and camera systems. Biometric surveillance represents a frontier of behavioral data extraction beyond self-reported and click-stream data. See also: affective computing, behavioral surplus, surveillance capitalism.

Bot (Chapter 20) — An automated account that mimics human user behavior on social media platforms, used for amplifying content, creating false impressions of social consensus, spreading misinformation, or inflating engagement metrics. Bots represent a significant corruption of the social proof signals that underpin platform engagement logic. See also: social proof, misinformation, astroturfing.

Breakup fee (Chapter 29) — Colloquially, the social and practical costs of leaving a platform — lost connections, content, and communities — that function like a penalty for platform exit. Breakup fees are the experiential dimension of switching costs. See also: switching cost, lock-in, platform dependency.

Candy Crush variable reward (Chapter 16) — A reference to mobile games' implementation of variable reinforcement, used in academic and design discourse as a shorthand for the engagement pattern adopted by social media platforms. The term underlines the deliberate borrowing of game addiction mechanics by social media designers. See also: variable reinforcement, gamification, Skinner box.

Center for Humane Technology (CHT) (Chapter 39) — A nonprofit organization co-founded by Tristan Harris and Aza Raskin in 2018 (evolving from the earlier Time Well Spent initiative) that advocates for fundamental reform of the technology industry's design and business model practices. The CHT produced the documentary The Social Dilemma (2020) and has submitted testimony to multiple legislative bodies. See also: Time Well Spent, humane technology, design ethics, Tristan Harris.

Chilling effect (Chapter 34) — The suppression of speech or behavior that results from the knowledge or fear of surveillance. Pervasive social media surveillance may produce chilling effects on political expression, dissent, and identity exploration. See also: surveillance capitalism, Section 230.

Color psychology (Chapter 14) — The application of research on color's psychological effects to interface design, particularly in choosing notification colors, like-button colors, and platform color schemes to maximize emotional response and engagement. Facebook's signature blue and Instagram's red notification badge are deliberate choices informed by color psychology research. See also: dark patterns, behavioral design.

Compulsion loop (Chapter 16) — A cyclical behavior pattern, adapted from game design, consisting of a trigger, an action, and a reward, which is repeated to generate persistent engagement. Social media feeds are structured as compulsion loops: a notification (trigger) prompts opening the app (action) which delivers variable social reward (reward), reinforcing the loop. See also: variable reinforcement, gamification, operant conditioning, dark patterns.

Conditioned response (Chapter 8) — A learned behavioral or physiological response triggered by a stimulus that has been repeatedly associated with a primary reward. Smartphone notification sounds and vibrations function as conditioned stimuli that trigger anticipatory dopamine release and compulsive checking. See also: variable reinforcement, dopamine, habituation.

Context collapse (Chapter 20) — danah boyd's term for the phenomenon in which content produced for one social context (friends, family, colleagues) is exposed to multiple unintended audiences simultaneously on social media. Context collapse creates chronic social performance anxiety and complicates authentic self-expression. See also: identity formation, parasocial relationship, networked public.

Cyberbullying (Chapter 31) — The use of digital communications platforms to bully, harass, or intimidate others, particularly among adolescents. Platform design choices — public comment sections, anonymous accounts, audience amplification — make social media environments uniquely conducive to cyberbullying. See also: adolescent brain development, content moderation, Online Safety Act.

Data minimization (Chapter 38) — A privacy principle, mandated under GDPR, requiring that data collection be limited to what is strictly necessary for the stated purpose. Data minimization is in direct tension with the behavioral surplus extraction logic of surveillance capitalism. See also: GDPR, surveillance capitalism, consent architecture.

Deepfake (Chapter 32) — Synthetic media, generated by deep learning systems, that convincingly depicts people saying or doing things they never said or did. Deepfakes represent an emerging threat to epistemic security and interact with social media's misinformation amplification dynamics. See also: misinformation, algorithmic amplification.

Deliberate practice (Chapter 36) — Anders Ericsson's concept of highly focused, feedback-driven skill development. Digital minimalism advocates contrast deliberate practice with the passive, fragmented consumption that social media promotes. See also: digital minimalism, attention budget, flow state.

Digital detox (Chapter 36) — An intentional, time-limited abstinence from social media or digital devices, intended to break habitual use patterns and recalibrate baseline attention and wellbeing. Research on digital detox efficacy shows mixed results, suggesting systemic design changes are more effective than individual willpower. See also: digital minimalism, withdrawal, screen time.

Digital literacy (Chapter 40) — The ability to critically evaluate, navigate, and use digital media and information systems. Digital literacy education is increasingly proposed as a countermeasure to the manipulative design of social media platforms. See also: media literacy, design ethics, consent architecture.

Disappearing content (Chapter 15) — Content — particularly Stories on Instagram and Snapchat — that is automatically deleted after a fixed time period (typically 24 hours). Disappearing content exploits loss aversion and FOMO, creating urgency to view content before it vanishes. See also: loss aversion, FOMO, streak mechanic.

Dopaminergic pathway (Chapter 7) — The neural circuits that use dopamine as their primary neurotransmitter, including the mesolimbic pathway (involved in reward and addiction) and the mesocortical pathway (involved in motivation and cognition). Social media engagement activates dopaminergic pathways in ways structurally similar to other behavioral addictions. See also: dopamine, nucleus accumbens, reward prediction error.

Dunbar's number (Chapter 20) — Robin Dunbar's hypothesis that humans can maintain stable social relationships with approximately 150 individuals, due to cognitive constraints. Social media friends lists and follower counts routinely exceed Dunbar's number by orders of magnitude, potentially degrading the quality of social connection while increasing its quantity. See also: parasocial relationship, social graph, social comparison theory.

Engagement bait (Chapter 18) — Content specifically designed to generate high engagement metrics — typically through controversy, outrage, emotional provocation, or artificial calls for interaction — rather than genuine informational or social value. Engagement bait is a direct consequence of algorithmic systems that reward engagement volume regardless of quality. See also: outrage amplification, engagement metric, algorithmic amplification.

Epistemic autonomy (Chapter 34) — The capacity of individuals to form beliefs through their own reasoning processes, based on diverse information sources. Platforms threaten epistemic autonomy through filter bubbles, algorithmic curation, and the amplification of misinformation. See also: epistemic bubble, autonomy-preserving design, filter bubble, algorithmic transparency.

Externality, negative (Chapter 35) — An economic term for costs imposed on parties who are not involved in a transaction, not reflected in market prices. Social media's harms — to mental health, democracy, and public discourse — are negative externalities of the attention economy business model. See also: surveillance capitalism, attention economy, polarization.

Fogg Behavior Model (Chapter 14) — B.J. Fogg's framework proposing that behavior is a function of motivation, ability, and prompt: behavior occurs when sufficient motivation and ability coincide with a prompt at the same moment. The Fogg Behavior Model is the theoretical foundation of most persuasive technology design. See also: persuasive technology, captology, dark patterns, Fogg, B.J..

Friction, artificial (Chapter 14) — Obstacles deliberately placed in the path of user actions that the platform wishes to discourage — such as multi-step account deletion processes, buried privacy settings, or repeated confirmation prompts for disengagement. Artificial friction is an asymmetric dark pattern that biases user behavior toward continued engagement. See also: friction, dark patterns, default settings.

Influencer (Chapter 29) — An individual who has developed a significant audience on social media platforms and leverages that audience for commercial endorsement, brand partnerships, or direct monetization. Influencers are simultaneously beneficiaries of platform engagement systems and dependent on them, creating complex relationships with platform regulation. See also: creator economy, parasocial relationship, platform dependency.

Intermittent reinforcement (Chapter 7) — Synonymous with variable reinforcement in many contexts; specifically emphasizes the non-continuous, unpredictable delivery of reward. Intermittent reinforcement produces particularly strong behavioral conditioning and is the principle underlying the addictive power of social media notifications. See also: variable reinforcement, operant conditioning, Skinner box.

Media literacy (Chapter 40) — The ability to access, analyze, evaluate, and create media. Media literacy is proposed as both an educational intervention against platform manipulation and a democratic capacity requiring protection. See also: digital literacy, epistemic autonomy, filter bubble.

Mental model (Chapter 14) — A user's internal representation of how a system works. Platforms exploit gaps between users' mental models and actual algorithmic behavior to generate unintended engagement, data disclosures, and consent. See also: consent architecture, dark patterns, algorithmic transparency.

Minimization, data — See data minimization.

Mood board effect (Chapter 11) — The distortion of social reality arising from consuming social media feeds composed primarily of others' most flattering, exciting, and successful moments. The mood board effect contributes to upward social comparison and reduced wellbeing. See also: social comparison theory, FOMO, identity formation.

Network effects (Chapter 29) — The phenomenon in which the value of a platform increases as more users join it, creating a self-reinforcing growth dynamic and significant barriers to competitive entry. Network effects are the primary structural source of platform power and monopolization. See also: platform dependency, lock-in, social graph.

Notification fatigue (Chapter 15) — The diminished responsiveness to notifications that occurs after prolonged exposure to high notification volumes, analogous to habituation. Notification fatigue complicates the variable reinforcement model — but platforms counter it through personalization, relevance optimization, and bundling strategies. See also: habituation, notification timing, bundled notifications.

Nudge (Chapter 14) — Richard Thaler and Cass Sunstein's term for a behavioral intervention that alters behavior through choice architecture without restricting options or changing economic incentives. Platform dark patterns are often technically nudges, though they operate in the opposite direction from the welfare-promoting nudges Thaler and Sunstein envisioned. See also: dark patterns, behavioral economics, default settings.

Oversharing (Chapter 20) — The disclosure of more personal information than social norms or personal interest would dictate, often driven by social validation dynamics and the interface design of social platforms. Oversharing is facilitated by platforms' streamlined posting interfaces and social reward for personal disclosure. See also: surveillance capitalism, behavioral surplus, context collapse.

Phantom vibration syndrome (Chapter 8) — The perceived sensation that a smartphone is vibrating when it is not, indicating the degree to which anticipatory attention to device notifications has been neurologically conditioned. Phantom vibrations represent one of the clearest behavioral markers of conditioned notification anxiety. See also: conditioned response, variable reinforcement, notification timing.

Polarization, affective (Chapter 33) — A form of political polarization characterized by increasing emotional hostility toward political opponents rather than (or in addition to) growing ideological distance between them. Social media outrage amplification is more directly linked to affective polarization than to ideological polarization. See also: outrage amplification, polarization, political, echo chamber.

Pull-to-refresh (Chapter 12) — A mobile interface gesture in which pulling downward on a feed and releasing triggers a content refresh, mimicking the physical action of a slot machine lever. Aza Raskin and others have noted that the slot machine analogy is not incidental — the gesture was designed to exploit variable reinforcement. See also: variable reinforcement, Skinner box, infinite scroll.

Regulatory capture (Chapter 37) — The process by which regulatory agencies become dominated by the industries they are supposed to regulate, prioritizing industry interests over public welfare. Regulatory capture is a significant obstacle to effective platform regulation. See also: Section 230, Digital Services Act, surveillance capitalism.

Self-regulation, digital (Chapter 36) — Individual strategies for managing one's own social media use, including screen time limits, notification management, and intentional phone-free periods. Research consistently shows that self-regulation is insufficient as a systemic response to designed addiction, though it remains valuable at the individual level. See also: digital minimalism, attention budget, screen time.

Slot machine effect (Chapter 7) — The analogy, popularized by Tristan Harris, between the pull-to-refresh gesture on a smartphone and the lever pull of a slot machine, emphasizing that both are variable-ratio reinforcement schedules delivering unpredictable reward. The slot machine effect captures the addictive mechanism at the core of social media feed design. See also: variable reinforcement, pull-to-refresh, Skinner box, operant conditioning.

Social dilemma (Chapter 35) — The title of a 2020 Netflix documentary featuring former platform employees and researchers discussing the designed addictiveness of social media. More broadly, the "social dilemma" refers to the structural conflict between platform incentives to maximize engagement and societal interests in wellbeing and democracy. See also: Center for Humane Technology, humane technology, surveillance capitalism.

Technoference (Chapter 30) — The interference of mobile device use with face-to-face interactions and relationships. Technoference is associated with reduced relationship quality, impaired parent-child interactions, and negative developmental outcomes for children raised in high-technoference environments. See also: screen time, adolescent brain development, problematic use.

Time-on-platform (Chapter 2) — A metric tracking the total duration of a user's active engagement with a platform in a given period. Time-on-platform is a foundational engagement metric that directly drives advertising inventory and revenue, creating strong incentives to maximize all forms of engagement regardless of quality. See also: engagement metric, DAU, attention economy.

Turing test for addiction (Chapter 7) — The informal proposition that if a platform's engagement mechanisms are indistinguishable from those of a clinically recognized addictive product, they should be regulated as such. The concept challenges platforms' framing of addictive design as merely "engaging." See also: behavioral addiction, variable reinforcement, design ethics.

Unfriending (Chapter 20) — The act of removing a social media connection. The social cost of unfriending — including the risk of perceived offense and social awkwardness — is part of what makes social media connections stickier than their genuine value warrants, contributing to lock-in. See also: lock-in, switching cost, social graph.

Upstream design (Chapter 39) — An approach that addresses the root causes of harmful platform behaviors at the level of system architecture, business model, and incentive structure, rather than through downstream interventions like content moderation or user education. Upstream design is the preferred approach of humane technology advocates. See also: humane technology, ethical design, design ethics.

Variable reward ratio (Chapter 7) — The specific form of variable reinforcement in which the number of responses required for a reward varies unpredictably. Variable ratio schedules are the most resistant to extinction and the most powerfully habit-forming reinforcement patterns. Social media feeds implement variable reward ratios: users must scroll through variable amounts of uninteresting content before encountering something rewarding. See also: variable reinforcement, operant conditioning, Skinner box.


This glossary will be updated in subsequent editions as the field evolves. Readers who identify omissions or inaccuracies are encouraged to contact the editorial team.