Glossary
A comprehensive reference of key terms used throughout Cross-Domain Pattern Recognition: The View From Everywhere. Each entry provides a concise definition and the chapter where the term is first introduced. Terms are organized alphabetically by letter heading.
A
Abstraction ladder — A hierarchy of representations ranging from concrete details to highly general principles; moving up the ladder reveals cross-domain patterns while moving down reveals domain-specific mechanics. First introduced: Ch. 1
Adaptive landscape — A metaphorical surface where peaks represent high-fitness solutions and valleys represent low-fitness ones; organisms, organizations, and algorithms navigate this landscape in search of better outcomes. First introduced: Ch. 7
Adjacent possible — The set of all things that are one step away from what currently exists; coined by Stuart Kauffman in biology and extended by Steven Johnson to innovation, it explains why certain discoveries can only happen when preconditions are met. First introduced: Ch. 25
Agent-based modeling — A computational approach that simulates the behavior of autonomous agents to understand emergent system-level phenomena; used in ecology, economics, and social science. First introduced: Ch. 3
Alignment problem — The challenge of ensuring that an optimizing system (whether AI, bureaucracy, or incentive structure) actually pursues the goals its designers intended rather than proxy metrics. First introduced: Ch. 15
Allometric scaling — The systematic relationship between body size and biological properties (metabolic rate, lifespan, heart rate), following power-law relationships; extended by Geoffrey West to cities and companies. First introduced: Ch. 29
Analogy — A structural correspondence between two systems from different domains; the primary cognitive mechanism enabling cross-domain pattern recognition, but also a source of error when surface similarities mask deep differences. First introduced: Ch. 1
Annealing — A process borrowed from metallurgy in which a system is heated (randomized) and then slowly cooled (constrained) to find optimal configurations; applied in optimization, career strategy, and institutional reform. First introduced: Ch. 13
Antifragility — A property of systems that gain strength from stressors, shocks, and volatility, beyond mere robustness or resilience; coined by Nassim Nicholas Taleb. First introduced: Ch. 17
Apophenia — The human tendency to perceive meaningful patterns in random or unrelated data; the shadow side of cross-domain pattern recognition. First introduced: Ch. 14
Attractor — A state or set of states toward which a dynamic system tends to evolve over time, regardless of starting conditions; includes point attractors, limit cycles, and strange attractors. First introduced: Ch. 2
Autocatalysis — A process in which the products of a reaction accelerate the reaction itself; a form of positive feedback found in chemistry, economics, and social movements. First introduced: Ch. 2
B
Balancing loop — See negative feedback loop. A feedback mechanism that counteracts change, pushing a system back toward equilibrium. First introduced: Ch. 2
Base rate — The underlying frequency of an event in a population, often neglected in human reasoning (base rate neglect); crucial for proper Bayesian updating. First introduced: Ch. 10
Bayesian reasoning — A method of updating beliefs by combining prior probability with new evidence, following Bayes' theorem; appears as a pattern in science, medicine, machine learning, and everyday judgment. First introduced: Ch. 10
Bias-variance tradeoff — The fundamental tension between a model's ability to fit training data closely (low bias, high variance) and its ability to generalize to new data (higher bias, lower variance); applies far beyond machine learning to policy, education, and personal decision-making. First introduced: Ch. 14
Black swan — A high-impact, low-probability event that is unpredictable from within existing frameworks and is retrospectively rationalized; term popularized by Nassim Nicholas Taleb. First introduced: Ch. 4
Boundary object — An artifact, concept, or practice that is shared across communities and interpreted differently by each, yet maintains enough common identity to facilitate coordination; coined by Susan Leigh Star and James Griesemer. First introduced: Ch. 27
Bounded rationality — Herbert Simon's concept that human decision-making is constrained by limited information, limited cognitive capacity, and limited time, leading to satisficing rather than optimizing. First introduced: Ch. 12
Brittleness — The property of a system that functions well under normal conditions but fails catastrophically under stress, typically due to over-optimization and lack of redundancy. First introduced: Ch. 17
C
Carrying capacity — The maximum population or activity level that an environment can sustain indefinitely; the upper asymptote of the S-curve in ecological and technological contexts. First introduced: Ch. 33
Cascade — A chain reaction in which one event triggers subsequent events, potentially amplifying through a network; appears in financial systems, power grids, ecosystems, and social contagion. First introduced: Ch. 18
Causal opacity — The difficulty of identifying cause-and-effect relationships in complex systems where multiple variables interact nonlinearly with time delays. First introduced: Ch. 28
Central limit theorem — The statistical principle that the sum of many independent random variables tends toward a normal distribution; explains why Gaussian statistics work for many phenomena but fail for fat-tailed ones. First introduced: Ch. 4
Chesterton's fence — The principle that one should not remove a fence (rule, institution, or practice) until one understands why it was put there; a call for epistemic humility before reform. First introduced: Ch. 38
Cobra effect — An unintended consequence where a solution to a problem actually makes the problem worse, often through perverse incentives; named after the apocryphal British bounty on cobras in colonial India. First introduced: Ch. 21
Coevolution — The process by which two or more interacting entities (species, technologies, institutions) shape each other's development over time. First introduced: Ch. 8
Complex adaptive system (CAS) — A system composed of many interacting agents that adapt to each other, producing emergent behavior that cannot be predicted from the properties of individual components. First introduced: Ch. 3
Complexity — The study of systems with many interacting parts whose collective behavior is more than the sum of their parts; distinguished from mere complication by the presence of feedback, emergence, and adaptation. First introduced: Ch. 1
Confirmation bias — The tendency to seek, interpret, and remember information that confirms pre-existing beliefs while discounting contradictory evidence. First introduced: Ch. 10
Conservation law — A principle stating that certain quantities remain constant within a closed system even as the system changes; extended metaphorically from physics to attention, trust, and complexity in human systems. First introduced: Ch. 41
Convergent evolution — The independent evolution of similar traits in unrelated lineages facing similar environmental pressures; used as a metaphor for why similar patterns arise independently across domains. First introduced: Ch. 1
Cooperation without trust — Mechanisms that enable collaborative outcomes among self-interested agents without requiring mutual goodwill, including iterated games, reputation systems, and institutional design. First introduced: Ch. 11
Critical mass — The minimum amount of something (fissile material, adopters, participants) needed to sustain a self-reinforcing process; a threshold concept related to phase transitions. First introduced: Ch. 5
Cross-domain pattern — A structural or dynamic regularity that appears across multiple, seemingly unrelated fields; the central subject of this book. First introduced: Ch. 1
Curse of dimensionality — The phenomenon whereby the computational or data requirements for solving a problem grow exponentially with the number of variables (dimensions) involved. First introduced: Ch. 7
D
Dark knowledge — Knowledge that exists within a system but is not documented, formalized, or easily communicated; includes institutional memory, craft traditions, and embodied expertise. First introduced: Ch. 28
Debt (as pattern) — The cross-domain pattern of borrowing from the future to fund the present, appearing as financial debt, technical debt, sleep debt, ecological debt, and social debt. First introduced: Ch. 30
Degeneracy — The ability of structurally different components to perform the same function; a form of redundancy found in genetic codes, neural circuits, and organizational roles that enhances robustness. First introduced: Ch. 17
Diminishing returns — The principle that incremental inputs yield progressively smaller incremental outputs; the slope-flattening portion of the S-curve. First introduced: Ch. 33
Distributed system — A system in which processing, decision-making, or control is spread across many nodes rather than concentrated in a central authority; contrasted with centralized systems. First introduced: Ch. 9
Domain — A distinct field of knowledge or practice (e.g., biology, economics, music) with its own vocabulary, methods, and community; the boundaries that cross-domain thinking traverses. First introduced: Ch. 1
E
Ecological succession — The predictable sequence of community changes following a disturbance, from pioneer species to climax community; applied as a pattern to technological platforms, genres, and neighborhoods. First introduced: Ch. 32
Edge of chaos — The narrow zone between rigid order and complete randomness where complex adaptive systems are most creative and adaptive; a concept from complexity science. First introduced: Ch. 5
Emergence — The phenomenon whereby collective behavior arises from the interactions of simpler components in ways that cannot be predicted or deduced from the components alone. First introduced: Ch. 3
Entrenchment — The process by which early choices become increasingly difficult to reverse as more systems and dependencies build upon them; related to path dependence and lock-in. First introduced: Ch. 30
Entropy — A measure of disorder or uncertainty in a system; in information theory, a measure of the average information content of a message; central to understanding both physical and informational systems. First introduced: Ch. 6
Epistemic humility — The recognition that one's knowledge is always incomplete and potentially wrong; a core virtue of cross-domain thinking. First introduced: Ch. 22
Equilibrium — A state in which opposing forces or processes balance each other, producing no net change; may be stable (returning after perturbation) or unstable (departing after perturbation). First introduced: Ch. 2
Ergodicity — The property of a system in which time averages equal ensemble averages; many human systems are non-ergodic, meaning what happens to the average is not what happens to the individual. First introduced: Ch. 4
Explore/exploit tradeoff — The fundamental tension between gathering new information (exploring) and using existing knowledge for known rewards (exploiting); appears in evolution, business strategy, learning, and algorithm design. First introduced: Ch. 8
Externality — A cost or benefit that affects parties not directly involved in a transaction or decision; a source of system-level dysfunction when individual optimization diverges from collective welfare. First introduced: Ch. 21
F
False negative (Type II error) — Failing to detect a signal or pattern that is actually present; the cost of being too conservative in filtering. First introduced: Ch. 6
False positive (Type I error) — Detecting a signal or pattern that is not actually present; the cost of being too liberal in filtering. First introduced: Ch. 6
Fat tails — A property of probability distributions where extreme events are far more likely than a normal (Gaussian) distribution would predict; characteristic of earthquakes, financial crashes, pandemics, and wars. First introduced: Ch. 4
Feedback loop — A circular causal pathway in which the output of a system feeds back as input, either amplifying change (positive/reinforcing) or dampening it (negative/balancing). First introduced: Ch. 2
Fitness landscape — See adaptive landscape. The mapping of possible configurations to their fitness or performance; organisms and organizations navigate these landscapes in search of better outcomes. First introduced: Ch. 7
Fragility — The property of being harmed by volatility, uncertainty, and stressors; the opposite of antifragility. First introduced: Ch. 17
Free rider problem — A situation in which individuals benefit from a collective resource without contributing to its maintenance, threatening the sustainability of cooperation. First introduced: Ch. 11
G
Gaussian distribution — The normal (bell curve) distribution that describes many natural phenomena with thin tails; dangerous when applied to fat-tailed systems. First introduced: Ch. 4
Goodhart's Law — "When a measure becomes a target, it ceases to be a good measure." The principle that optimizing for a proxy metric distorts the system being measured. First introduced: Ch. 15
Graceful degradation — The property of a system that loses functionality gradually rather than catastrophically when components fail; achieved through redundancy and loose coupling. First introduced: Ch. 17
Gradient — The direction and rate of steepest change in a function or landscape; the information used by gradient descent to navigate toward optima. First introduced: Ch. 7
Gradient descent — An optimization process in which a system moves incrementally in the direction of greatest improvement; appears in machine learning, evolution, market pricing, and river formation. First introduced: Ch. 7
H
Hayekian knowledge problem — Friedrich Hayek's insight that economically relevant knowledge is dispersed among millions of individuals and cannot be aggregated by any central planner; an argument for distributed systems. First introduced: Ch. 9
Heuristic — A mental shortcut or rule of thumb that enables quick decisions under uncertainty; effective in many situations but vulnerable to systematic biases. First introduced: Ch. 12
Homomorphism — See isomorphism. A structural correspondence between systems that preserves some (but not all) relationships; a weaker but more common form of cross-domain analogy. First introduced: Ch. 1
Hysteresis — The phenomenon whereby a system's state depends on its history, not just current conditions; a phase transition that occurs at different thresholds going up versus coming down. First introduced: Ch. 5
I
Iatrogenesis — Harm caused by the healer; originally a medical term, extended to any situation where well-intentioned intervention makes things worse. First introduced: Ch. 19
Information asymmetry — A situation in which one party in a transaction or relationship possesses more or better information than the other, creating potential for exploitation. First introduced: Ch. 34
Information (as currency) — The concept that information is the fundamental medium of exchange across all domains — physical, biological, and social — underlying everything from DNA to markets to neural processing. First introduced: Ch. 39
Institutional memory — The accumulated knowledge, procedures, and cultural practices that enable an organization to function, often carried in the minds of long-term members rather than in documents. First introduced: Ch. 28
Invariance — A property that remains unchanged under a transformation; the mathematical heart of symmetry and conservation laws. First introduced: Ch. 40
Isomorphism — A structural correspondence between two systems that preserves all relevant relationships; the deepest form of cross-domain pattern, suggesting a shared underlying structure. First introduced: Ch. 1
K
Kleiber's law — The scaling relationship between an animal's metabolic rate and its body mass, following a 3/4 power law; one of the most robust scaling laws in biology. First introduced: Ch. 29
Knowledge transfer — The process of applying insights or methods from one domain to another; the practical skill at the heart of cross-domain thinking. First introduced: Ch. 1
L
Legibility — The degree to which a system can be seen, understood, and measured by an outside observer (typically the state or management); coined by James C. Scott. First introduced: Ch. 16
Legibility trap — The systematic error of preferring legible (measurable, standardized) information over illegible (tacit, local, contextual) information, leading to impoverished decision-making. First introduced: Ch. 20
Leverage point — A place in a system where a small change can produce large effects; identified by Donella Meadows as ranging from parameters (weak) to paradigms (strong). First introduced: Ch. 2
Lindy effect — The observation that for non-perishable entities (ideas, technologies, books), the longer something has survived, the longer its remaining expected lifespan; future life expectancy is proportional to current age. First introduced: Ch. 31
Local optimum — A solution that is better than all nearby alternatives but not the globally best solution; the trap that gradient descent can fall into without mechanisms for escape. First introduced: Ch. 7
Lock-in — A state where a system becomes committed to a particular path, technology, or institutional arrangement, making change increasingly costly even when better alternatives exist. First introduced: Ch. 30
Log-normal distribution — A probability distribution whose logarithm is normally distributed; often found in size distributions (income, city populations) and sometimes confused with power laws. First introduced: Ch. 4
Loose coupling — A system design in which components interact through well-defined interfaces with limited mutual dependence, allowing failure in one component to be contained. First introduced: Ch. 18
M
Map-territory relation — Alfred Korzybski's insight that all representations (models, maps, theories, metrics) necessarily simplify reality and should not be confused with the territory they represent. First introduced: Ch. 22
Mean reversion — The tendency of a variable to return toward its long-term average over time; commonly observed in financial markets, sports statistics, and biological systems. First introduced: Ch. 4
Mechanism design — The engineering of rules, incentives, and institutions to produce desired outcomes from self-interested agents; sometimes called "reverse game theory." First introduced: Ch. 11
Meta-pattern — A pattern about patterns; a higher-order regularity governing how cross-domain patterns relate to each other. First introduced: Ch. 42
Metis — Practical, local, experience-based knowledge that cannot be easily codified or communicated in formal terms; from Greek, used by James C. Scott to contrast with top-down technical knowledge. First introduced: Ch. 16
Model — A simplified representation of reality that captures some features while ignoring others; all models are wrong, but some are useful (George Box). First introduced: Ch. 22
Modularity — The degree to which a system's components can be separated and recombined; modular systems are more evolvable and more resistant to cascading failure. First introduced: Ch. 17
Moral hazard — A situation in which one party takes risks because another party bears the consequences; a failure mode of systems that separate decision-making from consequences. First introduced: Ch. 34
Multiple discovery — The phenomenon whereby the same invention or discovery is made independently by two or more people at roughly the same time; evidence that innovations are products of their era's adjacent possible. First introduced: Ch. 26
Multiplex network — A network with multiple types of connections between the same nodes; more realistic than simple networks for representing social, biological, and infrastructure systems. First introduced: Ch. 18
N
Narrative capture — The cognitive trap of becoming so attached to a story or explanation that one ignores or distorts evidence that contradicts it; affects individuals, organizations, and cultures. First introduced: Ch. 36
Narrative fallacy — The tendency to construct post-hoc stories that create an illusion of understanding and inevitability around events that were actually unpredictable. First introduced: Ch. 36
Negative feedback loop — A feedback mechanism that counteracts change, pushing a system back toward equilibrium or a set point; the basis of homeostasis and regulation. First introduced: Ch. 2
Network effects — The phenomenon whereby the value of a product or service increases as the number of users grows; a powerful positive feedback loop in platform economics. First introduced: Ch. 5
Noether's theorem — Emmy Noether's proof that every continuous symmetry of a physical system corresponds to a conservation law; extended metaphorically to suggest that human systems also have conserved quantities linked to their symmetries. First introduced: Ch. 41
Noise — Random variation or unwanted signal that obscures meaningful information; distinguishing signal from noise is a fundamental challenge across all domains. First introduced: Ch. 6
Non-ergodicity — See ergodicity. The property of a system in which time averages do not equal ensemble averages, meaning the experience of a typical individual differs from the statistical average across the population. First introduced: Ch. 4
Nonlinearity — A relationship in which the output is not proportional to the input; small inputs may produce large outputs (or vice versa), making prediction difficult. First introduced: Ch. 2
Normal accident — Charles Perrow's concept that certain kinds of accidents are inevitable in tightly coupled, complex systems — not because of component failure but because of interaction effects. First introduced: Ch. 18
O
Observation selection effect — A bias that arises from the conditions necessary for observation itself; we can only observe situations compatible with our existence as observers. First introduced: Ch. 37
Optimization — The process of finding the best solution according to a defined objective function; powerful but dangerous when the objective function fails to capture what actually matters. First introduced: Ch. 7
Order parameter — A quantity that characterizes the state of a system near a phase transition, changing from zero to nonzero (or vice versa) at the critical point. First introduced: Ch. 5
Overfitting — The error of tailoring a model too closely to particular data, capturing noise rather than signal, and thus failing to generalize; the pattern-matching equivalent of seeing faces in clouds. First introduced: Ch. 14
P
Paradigm — A set of shared assumptions, methods, and values that define a scientific community's approach to problems; coined by Thomas Kuhn. First introduced: Ch. 24
Paradigm shift — A fundamental change in the basic concepts and experimental practices of a scientific discipline; extended beyond science to any domain where the governing framework transforms. First introduced: Ch. 24
Path dependence — The phenomenon whereby earlier events and decisions constrain later possibilities, so that history matters and initial conditions shape long-term outcomes. First introduced: Ch. 25
Perverse incentive — An incentive structure that produces behavior opposite to what was intended; the mechanism underlying the cobra effect and many Goodhart's Law violations. First introduced: Ch. 21
Phase transition — An abrupt, qualitative change in a system's behavior at a critical threshold, analogous to water turning to ice; appears in physics, social movements, epidemics, and technology adoption. First introduced: Ch. 5
Polanyi's paradox — Michael Polanyi's observation that "we know more than we can tell" — much human knowledge is tacit and cannot be fully articulated or automated. First introduced: Ch. 23
Positive feedback loop — A feedback mechanism that amplifies change, pushing a system further from its starting point; the engine of exponential growth, bubbles, and tipping points. First introduced: Ch. 2
Power law — A mathematical relationship where one quantity varies as a power of another (y = kx^a); produces scale-free distributions with no characteristic scale and fat tails. First introduced: Ch. 4
Precautionary principle — The principle that when an action raises potential threats of harm, precautionary measures should be taken even if cause-and-effect relationships are not fully established; particularly important in fat-tailed domains. First introduced: Ch. 4
Preferential attachment — The mechanism ("rich get richer") by which nodes with more connections attract even more connections, generating power-law degree distributions in networks. First introduced: Ch. 4
Prior (probability) — In Bayesian reasoning, the initial estimate of the probability of a hypothesis before considering new evidence. First introduced: Ch. 10
Prisoner's dilemma — A game theory scenario in which two rational agents may not cooperate even though cooperation would benefit both; the canonical model for cooperation problems. First introduced: Ch. 11
R
Redundancy — The inclusion of extra components, pathways, or capacity beyond what is strictly necessary for normal operation; provides resilience at the cost of efficiency. First introduced: Ch. 17
Reinforcing loop — See positive feedback loop. A feedback mechanism that amplifies change. First introduced: Ch. 2
Replication crisis — The ongoing discovery that many published scientific findings cannot be reproduced, raising fundamental questions about research methods, incentives, and the reliability of knowledge. First introduced: Ch. 14
Resilience — The ability of a system to absorb disturbance and reorganize while retaining its essential structure and function; distinguished from robustness (resisting change) and antifragility (benefiting from change). First introduced: Ch. 17
Robustness — The ability of a system to maintain its function despite perturbation; achieved through redundancy, modularity, and diversity. First introduced: Ch. 17
S
S-curve — A sigmoid curve describing the typical lifecycle of growth: slow start, rapid acceleration, and eventual plateau; appears in technology adoption, biological growth, learning, and empire expansion. First introduced: Ch. 33
Satisficing — Herbert Simon's term for choosing an option that meets a minimum threshold of acceptability rather than searching for the optimal solution; rational behavior under bounded rationality. First introduced: Ch. 12
Scale invariance — The property of looking the same at every level of magnification; characteristic of fractals, power laws, and many natural phenomena. First introduced: Ch. 4
Scaling law — A mathematical relationship between two quantities that holds across many orders of magnitude; reveals deep structural regularities in how systems change with size. First introduced: Ch. 29
Selection bias — A systematic error introduced by the non-random selection of data or subjects, leading to conclusions that do not generalize to the full population. First introduced: Ch. 37
Self-organization — The spontaneous emergence of order from the interactions of components without external direction or central control. First introduced: Ch. 3
Self-organized criticality — Per Bak's concept that many complex systems naturally evolve toward a critical state where small perturbations can cause events of any size, following power-law distributions. First introduced: Ch. 5
Senescence — The process of aging and deterioration in biological organisms; extended as a pattern to institutions, technologies, codebases, and empires. First introduced: Ch. 31
Signal — Meaningful information embedded in a background of noise; detecting genuine signals is a universal challenge across science, medicine, intelligence, and everyday life. First introduced: Ch. 6
Signal-to-noise ratio (SNR) — The ratio of meaningful information to background noise in a communication channel or dataset; a measure of information quality applicable far beyond engineering. First introduced: Ch. 6
Simulated annealing — A computational optimization technique inspired by metallurgical annealing, using controlled randomness to escape local optima. First introduced: Ch. 13
Skin in the game — The condition of bearing the consequences of one's own decisions; Taleb's argument that systems function better when decision-makers are exposed to both the upside and downside of their choices. First introduced: Ch. 34
Spontaneous order — Order that emerges from the voluntary interactions of individuals without being imposed by a central authority; examples include language, markets, and common law. First introduced: Ch. 3
Streetlight effect — The observational bias of searching for answers only where it is easy to look (under the streetlight) rather than where answers are most likely to be found; a pervasive problem in research, policing, and data analysis. First introduced: Ch. 35
Strong ties — Close, frequent connections between individuals (friends, family, close colleagues); important for trust and support but limited in reach. First introduced: Ch. 9
Structural coupling — The mutual influence between a system and its environment, each shaping the other over time through ongoing interaction. First introduced: Ch. 32
Substrate independence — The property of a pattern or process that functions identically regardless of the physical medium in which it is implemented; the reason the same algorithms appear in silicon and neurons, or the same organizational structures appear in cells and companies. First introduced: Ch. 1
Succession (ecological) — See ecological succession. The predictable sequence of changes in community composition after a disturbance. First introduced: Ch. 32
Survivorship bias — The logical error of focusing on entities that passed some selection process while overlooking those that did not, leading to false conclusions about what causes success. First introduced: Ch. 37
Symmetry — An invariance under transformation; a property that remains unchanged when the system is rotated, translated, or otherwise transformed. First introduced: Ch. 40
Symmetry-breaking — The process by which a symmetric state gives way to an asymmetric one, creating differentiation, structure, and complexity; appears in physics, biology, and social systems. First introduced: Ch. 40
System dynamics — A methodology for studying complex systems using feedback loops, stocks, and flows, pioneered by Jay Forrester at MIT. First introduced: Ch. 2
T
Tacit knowledge — Knowledge that is difficult or impossible to articulate in words or formulas; includes skills, intuitions, and practical know-how. First introduced: Ch. 23
Technical debt — Ward Cunningham's metaphor for the accumulated cost of shortcuts and deferred maintenance in software development; the canonical example of debt as a cross-domain pattern. First introduced: Ch. 30
Threshold — A critical value beyond which a system undergoes qualitative change; synonymous with tipping point in many contexts. First introduced: Ch. 5
Threshold concept — A concept that, once understood, fundamentally and irreversibly transforms a learner's perception of a subject; acts as a portal to a new way of thinking. First introduced: Ch. 24
Tight coupling — A system design in which components are closely interdependent with little slack, buffer, or redundancy; efficient but vulnerable to cascading failure. First introduced: Ch. 18
Tipping point — The critical threshold at which a small additional input triggers a large, often irreversible change in a system's behavior; popularized by Malcolm Gladwell but rooted in physics. First introduced: Ch. 5
Tit for Tat — A simple strategy for the iterated prisoner's dilemma: cooperate first, then mirror the opponent's previous move; Robert Axelrod's tournaments showed it to be remarkably effective. First introduced: Ch. 11
Transfer learning — The application of knowledge gained in one context to a different but structurally similar context; a key mechanism of cross-domain pattern recognition. First introduced: Ch. 1
Tragedy of the commons — Garrett Hardin's scenario in which individuals acting in rational self-interest deplete a shared resource, harming all; challenged by Elinor Ostrom's work showing communities can self-govern commons. First introduced: Ch. 11
U
Uncertainty — The state of not knowing, which may be reducible (through more data) or irreducible (fundamental to the system); distinguishing between the two is crucial for appropriate action. First introduced: Ch. 6
Underfitting — The error of using a model too simple to capture the genuine patterns in data; the opposite of overfitting, resulting in missed signal. First introduced: Ch. 14
Universality — The phenomenon in physics where different systems exhibit the same behavior near phase transitions, regardless of their microscopic details; the deepest justification for why cross-domain patterns exist. First introduced: Ch. 5
V
Via negativa — The practice of improving by removing (subtracting harmful inputs) rather than adding (introducing new interventions); often more effective and less risky than via positiva. First introduced: Ch. 19
Volatility — The degree of variation or unpredictability in a system over time; can be harmful (fragile systems) or beneficial (antifragile systems). First introduced: Ch. 13
W
Weak ties — Loose, infrequent connections between individuals (acquaintances, contacts); Mark Granovetter showed these are disproportionately valuable for spreading novel information and bridging communities. First introduced: Ch. 9
Wicked problem — A problem that is difficult or impossible to solve because of incomplete, contradictory, and changing requirements that are often hard to recognize; coined by Rittel and Webber. First introduced: Ch. 22
Z
Zipf's law — The empirical observation that in many datasets (word frequencies, city sizes, website visits), the frequency of an item is inversely proportional to its rank, producing a power-law distribution. First introduced: Ch. 4
This glossary contains over 200 terms. For additional context on any term, consult the chapter of first introduction and the Index for all chapters where the term appears.