> "It is difficult to get a man to understand something, when his salary depends on his not understanding it."
Learning Objectives
- Define institutional sunk cost and explain how it differs from individual sunk cost fallacy
- Identify the five components of consensus switching cost: career investment, reputational capital, textbook infrastructure, funding commitments, and identity investment
- Analyze how sunk cost dynamics kept specific wrong consensuses in place for decades
- Apply the switching cost analysis to your own field as part of the Epistemic Audit
- Distinguish between rational conservatism (healthy resistance to change) and pathological entrenchment (sunk cost maintaining error)
In This Chapter
- Chapter Overview
- 9.1 From Individual Sunk Cost to Institutional Sunk Cost
- 9.2 The Lobotomy Story: A Sunk Cost Tragedy
- 9.3 The Five Components of Switching Cost
- 9.4 The Dietary Fat Hypothesis: Fifty Years of Sunk Cost
- 9.5 The Caloric Theory of Heat: A Historical Parallel
- 9.6 What It Looked Like From Inside
- 9.7 Active Right Now: Where Sunk Cost May Be Maintaining Wrong Answers
- 9.8 Distinguishing Healthy Conservatism from Pathological Entrenchment
- 9.9 Practical Considerations: Reducing Switching Costs
- 9.10 Chapter Summary
- Spaced Review
- What's Next
- Chapter 9 Exercises → exercises.md
- Chapter 9 Quiz → quiz.md
- Case Study: The Lobotomy — When a Nobel Prize Became a Sunk Cost → case-study-01.md
- Case Study: The Dietary Fat Consensus — Fifty Years of Switching Cost → case-study-02.md
Chapter 9: The Sunk Cost of Consensus
"It is difficult to get a man to understand something, when his salary depends on his not understanding it." — Upton Sinclair
Chapter Overview
In 1949, the Nobel Prize in Physiology or Medicine was awarded to António Egas Moniz for developing the prefrontal lobotomy — a surgical procedure that involved severing connections in the brain's frontal lobe to treat mental illness. The procedure had been performed on tens of thousands of patients worldwide. It was endorsed by the most prestigious medical institutions. It was featured in the popular press as a miracle cure for mental illness. Moniz was celebrated as a pioneer who had transformed psychiatry.
By the mid-1950s, the evidence was devastating. Lobotomized patients showed severe personality changes, emotional blunting, intellectual impairment, and in many cases a zombie-like passivity that was worse than the original condition. Follow-up studies revealed that the procedure's benefits had been grossly overstated and its harms grotesquely underreported. The introduction of chlorpromazine (Thorazine) in 1954 provided a pharmaceutical alternative that made lobotomy's crude approach obsolete.
And yet the practice did not stop immediately. It continued through the late 1950s and into the 1960s in some institutions, with an estimated 40,000-50,000 lobotomies performed in the United States alone. Walter Freeman, the most prolific lobotomist in America, continued performing the procedure — including his notorious "ice pick" transorbital technique — into the 1960s, even as the medical establishment was turning against it.
Why did it take so long? The evidence against lobotomy was not ambiguous. The harm was visible, documented, and heartbreaking. The alternative (medication) was available. And yet the practice persisted for years after the case against it was clear.
The answer lies in what this chapter examines: the sunk cost of consensus. By the time the evidence turned against lobotomy, an enormous infrastructure had been built on it. Surgeons had spent years mastering the technique. Hospitals had invested in surgical facilities and protocols. Psychiatric institutions had organized their treatment programs around lobotomy as a primary intervention. The Nobel Prize had validated it at the highest possible level. Careers, reputations, and institutional identities were invested in the procedure.
Abandoning lobotomy didn't just mean acknowledging that the procedure was harmful. It meant acknowledging that thousands of patients had been brain-damaged unnecessarily. It meant that the Nobel Prize had been awarded for a medical atrocity. It meant that the careers built on performing and advocating for lobotomies had been built on error. The switching cost was not just intellectual — it was existential.
The Moral Dimension
The lobotomy case reveals something that the abstract language of "sunk cost" obscures: the moral weight of the investment. When we talk about "switching costs" in business — abandoning a failing product line, writing off a bad investment — the costs are financial. Painful but recoverable. When the "investment" being abandoned is a medical procedure performed on tens of thousands of people, the switching cost includes confronting an irreversible moral catastrophe.
The lobotomized patients could not be un-lobotomized. Their cognitive damage was permanent. Their families had consented to the procedure based on the recommendations of trusted physicians. Acknowledging the error meant acknowledging that the trust had been betrayed — not through malice, but through the structural dynamics of consensus-building in a field that should have been more careful.
This moral dimension amplifies the sunk cost beyond anything a rational cost-benefit analysis can capture. The psychological mechanism of cognitive dissonance (Festinger, 1957) predicts that the greater the harm caused by a decision, the stronger the motivation to justify that decision — because the alternative (accepting that you caused irreversible harm) is psychologically unbearable. The sunk cost is not just economic or reputational. It is moral. And moral sunk costs may be the most resistant of all.
This is Part II of the book: The Persistence Engine. Part I examined how wrong ideas enter fields. Part II examines something arguably more consequential: how wrong ideas stay — why they persist for decades or centuries even after the evidence turns against them. The sunk cost of consensus is the first and most fundamental persistence mechanism.
In this chapter, you will learn to: - Recognize the five components of consensus switching cost - Analyze how sunk cost dynamics maintain wrong answers across multiple fields - Distinguish between healthy conservatism and pathological entrenchment - Apply the switching cost analysis to your own field - Begin Part II's expansion of the Epistemic Audit with persistence diagnostics
🏃 Fast Track: If you're familiar with the sunk cost fallacy at the individual level, start at section 9.3 (The Five Components of Switching Cost) for the institutional-scale analysis.
🔬 Deep Dive: After this chapter, read El-Hai's The Lobotomist for the definitive account of Walter Freeman, and Tavris & Aronson's Mistakes Were Made (But Not by Me) for the psychology of institutional self-justification.
9.1 From Individual Sunk Cost to Institutional Sunk Cost
The sunk cost fallacy in individual psychology is well-known: people continue investing in failing projects because they've already invested so much. The money spent on a bad movie ticket makes you stay for the second half. The years invested in a failing relationship make you reluctant to leave. The career investment in a declining industry makes you resist retraining.
This individual-level phenomenon is real but relatively easy to overcome. You can learn about it, recognize it in yourself, and make better decisions. The institutional version is far more powerful — and far harder to escape — because it is not located in any individual's mind. It is an emergent property of the system.
Why Institutional Sunk Cost Is Different
When an individual commits the sunk cost fallacy, they can change their mind by recognizing the fallacy. But when an institution is locked into a wrong consensus by sunk cost, no individual's recognition of the fallacy is sufficient to change the outcome. The lock-in is structural:
- The surgeon who recognizes that lobotomy is harmful can stop performing it — but the institutional infrastructure (surgical suites, training programs, referral networks, insurance codes) continues to support it.
- The economist who recognizes that equilibrium models are inadequate can publish critiques — but the funding agencies, journals, and tenure committees that evaluate research within the equilibrium framework continue to reward it.
- The teacher who recognizes that standardized testing distorts education can teach differently in their classroom — but the accountability system, the school board, and the state legislature continue to require the tests.
Individual recognition of the sunk cost problem is necessary but not sufficient for correction. The switching cost must be overcome collectively — which requires coordinating the abandonment of investments across many actors simultaneously. This coordination problem is why institutional sunk costs are so much more persistent than individual ones.
🔄 Check Your Understanding (try to answer without scrolling up)
- How does institutional sunk cost differ from individual sunk cost?
- Why is individual recognition of the sunk cost fallacy insufficient to overcome institutional sunk cost?
Verify
1. Individual sunk cost is located in one person's mind and can be overcome by recognizing the fallacy. Institutional sunk cost is an emergent property of the system — it's distributed across careers, infrastructure, training, funding, and identity — and cannot be overcome by any single individual's recognition. 2. Because the lock-in is structural: even if every individual recognizes the problem, the infrastructure (funding, training, journals, regulations, professional standards) continues to support the status quo. Changing requires coordinated collective action, which faces its own barriers.
9.2 The Lobotomy Story: A Sunk Cost Tragedy
Let's trace the sunk cost dynamics through the lobotomy case in full detail, because the pattern is sickeningly clear.
The Investment Phase (1935-1950)
Moniz performed the first prefrontal leucotomy in 1935. Walter Freeman adapted the technique for American practice and became its most aggressive promoter. By the late 1940s, lobotomy was being performed at major medical institutions across the United States and Europe. The investment was enormous:
- Career investment: Dozens of neurosurgeons specialized in the procedure. Freeman alone performed or supervised an estimated 3,500 lobotomies. Other surgeons built careers, reputations, and academic positions on the procedure.
- Institutional investment: Hospitals built surgical facilities, developed protocols, trained staff, and organized patient care around lobotomy as a standard treatment.
- Prestige investment: The 1949 Nobel Prize represented the highest possible validation. Textbooks incorporated lobotomy as a treatment option. Medical schools taught the technique. Professional societies endorsed it.
- Narrative investment: The public story was that lobotomy was a breakthrough — a humane alternative to permanent institutionalization. This narrative was emotionally powerful and had been communicated through major media (including a laudatory Saturday Evening Post article about Freeman).
The Counter-Evidence Phase (1950-1960)
By the early 1950s, the evidence against lobotomy was accumulating rapidly:
- Follow-up studies revealed that many patients had severe cognitive deficits, personality changes, and loss of emotional depth
- The initial claims of improvement were found to be based on superficial assessments (patients were "quieter" — which was treated as improvement rather than recognized as brain damage)
- The introduction of chlorpromazine (1954) provided a less destructive alternative
- Patient advocacy organizations and some psychiatrists began publicly opposing the procedure
- Some lobotomized patients, or their families, began speaking out about the devastating consequences
The Sunk Cost Resistance Phase (1955-1967)
Despite the mounting evidence, the practice continued — not because anyone was evaluating the evidence and concluding that lobotomy was beneficial, but because the cost of acknowledging the error was catastrophic:
- For the surgeons: Admitting that lobotomy was harmful meant admitting that they had brain-damaged thousands of patients. This was not merely a career inconvenience — it was a moral catastrophe. The psychological mechanism of cognitive dissonance (Festinger, 1957) predicts that people will resist information that would require them to acknowledge having caused serious harm.
- For the institutions: Hospitals that had performed lobotomies faced potential liability. Professional organizations that had endorsed the procedure faced credibility damage. The Nobel Committee faced the embarrassing question of whether to revoke the prize (they never did).
- For the field of psychiatry: Lobotomy was psychiatry's most visible surgical intervention. Acknowledging its failure meant acknowledging that psychiatry had subjected patients to an ineffective, damaging, irreversible procedure for two decades. The damage to the field's credibility — already fragile compared to other medical specialties — would be severe.
Walter Freeman continued performing lobotomies until 1967, when a patient died on the operating table during her third lobotomy. His hospital finally revoked his surgical privileges. He had been performing the procedure for 31 years, despite at least a decade of clear evidence against it. His personal sunk cost — his career, his identity, his legacy — was insurmountable.
The Institutional Response
What makes the lobotomy case a pure sunk cost study, rather than merely a story of one rogue physician, is the institutional response. Freeman was the most extreme case, but the broader pattern was systemic:
- Professional societies that had endorsed lobotomy were slow to withdraw their endorsement. The American Medical Association did not formally condemn the procedure; it simply stopped being mentioned as the field moved on.
- Hospitals that had performed thousands of lobotomies did not conduct retrospective reviews of their outcomes. The patients — many of whom were institutionalized and unable to advocate for themselves — were largely forgotten.
- The Nobel Prize was never revoked or officially reconsidered. To this day, Moniz's Nobel remains on the record, without any formal acknowledgment from the Nobel Committee that the prize was awarded for a procedure that harmed more patients than it helped.
- Textbook treatment shifted from advocacy to silence. Rather than explicitly acknowledging the error and analyzing what went wrong, most medical textbooks simply stopped discussing lobotomy. This is the revision myth (Chapter 1, Stage 7) in action: the messy, harmful history was sanitized through omission rather than confronted through analysis.
The institutional response — silence rather than reckoning — is itself a product of sunk cost dynamics. A formal acknowledgment would have required institutions to absorb the reputational, legal, and moral costs of the error. Silence avoided those costs by allowing the error to fade from memory. But silence also prevented the field from learning the structural lessons: Why did this happen? What institutional features allowed it? How can we prevent the next lobotomy?
These questions were never systematically addressed. And so the structural conditions that produced the lobotomy disaster — prestige-driven adoption, inadequate evidence standards, career-invested practitioners, institutional resistance to admitting harm — remain present in medicine today. The specific procedure is gone. The structural vulnerability is intact.
📜 Historical Context: The lobotomy story is sometimes presented as an isolated aberration — a uniquely dark chapter that "could never happen today." This is the revision myth (Chapter 1, Stage 7). The structural dynamics that maintained lobotomy — career investment, institutional prestige, Nobel validation, narrative momentum, and the psychological impossibility of acknowledging having caused mass harm — are present in every field today. The specific procedure is obsolete; the structural forces are permanent.
9.3 The Five Components of Switching Cost
The lobotomy case illustrates all five components of what we might call the consensus switching cost equation:
Component 1: Career Investment
This is the most immediately tangible component. Researchers, practitioners, and educators who have built their careers on the current consensus face career destruction if the consensus changes. This is not abstract: changing your field's fundamental framework can mean that your publications are obsolete, your expertise is devalued, your tenure case is weakened, and your professional identity is invalidated.
The career investment creates a structural asymmetry: the people most qualified to evaluate whether the consensus is wrong are the same people whose careers depend on it being right. This is not a conflict of interest in the conventional sense (no one is being paid to maintain the error). It is a structural feature of how expertise is organized: deep expertise in a paradigm and investment in that paradigm are inseparable.
Consider the numbers. A tenured professor at a major research university has typically spent: 4 years in undergraduate education, 5-7 years in graduate school, 2-4 years in postdoctoral positions, and 6-7 years as a junior faculty member building a tenure case. By the time they achieve tenure — roughly 17-22 years after starting their education — their entire professional identity is organized around a specific framework, methodology, and set of assumptions. Asking them to abandon that framework is asking them to acknowledge that two decades of work was built on a flawed foundation. The psychic cost is immense, even if the career mechanics (tenure provides job security) would survive the transition.
For untenured faculty, the calculus is even worse. Without the security of tenure, challenging the dominant framework is a career-ending move in most institutions. The junior faculty member who publishes work contradicting the senior faculty's framework will be evaluated for tenure by those same senior faculty. The incentive structure ensures that the people with the freshest perspective (junior researchers who haven't yet accumulated sunk costs) are the ones least able to act on it.
Component 2: Reputational Capital
Beyond individual careers, the reputation of the field is invested in the consensus. When a field changes its mind, the public and other fields notice — and the narrative is rarely flattering. "Psychiatry spent two decades brain-damaging patients." "Nutrition science spent fifty years giving wrong dietary advice." "Economics failed to predict the worst financial crisis in eighty years."
Each of these narratives damages the field's credibility, its funding prospects, and its authority on future questions. Fields are understandably reluctant to provide ammunition for these narratives — even when the correction would ultimately improve their credibility.
Component 3: Textbook Infrastructure
When a consensus is embedded in textbooks, training curricula, certification exams, and clinical guidelines, changing the consensus requires changing all of these simultaneously. This is an enormous coordination problem: textbooks take years to revise, curricula take years to reform, certification exams must be rewritten, and clinical guidelines must go through lengthy review processes.
During the transition, practitioners trained under the old consensus and practitioners trained under the new one will coexist — creating confusion, inconsistency, and the possibility of harm. The transition itself is costly, even when the destination is correct.
Component 4: Funding Commitments
Research funding is allocated based on the current consensus. Grants are awarded, laboratories are equipped, graduate students are admitted, and multi-year projects are initiated — all within the existing framework. Changing the consensus means that current grants may become irrelevant, laboratory equipment may become obsolete, and graduate students may find their dissertations unsalvageable.
Funding agencies are also invested: they evaluated and approved the current research programs. Acknowledging that the consensus was wrong implies that their funding decisions were wrong — a conclusion that funding agencies are structurally motivated to resist.
The funding component creates a particularly vicious cycle: research funding flows toward the dominant paradigm, which produces evidence within that paradigm, which justifies continued funding. Researchers who want to investigate alternatives cannot get funded because the funding criteria are defined by the paradigm they're trying to challenge. The paradigm maintains itself through the funding structure, and the funding structure is shaped by the paradigm. Breaking this cycle typically requires either external funding (from a source not committed to the current paradigm) or a crisis dramatic enough to force funding agencies to reconsider their priorities (Chapter 19).
Component 5: Identity Investment
Perhaps the most powerful component, and the one most resistant to rational override: for many practitioners, the consensus is not just a belief — it is part of their professional identity. A gastroenterologist who has spent twenty years treating ulcers with acid-suppression drugs doesn't just believe that ulcers are caused by acid. That belief is woven into their professional identity — their sense of who they are and what they do. Changing the belief requires changing the identity, which is psychologically the most costly form of change.
This identity component explains why some practitioners resist correction even after they privately acknowledge the evidence: the intellectual acknowledgment and the identity adjustment operate on different timescales. You can change your mind in an afternoon. Changing your professional identity takes years.
Social psychologists have documented this gap between private belief and public behavior extensively. Leon Festinger's original work on cognitive dissonance showed that people who invest heavily in a belief become more committed to it when evidence contradicts it — not less. The investment raises the psychological cost of disconfirmation. A gastroenterologist who privately suspects the bacterial hypothesis might be correct but has twenty years invested in the acid model faces an identity crisis that the evidence alone cannot resolve. The intellectual question ("Is H. pylori the cause?") has an answer the evidence supports. The identity question ("Who am I if my career's work was wrong?") does not have an easy answer — and it is the identity question, not the intellectual question, that determines behavior.
This is why the most effective correction strategies (Chapter 33 will explore this in detail) address identity rather than evidence. Offering a practitioner new evidence is necessary but insufficient. Offering them a new identity — a way to be a hero of the correction rather than a victim of it — is what enables actual change.
🧩 Productive Struggle
Consider your own field and your own professional identity. If the core consensus in your domain were shown to be wrong tomorrow, what would the switching cost be for you personally? Not just intellectually — what would it cost in terms of career, reputation, identity, and the investments you've already made?
The discomfort this question generates is itself a data point. If changing your mind would be costly, you have a structural incentive to resist — regardless of the evidence.
9.4 The Dietary Fat Hypothesis: Fifty Years of Sunk Cost
We've encountered the dietary fat hypothesis in multiple previous chapters (authority cascade in Ch.2, survivorship bias in Ch.5, anchoring in Ch.7). Now we can examine it through the sunk cost lens — and see how the switching cost components interacted to maintain a wrong consensus for decades.
The Investment
By the 1980s, the low-fat dietary consensus had accumulated massive sunk costs:
| Component | Investment |
|---|---|
| Career | Thousands of nutrition researchers built careers on the fat-heart hypothesis |
| Reputation | The American Heart Association, USDA, and WHO had endorsed the fat guidelines |
| Textbook | Medical school nutrition curricula were organized around the fat model |
| Funding | Billions in research funding had been allocated to studying fat and heart disease |
| Identity | An entire generation of nutritionists identified as advocates of low-fat eating |
| Economic | The low-fat food industry was worth tens of billions of dollars |
The Counter-Evidence
The counter-evidence against the fat hypothesis didn't trickle in — it accumulated in waves, each one meeting the same wall of resistance:
Counter-evidence accumulated throughout the 1970s-2000s: - Studies failed to demonstrate a clear causal link between dietary saturated fat and heart disease - The sugar industry's manipulation of research was exposed - Populations that consumed high-fat diets (Mediterranean, French) showed lower heart disease rates - The obesity and diabetes epidemics accelerated after the adoption of low-fat guidelines - Meta-analyses in the 2010s concluded that the evidence for the fat-heart hypothesis was much weaker than previously claimed
The Sunk Cost Resistance
Despite this evidence, the consensus held for decades. The switching cost dynamics are clear:
- Career investment: Researchers who had spent careers studying fat-heart disease faced devaluation of their life's work.
- Institutional reputation: The AHA, USDA, and WHO would face enormous credibility damage if they admitted their guidelines had been wrong for fifty years.
- Textbook inertia: Medical school curricula had been teaching the fat model for decades. Rewriting required acknowledging decades of wrong instruction.
- Industry: The low-fat food industry had invested billions in products designed to meet the guidelines. Reversing the guidelines would devastate the industry.
- Identity: Nutritionists who had spent careers advising patients to reduce fat intake would need to tell those patients — and themselves — that the advice was misguided.
The correction has been slow and partial. Current dietary guidelines are more nuanced than the original low-fat recommendations, but the full correction — a clear acknowledgment that the fifty-year fat consensus was substantially wrong — has not occurred at the institutional level. The sunk cost is too large.
Compare the dietary fat case to the peptic ulcer case (our anchor example from Chapter 2). Both involved decades-long wrong consensuses maintained by sunk cost. But the ulcer case was corrected more completely because: (a) the alternative (antibiotics) was clearly superior and easily adopted, (b) the switching cost for individual gastroenterologists was lower (they just prescribed different drugs, not a different philosophical framework), and (c) the evidence was decisive rather than probabilistic. The dietary fat case is harder because: (a) the alternative is "it's complicated" (not a simple, actionable recommendation), (b) the switching cost involves rewriting national dietary policy, and (c) the evidence is epidemiological and statistical rather than mechanistic and decisive.
This comparison reveals an important principle: the speed of correction is inversely proportional to the switching cost. The higher the switching cost, the longer the wrong consensus persists — regardless of the evidence. We'll formalize this relationship in Chapter 22 (The Speed of Truth).
🔗 Connection: The dietary fat case demonstrates how persistence mechanisms (sunk cost) interact with entry mechanisms (authority cascade, survivorship bias, anchoring). The entry mechanisms got the wrong idea in. The sunk cost kept it there. Breaking free required overcoming both sets of forces simultaneously — which is why the correction took so long and remains incomplete.
9.5 The Caloric Theory of Heat: A Historical Parallel
To show that sunk cost dynamics are not unique to modern fields, consider a case from the history of physics.
In the late 18th and early 19th century, the dominant theory of heat was the caloric theory — the idea that heat was a fluid ("caloric") that flowed from hot objects to cold objects. The theory was endorsed by the most prestigious scientists of the era, including Lavoisier. It explained many observations, generated productive research, and was embedded in the vocabulary of chemistry and physics.
Counter-evidence appeared early. Benjamin Thompson (Count Rumford) demonstrated in 1798 that boring a cannon produced unlimited amounts of heat — which was impossible if heat were a conserved fluid. The caloric theory should have been abandoned in the early 1800s.
It persisted for another fifty years.
Why? Sunk cost. Lavoisier's chemical nomenclature — built around the caloric concept — was the standard. Textbooks used caloric language. Research programs were designed within the caloric framework. Young scientists who proposed the kinetic theory of heat (that heat is molecular motion) faced resistance not because their evidence was weak, but because the switching cost was high. The correction came gradually, driven by the accumulated work of multiple researchers (Joule, Clausius, Boltzmann), and wasn't complete until the 1860s — more than sixty years after Rumford's decisive experiment.
The caloric theory case demonstrates that sunk cost dynamics are not a modern phenomenon. They are a structural feature of how knowledge-producing institutions work, and they have operated in every field in every century.
Einstein's Static Universe: When Even Geniuses Can't Escape
We briefly encountered Einstein's cosmological constant in Chapter 2 (authority cascade). Now we can see the sunk cost dimension.
Einstein introduced the cosmological constant in 1917 to maintain a static universe — the prevailing consensus. When Hubble's observations in the late 1920s demonstrated expansion, Einstein could have celebrated: his own equations had predicted it. Instead, he reportedly called the cosmological constant his "greatest blunder" and removed it.
But the deeper question is why Einstein added the constant in the first place. His own equations predicted an expanding or contracting universe. The mathematics was clear. But the consensus was that the universe was static, and Einstein — despite being the most brilliant physicist of his era — adjusted his mathematics to fit the consensus rather than trusting his equations to challenge it.
This is sunk cost operating at the individual level within a communal context. The physics community's investment in a static universe was so deep — philosophically, aesthetically, and conceptually — that even Einstein was reluctant to challenge it. The switching cost for the community (abandoning the static universe worldview) was high enough to deter even the one person best positioned to initiate the change.
Bloodletting: Millennia of Sunk Cost
For the most extreme example of sunk cost persistence, consider bloodletting — the practice of withdrawing blood from patients to treat virtually any illness. Bloodletting was practiced for over 2,000 years, from ancient Greece through the 19th century. Galen's humoral theory (2nd century CE) provided the theoretical justification: illness was caused by an imbalance of bodily humors, and removing blood restored the balance.
The practice was harmful — weakening already sick patients, causing infections, and occasionally killing them directly (George Washington's death in 1799 was likely hastened by the removal of approximately 40% of his blood by his physicians). Counter-evidence accumulated over centuries: physicians who did NOT bleed their patients often saw better outcomes.
Yet bloodletting persisted for roughly 2,000 years. The sunk cost was enormous: an entire medical infrastructure built on humoral theory, centuries of textbooks, generations of training, the professional identity of physicians as practitioners of heroic medicine, and the deep philosophical commitment to the idea that illness required active, aggressive intervention.
The scale of this sunk cost — measured not in years but in millennia — demonstrates something important: there is no natural limit to how long sunk cost can maintain a wrong consensus. Without a specific correction mechanism (in this case, the development of germ theory and controlled clinical trials), the wrong answer can persist indefinitely.
The bloodletting case also reveals a feature of sunk cost that the modern cases sometimes obscure: the sunk cost grows with time. Each generation that practices bloodletting adds to the investment: more physicians trained in the technique, more textbooks incorporating it, more institutional tradition supporting it. The switching cost at any given moment includes not just the current generation's investment but the accumulated investment of every previous generation. This accumulation explains why very old wrong ideas are harder to correct than recent ones — the sunk cost has had more time to compound.
This has a disturbing implication for any field: the longer a wrong consensus persists, the harder it becomes to correct — not because the evidence weakens, but because the investment deepens. Time is the enemy of correction. Every year that a wrong consensus remains in place, it becomes more entrenched, more expensive to abandon, and more deeply woven into the identities and institutions that sustain it.
💡 Intuition: Think of sunk cost as compound interest — but compounding against correction. A wrong consensus that has persisted for ten years has accumulated a certain switching cost. After twenty years, the cost hasn't merely doubled — it has compounded, because each year of additional investment builds on the previous years' investment. After fifty years (the dietary fat case), the accumulated switching cost is so enormous that even decisive evidence may be insufficient to trigger correction without an external crisis.
9.6 What It Looked Like From Inside
Consider the perspective of a mid-career gastroenterologist in 1990, seven years after Marshall and Warren's initial publication:
- You have spent fifteen years treating ulcers with acid-suppression drugs. You have performed endoscopies, prescribed medications, managed patients through relapses, and occasionally referred the most severe cases for surgery.
- You have published papers on acid secretion and its relationship to ulcer disease. Your tenure case was built on this work. Your professional reputation is as an expert in acid-related disorders.
- You have heard about the H. pylori hypothesis. You have read Marshall and Warren's papers. The evidence is stronger than you expected.
- But accepting the bacterial hypothesis means accepting that for fifteen years, you have been treating the symptom (excess acid) rather than the cause (bacterial infection). It means that the surgical referrals you made — patients whose stomachs were permanently altered — were unnecessary. It means your publications on acid secretion, while not wrong, were focused on the wrong question.
- The professional cost of publicly switching positions would be significant. Your acid-secretion colleagues would view you as a traitor. Your publications would be reframed as "pre-bacterial era" work. Your expertise in acid management would be devalued.
- The personal cost is even higher. You would need to look at yourself in the mirror and acknowledge that some of your patients were harmed by your failure to adopt a correct diagnosis earlier. This is not an abstract question — these are specific people with names and faces. Mrs. Rodriguez, who underwent a vagotomy in 1988 that left her with chronic digestive problems. Mr. Chen, who has been taking acid-suppression medication for twelve years. The surgical referrals, the medication prescriptions, the reassurances you gave to patients — all of these become morally freighted the moment you accept that the underlying cause was a bacterium curable with two weeks of antibiotics.
From this perspective, continuing with the acid model for another few years — until the evidence becomes so overwhelming that everyone switches simultaneously — is the rational strategy. The switching cost for an individual early mover is devastating. The switching cost for someone who changes when the whole field changes is minimal (you just go with the new consensus).
This explains why fields change en masse rather than one person at a time: the sunk cost structure incentivizes waiting until collective switching becomes inevitable. Economists would call this a coordination game: each individual's optimal strategy depends on what everyone else does. If everyone else is going to switch, switching early is advantageous (you're a leader). If everyone else is going to stay, switching early is catastrophic (you're a heretic). Since no one knows what everyone else will do, the rational strategy is to wait — and the consensus persists until an external shock or generational turnover triggers simultaneous switching.
This coordination problem also explains the often-observed phenomenon of "sudden" paradigm shifts. From the outside, it looks like a field changed its mind overnight. From the inside, what happened is that a critical mass of practitioners simultaneously concluded that the switching cost of maintaining the old consensus had exceeded the switching cost of abandoning it — and once the first few switched publicly, the coordination problem resolved rapidly as everyone followed.
🌍 Global Perspective: Sunk cost dynamics operate differently across academic cultures. In the United States, where academic careers are highly competitive and tenure depends on publication productivity within established frameworks, the career component of switching cost is particularly severe. In European systems with more secure academic positions, the career component may be lower — but the reputational and institutional components remain. In East Asian academic cultures, where deference to senior scholars is deeply embedded, the career cost of challenging a senior figure's framework can be even higher than in the West.
🔍 Why Does This Work?
The sunk cost of consensus works because the cost of being wrong is distributed asymmetrically across time. The cost of maintaining a wrong consensus is borne by patients, students, the public, or other downstream victims — and it's spread over years, making it invisible to any single decision-maker. The cost of acknowledging a wrong consensus is borne immediately and personally by the practitioners who switch — and it's concentrated, visible, and career-threatening. This asymmetry ensures that the wrong consensus is maintained long past its expiration date: the people who would need to change bear the cost, while the people who are harmed by the status quo have no voice in the decision.
9.7 Active Right Now: Where Sunk Cost May Be Maintaining Wrong Answers
Opioid prescribing practices. For two decades, the medical establishment endorsed aggressive opioid prescribing for chronic non-cancer pain — based partly on pharmaceutical company influence and partly on a genuine (if misguided) desire to address undertreated pain. The sunk cost includes: clinical guidelines embedded in practice, physician training in opioid prescribing, patient expectations for pain medication, pharmaceutical company investments, and the career investments of pain medicine specialists who championed the approach. The correction is underway but faces enormous resistance — partly because the alternative (multimodal pain management) is more complex, more expensive, and less amenable to the 15-minute clinical encounter.
Academic publishing model. The current system — where publicly funded researchers give their work to private publishers, who charge libraries and individuals for access — is widely recognized as inefficient and inequitable. The open-access alternative is available and growing. Yet the old model persists because of sunk cost: journal reputations (built over decades), career evaluation systems (built on journal prestige rankings), library subscription budgets (committed for years in advance), and publisher business models (generating billions in annual revenue). Every stakeholder acknowledges the problem; the switching cost prevents the solution.
Standardized testing in education. As discussed in Chapter 4, standardized testing has known problems (Goodhart's Law, curriculum narrowing, cultural bias). Alternatives exist (portfolio assessment, competency-based evaluation, project-based learning). Yet the testing infrastructure persists because the sunk cost includes: test development companies (multi-billion-dollar industry), school accountability systems (built on test scores), university admissions processes (dependent on standardized metrics), and decades of legislative framework mandating testing.
GDP as a primary economic indicator. Despite decades of criticism (Chapter 4), GDP remains the dominant measure of economic health. The sunk cost includes: international institutional infrastructure (IMF, World Bank, OECD), domestic policy frameworks, media reporting conventions, political rhetoric, and the careers of economists specializing in GDP analysis. Alternative indicators exist (HDI, GPI, SPI) but face the coordination problem: switching would require simultaneous change across institutions worldwide.
In each case, the pattern is the same: the problem is identified, the alternative is available, and the sunk cost prevents the switch. The consensus persists not because anyone defends it on the merits but because the cost of changing exceeds any individual actor's incentive to absorb it.
9.8 Distinguishing Healthy Conservatism from Pathological Entrenchment
Not all resistance to new ideas is pathological sunk cost. Some resistance is healthy scientific conservatism — the appropriate skepticism that protects fields from adopting every new claim that comes along. How do you tell the difference?
Healthy Conservatism
- Resistance is based on evidence: "The new claim hasn't been independently replicated yet"
- The same evidential standards are applied to the new claim AND the existing consensus
- Dissenters are engaged on the merits of their evidence, not punished personally
- A specific, articulable evidential threshold exists for changing the consensus
- The field actively investigates the new claim rather than dismissing it
Pathological Entrenchment (Sunk Cost)
- Resistance is based on investment: "We've built too much on the current framework to abandon it"
- Higher evidential standards are applied to the new claim than were ever applied to the existing consensus
- Dissenters face career consequences, social ostracism, or professional marginalization
- No specific evidential threshold exists — the goalposts move as new evidence appears
- The field does not investigate the new claim and discourages others from doing so
The diagnostic table:
| Feature | Healthy Conservatism | Pathological Entrenchment |
|---|---|---|
| Basis of resistance | Evidence | Investment |
| Evidential symmetry | Same standards for both | Asymmetric — higher for challenger |
| Treatment of dissenters | Engagement | Punishment |
| Correction threshold | Specific and stated | Vague or shifting |
| Investigation of alternative | Active | Discouraged |
📐 Project Checkpoint
Your Epistemic Audit — Chapter 9 Addition (Part II begins)
Return to your audit target and analyze the sunk cost structure:
Map the switching costs. For the core consensus in your field, identify the career investment, reputational capital, textbook infrastructure, funding commitments, and identity investment that maintain it.
Identify the cost asymmetry. Who bears the cost of maintaining the wrong consensus (if it is wrong)? Who bears the cost of acknowledging the error? Are these the same people?
Apply the diagnostic table. Does your field's resistance to challenges look more like healthy conservatism or pathological entrenchment?
Estimate the switching cost. If the core consensus in your field were shown to be wrong, what would need to change? Careers? Textbooks? Funding? Regulations? Identities?
Add 300–500 words to your Epistemic Audit document.
9.9 Practical Considerations: Reducing Switching Costs
If sunk cost is a structural force that maintains wrong answers, the solution must also be structural: designing institutions that reduce the cost of being wrong.
Strategy 1: Normalize Error Acknowledgment
The most powerful sunk cost reducer is a culture that treats changing one's mind as a sign of strength rather than a sign of weakness. If the profession celebrates corrections — "Dr. X courageously revised their position when the evidence demanded it" — the identity cost of switching drops dramatically.
Some fields are beginning to experiment with this: "loss of confidence" retractions in science, "postmortem" cultures in technology companies, and "lessons learned" reviews in the military. But in most fields, changing your mind publicly is still stigmatized.
Strategy 2: Separate Assessment from Identity
Encourage practitioners to think of their professional contributions as assessments (which can be updated) rather than commitments (which must be defended). "My best assessment, given current evidence, is X" is a formulation that allows revision without identity threat. "I believe X" makes revision feel like betrayal.
Strategy 3: Build Sunset Clauses into Consensus
When a field adopts a new framework, build in explicit review dates: "We will re-evaluate this consensus after 10 years of additional evidence." Sunset clauses prevent the indefinite accumulation of sunk cost by forcing periodic reassessment.
Strategy 4: Fund Transitions, Not Just Research
When a consensus changes, the field needs transition support: retraining programs, updated curricula, transitional funding for researchers who need to redirect their work. If the switching cost is partly economic, the solution must also be partly economic.
Consider the analogy to industrial policy. When an economy transitions from manufacturing to services, governments provide retraining programs, unemployment insurance, and transition support for displaced workers. Fields undergoing paradigm shifts need the equivalent: "paradigm transition support" that cushions the personal cost of changing one's mind. Without it, individual practitioners bear the full switching cost — and rationally resist.
Strategy 5: Celebrate Corrections as Discoveries
If a field treated the correction of a long-standing error as a discovery — on par with the original finding — the reputational cost of correction would drop substantially. Imagine a world where "Dr. X demonstrated that the 30-year consensus on Y was wrong, enabling more effective treatment of Z" was celebrated as a Nobel-worthy achievement. In such a world, the incentive structure would favor correction rather than resistance.
Some fields are moving in this direction. The replication crisis in psychology has produced a culture (in some quarters) where identifying failed replications is treated as a contribution rather than an attack. But in most fields, the correction of error is still treated as an embarrassment rather than an achievement — which is itself a structural failure that maintains sunk cost dynamics.
🪞 Learning Check-In
Pause and reflect: - Have you ever held onto a belief, strategy, or approach longer than the evidence warranted because you'd invested too much to change? - In your field, what would it cost — professionally, reputationally, personally — to publicly admit that a core consensus was wrong? - Can you imagine a field culture where admitting error was celebrated? What would that culture look like?
✅ Best Practice: When presenting your own work, include an explicit "conditions for revision" section: "I would change my conclusion if I saw evidence of X, Y, or Z." This practice reduces your future switching cost by building revision into the original commitment. It also signals to others that you are a scientist, not an advocate.
9.10 Chapter Summary
Key Arguments
- The sunk cost of consensus is the first and most fundamental persistence mechanism — the force that keeps wrong answers in place after counter-evidence appears
- Institutional sunk cost differs from individual sunk cost because it is distributed across careers, infrastructure, funding, and identity, making it impossible for any single individual to overcome
- The five components of switching cost — career investment, reputational capital, textbook infrastructure, funding commitments, and identity investment — interact to create a barrier that is much larger than any individual component
- The cost of maintaining a wrong consensus is distributed and invisible; the cost of acknowledging it is concentrated and personal — this asymmetry is the engine of persistence
- The lobotomy case, the dietary fat hypothesis, and the caloric theory of heat all demonstrate identical sunk cost dynamics across different fields and centuries
Key Debates
- Can institutional sunk cost ever be fully overcome without generational turnover?
- Is it ethical to maintain a consensus you suspect is wrong because the switching cost is too high?
- How do you balance the real costs of transition against the real costs of perpetuating error?
Analytical Framework
- The five-component switching cost equation
- The diagnostic table for healthy conservatism vs. pathological entrenchment
- The cost asymmetry principle (distributed/invisible costs of maintaining error vs. concentrated/personal costs of acknowledging it)
Spaced Review
Revisiting earlier material to strengthen retention.
- (From Chapter 8) How does imported error interact with sunk cost? When a field has borrowed a framework from another field, does the dual investment (in the import AND in the source discipline) increase the switching cost?
- (From Chapter 2) The authority cascade gets wrong ideas IN. The sunk cost of consensus keeps them THERE. Trace how these two mechanisms interact in the dietary fat hypothesis.
- (From Chapter 1) Which stages of the lifecycle of a wrong idea correspond to sunk cost dynamics?
Answers
1. Yes — imported frameworks carry double sunk cost. The field has invested in the framework (career, textbook, funding) AND in the relationship to the source discipline (prestige borrowing, cross-field credibility). Abandoning the framework costs both investments simultaneously. 2. Authority cascade (Ancel Keys's prestige) installed the fat hypothesis. Sunk cost (career investment in fat-heart research, AHA guidelines, low-fat food industry, nutritionist identities) maintained it. Breaking free required overcoming both: new evidence strong enough to bypass the authority cascade AND a new generation willing to absorb the switching cost. 3. Stages 3 (Entrenchment), 4 (Counter-evidence — sunk cost determines the response to counter-evidence), and 5 (Resistance — sunk cost powers the institutional resistance).What's Next
In Chapter 10: The Replication Problem, we'll examine the second persistence mechanism: what happens when nobody checks whether the foundational evidence is actually true. You'll encounter psychology's replication crisis, the shocking unreliability of preclinical cancer research, and the structural reasons why "checking the homework" is disincentivized in every field.
Before moving on, complete the exercises and quiz to solidify your understanding.
Chapter 9 Exercises → exercises.md
Chapter 9 Quiz → quiz.md
Case Study: The Lobotomy — When a Nobel Prize Became a Sunk Cost → case-study-01.md
Case Study: The Dietary Fat Consensus — Fifty Years of Switching Cost → case-study-02.md
Related Reading
Explore this topic in other books
Propaganda Psychology of Persuasion Propaganda Bandwagon and Social Proof Applied Psychology Persuasion, Influence, and Social Pressure Algorithmic Addiction Social Proof and Manufactured Consensus