Case Study: Scoring Your Own Field — A Guided Health Assessment
Purpose
This case study guides you through a complete Epistemic Health Checklist assessment of your own field. It provides structured prompts for each dimension, common pitfalls in scoring, and a framework for interpreting your results.
Step 1: Define Your Field
Before scoring, define what you're assessing. "Your field" might be: - An academic discipline (psychology, economics, computer science) - A professional practice (medicine, law, education, engineering) - An industry (finance, technology, pharmaceutical) - An organization (your company, your department, your research group)
Be specific. "Science" is too broad. "Computational neuroscience at research universities" is appropriately specific. The more precisely you define the boundary, the more useful the assessment.
My field: ___
Step 2: Score Each Dimension
For each dimension, answer the guiding questions, then assign a score from 1-10.
Dimension 1: Dissent Tolerance (Score: ___/10)
Guiding questions: - Can you name a prominent dissenter in your field who challenged the consensus? What happened to them? - If you published a paper challenging a core assumption of your field tomorrow, what would happen to your career? - Does your field have formal mechanisms for structured dissent (red teams, devil's advocates, adversarial reviews)? - When was the last time a dissenter changed the field's consensus? How long did it take?
Common pitfall: Scoring too high because you personally would tolerate dissent. The question is about the field's structures, not your personal openness.
Dimension 2: Replication Culture (Score: ___/10)
Guiding questions: - When was the last time a major finding in your field was independently replicated? - Can replications be published in your field's top journals? - Are there incentives (career, funding, prestige) for conducting replications? - Does your field use pre-registration or registered reports?
Common pitfall: Conflating in-principle replicability with actual replication. Many fields claim to be replicable but have never systematically tested it.
Dimension 3: Incentive Alignment (Score: ___/10)
Guiding questions: - What behavior does the funding structure reward? Does it reward truth-seeking or specific outcomes? - What gets you promoted in this field? Is it accuracy, novelty, or volume? - Do conflicts of interest exist? Are they disclosed and managed? - Can you acknowledge a mistake publicly without career consequences?
Dimension 4: Measurement Validity (Score: ___/10)
Guiding questions: - Are the things you measure good proxies for the things you care about? - Could you game the metrics without improving the actual outcome? - Are there important outcomes in your field that are not measured at all? - Do the numbers in your field look more precise than the underlying knowledge justifies?
Dimension 5: Outsider Access (Score: ___/10)
Guiding questions: - Can someone without traditional credentials contribute to your field? - Are there examples of important insights that came from outside the field? - How does the field respond to criticism from adjacent disciplines? - What barriers (credential requirements, jargon, gatekeeping) prevent outsiders from participating?
Dimension 6: Correction Speed (Score: ___/10)
Guiding questions: - How long does it typically take for a known error to be corrected in practice? - Are there mechanisms for rapid correction (guidelines, retractions, standards updates)? - Can you name an error that was corrected quickly? One that took decades? - What is the typical lag between evidence and practice change?
Dimension 7: History Awareness (Score: ___/10)
Guiding questions: - Does your field's training include honest discussion of past errors? - Can you name three things your field was wrong about? How did the correction happen? - Does the field present its history as smooth, inevitable progress? - Do current practitioners know about the field's historical wrong turns?
Dimension 8: Claim Falsifiability (Score: ___/10)
Guiding questions: - Are the field's core claims structured so they could, in principle, be disproven? - Can proponents specify what evidence would change their minds? - When predictions fail, does the field revise its theories or add epicycles? - Are there unfalsifiable assumptions that the field treats as settled?
Dimension 9: Method Diversity (Score: ___/10)
Guiding questions: - How many different methods does your field use to investigate the same questions? - Is there a dominant methodology? If so, what blind spots does it create? - Are alternative methods valued or marginalized? - Does the field triangulate — check findings using independent methods?
Dimension 10: Process Transparency (Score: ___/10)
Guiding questions: - Are data, methods, and analysis publicly available? - Is peer review open or anonymous? - Are funding sources fully disclosed? - Can an outsider observe and evaluate the field's decision-making processes?
Step 3: Compile and Interpret
Fill in the profile table and calculate the average. Then answer:
- Which three dimensions scored lowest? These are your field's specific vulnerabilities.
- Is the profile even or uneven? If uneven, the high-scoring dimensions may be hiding the vulnerability created by the low-scoring ones.
- Compare to the worked examples. Which field (medicine, nutrition, software engineering) does yours most resemble?
- What would the profile predict? Based on the failure mode framework, what types of error would you expect your field to be most susceptible to?
Analysis Questions
1. After completing the assessment, show it to a colleague in your field and ask them to score independently. Where do your scores agree? Where do they diverge? What does the divergence reveal about subjective vs. objective dimensions?
2. Identify the one dimension where improvement would have the largest impact on your field's overall epistemic health. What structural change would produce that improvement? What resistance would you face?
3. Score your field as it was 20 years ago on the same 10 dimensions. Has the profile improved, worsened, or stayed the same? What events or structural changes explain the trajectory?