Further Reading: Field Autopsy — Psychology
Tier 1: Verified Sources
Ritchie, Stuart. Science Fictions: How Fraud, Bias, Negligence, and Hype Undermine the Search for Truth. Metropolitan Books, 2020. The most accessible overview of the replication crisis across science, with extensive coverage of psychology. Ritchie documents fraud, bias, negligence, and hype as structural features of modern science, not aberrations.
Open Science Collaboration. "Estimating the Reproducibility of Psychological Science." Science, 2015. The Reproducibility Project's landmark paper. The study attempted to replicate 100 published psychology findings and found that only 36% replicated with the original effect size. The paper that quantified the crisis.
Simmons, Joseph P., Leif D. Nelson, and Uri Simonsohn. "False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant." Psychological Science, 2011. The paper that demonstrated how researcher degrees of freedom could produce false-positive results. Using real data, the authors showed that standard analytical flexibility could "prove" that listening to a Beatles song makes you younger. A devastating and entertaining demonstration.
Henrich, Joseph, Steven J. Heine, and Ara Norenzayan. "The Weirdest People in the World?" Behavioral and Brain Sciences, 2010. The paper that defined the WEIRD problem: most psychology research is conducted on a narrow population that may not represent human psychology in general.
Nosek, Brian A., et al. "Promoting an Open Research Culture." Science, 2015. The Transparency and Openness Promotion (TOP) guidelines — the institutional framework for Open Science in psychology. A blueprint for other fields.
Kahneman, Daniel. "A New Etiquette for Replication." Social Psychology, 2014. Kahneman's thoughtful proposal for how the replication movement and original researchers should interact — written by someone who initially supported social priming and then recognized the field's problems.
Tier 2: Attributed Claims
The Many Labs replication projects (Many Labs 1, 2, 3) have been published in multiple journals and provide the most systematic evidence about which psychology effects replicate and which don't. The general finding: some classic effects replicate robustly across labs and populations; others fail completely; many show smaller effects than originally reported.
Research on the adoption of Open Science practices is ongoing. Pre-registration rates have increased significantly since 2013, but estimates of what proportion of published studies are pre-registered vary by field and journal.
The debate about overcorrection in post-crisis psychology is active. Researchers including Paul Rozin, Sanjay Srivastava, and others have written thoughtfully about the tension between methodological reform and the risk of constraining scientific creativity.
Recommended Reading Sequence
- Start with Simmons et al. (2011) — short, devastating, accessible demonstration of the problem
- Then Open Science Collaboration (2015) — the empirical evidence for the crisis
- Then Ritchie (Science Fictions) — the full narrative across science
- Then Henrich et al. (2010) — the WEIRD problem that compounds the crisis
- Then Nosek et al. (2015) — the reform framework