Case Study: Red Teams That Failed — When Institutional Design Meets Institutional Culture

The Promise

Red teams are among the most intuitively appealing tools for reducing institutional error. The concept is simple: designate a group whose job is to find flaws before they become failures. Give them permission to disagree. Institutionalize dissent.

The U.S. military has the most developed red team culture of any institution. The Army's University of Foreign Military and Cultural Studies (UFMCS) at Fort Leavenworth, established in 2004, trains officers specifically in red team thinking — challenging assumptions, questioning plans, and arguing the enemy's perspective.

Yet Chapter 28 documented that the military continues to repeat errors despite this investment. The counterinsurgency amnesia cycle, the body count problem, and the doctrinal lock-in that led to the Iraq insurgency all occurred in an institution with extensive red team capability.

What went wrong?

The Structural Failures

Failure 1: Authority Override

Red teams make recommendations. Commanders make decisions. When a red team's findings conflict with the commander's plan — or with the institutional consensus — the commander can override them. And frequently does.

This is the fundamental limitation of advisory dissent: it is structurally subordinate to the authority it advises. A red team that tells a four-star general his plan has a critical flaw is giving advice to someone who has the power, the institutional support, and the career incentive to ignore it.

The pre-Iraq intelligence assessments included dissenting views from multiple agencies. The intelligence community's process did surface alternative analyses. But the decision-makers selected the analysis that supported their preferred policy. The red team function worked — the dissenting views were generated and documented. The institutional response failed — the dissenting views were not acted upon.

Failure 2: Performative Compliance

Over time, red teams can become a compliance exercise — a box to check rather than a genuine challenge to think differently. When red teaming is required by policy but not valued by leadership, the exercise degenerates:

  • Red teams are staffed with officers who aren't needed elsewhere (the weakest, not the strongest)
  • Red team exercises are scheduled at the end of the planning process, when changes are expensive and unwelcome
  • Red team findings are documented but not discussed — they satisfy the requirement without influencing the decision
  • Red team members learn that producing uncomfortable findings is career-limiting, and self-censor accordingly

This is the normalization of deviance (Chapter 19) applied to the correction mechanism itself. The red team process gradually drifts from genuine challenge to institutional theater — and no single drift is dramatic enough to trigger alarm.

Failure 3: Cultural Incompatibility

Red teaming requires an institutional culture that values being wrong — that treats finding a flaw before deployment as a success rather than an embarrassment. Military culture, despite its investment in red teams, also values decisiveness, confidence, and commitment to the plan. These values are in tension with red team culture, which values doubt, alternative thinking, and willingness to say "this plan might not work."

The cultural incompatibility is not unique to the military. Corporate red teams face the same tension: the company values confidence in its strategy, and a red team that undermines that confidence is culturally threatening even when it's analytically correct.

The Lesson: Tools Are Necessary but Not Sufficient

The red team case illustrates the central lesson of this chapter: institutional tools for error reduction work only when the institutional culture supports them.

This maps directly onto the Epistemic Health Checklist (Chapter 32). Red teams improve Dimension 1 (Dissent Tolerance) — but only if the institution scores at least moderately on Dimension 1 already. In an institution with very low dissent tolerance, a red team will be suppressed, co-opted, or rendered performative. The tool cannot create the conditions for its own effectiveness.

This creates a paradox: the institutions that most need red teams are the institutions least capable of using them effectively. An institution that already values dissent doesn't need a formal red team structure — it has organic dissent. An institution that doesn't value dissent will neutralize the red team through structural mechanisms (authority override, performative compliance, cultural incompatibility).

When Red Teams Work

Red teams do work under specific conditions:

  1. Leadership genuinely wants to be challenged. This is rare but real. Some leaders actively seek out contrary views because they understand that unchallenged plans are fragile plans.

  2. Red team findings have structural consequences. If the red team can trigger a formal review, delay a deployment, or require a documented response from leadership, it has institutional teeth. Without consequences, recommendations are suggestions that can be ignored.

  3. Red team members are protected. If red team duty is career-enhancing (or at least career-neutral) rather than career-limiting, the best people will volunteer and will produce genuine challenges.

  4. The red team is external. External red teams — consultants, academic reviewers, cross-agency teams — face less internal pressure to conform than internal red teams. The structural independence enables genuine challenge.

Analysis Questions

1. The chapter identifies a paradox: the institutions that most need red teams are least capable of using them. Is this paradox resolvable? Design an institutional structure in which a red team could be effective even in a low-dissent-tolerance organization.

2. Compare the military's red team experience with psychology's registered reports. Both are institutional tools for error reduction. Why have registered reports been more effective? What structural features make the registered report model more robust against institutional neutralization?

3. Apply the red team concept to a non-military, non-corporate setting — an academic department, a hospital, a school district. What would a red team look like in this context? What specific failure modes would it address? What cultural barriers would it face?