Case Study 28-2: Psychological Safety and the Silence Tax — What Organizations Pay When People Don't Speak Up
Category B: Research Study Deep Dive
This case study examines Amy Edmondson's foundational research on psychological safety, its implications for workplace confrontation, and the measurable organizational costs of environments where people don't speak up. We draw on Edmondson's original studies, subsequent replications, and the practical literature on building candid workplace cultures.
Introduction: The Paradox of Silence
In 1996, a Harvard Business School researcher named Amy Edmondson was studying medication errors in hospital nursing teams. She had a hypothesis: better-performing teams would make fewer errors. She developed a measure of team performance and a measure of error rates, collected the data, and was startled by what she found.
The better-performing teams had higher error rates.
Edmondson spent considerable time trying to figure out what had gone wrong with her methodology. It turned out nothing had gone wrong. What she'd stumbled onto was the paradox of organizational silence: in the high-performing teams, nurses were more comfortable reporting errors and near-misses — and so more errors showed up in the data. In the lower-performing teams, nurses were afraid to report, and so the errors that happened never got recorded.
The high-performing teams didn't make more mistakes. They were more honest about the mistakes they made.
This seemingly counterintuitive finding launched one of the most influential research programs in organizational behavior of the past three decades. Edmondson named the underlying variable "psychological safety" — the belief that one can speak up, ask questions, raise concerns, or admit mistakes without fear of punishment or humiliation — and spent the next twenty years documenting its relationship to every form of organizational performance that matters.
The implications for workplace confrontation are direct, empirical, and somewhat uncomfortable: organizations pay an enormous, measurable cost when people don't speak up. The "silence tax" is real.
Part 1: What Edmondson Found
The Original Research
Edmondson's foundational study, published in Administrative Science Quarterly in 1999, examined 51 work teams in a manufacturing company. She measured both psychological safety and team performance. The findings were consistent with the hospital data: psychological safety was a significant predictor of team learning — and team learning was a significant predictor of team performance.
But the mechanism was specific. Psychological safety didn't make teams perform better by making people feel good. It made them perform better by enabling a specific set of behaviors:
- Error reporting and discussion — teams that could talk about mistakes could correct them; teams that couldn't, couldn't
- Speaking up with concerns — teams where people felt safe raising concerns caught problems early; teams where they didn't, caught them late or not at all
- Risk-taking and experimentation — teams where failure was punished didn't try new things; teams where failure was treated as information did
- Asking for help — teams where admitting you didn't know something was safe asked for help and learned; teams where it wasn't, performed below their potential indefinitely
The common thread: all of these behaviors involve some form of vulnerability — expressing something that might be wrong, might be criticized, might reflect poorly. Psychological safety is the organizational condition that makes vulnerability safe.
The Google Replication
In 2012, Google launched an internal research project called Project Aristotle to identify what distinguished high-performing teams from lower-performing ones. They examined two years of performance data across 180 teams and found that team composition — who was on the team, their skills, their backgrounds — was much less predictive of performance than expected.
The most significant predictor of team performance? Psychological safety.
The finding matched Edmondson's: the best-performing teams at Google were not those with the most talented individuals. They were those in which individuals felt safe taking interpersonal risks — including the risk of raising concerns, disagreeing with each other, and saying "I don't know."
Project Aristotle's lead researcher, Julia Rozovsky, wrote: "The safer team members feel with one another, the more likely they are to admit mistakes, to partner and collaborate effectively, and to take on new roles."
This is psychological safety — not as a soft value but as a performance driver.
The Organizational Silence Literature
Edmondson's work sits within a broader literature on organizational silence — the systematic tendency of employees to withhold information about problems, concerns, and mistakes from those who have authority to act on them.
Frances Miliken, Elizabeth Morrison, and Patricia Hewlin (2003) surveyed over 200 employees about organizational silence and found that 85% had withheld at least one work-related concern in the previous year, specifically because they feared the consequences of speaking up. The costs they anticipated: being seen as a troublemaker, damaging relationships, or experiencing retaliation.
Elizabeth Morrison and Frances Milliken (2000) identified the organizational conditions that produce silence: a history of punishment for dissent, centralized decision-making that signals input isn't valued, a climate where the bearer of bad news is blamed for the news, and managers who are threatened rather than grateful when employees raise concerns.
Part 2: The Silence Tax
What does silence actually cost organizations? The research permits specific answers.
The Catastrophic Failure Evidence
Some of the most compelling evidence for the cost of organizational silence comes from high-profile failures where the silence was later documented.
The Challenger space shuttle disaster (1986). Engineers at Morton Thiokol had raised concerns about O-ring performance at low temperatures — the failure mechanism that caused the explosion — before the launch. Their concerns were suppressed through schedule pressure, hierarchical authority, and the implicit message that further concerns were unwelcome. The political calculations that produced silence had catastrophic physical consequences.
The Boeing 737 MAX crashes (2018-2019). Investigations found that engineers and pilots had raised concerns about the MCAS system before the two crashes that killed 346 people. The organizational culture — which prioritized schedule and cost over safety concerns — created conditions where speaking up was not rewarded. Concerns that reached some levels of management were not adequately surfaced to decision-makers.
The Wells Fargo fraudulent accounts scandal (2016). Employees who raised concerns about pressure to open fraudulent accounts were documented to have been terminated or subjected to adverse action. This suppressed accurate information flow to senior leadership and regulators long past the point where the problem could have been addressed at lower cost.
These are extreme cases. But the mechanism they illustrate — a silence-producing culture that prevents accurate information from reaching decision-makers — operates at smaller scale in most organizations, most of the time, producing costs that are diffuse, chronic, and unmeasured.
The Innovation Suppression Evidence
Edmondson and colleagues have documented the relationship between psychological safety and innovation. Teams with higher psychological safety generated more new ideas, were more willing to challenge existing practices, and were more likely to successfully implement innovations. Teams with lower psychological safety showed the classic pattern: individuals had ideas, calculated the risk of raising them, and either kept them to themselves or softened them to irrelevance.
The mechanism: innovation requires proposing something that might be wrong, defending it against skepticism, and being willing to be publicly associated with a failure. In low-psychological-safety environments, the rational calculation is to generate ideas privately and share them only when they've already succeeded elsewhere. By then, the competitive advantage is gone.
The Turnover and Disengagement Evidence
A 2017 Gallup survey found that only 30% of U.S. employees were "engaged" at work. The single strongest predictor of engagement: whether employees felt their opinions were valued and that they could speak up. High-disengagement, low-psychological-safety organizations don't see the cost as a line item. They see it as average performance — output that looks normal because they've never seen their organization at full capacity.
Part 3: The Organizational Bystander Effect
Psychological safety and the individual bystander problem are deeply connected. The bystander dynamic that Latané and Darley documented — diffusion of responsibility, assumption that someone else will act — operates in organizations through the additional mechanism of career risk.
When an employee witnesses misconduct, error, or a significant problem, they face the calculation that every bystander faces: will I be the one who says something? In organizations, the career cost of being the one who acts is real and observable.
Diffusion of responsibility + career risk. Each bystander assumes someone else will act. In organizations, the career cost of acting makes the calculation even more adverse: not only is it someone else's problem, but acting makes it your problem.
Pluralistic ignorance. Nobody speaks up; each individual takes the collective silence as evidence the situation isn't as serious as they privately believe. The silence of others produces false reassurance — even as each individual is producing the same silence for the same reason.
The bad news messenger problem. In many cultures, the person who reports a problem is treated as though they caused it. When the messenger is punished, the message stops arriving. Managers who respond to bad news with anger systematically train their organizations not to bring them bad news — while believing they are well-informed because nothing alarming has been reported.
What Builds Psychological Safety
Edmondson's research identifies specific behaviors:
What builds it: - Responding to bad news with curiosity: "Tell me more about this" rather than "Why did this happen?" - Publicly acknowledging your own uncertainty and mistakes - Explicitly appreciating when people raise uncomfortable concerns - Following up when concerns are raised, even to say "we considered it and decided not to change"
What destroys it: - Punishing or marginalizing people who raise concerns - Responding to errors with blame rather than inquiry - Being visibly threatened by challenges to your position - Rewarding people who tell you what you want to hear
The manager's behavior is the primary driver of team-level psychological safety. This is both the bad news (each manager creates their team's climate) and the good news (it's within individual managerial control to change).
Part 4: Confrontation as Psychological Safety Practice
Here is the argument that connects this case study to the chapter directly: individuals who have the skills to raise concerns effectively create psychological safety not just for themselves but for the people around them.
When Sam raises the attribution issue with Elena rather than stewing or going to the boss, he demonstrates to everyone watching that direct, respectful peer confrontation is possible and productive. When Marcus Chen tells Diane he needs more preparation time, he demonstrates to every paralegal that junior employees can advocate for themselves professionally. When Dr. Priya names the near-miss and requests a specific decision from Harmon, she demonstrates that patient safety concerns will be raised even when politically inconvenient.
None of these individuals set out to create psychological safety. They set out to address specific, concrete problems. But the cultural effect — the signal that honest communication is survivable and productive — radiates outward from individual acts of workplace confrontation.
The confrontation skills in this chapter are not only personally valuable. They are organizationally valuable. The employee who can raise concerns clearly, directly, and without making the other party feel attacked is not just solving their individual problem. They are contributing to the kind of organization that catches errors early, surfaces innovations, and retains engaged talent.
The silence tax is paid by everyone in the organization. These skills are, in part, how it gets reduced.
Part 5: When Psychological Safety Is Low
What do you do when you're operating in a low-psychological-safety environment?
Distinguish what you can influence from what you can't. You cannot, as an individual employee, change the psychological safety culture of a large organization. You can influence the climate of your immediate team and your own behavior as a model.
Calibrate your courage to the stakes. Not all concerns warrant the same level of risk. A significant safety concern, a pattern of discrimination, a serious ethical violation — these may warrant speaking up even in environments where speaking up is risky. A preference for a different process — probably not worth the career cost.
Find allies before you speak. In low-psychological-safety environments, concerns raised individually are more vulnerable than concerns with visible support.
Know your legal protections. When concerns involve discrimination, harassment, or safety violations, legal protections exist independent of the organizational culture.
Have a BATNA. If your organization is genuinely not safe to work in — if the psychological cost is severe and unchanging — your alternative is to leave. This is not defeat. It is a reasonable calculation that some environments are not improvable from the inside, and that your wellbeing is worth protecting.
Discussion Questions
-
Edmondson's original finding — that higher-performing teams had higher measured error rates — was initially counterintuitive. Explain the mechanism she identified. What does this finding imply about how organizations should respond when error reports go up after an intervention to improve psychological safety?
-
The cases of the Challenger and the Boeing 737 MAX both involved engineers whose concerns about safety were suppressed. What specific management behaviors would have changed the outcome? Are these behaviors within reach of individual managers who are themselves operating in cultures that suppress dissent?
-
"Pluralistic ignorance" describes each individual taking collective silence as evidence that the situation isn't serious. Have you observed this in an organization, team, or group? What broke the silence — if anything? What enabled the first person to speak?
-
The case study argues that individuals with confrontation skills create psychological safety for people around them, not just for themselves. Is this a realistic claim, or an overstatement? Under what conditions is it most likely to be true, and under what conditions would individual confrontation skill make no difference to the broader culture?
-
The final section suggests that in low-psychological-safety environments, the appropriate response varies by stake size. How do you make this calculation? What factors determine whether a concern is worth the career risk in a hostile environment — and who has the privilege of making this calculation safely? themes: ["psychological safety", "Edmondson", "hierarchy effect", "speaking up", "organizational behavior", "managing up"]
Case Study 28-2: Amy Edmondson and the Hierarchy Effect — Why People Don't Speak Up
Overview
Amy Edmondson is a professor at Harvard Business School whose research on organizational behavior has produced some of the most practically important findings in the management sciences. Her core body of work — developed over more than two decades, across industries from healthcare to aviation to technology to manufacturing — addresses a phenomenon that most organizations experience but struggle to understand: why people consistently fail to share information that matters, even when they know it would help.
The central finding, which Edmondson has replicated across hundreds of organizations and thousands of participants: in virtually every organization she has studied, people at lower levels of the hierarchy consistently underestimate how much their information and concerns would be welcomed by those above them — and overestimate the risks of sharing that information. The result is a systematic, predictable, and enormously costly failure of upward information flow.
This case study examines Edmondson's research in detail and applies its findings to the specific challenge of managing-up confrontations.
The Research: How Silence Became a Research Question
The Hospital Study That Started It All
Edmondson's foundational research began not as a study of silence but as a study of medical error. In the late 1990s, she was conducting research in hospital intensive care units, tracking medication error rates across nursing units and comparing them to team dynamics — specifically, to whether the units had what she was calling "psychological safety" (the shared belief that the team is safe for interpersonal risk-taking).
Her hypothesis going in was straightforward: units with better team dynamics and higher psychological safety would make fewer errors. What she found was the opposite. Units with the highest psychological safety reported the most errors.
This was confusing — until Edmondson realized she had been measuring something different than she thought. The units with high psychological safety were not making more errors. They were reporting more errors — because the team environment felt safe enough for people to acknowledge and discuss mistakes rather than hiding or minimizing them. The units with low psychological safety were almost certainly making at least as many errors, but those errors were staying invisible because the team culture made error disclosure feel too risky.
This observation reoriented Edmondson's research agenda. The question was no longer "what makes teams perform better?" It was "why do people in organizations consistently fail to share information that matters — and what can be done about it?"
The Hierarchy Effect
Edmondson's subsequent research, including large-scale studies across hospitals, airlines, the manufacturing sector, and technology firms, documented what she came to call the "hierarchy effect": a consistent, measurable relationship between a person's position in an organizational hierarchy and their willingness to speak up about concerns.
The finding, in brief: the lower your position in the hierarchy relative to the person you would need to tell, the less likely you are to speak up, regardless of the importance of the information.
The Calculus of Silence
Edmondson's research identified the implicit cost-benefit calculus that drives silence in hierarchical organizations. When an employee perceives a problem that would require speaking up to someone above them, they are — usually unconsciously — running an analysis:
Perceived costs of speaking up: - Being seen as incompetent ("I don't understand the reasoning behind the decision") - Being seen as disruptive or difficult ("This person is always raising problems") - Being seen as insubordinate ("Who are they to challenge the decision?") - Damaging the relationship with the supervisor or leader - Experiencing retaliation, whether formal or informal
Perceived benefits of speaking up: - The problem gets addressed - The organization improves - The person who spoke up may be recognized for their contribution
In Edmondson's analysis, the costs almost always feel more certain and more immediate than the benefits. The costs are personal, specific, and proximate — they will happen to me, soon. The benefits are diffuse, uncertain, and delayed — maybe the organization improves, maybe someone notices, maybe something changes. This calculus consistently favors silence.
What makes this especially destructive is that the analysis is partly self-fulfilling: in organizations where silence is the norm, the few people who do speak up are often met with the very responses they feared (being seen as difficult, experiencing informal social cost), which confirms the logic of silence for everyone else.
The Hierarchy Effect in Numbers
Across Edmondson's research:
- Nurses consistently underestimate the likelihood that doctors want to hear their concerns
- Pilots in the co-pilot seat are significantly less likely to raise safety concerns with the captain than with another pilot of equal rank — even in simulated emergency scenarios where the stakes are immediately life-threatening
- In a study of manufacturing workers, employees could identify quality problems that managers were unaware of — and chose not to raise them because they did not believe they would be welcomed
- In organizational surveys, when employees are asked "would your manager welcome concerns?", they typically predict significantly lower openness than managers actually report
The consistent theme: people systematically overestimate the risk and underestimate the welcome. The gap between perceived and actual openness is not small. In many organizations, it is the width of an entire cultural divide.
What Leaders Can Do: Creating Conditions for Upward Information Flow
Edmondson's research has moved, in its later iterations, toward intervention: what can leaders and organizations actually do to change the calculus?
Model Fallibility
Leaders who openly acknowledge their own uncertainty, limitations, and mistakes create environments in which others' uncertainty, limitations, and mistakes are normalized. This is not performance humility — it is a genuine signal about what the organizational culture treats as acceptable.
Sam's boss Marcus Webb, who manages by presenting certainty and holding information close, is the opposite model. His behavior communicates that information is power, that uncertainty is weakness, and that the appropriate posture is confident authority. These signals trickle down into the team's behavior: in a culture modeled by Webb, the implicit message is that raising concerns is risky.
Explicitly Invite Input
Leaders who simply want input often assume their openness is visible. It is not. The hierarchy effect is robust: even when a leader is genuinely open to concerns, subordinates will read the silence of non-invitation as an implicit signal not to speak. Explicit invitations — "I want to hear concerns, even if they challenge what I've just said" — are not merely polite. They are essential for counteracting the default calculus.
Edmondson's research shows that explicit invitations significantly increase speaking-up rates, even controlling for other organizational factors. The words matter because the hierarchy effect is partly a matter of perceived permission, and explicit permission changes the perception.
Respond Productively to Concerns When Raised
The most important thing a leader can do to create an environment where concerns are raised is to respond well to concerns when they are raised. If an employee raises a concern and the response is defensiveness, dismissal, or informal retaliation — even if subtle — that response is observed by everyone in the environment and immediately updates the collective calculus toward silence.
Edmondson describes this as the "leader response" variable: in her studies, the single most predictive factor for whether psychological safety is high or low in a team is not how much the leader explicitly says they want input but how they actually respond when they receive it.
This is a direct implication for managing up: the people who successfully manage up once and are received well make the entire organization slightly safer for the next person who tries. The people who are received badly do the opposite.
What Individuals Can Do: Speaking Up More Effectively
Edmondson's research is not only prescriptive for leaders. It also has direct implications for individuals who need to speak up in organizational hierarchies.
Reframe the Calculus Explicitly
The first move for anyone facing the decision of whether to speak up is to examine the calculus they are running and check it against reality. Are the perceived costs as certain as they feel? Are the perceived benefits as uncertain as they seem?
In Edmondson's research, the consistent finding is that people overestimate the risk. This doesn't mean the risk is zero — it means that the subjective perception of risk is reliably inflated. Explicitly checking the calculus ("What is the actual likelihood that this kills my career vs. the actual likelihood that this gets received as useful?") frequently produces a revised assessment that supports speaking up.
Frame as Mission-Critical
Concerns raised in terms of organizational values and mission are more consistently welcomed than concerns raised in personal terms. "I'm raising this because I think it puts the project at risk" lands differently than "I'm raising this because it's affecting me personally" — even when both are true. The first frame positions the speaker as a steward of organizational interests. The second positions them as a person with a grievance.
Sam applied this principle in his conversation with Marcus Webb: he framed the information-withholding concern in terms of its organizational impact (decision quality, operational effectiveness, hiring recommendations) rather than in terms of his own frustration or experience of disrespect.
Use Tentative Language Strategically
Edmondson identifies a specific language strategy that reliably reduces the perceived threat of speaking up to authority: the use of what she calls "tentative language" that signals intellectual humility rather than confrontation.
Instead of: "This approach is wrong and here's why." Try: "I could be missing something, but I'm worried about [specific concern]. Can you help me understand the reasoning?"
This framing accomplishes two things simultaneously: it raises the concern (which is the point) and it signals respect for the leader's judgment and position. It is not dishonest — it is a genuine acknowledgment that the speaker doesn't have the full picture. And it is dramatically more likely to result in the concern being heard.
The "yes, and" approach from Chapter 28 is a specific application of this strategy.
Pick the Right Moment
Edmondson's research shows that the timing and setting of speaking up matters significantly. Concerns raised in public (in meetings, in front of a group) activate defensiveness in ways that private conversations do not. Concerns raised when the leader is under immediate pressure are less likely to be received than concerns raised in a calmer moment. Concerns raised as an ambush — without any advance signal that a difficult topic is coming — generate more defensiveness than concerns raised with a brief advance notice ("I've been thinking about something and I'd like to find a time to talk about it this week").
Sam requested time with Webb in advance ("I'd like 20–30 minutes to discuss something operational with you, this week if possible") rather than raising the concern in a chance hallway encounter or in an existing meeting. This gave Webb the time to mentally prepare — to arrive at the conversation without the specific defensiveness of ambush.
Application to Managing-Up Confrontations
Edmondson's research has direct application to the specific challenge of managing-up confrontations in organizational settings. The hierarchy effect is not an immutable law — it is a pattern that emerges from specific organizational conditions and individual calculus errors, and that can be significantly modified by the right preparation and framing.
The key applications:
Don't let the hierarchy effect tell you what is possible. The consistent finding is that people overestimate the risk of speaking up. Before concluding that raising a concern with your supervisor is too risky, explicitly check your risk assessment against available evidence. How has this supervisor responded to concerns in the past? What is the organizational culture around raising issues? What would the actual consequences of a well-prepared, professionally framed concern realistically be?
Use the solution-presenting frame as a calculus correction. The solution-presenting frame directly addresses the most common perceived costs of speaking up: being seen as a complainer, being seen as not understanding the reasoning, being seen as insubordinate. A speaker who arrives with preliminary thinking about solutions has already demonstrated competence and organizational commitment, which counteracts the most threatening perceived costs.
Recognize the organizational cost of your silence. Edmondson's research makes clear that individuals' silence is not cost-free for organizations. When the nursing staff don't report medication errors, patients are harmed. When co-pilots don't challenge captains' incorrect approaches, planes crash. When operations managers don't raise information flow problems to their VPs, organizational decisions get made with worse information. Your silence has a cost, and it is not yours alone to bear.
Create conditions in your own team. For managers — even those who are managing up in some relationships — the research has direct implications for managing down. The leaders in Edmondson's studies who modeled fallibility, explicitly invited input, and responded productively to concerns when raised consistently produced teams that performed better, caught more errors, and surfaced more concerns before those concerns became crises. This is not just good ethics. It is good operations.
Key Research Findings Summary
| Finding | Practical Implication |
|---|---|
| People consistently overestimate the risk of speaking up | Check your calculus explicitly before deciding not to raise a concern |
| The perceived costs of speaking up feel more certain and immediate than the benefits | Reframe: organizational costs of silence are also real, certain, and often more serious |
| The hierarchy effect is consistent across industries and cultures | It is a default pattern, not an immutable law — it can be modified |
| Explicit invitations to speak up significantly increase speaking-up rates | Leaders: don't assume your openness is visible; say it out loud |
| Leader response to concerns is the strongest predictor of team psychological safety | Managers: the way you receive concerns when they are raised matters more than anything you say about wanting them |
| Tentative language reduces the perceived threat of upward challenges | Frame concerns as questions and partial perspectives, not as certainties |
| Timing and setting affect reception | Choose private, calm, pre-announced conversations over ambushes |
Discussion Questions
-
Edmondson's research found that units with high psychological safety reported more errors — not fewer. Why did this finding initially seem paradoxical, and what does the correct interpretation reveal about how organizations should think about error disclosure?
-
The "hierarchy effect" calculus consistently overestimates costs and underestimates benefits. Why do you think this bias is so robust? Is there an evolutionary or developmental explanation for why we might be wired to overweight the costs of challenging authority?
-
Edmondson identifies the leader's response to concerns as the strongest predictor of psychological safety. What specific behaviors in a leader's response to a concern would increase future speaking-up? What specific behaviors would decrease it?
-
The research distinguishes between leaders who say they want input and leaders who actually respond well when they receive it. Can you think of a leader (in your experience or in public life) whose stated openness and actual receptiveness diverged significantly? What was the effect on the organization or group?
-
Sam applied several of Edmondson's principles — the solution-presenting frame, advance notice, private setting, organizational framing — in his conversation with Webb. Which of these do you think was most important to the conversation's success? Why?
-
Edmondson's research suggests that individual silence in organizations is not cost-free — it has organizational consequences. Does this change how you think about the ethics of choosing not to raise a concern at work? What responsibilities do individuals have to speak up, and where are the limits of that responsibility?