Case Study 2: Trust and Complexity Conservation -- The Currencies You Cannot Print

"Trust is like an eraser -- it gets smaller after every mistake." -- Anonymous


Two Invisible Currencies

This case study examines two of the most consequential conserved quantities in human systems: trust and complexity. Neither appears on any balance sheet. Neither is measured in standard units. Neither is tracked by any formal accounting system. Yet both behave, in their respective domains, like quantities that are conserved -- and both produce devastating consequences when leaders pretend they can be created from nothing or destroyed into nothing.

The case study traces each through a detailed real-world example, then examines the structural parallel between them.


Part I: Conservation of Trust -- The Wells Fargo Account Scandal

The Setup

For decades, Wells Fargo was considered one of the most trustworthy banks in the United States. It had survived the 2008 financial crisis in better shape than most of its peers. It was known for conservative lending practices, a community-banking ethos, and the folksy image cultivated by its stagecoach logo and its reputation as "the bank that puts customers first." Warren Buffett, perhaps the most famous investor in the world, held Wells Fargo as one of his largest positions and praised its management and culture.

Behind this image, something very different was happening.

Beginning in at least 2002 and continuing until 2016, Wells Fargo employees opened millions of unauthorized bank and credit card accounts in existing customers' names. They transferred customers' money into new accounts that the customers had not requested, signed customers up for online banking services they did not know about, and enrolled customers in bill-pay programs without their knowledge. The motivation was a high-pressure sales culture that demanded employees meet aggressive cross-selling targets -- the number of products sold per customer. Employees who failed to meet targets faced termination. Employees who met targets earned bonuses.

Over this period, Wells Fargo employees created approximately 3.5 million unauthorized accounts. Customers were charged fees on accounts they did not know existed. Their credit scores were damaged by inquiries they had not authorized. Their financial lives were disrupted by a bank they trusted to protect them.

The Trust Transfer

When the scandal became public in 2016, Wells Fargo's response demonstrated the conservation of trust with painful clarity.

Phase 1: Trust destruction. The revelation destroyed trust between Wells Fargo and its customers. Customers who had trusted the bank to act in their interest discovered that the bank had been acting against their interest -- not through a single error, but through a systematic, years-long program of deception. The trust that had taken decades to build was destroyed in weeks.

Phase 2: Trust transfer to suspicion. The destroyed trust did not vanish. It was converted into suspicion -- not just of Wells Fargo, but of the banking industry as a whole. Surveys conducted after the scandal found that consumer trust in banks declined across the board, not just for Wells Fargo. The betrayal by one institution was transferred to the entire category. Customers who had never had an account at Wells Fargo became more suspicious of their own banks, more vigilant about checking their account statements, more skeptical of banks' claims to act in their interest.

This is conservation of trust in action. The trust that Wells Fargo destroyed was not lost to the universe. It was transformed into a generalized suspicion that was distributed across the banking system. The total amount of "trust-plus-suspicion" in the system was conserved; what changed was the ratio.

Phase 3: Trust redistribution. As trust in Wells Fargo declined, some of it was redistributed to competitors. Credit unions and community banks reported increased deposits and new account openings in the months after the scandal. The trust was not created from nothing -- it was transferred from Wells Fargo to institutions perceived as more trustworthy. The conservation law operated across the competitive landscape: trust lost by one institution became trust gained by others.

Phase 4: Trust bankruptcy. Wells Fargo entered a state of trust bankruptcy. Its public statements were met with skepticism regardless of their content. Its reform efforts were assumed to be cosmetic. Its new CEO was given less benefit of the doubt than a CEO at any other major bank would have received. The bank had exhausted its trust capital, and every subsequent interaction operated under a cloud of suspicion that increased the cost and reduced the effectiveness of everything the bank tried to do.

The bank was fined billions of dollars. Its CEO was forced to resign. The Federal Reserve imposed unprecedented restrictions on the bank's growth. But the most lasting consequence was the trust damage -- damage that no fine, no leadership change, and no reform program could quickly undo, because trust cannot be manufactured. It can only be built slowly through consistent trustworthy behavior, and rebuilding begins from a deficit that the conservation law says must be worked off before the balance turns positive again.

The Conservation Logic

The Wells Fargo case illustrates three features of trust conservation.

First, trust creation is slow but trust destruction is fast. This asymmetry is a feature of the conservation dynamics, not an aberration. Building trust requires many consistent interactions that each add a small increment. Destroying trust requires only a single betrayal that removes the entire accumulated balance. The asymmetry is analogous to the physical asymmetry between building a structure (which requires careful, sustained effort) and destroying one (which requires only sufficient force applied once).

Second, destroyed trust does not disappear but transforms. Trust that is destroyed becomes suspicion, cynicism, and vigilance -- active negative states that affect behavior. A customer who has been betrayed by a bank does not return to a neutral state of no opinion. They enter a negative state of active distrust that affects their interactions with all banks. The conserved quantity has changed sign, not magnitude.

Third, trust transfers across boundaries. When one institution betrays trust, the trust damage spreads to similar institutions. This network effect means that trust conservation operates not just within a single relationship but across an entire ecosystem of relationships. The total trust-plus-suspicion in the system is conserved, but a betrayal in one node redistributes the balance across all nodes.


Part II: Conservation of Complexity -- The Healthcare.gov Launch

The Setup

In 2013, the United States government launched Healthcare.gov, the online marketplace where Americans could enroll in health insurance under the Affordable Care Act. The launch was one of the most visible technology failures in American history. On its first day, only six people successfully enrolled. The site crashed repeatedly. Users encountered error messages, infinite loading screens, and data loss. Over the following weeks, millions of Americans who were legally required to obtain health insurance could not access the system designed to provide it.

What went wrong? The standard narrative focuses on technical failures: bad code, insufficient testing, poor project management. But from a conservation-of-complexity perspective, the failure illustrates something deeper: the illusion that complexity can be eliminated through project management rather than merely managed.

The Complexity That Could Not Be Removed

The Affordable Care Act required Healthcare.gov to perform an extraordinarily complex task. The system had to verify users' identities, determine their eligibility for subsidies based on income, connect to databases at the IRS and the Social Security Administration, present insurance plans from dozens of different insurers in each state, calculate premiums based on age, location, family size, and tobacco use, and process enrollment -- all in real time, for millions of simultaneous users.

This complexity was not a design choice. It was a consequence of the underlying policy. The Affordable Care Act was a complex law because it attempted to reform a complex system (American healthcare) while preserving most of the existing system's structure (employer-based insurance, state-level regulation, multiple competing private insurers). Every feature of the law that preserved compatibility with the existing system added complexity to the technology that implemented it.

The leaders overseeing the project did what leaders frequently do with complexity: they demanded simplicity. They wanted a user experience as simple as shopping on Amazon. They wanted clean screens, minimal clicks, and an intuitive flow. They did not want to see the complexity. And they got what they wanted -- on the surface.

The user interface was indeed simple. But the complexity had not been eliminated. It had been pushed downward -- into the backend systems, into the integrations with federal databases, into the business logic that determined eligibility and pricing. The complexity that was hidden from the user had to be handled by the technology, and the technology was not ready to handle it.

Tesler's Law in Action

The Healthcare.gov failure is a textbook demonstration of Tesler's Law. The application had an inherent amount of irreducible complexity determined by the nature of the problem it was solving. That complexity could be distributed between the user (who could be asked to navigate complex choices and provide detailed information) and the system (which could be asked to automate the complex choices and gather the information from other sources). The decision to make the user experience simple was, simultaneously, a decision to make the backend enormously complex. Conservation of complexity demanded it.

The failure occurred because the decision-makers treated the user-experience simplicity as evidence that the overall problem was simple. They set timelines and budgets appropriate for a simple project. They staffed the project as if the simplicity of the interface reflected the simplicity of the system. They were, in effect, treating the conservation of complexity as if it were a law that could be repealed by executive decision.

It could not. The complexity was conserved. It lived in the backend, in the integrations, in the edge cases, in the concurrency challenges of millions of simultaneous users. When the system launched, the hidden complexity asserted itself with devastating effect.

The Reconstruction

The rescue of Healthcare.gov, led by a team of Silicon Valley technologists brought in after the initial failure, is equally instructive from a conservation perspective. The rescue team did not eliminate the complexity. They redistributed it. They broke the system into smaller components, each of which handled a manageable piece of the total complexity. They added caching layers that reduced the real-time demand on backend systems. They introduced queuing mechanisms that spread the load across time rather than handling it all simultaneously.

None of these techniques reduced the total complexity. They redistributed it across time (queuing), across components (microservices), and across layers (caching). The conservation law held throughout. What changed was not the amount of complexity but the skill with which it was managed -- the recognition that complexity must be allocated deliberately rather than wished away.


Part III: The Structural Parallel

Trust and complexity are different quantities. One is social, the other is technical. One lives in relationships, the other in systems. But they share a conservation structure that is worth making explicit.

What Can Be Done with Each

Operation Trust Complexity
Created from nothing? No -- trust requires sustained trustworthy behavior No -- irreducible complexity is determined by the nature of the problem
Destroyed into nothing? No -- destroyed trust becomes suspicion No -- removed complexity reappears as bugs, edge cases, or failures
Transferred? Yes -- between relationships, between institutions, between categories Yes -- between layers, between components, between users and developers
Transformed? Yes -- trust transforms into suspicion and back Yes -- complexity transforms between types (user complexity, code complexity, operational complexity)
Measured precisely? No -- trust has no standard unit No -- complexity has no universal metric
Managed skillfully? Yes -- trust can be allocated and invested wisely Yes -- complexity can be allocated to where it is handled most effectively

The Common Pattern

In both cases, leaders who fail to recognize the conservation law make the same error: they confuse the removal of the visible quantity with the elimination of the quantity itself. A CEO who suppresses negative feedback does not eliminate the organizational distrust that generated it. They move it underground, where it festers and grows. A project manager who demands a simple interface does not eliminate the underlying complexity. They move it to the backend, where it accumulates until the system cannot handle it.

In both cases, the conservation law provides the same diagnostic tool: when something seems to have disappeared (trust seems high when it should be low, a system seems simple when the problem is complex), look for where the conserved quantity has been moved. It is there. It is always there. The question is not whether it exists but where it is hiding.

The Common Consequence of Ignoring Conservation

In both cases, ignoring conservation produces the same outcome: a reckoning. Wells Fargo's trust debt came due when the scandal became public. Healthcare.gov's complexity debt came due on launch day. In both cases, the reckoning was more severe than it would have been if the conservation law had been respected from the beginning -- because the hidden accumulation had compounded over time, just as financial debt compounds (Ch. 30).

The common lesson is this: conservation laws in human systems are patient. They do not punish violations immediately. The CEO who ignores trust conservation may enjoy years of apparent success. The project manager who ignores complexity conservation may hit milestones and earn praise right up until launch day. The violation produces no immediate feedback, which encourages the violator to believe they have found an exception to the rule. But the violation is accumulating. The conserved quantity is building up somewhere, out of view, and when it finally becomes visible, the cost of the accumulated violation is far greater than the cost of respecting the conservation law would have been.


Part IV: Building Conservation-Aware Institutions

The cases suggest practical principles for institutions that want to respect the conservation of trust and complexity.

For Trust

  1. Treat trust as a balance sheet item. Even though trust cannot be precisely measured, it can be tracked qualitatively. Every interaction with customers, employees, or the public either deposits into or withdraws from the trust account. Know which you are doing.

  2. Invest in trust before you need it. Trust capital, like financial capital, is most valuable in a crisis. An institution that has built deep trust reserves can survive a mistake. An institution that has depleted its trust reserves cannot. The time to invest in trust is before the crisis, not during it.

  3. Trace trust transfers. When trust is lost in one part of the organization, ask where it has gone. Has it become employee cynicism? Customer suspicion? Regulatory scrutiny? Following the transfer reveals the full cost of the breach.

  4. Accept the asymmetry. Trust builds slowly and breaks quickly. This is not unfair. It is the conservation dynamic. Design your institution's behavior around this asymmetry: be patient in building trust, be vigilant in protecting it.

For Complexity

  1. Budget complexity as you budget money. Every project has a complexity budget determined by the nature of the problem. Allocate it deliberately: who handles which complexity? Users? Developers? Operations? Support? The allocation is a design decision with consequences.

  2. Do not confuse hiding complexity with eliminating it. A simple interface is not a simple system. When evaluating a project, look at where the complexity lives, not just whether the surface appears clean.

  3. Trace complexity transfers. When a redesign simplifies one component, ask where the complexity moved. To another component? To the users? To the documentation? To the support team? The complexity went somewhere. Find it before it finds you.

  4. Respect irreducible complexity. Some problems are inherently complex. No amount of clever design will make them simple. The goal is not simplicity but appropriate complexity -- complexity allocated to the places where it can be handled most effectively.


Questions for Reflection

  1. Identify an institution in your experience that has entered trust bankruptcy. Trace the conservation dynamics: where did the destroyed trust go? How did it transform? How did it affect adjacent institutions?

  2. Consider a software product or service you use regularly. Apply Tesler's Law: identify the complexity that has been hidden from you as a user. Where does it live? Who manages it? What happens when that management fails?

  3. The chapter argues that trust conservation operates across a network -- betrayal by one institution reduces trust in similar institutions. Can you identify a case where this network effect worked in the positive direction -- where trustworthy behavior by one institution increased trust in the whole category?

  4. Healthcare.gov's failure is often attributed to poor project management. How does the conservation-of-complexity analysis change or deepen that diagnosis? What would the project managers have done differently if they had understood Tesler's Law?

  5. Both trust and complexity are hard to measure precisely. Does this make the conservation analysis less useful, or does the conservation framework actually provide a way to reason about these quantities that compensates for the difficulty of measuring them?