Case Study 1: The Presentation That Changed a Policy — Data Storytelling in Government
Tier 2 — Attributed Narrative: This case study is a fictionalized composite inspired by documented real-world examples of data-driven policy change at the local government level. The specific characters, county, and exact figures are constructed for pedagogical purposes, but the dynamics — how data presentations succeed or fail in government settings — are based on widely reported patterns in public policy communication. The CDC vaccination data patterns referenced are consistent with published trends.
The Analyst and the Audience
Maria Gutierrez had been a data analyst at the Riverside County Health Department for three years. In that time, she had produced dozens of reports — meticulous documents full of tables, statistical tests, and carefully worded methodology sections. She was proud of her rigor. Her reports were technically impeccable.
And nobody read them.
Well, that is not entirely fair. Her supervisor read them. The department's biostatistician read them. But the people who actually made funding decisions — the county commissioners — had never, as far as Maria could tell, opened a single one. Her reports went into a shared drive folder that, judging by the access logs, received roughly the same amount of traffic as the department's archived fax instructions from 1997.
Then, in the spring of 2023, something changed. The county was facing a decision: whether to continue funding the mobile vaccination clinic program or cut it to close a budget gap. The program cost $1.8 million per year. The budget shortfall was $3.2 million. The mobile clinics were on the chopping block.
Maria's supervisor asked her to prepare "some data" for the commissioners' budget hearing. The usual request. The usual expectation: a report that would be technically sound, bureaucratically appropriate, and thoroughly ignored.
This time, Maria decided to try something different.
The Old Approach (What Maria Would Have Done Before)
Maria's typical report would have looked something like this:
Title: "Analysis of Mobile Vaccination Clinic Program Effectiveness: A Retrospective Evaluation of County-Level Immunization Data, 2018-2023"
Structure: 28 pages. Abstract. Literature review. Methodology section describing the quasi-experimental design. Regression tables with standard errors and confidence intervals. Twelve charts, most without annotations. A conclusion section that used phrases like "the results suggest a statistically significant positive association" and "further research is warranted."
The report was accurate. It was thorough. It was also nearly unreadable for anyone who did not hold a degree in biostatistics.
Maria had watched commissioners receive reports like this before. They would glance at the cover page, flip to the executive summary (if there was one), skim for a number that confirmed or contradicted their prior beliefs, and set it aside. The 28 pages of careful analysis might as well have been written in invisible ink.
The New Approach: Thinking Like a Storyteller
This time, Maria started not with her data, but with five questions:
-
Who is my audience? Seven county commissioners. Two had science backgrounds. Five did not. All were elected officials facing budget pressure and constituent complaints. Their primary concern was not public health — it was fiscal responsibility. They needed to cut $3.2 million, and cutting programs was politically painful.
-
What do they need to decide? Whether to continue, reduce, or eliminate the mobile clinic program.
-
What are they worried about? Spending money on programs that do not work. Being criticized by voters for wasteful spending. But also: being blamed if children get sick because of cuts.
-
How much time do I have? Each budget item got 10 minutes. Maria would have 5 minutes to present, with 5 minutes for questions.
-
What is the ONE thing they need to remember? Maria thought about this for a long time. She had dozens of findings. But if the commissioners walked out of the room remembering only one thing, what should it be?
She settled on this: Cutting the mobile clinics would cost the county more money than keeping them, because the preventable disease outbreaks that would follow are far more expensive than the program itself.
That was her insight. Everything else was supporting evidence.
Building the Presentation
Maria built an 8-slide deck. Here is what each slide contained:
Slide 1: Title "Protecting Children, Protecting Budgets: The Case for Mobile Vaccination Clinics"
She chose this title deliberately. It addressed both concerns — children's health and fiscal responsibility. She knew that leading with "health" alone would make this sound like a plea. Leading with both made it sound like a business case.
Slide 2: The situation "Riverside County achieved a 91% childhood vaccination rate in 2019 — one of the highest in the state."
One number. One sentence. A chart showing the county's rate compared to the state average, with the county highlighted in green above the state line. The point: we were doing well. This was the baseline.
Slide 3: The complication "Since 2020, our rate has dropped to 78% — below the herd immunity threshold for measles."
Same chart, now extended to 2023, with the line dropping below a clearly marked "85% herd immunity threshold" reference line. A red shaded zone below the threshold. The visual impact was immediate: we were in the danger zone.
Slide 4: Where the decline happened "The decline is concentrated in rural areas without clinic access. Urban rates have held steady."
A simple two-line chart: urban (stable) vs. rural (declining). Annotated with "Mobile clinic service area" above the urban line and "No mobile clinic access" below the rural line. The visual told the story: the clinics were working.
Slide 5: The key insight "Mobile clinics are not just a health program — they are a cost-avoidance program."
This slide contained a single comparison:
| Cost per year | |
|---|---|
| Mobile clinic program | $1.8 million |
| Single measles outbreak (estimated) | $4.2 million |
Below the table: "In 2019, a measles outbreak in Clark County, Washington cost the county an estimated $3.4 million in emergency response, hospitalization, and school closures. Our county's lower vaccination rate puts us at higher risk."
Maria sourced the Clark County figure from published reports. It was not a hypothetical — it was a documented real-world cost.
Slide 6: What happens if we cut "If we eliminate mobile clinics, our model projects vaccination rates will fall to 68% within two years — the level at which outbreaks become likely."
A projection chart showing the expected trajectory without clinics, with a shaded uncertainty range. An annotation marking the "outbreak risk zone."
Slide 7: The recommendation "We recommend maintaining the mobile clinic program and offsetting the cost through three specific measures."
She listed three budget alternatives that could close the same gap without cutting the health program.
Slide 8: The closing image A photograph of a mother holding her baby at a mobile clinic event (used with permission), with a single sentence: "24,000 children in Riverside County were vaccinated through mobile clinics last year."
No chart. No numbers. Just a reminder that behind the budget line items were real families.
The Presentation
Maria practiced her talk six times. She timed it: 4 minutes and 40 seconds, leaving room for a brief pause between her recommendation and the close. She knew her material well enough that she did not read from notes.
When she presented, she did not start with "Thank you for having me" or "Today I'm going to talk about." She started with a question:
"Can anyone tell me what a measles outbreak costs?"
Silence. One commissioner guessed: "A lot?"
Maria said: "In Clark County, Washington, it cost $3.4 million — almost twice what our entire mobile clinic program costs per year. And their vaccination rate, before the outbreak, was higher than ours is right now."
She had the room's attention.
She walked through each slide, spending about 30 seconds per slide. She did not read the slides — she told the story. The slides provided visual evidence while her words provided narrative and emotion.
When she reached the cost comparison on Slide 5, she paused. She let the numbers sit on the screen for a full five seconds before speaking. Then she said: "The mobile clinic program does not cost $1.8 million. It *saves* us $1.8 million — because the alternative is more expensive."
That reframing — from "cost" to "savings" — was the turning point.
The Questions
The commissioners had five minutes for questions. Here is what they asked and how Maria responded:
Commissioner Davis: "How do we know the mobile clinics are the reason for the difference? Maybe rural areas are just different."
Maria had anticipated this. "That's a fair question, and I'm glad you asked it. We controlled for income, education, and distance to the nearest hospital. Even after accounting for those differences, counties with mobile clinic access had vaccination rates 8 points higher. I can't prove causation with certainty — that would require a randomized trial — but the evidence is consistent across multiple analyses."
She was honest about uncertainty. She did not overstate. But she also did not understate.
Commissioner Park: "You said 'projected.' How reliable is that projection?"
Maria: "Projections have uncertainty, and I want to be upfront about that. The shaded area on the chart shows the range of plausible outcomes. Our best estimate is a rate of 68%, but it could be as low as 62% or as high as 74%. Even at the high end, we'd still be below the herd immunity threshold."
Commissioner Walsh: "What if we cut the program in half instead of eliminating it?"
Maria had backup slides for this. She showed a map of the county with clinic routes, demonstrating which communities would lose access under various cut scenarios. "Cutting in half means choosing which children don't get served. Here's what that looks like geographically." The map made the tradeoff concrete.
The Outcome
The commissioners voted 5-2 to maintain the mobile clinic program at full funding. Commissioner Davis — the one who had asked the tough question about causation — was quoted in the local newspaper: "The data made it clear that this program pays for itself. Sometimes the most fiscally responsible thing to do is not to cut."
Maria's analysis had not changed from previous years. The data was the same. The statistical methods were the same. What changed was the communication.
What Made It Work: Lessons
Looking back, several specific communication choices made Maria's presentation effective:
She started with her audience, not her data. She designed the entire presentation around what the commissioners needed to decide, not what she had analyzed. The 28-page report contained more analysis; the 8-slide deck contained more impact.
She led with the insight, not the methodology. The commissioners never saw a regression table. They saw costs, rates, and projections — the outputs of the analysis, translated into their language. The methodology was available if asked (and it was asked, once), but it was not the centerpiece.
She framed the decision in their terms. "Fiscal responsibility" is commissioner language. "Statistically significant positive association" is analyst language. Maria translated.
She used the narrative arc. Situation (we were doing well), Complication (now we're not), Resolution (clinics are the difference), Call to Action (keep them and offset the cost elsewhere). The commissioners followed the story naturally.
She anticipated objections. Every tough question got a prepared, honest answer. She did not get defensive when challenged. She acknowledged uncertainty while still making a clear recommendation.
She made it human. The closing photograph was not decoration — it was a reminder that data represents people. In a budget hearing, it is easy to think in abstractions. The image made it personal.
She practiced. Six times. Out loud. Timed. This is the most underrated communication skill: rehearsal.
The Broader Lesson
Maria's experience illustrates a truth that applies far beyond government budget hearings: the quality of your analysis is necessary but not sufficient. The quality of your communication determines whether your analysis has impact.
This is not a comfortable truth for many data scientists. We would prefer to believe that good work speaks for itself — that if the analysis is rigorous, the numbers will persuade. But numbers do not persuade. Stories persuade. Evidence presented in context persuades. Clear, honest, audience-centered communication persuades.
Maria did not manipulate the commissioners. She did not exaggerate her findings or hide uncertainty. She was scrupulously honest. What she did was translate her honest analysis into a form that her audience could understand, evaluate, and act on.
That is not spin. That is the highest form of data communication.
Discussion Questions
-
Maria chose to include a photograph in her closing slide. Some data communication experts would argue that emotional images have no place in a data presentation. Do you agree or disagree? Where is the line between humanizing data and manipulating the audience?
-
Commissioner Davis asked about causation. Maria acknowledged she could not prove it. If she had been less honest — if she had said "Yes, the clinics definitely caused the higher rates" — would the outcome have been different? What are the risks of overstating certainty?
-
Maria had 5 minutes. If she had 30 minutes instead, how should she have used the additional time? Should she have added more slides, more data, and more detail? Or would the same 8 slides with more discussion be more effective?
-
Think about a time when you tried to explain something to someone who did not have your background. What worked? What did not? How does that experience connect to the principles in this chapter?
-
Maria chose not to show the 28-page report during her presentation. But she had it available. In what situations would showing more methodology be appropriate — even for a non-technical audience?