Case Study 01: From Idea to App in an Afternoon

A non-developer builds a customer feedback dashboard using vibe coding


Background

Elena Vasquez is a marketing manager at a mid-size consumer electronics company called BrightWave Technologies. She has a degree in business administration, over eight years of experience in marketing, and a strong understanding of customer behavior — but she has never written a line of code. Her technical skills extend to spreadsheets, presentation software, and marketing analytics platforms, but programming has always felt like someone else's job.

For months, Elena had been frustrated by a recurring problem. Her company collected customer feedback from multiple channels: post-purchase surveys via email, product review forms on the website, social media comments scraped into spreadsheets, and customer support ticket summaries exported from their helpdesk system. Every week, she spent hours manually compiling this feedback into a PowerPoint presentation for the product team. She copied numbers from spreadsheets, created charts in Excel, and pasted everything together into slides that were outdated almost as soon as she finished them.

Elena had asked the IT department for a dashboard, but they were backlogged with higher-priority projects. She had evaluated several commercial dashboard tools, but none of them handled the specific combination of data sources and metrics her team needed without expensive customization. She was stuck — until a colleague mentioned vibe coding.

"Have you tried just asking Claude to build it for you?" her colleague said during a lunch break in March 2025. "My cousin used it to build an inventory tracker for his restaurant. He doesn't know how to code either."

That conversation planted the seed. The following Saturday, Elena decided to try.


The Vibe Coding Journey

Step 1: Defining the Goal (30 minutes)

Elena sat down with a notebook — a physical one — and wrote out what she wanted before touching any technology. She had read an article about vibe coding that emphasized the importance of knowing what you want before asking the AI to build it.

Her notes read:

  • What: A web-based dashboard that shows customer feedback trends. Should pull data from CSV files (the format all her sources exported to). Should show overall satisfaction scores, trending topics in feedback, sentiment breakdown, and volume of feedback over time.
  • Who: Herself and the product team (about 8 people). Not customer-facing.
  • Problem: Manually compiling feedback data takes 4-6 hours per week and is always outdated. The product team needs real-time visibility into customer sentiment.
  • Success criteria: She can upload a CSV file and immediately see a visual dashboard with charts. It should take less than 5 minutes to update the dashboard with new data, compared to the current 4-6 hours.

Step 2: The First Conversation (20 minutes)

Elena opened Claude in her web browser and typed her first prompt:

"I want to build a customer feedback dashboard for my marketing team. I'm not a developer — I've never coded before. I need a web page where I can upload a CSV file containing customer feedback data. The CSV has columns for: date, source (survey, review, support, social), rating (1-5), category (product quality, shipping, customer service, pricing, other), and comment (text). After uploading, I want to see: 1) An overall average rating displayed prominently, 2) A line chart showing average rating by month, 3) A bar chart showing feedback volume by source, 4) A pie chart showing feedback by category, 5) A sentiment breakdown showing positive/neutral/negative percentages based on rating (4-5 positive, 3 neutral, 1-2 negative). Make it look professional and clean, with a blue and white color scheme. I want to be able to run this on my laptop."

She was nervous about the prompt being too long, but it captured everything she needed.

Claude responded with a complete Python application using Flask (a web framework) and Chart.js (a charting library). The response included three files: a Python script, an HTML template, and instructions for how to run it.

Step 3: Getting It Running (45 minutes)

This was the most challenging part for Elena. She had never used a terminal or installed Python. Claude walked her through it step by step:

  • Installing Python from python.org
  • Opening the terminal (Command Prompt on her Windows laptop)
  • Creating a folder for the project
  • Saving the files Claude had generated
  • Installing the required Python packages with pip install flask pandas
  • Running the application with python app.py

She hit two errors during this process. The first was a ModuleNotFoundError because she had forgotten to install pandas. She copied the error message back into Claude, and it told her exactly what to do. The second was a port conflict — another application was using port 5000. Again, she pasted the error, and Claude modified the code to use port 8080 instead.

When she finally opened http://localhost:8080 in her browser and saw a clean, professional-looking upload page with the BrightWave blue color scheme, she felt a rush of genuine excitement.

Step 4: Testing with Real Data (30 minutes)

Elena had prepared a test CSV file with 200 rows of real feedback data from the past quarter. She uploaded it and — to her surprise — the dashboard rendered immediately. The charts appeared, the average rating showed 3.7 out of 5, and the category breakdown matched what she knew from her manual analysis.

But she noticed issues:

  1. The monthly trend chart was hard to read because it showed individual data points rather than a smooth trend line.
  2. The source names from her CSV ("email_survey," "web_review") were not formatted nicely — she wanted them to appear as "Email Survey" and "Web Review."
  3. There was no way to filter by date range, which she needed to compare different periods.
  4. The page had no title or branding.

Step 5: Iterative Refinement (2 hours)

This is where Elena spent the bulk of her time, and it was also where she found the vibe coding process most rewarding. She went through several rounds of conversation with Claude:

Round 1 — Visual improvements:

"The monthly trend chart shows individual data points and looks jagged. Can you make it a smooth line? Also, the source names from my CSV are formatted like 'email_survey' — can you make them display as 'Email Survey' with proper capitalization and spaces?"

Claude updated both issues and also added hover tooltips to the charts (something Elena had not asked for but immediately appreciated).

Round 2 — Date filtering:

"I need to filter the data by date range. Can you add a date picker where I select a start and end date, and all the charts update to show only that period?"

Claude added a date range selector at the top of the dashboard. It worked on the first try.

Round 3 — Export functionality:

"This is great. One more thing — my boss will want to see these charts in our Monday meetings. Can you add a button that exports the dashboard as a PDF?"

Claude added a "Download PDF" button using a JavaScript library. The generated PDF looked almost identical to the web page.

Round 4 — Multiple file handling:

"Right now I can only upload one CSV at a time. But I get feedback from four different systems, each with their own CSV. Can I upload multiple CSVs and have them all combine into one dashboard?"

This required a more significant change, and Claude restructured the upload page to accept multiple files and merge them before generating the dashboard. There was a bug where duplicate entries appeared when the same file was uploaded twice, and Elena caught it during testing. She described the problem, Claude fixed it by adding deduplication logic, and the dashboard worked correctly.

Round 5 — Branding and polish:

"Can you add 'BrightWave Customer Insights' as the page title, with our logo? I'll upload the logo file. Also, the font is a bit small — can you increase it and add a professional footer?"

Claude added the branding, increased the font sizes, and created a clean footer with a timestamp showing when the data was last updated.

Step 6: Validation (1 hour)

Elena methodically tested the dashboard:

  • Uploaded an empty CSV: the dashboard showed a helpful "No data to display" message instead of crashing.
  • Uploaded a CSV with missing columns: the app showed a clear error message identifying which columns were missing.
  • Uploaded a very large CSV (5,000 rows): the dashboard loaded in about 3 seconds, which was acceptable.
  • Tested on her phone's browser: the charts were readable but the layout was cramped. She asked Claude to make it responsive, and it adjusted the layout for mobile screens.
  • Had a colleague try using it without any instructions: the colleague figured it out in under a minute.

Step 7: Sharing with the Team (30 minutes)

Elena's dashboard was running on her laptop, which was fine for development but not practical for team use. She asked Claude about options and learned she could deploy it to a free tier of a cloud platform. Claude walked her through deploying to PythonAnywhere, a hosting service with a free tier that was sufficient for her team's internal use.

By Saturday evening, the Customer Insights Dashboard was live and accessible to her team via a URL.


The Result

The following Monday, Elena demonstrated the dashboard at the weekly product team meeting. The reaction was immediate and enthusiastic:

  • The product manager said it was "exactly the kind of visibility we've been asking for"
  • The VP of Product asked if it could be expanded to include Net Promoter Score data
  • A colleague from the data team was impressed by the chart quality and asked what developer had built it

Elena explained that she had built it herself over a single afternoon using vibe coding. The room was a mix of surprise, curiosity, and — from the IT department representative — a slightly uncomfortable silence.

By the Numbers

Metric Before After
Time to compile weekly feedback report 4-6 hours 5 minutes (upload new CSVs)
Report freshness Weekly (always outdated) Real-time (updated on upload)
Charts and visualizations Static PowerPoint Interactive web dashboard
Accessible to team Via email attachment Via shared URL
Development cost Would have been $5,000-$15,000 from IT $0 (plus one Saturday)

Three Months Later

Three months after the initial build, Elena had continued to iterate on the dashboard. She had added:

  • A keyword extraction feature that showed the most frequently mentioned words in feedback comments
  • A comparison mode that let the team see two time periods side by side
  • An automatic weekly email summary generated from the dashboard data
  • Color-coded alerts when satisfaction dropped below a threshold

Each feature was added through the same conversational process: describe what she wanted, review the result, test it, refine it. Her total time investment since the initial build was about 10 additional hours spread over three months.

Perhaps most significantly, Elena's success had inspired three other non-technical team members to try vibe coding for their own projects. The head of HR built an employee satisfaction survey analyzer. A sales operations coordinator built a lead scoring tool. A facilities manager built an office space booking system.


Lessons Learned

Elena shared her reflections in a company blog post, and they capture several key lessons for aspiring vibe coders:

  1. Preparation matters more than technical skill. "The time I spent writing down what I wanted before touching any AI tool was the most valuable 30 minutes of the whole project. When I was clear about what I needed, the AI delivered. When I was vague, the results were vague."

  2. Errors are not failures — they are conversations. "Every error message I got felt scary at first, but I learned to just copy it into the chat and ask what went wrong. The AI always explained it and fixed it. Errors became part of the normal workflow, not a sign that something was going terribly wrong."

  3. Test like a user, not a developer. "I don't know how to read code for bugs, but I know how to use software. I tested by using the dashboard the way I would use it for real work — uploading real data, trying unusual situations, checking if the numbers made sense. That was enough to catch every important issue."

  4. Start small and expand. "I didn't try to build the final, perfect dashboard on the first attempt. I started with the basic charts and added features one at a time. Each round of improvements was quick because the foundation was already solid."

  5. Know the limits. "I wouldn't use this approach for something that handles sensitive customer data directly — that would need professional security review. But for an internal analytics tool using aggregated, anonymized data? Vibe coding was perfect."


Discussion Questions

  1. What aspects of Elena's preparation contributed most to her success?
  2. At what points in the process would a professional developer's involvement have been beneficial?
  3. How does Elena's testing approach (testing like a user) compare to formal software testing practices?
  4. What are the risks of non-technical team members building tools that become business-critical, as the Customer Insights Dashboard did?
  5. How would you evaluate whether Elena's dashboard is reliable enough for the business decisions being made from its data?

This case study illustrates the concepts from Sections 1.6, 1.7, and 1.9 of Chapter 1. The code from Elena's project is available in code/case-study-code.py.