Exercises: Analytics Decoded — Reading Your Numbers Like a Scientist
Part A: Understanding Your Metrics
Exercise 1: The Metric Audit Open your analytics dashboard for any platform you use. For each of the following metrics, find where it lives (or whether your platform shows it at all), write down your current average, and classify it as Tier 1 (Growth Signal), Tier 2 (Quality Signal), or Tier 3 (Context): views, completion rate, share rate, save rate, engagement rate, click-through rate, impressions. Which metrics can you not find? Which are most prominent in the dashboard (hint: usually the vanity ones)?
Exercise 2: Vanity vs. Real Look at your five most recent videos. Rank them by (a) total views and (b) share rate. Does the ranking change? Which video performed best on growth-predictive metrics that didn't perform best on vanity metrics? Write a hypothesis for why the two rankings differ.
Exercise 3: The Frequency Log For one week, track every time you open your analytics (even just to check quickly). At the end of the week, total the count. Then answer: Did any of those checks change your behavior in a productive way? Or were they primarily emotional check-ins? Based on this, design a realistic analytics review schedule for yourself.
Exercise 4: The Metric Benchmark Research typical performance benchmarks for your platform and content category. For TikTok, what's a good completion rate? For YouTube educational content, what's a strong CTR? Compare your current averages to these benchmarks. Where are you above average? Where are there clear gaps?
Part B: Reading Retention Curves
Exercise 5: The Retention Curve Diagnosis If you have access to YouTube Studio retention curves, open the three most recent videos that have at least 500 views. For each video: (a) identify which curve shape it most resembles (Cliff, Slope, Plateau, Mountain), (b) identify the top two drop-off spike points — the exact timestamps where viewership drops sharply, (c) watch those timestamps in your video. What's happening there? Write a diagnosis and proposed fix for each.
Exercise 6: The Rising Spike Hunt In the same three retention curves, find every moment where the curve rises (viewers are rewatching). For each rising spike: watch that exact moment. What's happening? Is it a visual reveal? A surprising fact? A funny moment? A callback? Build a list of "what makes my audience rewatch" — this is a recipe card for your strongest content moments.
Exercise 7: The 30-Second Test For five recent videos, record: what percentage of viewers are still watching at 30 seconds? Compare this to your completion rate. If the 30-second retention is significantly higher than your completion rate (e.g., 60% at 30 seconds, 25% completion), you have mid-video content problems. If the 30-second retention is low (below 40%), you have a hook-to-commitment gap. Diagnose your most common pattern.
Exercise 8: The Prediction Test Before your next video goes live, write down your prediction: what completion rate do you expect? What share rate? What engagement rate? After one week, compare your prediction to reality. Where were you right? Where were you wrong? Over time, this builds your intuitive analytics model — you start to predict performance before publishing.
Part C: Engagement Quality Analysis
Exercise 9: The Comment Audit Read every comment on your three best-performing videos (by growth score, not by views). Categorize each comment into: question, quote, share statement, testimonial, pushback/correction, simple reaction (emoji/one word). What percentage are in each category? What topics generate the most questions? What lines get quoted back most often?
Exercise 10: The Share Context Map When people say "I shared this with ," who did they send it to, and why? Look through your share statements in comments to identify: Who is the person your viewers are sharing with? What's the reason they give ("my friend who ," "my mom because ___")? This is your secondary audience — the person your viewer has in mind when they share. Does your content speak to them?
Exercise 11: The Save Motivation Question For videos with above-average save rates: what is the viewer planning to do with this saved content? Will they rewatch it? Share it later? Use it as reference? The answer changes how you should design "save-worthy" content. Look at what's being saved and hypothesize the motivation. Then test: add a verbal prompt ("save this for when you need it") and see if save rate increases.
Part D: A/B Testing Practice
Exercise 12: Design a Thumbnail Test Choose one of your existing videos and design two thumbnail versions: Thumbnail A (current or Version A concept) and Thumbnail B (a specific change — different expression, different text, different color scheme). Write out exactly what's different and your hypothesis for which will perform better and why. If you have YouTube Studio access, implement the test.
Exercise 13: The Hook Type Tracker For your next 10 videos, before you post each one, record: video topic, hook type used (curiosity/challenge/emotional/value/direct engagement — Ch. 16 categories), your prediction for completion rate. After 10 videos, calculate average completion rate by hook type. Which hook type is working best for your specific audience?
Exercise 14: Posting Time Experiment If you currently post at the same time every week, run a 6-week test: three weeks at your usual time, three weeks at a different time (opposite time of day, or different day). Keep content type constant as much as possible. Compare average engagement rate and share rate between the two time slots.
Part E: Building Your Analytics System
Exercise 15: Your Analytics Spreadsheet Create a spreadsheet with the following columns for every video going forward: date posted, title, hook type, content category, views (7-day), completion rate, share rate, save rate, engagement rate, growth score (calculate using the formula: (share rate × 2) + (save rate × 1.5) + engagement rate), notes. Fill in your last 10 videos and establish your baseline averages.
Exercise 16: The Monthly Review Ritual Design your personal monthly analytics review process. Using the five-step framework from Section 34.6 (Gather, Identify, Hypothesize, Decide, Note Non-Quantifiables), write out what specifically you'll do in each step, how long you'll spend on each, and what tools you'll use. Then schedule your first monthly review.
Exercise 17: The Data Mindset Letter Write a short letter (one paragraph) to yourself about what data is and isn't allowed to tell you. What decisions will you let analytics influence? What decisions will you keep protected from data pressure? What's your line between being evidence-informed and being evidence-controlled? Keep this letter in your analytics folder and reread it when you feel like a metric is driving your creative choices.