Case Study 14-2: Morning Brew — How a Newsletter Built Its Audience on Research, Not Assumption
Background
In 2015, two University of Michigan students named Alex Lieberman and Austin Rief started a business news newsletter. The premise was simple: they wanted to read business news that was not dry and dense, so they would write the kind of newsletter they wished existed.
That founding instinct — create content you yourself want to consume — is one of the most common stories in creator media. But what separated Morning Brew from the thousands of newsletters that follow the same logic and fail is that Lieberman and Rief did not stay in their own heads. From early on, they built systematic feedback loops that kept them calibrated to what their actual readers wanted, not what they assumed.
By 2020, Morning Brew had 2.5 million subscribers and was acquired by Insider (now Business Insider) for $75 million. By 2023, the broader Morning Brew brand had expanded to seven newsletters across different industry verticals, a podcast network, and a media operation with over 200 staff.
This is a story about how audience research at the newsletter level parallels — and in some ways exceeds — what platform-based creators can do.
The Early Research: Who Actually Is Reading This?
In the first year, Morning Brew was a small campus newsletter, distributed manually to Michigan students. At that scale, Lieberman and Rief had something that would later be harder to replicate: direct, frequent conversations with their readers. They would see people reading the newsletter at coffee shops and stop to talk to them. They attended student organization meetings and asked members what they liked and what they would change.
This is an important point: the research methods that work at scale are usually refined versions of the research methods that work at zero. Direct conversation is always the most information-rich research, and the most successful creators are the ones who never fully stop having those conversations even after scale makes them harder to arrange.
What they learned from early conversations shaped everything:
- Their readers were ambitious, career-oriented students who felt alienated by traditional business news (the Wall Street Journal was seen as "for old people in finance")
- The specific struggle was not lack of interest in business news but the perception that understanding it required prior knowledge they did not have
- The format breakthrough: short paragraphs, pop culture references, and genuine jokes made the content feel like something a smart friend wrote, not a corporate press release
This "smart friend" framing — which the team explicitly articulated internally — became the editorial voice that Morning Brew has maintained for nearly a decade. It did not come from brainstorming. It came from listening to how their readers described what they wanted.
Email Analytics as Audience Research
Once Morning Brew was operating at scale, direct conversation became less feasible as the primary research mechanism. But newsletters have a research advantage that most social platforms do not: rich, individual-level behavioral data.
Every newsletter platform (Mailchimp, ConvertKit, Beehiiv, Substack) shows you open rate, click-through rate on individual links, and unsubscribe rate per issue. Morning Brew used this data obsessively.
What they tracked and why:
Open rate by subject line: Morning Brew ran constant informal A/B tests on subject lines by sending different subject lines to different portions of the list and measuring which opened more. This gave them a near-continuous feedback loop on what kinds of framing their audience responded to. Over time, patterns emerged: curiosity gaps outperformed news summaries ("Why is everyone suddenly talking about X" outperformed "X Company Reports Q3 Earnings").
Click-through rate by story type: Which stories did readers actually click through to read more? This separated what readers said they cared about from what they actually engaged with — an important distinction. Readers in surveys said they wanted more macroeconomic policy coverage; their actual click behavior showed they engaged most with company-specific stories about brands they recognized.
Unsubscribe rate by issue: When an issue produced an above-average unsubscribe rate, it was a leading indicator that something in that issue had missed the mark. Morning Brew investigated each spike — sometimes it was a topic that alienated a segment of readers, sometimes it was a joke that landed wrong.
The Reader Survey as a Twice-Annual Event
Morning Brew ran formal reader surveys twice a year. These were not optional additions to the newsletter — they were treated as editorial events, promoted heavily and designed carefully.
Key elements of their survey design that translate directly to the creator context:
They surveyed for behavior, not opinion. Instead of asking "Do you enjoy our finance coverage?", they asked "In the last month, have you taken any action based on something you read in Morning Brew? If yes, what?" Behavioral questions produce more honest and more actionable data than opinion questions.
They used surveys to surface new verticals. One of their surveys revealed strong reader demand for retail and e-commerce coverage specifically. That research directly led to the creation of Retail Brew, one of Morning Brew's industry-specific newsletters, which launched in 2020 and grew to hundreds of thousands of subscribers. The product idea came from listening to readers, not from internal brainstorming.
They published the results. Morning Brew regularly shared what they learned from surveys back with their readership: "You told us X. Here's what we're doing about it." This closed the feedback loop publicly, building trust with readers that their input actually mattered.
Comment Mining in a Newsletter Context
Most newsletters do not have public comment sections. Morning Brew's version of comment mining was reply mining: reading every reply their newsletter received (they encouraged readers to reply directly), categorizing the content, and feeding the patterns back into editorial decisions.
When readers replied in bulk about a particular story — either positively (expanding on it, sharing their own experience) or negatively (correcting an error, expressing disagreement) — it was treated as an editorial data point. Topics that generated high reply volume became recurring features. Stories that generated corrections or pushback were examined carefully for accuracy and framing issues.
For solo creators operating email lists, this same practice is completely replicable: every email you send should invite a reply, and every reply is a data point worth categorizing and tracking.
The Gap Analysis That Built a Media Company
Morning Brew's most consequential act of audience research was not about improving the daily newsletter — it was about identifying adjacent audience needs that no existing publication was serving well.
The team noticed a pattern in their readership data: significant segments of their audience were concentrated in specific industries (marketing, retail, healthcare, tech). These readers were engaged with the general business news but were also hungry for news specific to their industry.
The competitive landscape in B2B industry newsletters was dominated by expensive, dense trade publications aimed at executives — not the same voice-and-format approach that had made Morning Brew successful with young professionals.
That gap — high audience demand for accessible, voice-driven industry news + no well-positioned competitor serving it the way Morning Brew did — was the strategic insight that drove the vertical newsletter expansion.
Marketing Brew launched in 2020. Retail Brew launched the same year. Each identified a segment of the existing Morning Brew readership with a specific, unmet informational need and built a product to serve it.
Lessons for Individual Creators
Morning Brew's research practices scale down to individual creators with smaller audiences:
Use your email reply rate as a leading indicator. If you run a newsletter or send email updates, the reply rate tells you which topics activate your audience enough to break the friction of responding. Track it.
Ask about behavior, not opinion. "Have you done anything differently because of my content in the past 30 days?" tells you more than "Do you enjoy my content?"
Publish your findings. When you run a survey and change something because of it, tell your audience. "You asked for more beginner-friendly content, so I'm changing my format" builds more trust than silently pivoting.
Look for the vertical gap. What segment of your current audience has needs that go beyond what your main channel addresses? That is your expansion opportunity — and it comes from research, not intuition.
Analysis Questions
-
Morning Brew's founding insight — "create content you yourself want to consume" — eventually had to be supplemented by systematic audience research as the company scaled. At what point do you think this shift became necessary? What happens when a creator's personal preferences and their audience's needs diverge?
-
Morning Brew used click-through behavior to reveal a gap between what readers said they wanted (macroeconomic policy coverage) and what they actually clicked on (company-specific stories). This is a common discrepancy between stated preferences and revealed preferences. Why might people consistently say they want one thing and do another? How should creators respond to this?
-
The Retail Brew expansion came directly from reading audience data and identifying a gap. Compare this to how most individual creators decide to launch a second channel or expand to a new format. What is typically different about how those decisions get made, and what could be improved?
-
Morning Brew shares survey results with readers publicly, creating a visible feedback loop. What are the advantages of this transparency? Are there situations where publishing survey results could be a disadvantage?