Case Study 10.1: The Macedonian Fake News Farms and the 2016 US Election

How Teenagers Monetized Political Misinformation Through Google AdSense


Overview

In the autumn of 2016, journalists and researchers investigating the ecosystem of false political information circulating on Facebook began to notice an unusual pattern: many of the most viral fake news stories were published on websites with American-sounding names but with server registrations pointing to an unexpected location — the small city of Veles in the Former Yugoslav Republic of North Macedonia (now the Republic of North Macedonia), a country of approximately two million people with an average monthly wage of about $400.

BuzzFeed News journalists Craig Silverman and Lawrence Alexander broke the story in November 2016, and subsequent reporting by the Washington Post, the Guardian, and many other outlets built a comprehensive picture of a miniature fake news industry operated by teenagers and young adults who were motivated almost entirely by one thing: money.

The Macedonian fake news case has become one of the most frequently cited examples in discussions of misinformation, media literacy, and the economics of digital advertising. But as with many frequently cited cases, its significance is sometimes misunderstood. This case study examines what actually happened in Veles, what the evidence shows about the operations' economic model, what their actual impact on the 2016 election may have been, and what lessons the case holds for understanding the political economy of misinformation.


Background: The Digital Advertising Ecosystem and Its Vulnerabilities

How AdSense Works

Google AdSense is a program that enables website owners to earn revenue by displaying Google ads on their websites. The program is central to the economic infrastructure of a large portion of the web: millions of websites generate their primary revenue through AdSense.

The program works through programmatic advertising: Google matches advertisers seeking to reach specific audience characteristics with websites where those audiences are present. When a user visits an AdSense-enabled website, an automated auction occurs in milliseconds, advertisers bid to show their ads to that user based on their inferred characteristics (political interest, demographic profile, geographic location, browsing history), and the winning ad is displayed. The website owner receives approximately 68% of the advertising revenue; Google retains approximately 32%.

Crucially, Google AdSense historically did not require editorial review of website content as a condition of program participation. Any website owner could apply to join AdSense and begin displaying ads. Google's content policies prohibited a number of content categories (pornography, dangerous content, deceptive content), but the enforcement of these policies relied primarily on automated detection and user complaints rather than proactive editorial review.

CPM Premiums for American Political Content

American political content commands disproportionately high CPM rates in digital advertising for several reasons: - American elections attract enormous advertising spending from campaigns, advocacy organizations, and issue advertisers. - American political audiences tend to be disproportionately affluent and educated — demographics that command premium CPM rates from many advertisers. - The emotional intensity of political content generates higher engagement, which advertisers and platforms associate with more valuable audience attention.

For Macedonian operators with access to cheap labor and low infrastructure costs, this CPM premium created a significant profit opportunity: produce content that attracts American political audiences, collect advertising revenue at American CPM rates, and retain the difference between American revenues and Macedonian costs.


The Veles Operations: What Actually Happened

Discovery and Scale

Silverman and Alexander's 2016 reporting identified more than 100 active US-politics websites registered in Veles alone. Analysis of the websites' content and traffic patterns revealed a highly systematic, profit-oriented operation:

  • Websites had names like "WorldPoliticus.com," "USADailyPolitics.com," "TrumpVision365.com," and "DonaldTrumpNews.co" — designed to look like American political news outlets.
  • Content was almost entirely plagiarized from American political websites or fabricated from whole cloth.
  • The most viral stories made false or dramatically exaggerated claims about presidential candidates, particularly pro-Trump and anti-Clinton stories.
  • Social media distribution was primarily through Facebook pages that aggregated large audiences by posting a mix of real political news and fabricated stories.

Individual operators reported varying levels of revenue. Silverman's reporting quoted one young Macedonian describing earning approximately $5,000 per month — more than ten times the average Macedonian wage. Other operators described more modest but still substantial earnings of $1,000-$3,000 per month.

Content Strategy: Following the Revenue

Perhaps the most revealing feature of the Macedonian operations is what determined content strategy: revenue data.

When Silverman interviewed Macedonian operators, they described making explicit content decisions based on what generated the most Facebook shares and, consequently, the most advertising revenue. Pro-Clinton content had been tried but generated less engagement and less revenue than pro-Trump content. Anti-Trump content also existed but generated less engagement than anti-Clinton content in the target audience (American conservatives and Trump supporters).

This content strategy was not ideological — it was algorithmic. The operators tested different types of content, measured engagement, and optimized toward whatever generated the most shares. In the American political landscape of mid-2016, pro-Trump content turned out to be the most profitable niche, and that is what the Veles operations produced.

One operator quoted in Silverman's reporting explicitly described having "no interest in US politics" and viewing the operations as a business rather than political advocacy. The content was a means to an end — generating advertising revenue.

The Role of Facebook

Facebook's distribution infrastructure was essential to the Macedonian operations' business model. Without Facebook's sharing mechanism, their websites would have had no effective way to drive large volumes of American traffic.

The mechanism worked as follows: 1. Operators created or purchased Facebook pages with large followings of American conservatives and Trump supporters, built by posting real (engaging) political news. 2. Fake news stories were mixed into the genuine content stream, appearing to be from the same trusted page as legitimate news. 3. When users engaged with (liked, shared, commented on) fake stories, Facebook's algorithm interpreted this engagement as a signal that the content was valuable and showed it to more of the page's followers and their networks. 4. High-sharing fake stories drove large volumes of traffic to the Macedonian websites. 5. Advertising revenue flowed back to the operators.

Facebook's algorithm did not distinguish between true and false content — both generated identical engagement signals. This was not a design flaw in the traditional sense; the algorithm was working as designed, maximizing content engagement. The problem was that the design did not include accuracy as a ranking criterion.


The Economic Analysis: Revenue, Costs, and Profit

Revenue Calculation

Working through the economics:

A moderate-scale Macedonian operation producing 20-30 articles per day could attract 2-4 million monthly page views if content achieved regular viral sharing on Facebook.

At a CPM of $3-5 (conservative for American political audiences): - 3,000,000 page views × 2 ad impressions per view = 6,000,000 impressions - 6,000,000 / 1,000 × $4 CPM = $24,000/month gross - Google AdSense share (68% to website owner): ~$16,320/month

At the Macedonian wage context: - Average monthly wage in North Macedonia (2016): ~$380 - Monthly profit of $16,000 represents approximately 42 average monthly wages - Annual revenue: ~$195,000 — equivalent to 42 years of average Macedonian wages

These calculations illustrate why the operations attracted so many operators: the profit-to-labor ratio was extraordinary.

Cost Structure

Costs for a typical operation were minimal: - Website hosting: $5-20/month on shared hosting - Domain registration: $10-15/year - Labor: Operators' own time (at near-zero opportunity cost for students) - Content production: Essentially free (plagiarism or simple fabrication) - Distribution: Free (Facebook pages) - Advertising management: Automated through AdSense (no cost)

The entire revenue stream was essentially pure profit after negligible startup costs.

The Advertising Arbitrage in Practice

The advertising arbitrage is particularly stark in this case:

The New York Times in 2016 employed approximately 1,300 journalists and editorial staff, maintained bureaus in 30 countries, and invested hundreds of millions annually in journalism. Its digital advertising revenue generated approximately $2-3 per page view in CPM equivalent.

A Macedonian fake news site produced zero original journalism, employed no reporters or editors, and generated approximately $2-4 per page view in CPM equivalent — similar revenue per page view, at essentially zero production cost.

The arbitrage profit — the gap between the production cost of legitimate journalism and the equivalent production cost of fake news — was captured by the Macedonian operators at the expense of the overall information ecosystem.


What the Evidence Shows About Electoral Impact

The Reach of Macedonian Fake News

BuzzFeed News's analysis found that the top 20 fake election news stories (many from Macedonian or similar operations) generated more total Facebook engagement in the three months before the election than the top 20 stories from major mainstream news outlets. Individual fake stories generated hundreds of thousands or millions of Facebook shares.

The most viral individual fake story — claiming that Pope Francis had endorsed Donald Trump — was estimated to have been shared approximately 960,000 times on Facebook. Other high-viral stories included false claims about the FBI investigation of Hillary Clinton and fabricated quotes attributed to politicians.

Why Impact Is Difficult to Establish

Despite this reach, establishing a causal link between Macedonian fake news consumption and vote choices is methodologically very difficult:

Self-selection: As established by Guess et al. (2019), the users most likely to consume fake news were already heavily partisan Trump supporters who had low probability of voting otherwise. The people most exposed to pro-Trump fake news were the least likely to need persuading.

Limited persuasion evidence: Research on political persuasion consistently finds that strong partisan identifiers are highly resistant to attitude change from political content. Allcott and Gentzkow (2017) estimated that fake news would need to be approximately 36 times as persuasive as a 30-second television advertisement to have determined the election outcome.

Counterfactual difficulty: We cannot know what the information environment would have looked like without Macedonian fake news — other forms of misinformation, partisan media, and organic false information would still have been present.

Activation vs. persuasion: If fake news had any electoral effect, it may have operated through increased turnout (activating existing Trump supporters) rather than through persuasion — a different mechanism that would require different evidence.


Platform Responses and Their Limits

Google's Response

Following media coverage, Google announced in November 2016 that it would update its AdSense policies to prohibit fake news sites from the program. In practice, this was difficult to implement: defining "fake news" in a policy context that could be automated was challenging, and determined operators could create new websites faster than automated systems could identify and remove them.

Google's subsequent efforts included improved content quality signals for AdSense participation, improved detection of spam and low-quality content, and expanded human review teams. The effectiveness of these measures was difficult to assess publicly, as Google did not release detailed data on the program.

Facebook's Response

Facebook implemented several responses to the fake news problem following the 2016 election: - Partnership with independent fact-checking organizations to label disputed content. - Algorithmic changes to reduce the distribution of "repeat offenders" — pages that repeatedly share false content. - Demoting links that generate "link baiting" behavior (high shares relative to likes/comments on the original link). - Reducing the algorithmic advantage of sensational and inflammatory content.

Research on the effectiveness of Facebook's fact-checking labels found modest effects on individual story sharing. Research on the algorithmic changes found significant reductions in viral news content sharing overall, though not specifically in the false-to-true content ratio.


Lessons and Implications

The Structural Lesson

The Macedonian case most clearly teaches that the monetization of misinformation does not require ideological motivation, foreign government direction, or organizational sophistication. It requires only: - A digital advertising ecosystem willing to monetize audience attention at scale without content review - A social media distribution mechanism that spreads emotionally engaging content without accuracy filtering - Low-cost labor and infrastructure in any country with internet access - Content that generates strong engagement responses among a target audience

These conditions exist globally and persistently. The Macedonian case is a symptom of systemic conditions rather than an isolated anomaly.

The Policy Lesson

The case suggests that addressing misinformation as an advertising revenue problem requires structural changes to the advertising ecosystem — not just content moderation improvements. Specifically:

  • Advertiser liability frameworks that make advertising networks accountable for content quality on websites where they place ads
  • Transparency requirements enabling independent verification of where advertising dollars flow
  • Quality review requirements for AdSense or equivalent programs — not just content policy enforcement but positive quality assessment

The Media Literacy Lesson

The Macedonian case is a powerful teaching tool because it demonstrates that misinformation need not be ideologically motivated to be harmful. This challenges the intuitive assumption that misinformation is primarily produced by people who believe what they write. It suggests that media literacy education needs to address the economic infrastructure of misinformation — teaching audiences to ask "who benefits financially from this content?" alongside "is this content accurate?"


Discussion Questions

  1. The Macedonian operators were motivated by profit rather than by any political conviction. Does this change your moral assessment of their actions? Is profit-motivated misinformation more or less problematic than ideologically motivated misinformation?

  2. Google AdSense did not review content before accepting websites into its program. Given what you now know about the consequences, should advertising networks bear legal responsibility for the content on which they place advertisements? What are the practical and legal challenges of implementing such responsibility?

  3. The Macedonian case demonstrates that misinformation production can be outsourced to low-wage countries, exploiting both global wage differentials and the CPM premiums of affluent-country audiences. What does this imply for the effectiveness of national-level content moderation approaches?

  4. If the electoral impact of Macedonian fake news was likely small (as Allcott and Gentzkow's analysis suggests), does that make the case less important as a policy concern? Or are there harms from misinformation that persist even when electoral impact is limited?

  5. The story notes that pro-Clinton fake news generated less engagement and revenue than pro-Trump fake news in 2016. How would you interpret this asymmetry? Does it suggest that Trump supporters are more susceptible to misinformation, or does it reflect something else about the political context?


Further Reading

  • Silverman, C., & Alexander, L. (2016, November 3). How teens in the Balkans are duping Trump supporters with fake news. BuzzFeed News.
  • Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236.
  • Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances, 5(1).
  • European Regulators Group for Audiovisual Media Services. (2021). Report on Disinformation: Assessment of the Implementation of the Code of Practice.