Case Study 02: Wikipedia at 25: The Platform That Refused to Manipulate You

An Anomaly Worth Examining

Type any factual question into a search engine and Wikipedia is likely to appear in the top results. For hundreds of millions of people, it is the first stop for information about anything — historical events, scientific concepts, biographical subjects, current affairs. It is available in 332 languages. Its English edition alone contains over 6.7 million articles. Approximately 1.8 billion unique devices access it every month. By every measure of reach and influence, Wikipedia is one of the most significant information resources in human history.

Wikipedia turned twenty-five years old in January 2026. It was founded in January 2001 by Jimmy Wales and Larry Sanger, initially as a complement to a more traditionally edited online encyclopedia called Nupedia. Within a few years, the volunteer-edited Wikipedia had dwarfed its curated predecessor, and the model of open, community-governed collaborative knowledge had proven itself at a scale that no one in 2001 had anticipated.

What makes Wikipedia remarkable in the context of this chapter is not primarily its scale, though the scale matters. It is the specific ways in which Wikipedia has refused to become what the attention economy would have made it.

Wikipedia has no advertising. It has no algorithmic feed. It does not track your reading behavior to build a behavioral profile. It does not send you notifications. It has no engagement optimization — no mechanisms designed to keep you on the platform longer, to pull you back when you leave, or to exploit psychological vulnerabilities to drive more page views. Its interface is, by the standards of contemporary digital design, almost aggressively plain.

And it is the fifth most-visited website in the world.

This is not an accident. It is a series of deliberate choices, made and repeatedly reaffirmed over twenty-five years, about what Wikipedia is for and how that purpose should shape its design.

The Structural Choice That Made Everything Else Possible

Wikipedia's refusal to run advertising is the foundational decision that makes all its other design choices possible. To understand why, consider the counterfactual.

If Wikipedia ran advertising — specifically, the behavioral surveillance advertising that dominates the contemporary digital landscape — it would need to collect behavioral data on its users. It would need to optimize for time-on-page and engagement metrics, because those are what advertising buyers pay for. The algorithms that determined what content appeared most prominently would need to prioritize content that kept users engaged, which in Wikipedia's context would mean controversial topics, emotionally resonant narratives, and content that provoked social sharing. The feedback loops that make content appear more prominently when it generates engagement would create incentives to write attention-grabbing rather than accurate articles. The design of the platform would shift toward keeping users on the site rather than helping them find what they came for and leave.

Wikipedia's founder Jimmy Wales made the decision not to run advertising in 2002, in Wikipedia's second year. The decision was controversial at the time — advertising seemed like the obvious path to financial sustainability — but Wales and the nascent Wikimedia community rejected it on the grounds that advertising would compromise Wikipedia's neutrality and introduce commercial incentives that were incompatible with the project's goal of creating free, unbiased knowledge.

This was not a naively idealistic decision. It was a structurally sophisticated one. Wales and others understood that the revenue model would shape the product. If Wikipedia's revenue depended on advertisers, Wikipedia's product decisions would eventually be shaped by what advertisers wanted — more engagement, more time-on-site, more granular user profiles. The only way to prevent this was to choose a different revenue model before the advertising path became locked in.

The Wikimedia Foundation, the nonprofit organization that owns and operates Wikipedia and its sister projects, was established in 2003 to hold Wikipedia's assets in a structure that would protect it from commercial acquisition. This structural choice — placing Wikipedia inside a nonprofit that cannot be bought by a for-profit company and cannot distribute profits to shareholders — is what has allowed all of Wikipedia's subsequent design choices to be made on the basis of mission rather than monetization.

How Wikipedia Actually Funds Itself

The question that inevitably follows the "no advertising" decision is: then how does it pay the bills?

The answer is donations — specifically, small donations from large numbers of individual readers. Several times a year, Wikipedia displays donation banners at the top of its pages, appealing to readers who find Wikipedia useful to contribute. The banners are, by most accounts, slightly annoying. They are also remarkably effective.

In its 2022-2023 fiscal year, the Wikimedia Foundation raised approximately $180 million from over 8 million individual donors. The median donation was approximately $13. This is a distribution that reflects the power of a genuine user community: millions of people contributing small amounts because they genuinely value what Wikipedia provides.

For context, $180 million is a substantial sum for a nonprofit but tiny compared to the revenues of major technology companies. Facebook's parent Meta Platforms generated approximately $117 billion in revenue in 2022. YouTube alone generated approximately $29 billion in advertising revenue in 2022. Wikipedia operates one of the world's most important information resources on roughly 0.6% of YouTube's advertising revenue.

This funding model has important implications for Wikipedia's design. Because the Foundation's revenue comes from user donations — because users are the customer rather than the product — the Foundation's incentives are aligned with user value rather than advertiser value. A Wikipedia that users found annoying or untrustworthy or biased would generate fewer donations. The donation model creates a direct feedback loop between user satisfaction and organizational sustainability.

It also creates constraints. The Wikimedia Foundation employs approximately 700 people and has faced criticism that its spending on overhead — staff, conferences, technical infrastructure — is disproportionate relative to the sums that reach the Wikipedia project directly. There have been ongoing governance disputes between the Foundation and the volunteer editor community about who controls decisions over Wikipedia's development. These are real tensions, and we will return to them.

Community Governance and the Absence of Algorithmic Curation

The second structural feature that distinguishes Wikipedia from attention economy platforms is its governance model. Wikipedia's content is created and moderated by a community of volunteer editors — as of 2024, approximately 45,000 people contribute to the English edition in any given month, with a much smaller core of highly active editors who do the majority of the work.

This community-governed model has direct design implications. There is no editorial algorithm determining which articles appear most prominently. There is no engagement-optimization system rewarding articles that generate more clicks. There is no behavioral targeting system delivering different content to different users based on their profiles. Wikipedia's front page — the "Main Page" in the English edition — features a "Did You Know," an "On This Day," and a "Featured Article" section that are selected by editors according to quality criteria, not engagement prediction. The selection criteria are public, disputed, and revised through community discussion.

The implications of this for information quality are significant. When Wikipedia's algorithms determine content visibility, they are doing so through processes that are transparent, publicly documented, and subject to community oversight. The criteria are "is this article accurate and well-sourced" rather than "does this article generate outrage and social sharing." This distinction is not trivial. The attention economy's algorithmic curation has demonstrably amplified misinformation because misinformation is emotionally engaging and engagement drives algorithmic promotion. Wikipedia's community curation, imperfect as it is, is not subject to this feedback loop.

Wikipedia's "neutral point of view" (NPOV) policy is a formal written commitment to representing all significant viewpoints on a topic fairly, without editorial advocacy. The policy is enforced through community discussion and, in contested cases, through formal arbitration processes. It is frequently contested, sometimes subverted, and occasionally produces articles that satisfy no one because they are trying to balance legitimately incompatible perspectives. But as a structural commitment — embedded in the platform's governance documents and enforced by a community with reputational investment in the policy's integrity — it represents something qualitatively different from a platform whose algorithmic feed prioritizes the most emotionally engaging version of contested claims.

What Wikipedia Sacrifices

Any honest account of Wikipedia as a case study in non-manipulative design must acknowledge what the model sacrifices.

Editor demographics. Wikipedia's volunteer editor base is heavily skewed: estimated at approximately 85-90% male, heavily from wealthy English-speaking countries, with significant underrepresentation of the Global South, women, and marginalized communities. This demographic imbalance produces content gaps — articles on topics of primary interest to non-Western populations, women's history, and marginalized communities are systematically less developed than articles on topics of primary interest to English-speaking Western men. Multiple initiatives have attempted to address this gap with mixed results.

The demography problem is not incidental to the model. Volunteer editor communities naturally reflect the demographics of people who have the time, internet access, and cultural confidence to contribute unpaid labor to a collaborative project. Addressing this requires either paying editors (which changes the model) or sustained, expensive outreach efforts that have not yet produced demographic parity.

Coverage gaps. The volunteer model means that coverage is uneven in ways that reflect editor interest rather than reader need. Articles on certain topics — major Western cities, popular scientific subjects, subjects of fan interest — are extensive and well-maintained. Articles on specialized topics, niche subjects, and rapidly changing current events may be inadequate, outdated, or absent. There is no mechanism for commissioning coverage of topics that need it; coverage exists only where volunteer editors choose to create it.

The reliability question. Wikipedia's reliability varies significantly by topic. Studies have found that Wikipedia's science articles compare reasonably well to established reference sources in accuracy; a landmark 2005 study in Nature found that the English Wikipedia's science articles contained an average of four errors each, compared to three in Encyclopaedia Britannica. But accuracy is harder to maintain in politically contested topics, in areas where determined bad actors coordinate to introduce bias, and in rapidly changing situations where information changes faster than the volunteer editor community can update.

The reliability limitation is often overstated by Wikipedia's critics and sometimes understated by its defenders. The honest assessment is that Wikipedia is highly reliable on many topics and significantly less reliable on others, and that users benefit from understanding the difference.

Speed. The community deliberation model is slow. Adding a new policy, redesigning a major feature, or responding to a fast-moving crisis requires consultation with a volunteer community that is distributed across time zones and has no formal hierarchy that can make binding decisions quickly. This is not always a disadvantage — slow deliberation produces more considered outcomes than fast algorithmic decisions in many cases. But it means Wikipedia cannot respond to emerging information at the speed of algorithmically optimized platforms.

What Wikipedia Proves

These limitations noted, Wikipedia's existence and sustained operation at top-ten global scale for twenty-five years demonstrates several things that are directly relevant to the argument of this chapter.

Large-scale platforms do not require advertising. This is the most straightforward thing Wikipedia demonstrates. The claim, frequently made by social media companies when confronted with demands for design reform, that advertising is the only viable revenue model for a large-scale platform, is simply false. Wikipedia operates at a scale and level of public trust that most advertising-supported platforms can only aspire to, with revenue from voluntary reader donations.

Algorithmic feed optimization is not required for user engagement. Wikipedia has no recommendation engine, no engagement-optimized feed, no algorithmic content curation. Users come to Wikipedia because they want specific information, find it, and leave. The average session length on Wikipedia is quite short — users find what they came for and go. This is not a failure of engagement strategy. It is evidence that genuine utility does not require manipulation to attract users.

Community governance, with all its dysfunction, produces better information quality than engagement-optimized algorithmic curation. Wikipedia's content moderation is slow, inconsistent, and demographically skewed. It is also substantially more accurate and less manipulative than the algorithmically curated content feeds of major social media platforms. This is not coincidental. When content quality rather than engagement is the optimization target — even imperfectly — the outcome is better information.

Non-manipulation is itself a form of user value. Wikipedia's users trust it in a way that users of advertising-supported platforms demonstrably do not. A 2022 Reuters Institute Digital News Report survey found Wikipedia among the most trusted information sources across multiple countries, consistently outranking social media platforms. This trust is not accidental. It is the product of twenty-five years of consistent commitment to accuracy, neutrality, and the absence of commercial manipulation. Trust is a form of value that advertising-supported platforms sacrifice in exchange for engagement metrics — and the long-term costs of that sacrifice are becoming increasingly visible.

The Question the Case Study Raises

Wikipedia's success does not answer the hardest question this chapter raises: can a social platform — one whose primary purpose is human communication and community rather than information retrieval — operate at scale without the engagement mechanics of the attention economy?

Wikipedia is, in an important sense, easier than Facebook or TikTok. Users come to Wikipedia to find information. They have a clear goal. The relationship between user and platform is transactional: I want to know something, Wikipedia tells me. The incentive to manipulate is present but weaker, because Wikipedia is not competing for the same kind of diffuse time-and-attention that social platforms seek.

A social platform is harder. The value it provides is inherently more open-ended — connection, entertainment, community, identity expression. These are things that are easier to manipulate than information retrieval, because there is no clear stopping condition, no "I found what I was looking for." The platforms that have most successfully captured social attention have done so by exploiting exactly this open-endedness, turning the absence of a clear stopping condition into an infinite scroll of stimulation.

But the question is not whether Wikipedia's exact model can be transposed onto a social platform. The question is what Wikipedia demonstrates about the assumptions of the attention economy. It demonstrates that scale does not require advertising. That governance by community rather than algorithm produces better outcomes on quality dimensions that matter to users. That trust — built through consistent non-manipulation over long time periods — is worth something that engagement metrics do not capture. That the money required to operate even a very large platform is, in principle, obtainable from users who genuinely value what the platform provides.

These are not complete answers. They are necessary components of an answer. And they have been demonstrated at scale, over twenty-five years, by a platform that began as a wild experiment and became, quietly, one of the most important institutions of the digital age.


Discussion questions for this case study appear in the exercises section. Students who wish to read more about Wikipedia's governance may consult the Further Reading list, which includes several works on the Wikimedia community and its challenges.