Case Study 35.2: WhatsApp and Election Misinformation in Brazil (2018 and 2022)
Background
Brazil is the fifth most populous country in the world, with approximately 215 million people. It is also one of the world's most active social media markets: as of 2018, Brazil had the largest WhatsApp user base of any country, with approximately 120 million users. For many Brazilians, WhatsApp — not Facebook, not Twitter, not news websites — was the primary medium for daily communication, news sharing, and community organization.
WhatsApp's character in Brazil was shaped by specific local conditions. Brazilian communication culture is highly social and group-oriented; WhatsApp's group functionality — allowing up to 256 members to share messages simultaneously — fit naturally with how Brazilians already communicated. The spread of low-cost smartphones and affordable mobile data plans in the early 2010s had made WhatsApp accessible to a broad cross-section of the population, including many who had never had home internet access. Religious communities, neighborhood associations, family networks, professional groups, and political communities organized through WhatsApp.
This infrastructure, in which WhatsApp functioned as the essential medium of Brazilian civil society, became the foundation for an unprecedented experiment in digital political manipulation during the 2018 presidential election. The combination of WhatsApp's ubiquity, its end-to-end encryption (which prevented monitoring), its group structure (which facilitated coordinated distribution), and Brazil's deep political polarization created conditions in which political misinformation could spread at industrial scale through a medium that was virtually impossible to monitor, moderate, or counter in real time.
Timeline
2016: The impeachment of President Dilma Rousseff and the subsequent political crisis polarize Brazil intensely. WhatsApp becomes a major medium for political communication and the spread of political content, including misinformation, about the crisis.
2017: Former army captain Jair Bolsonaro, a right-wing politician with a consistent presence on social media, begins building his WhatsApp presence aggressively. His communication style — direct, provocative, anti-establishment — translates effectively into WhatsApp content that people forward. His political operation develops early sophistication in WhatsApp distribution.
July 2018: The Brazilian fact-checking organization Agência Lupa and the newspaper Folha de S.Paulo begin documenting the scale of political misinformation spreading through WhatsApp in the lead-up to the October election. Initial findings show false stories about Workers' Party (PT) candidate Fernando Haddad circulating at scale, including fabricated stories about PT's supposed sex education programs for kindergarteners (a persistent false narrative that the PT had introduced a "gay kit" into schools) and false claims about PT corruption.
August 2018: Folha de S.Paulo reports on what it describes as a "mass misinformation machine" — businesses selling WhatsApp group lists and bulk message distribution services to political campaigns and their supporters. The scale of the operation described is extraordinary: individual contractors claim to be able to distribute messages to millions of WhatsApp group members. The Bolsonaro campaign denies systematic use of these services.
September 2018: Jair Bolsonaro is stabbed at a campaign rally, generating enormous public attention and sympathy. WhatsApp becomes a primary channel for information and misinformation about the attack and his recovery. The incident, and the WhatsApp traffic it generates, is seen by analysts as significant in consolidating Bolsonaro's image as a martyr and outsider.
October 7, 2018: First-round presidential election. Bolsonaro receives 46% of the vote, far more than polls had predicted, while Haddad receives 29.3%. The unexpected margin triggers intensive analysis of the WhatsApp role. Researchers at the Center for Media and Political Affairs (Cerap) and other organizations begin systematic analysis of misinformation spread through WhatsApp during the campaign.
October 12-18, 2018: Between the first and second round, WhatsApp misinformation intensifies. Analysis by Brazilian fact-checkers, using a sample of the most-forwarded content in monitored WhatsApp groups, finds that a significant proportion of the most-distributed political content is false or misleading. Content about Haddad includes fabricated videos, out-of-context images, and false statements attributed to him.
October 28, 2018: Second-round election. Bolsonaro wins with 55.1% of the vote, defeating Haddad. International and domestic observers widely describe the election as the first "WhatsApp election" — an election in which an encrypted messaging platform played a decisive role in information distribution, with minimal ability for real-time monitoring or counter-misinformation.
November 2018 — 2019: Academic research on the 2018 Brazilian election WhatsApp dynamics is published. Studies by researchers at the University of São Paulo, at the Oxford Internet Institute's Brazilian research program, and at multiple other institutions document the scale of misinformation distribution, the evidence of coordination, and the challenges of attribution and accountability under WhatsApp's encryption model.
2019 — 2021: The Brazilian Electoral Court (TSE) and federal prosecutors investigate the WhatsApp campaign. Investigations focus on whether the coordinated distribution of campaign content through purchased WhatsApp lists constitutes illegal campaign finance (the purchase of bulk messaging services could constitute undisclosed campaign expenditure). The investigations produce some enforcement actions but do not result in comprehensive accountability.
2021: Bolsonaro, having served as president since 2019, begins using WhatsApp to spread misinformation about the 2022 election before it has been called — claiming that the voting machine system is rigged, a narrative for which no evidence exists but which is distributed at scale through the same networks developed in 2018.
2022, January — October: The pre-election period sees systematic effort to undermine confidence in Brazil's electronic voting system through WhatsApp. Bolsonaro and allies distribute content claiming that only paper ballots can prevent fraud. Independent experts and the TSE repeatedly and publicly debunk these claims. The misinformation campaign is documented as explicitly pre-positioning supporters to reject an electoral loss.
October 2, 2022: First round of the 2022 presidential election. Lula da Silva (PT) receives 48.4%; Bolsonaro receives 43.2%. No candidate reaches the 50% threshold required for first-round victory.
October 30, 2022: Second round. Lula wins with 50.9% of the vote — the closest presidential election in Brazilian history. Bolsonaro does not concede. His WhatsApp networks immediately distribute content framing the election as stolen.
January 8, 2023: Bolsonaro supporters storm and vandalize the Presidential Palace, Congress, and the Supreme Court in Brasília — explicitly modeled on the January 6, 2021 U.S. Capitol attack, and directly inspired by the "stolen election" narrative distributed through WhatsApp for over a year.
2023: Brazilian authorities investigate the January 8 attacks. The TSE and legislature develop new regulatory frameworks for digital political communication. Meta (Facebook/WhatsApp) faces significant political pressure from the Lula government and the TSE regarding platform governance.
Analysis
The WhatsApp Architecture and Its Political Consequences
WhatsApp's architectural choices — end-to-end encryption, group messaging, forwarding functionality — were designed with legitimate purposes: protecting private communication from surveillance, enabling group coordination, and allowing easy sharing. In the Brazilian political context, these features combined to create a misinformation distribution infrastructure that was effectively invisible to platforms, regulators, and fact-checkers until its effects were measured after the fact.
Encryption as an accountability barrier: WhatsApp's end-to-end encryption means that Meta (WhatsApp's parent company) cannot see message content. This genuine privacy protection also means that coordinated misinformation campaigns distributing identical or near-identical false content to thousands of groups simultaneously are invisible to the platform until independently researched. The platform's anti-manipulation policies cannot be applied to content the platform cannot see.
Groups as distribution networks: WhatsApp's group functionality, with groups of up to 256 members, allows a single message to reach thousands of people in parallel, bypassing the "social spread" dynamic that limits organic content distribution on open social media. When a political operation creates or infiltrates thousands of WhatsApp groups and distributes content to them simultaneously, the distribution pattern resembles traditional broadcast media more than social media — except it is completely unregulated and entirely invisible.
Forwarding as viral amplification: The forward button allows content to move rapidly from group to group, person to person, without losing the appearance of personal endorsement that makes social media sharing persuasive. When a family member or trusted community leader forwards content, the implicit trust signal of personal endorsement accompanies it. The recipient cannot easily distinguish between content their contact genuinely believes and content that arrived in their contact's inbox through a coordinated political operation.
The role of trusted intermediaries: Research on WhatsApp misinformation in Brazil consistently emphasizes the importance of trusted social intermediaries — religious leaders, community figures, family matriarchs and patriarchs — who receive content through political distribution networks and forward it to their own networks with implicit personal endorsement. The misinformation travels through trusted relationship networks in ways that are both more persuasive and more difficult to counter than content encountered through impersonal algorithmic feeds.
The Industrial-Scale Distribution Problem
The 2018 Folha de S.Paulo reporting on bulk WhatsApp distribution services documented a dimension of the Brazilian WhatsApp election that distinguished it from organic political misinformation: the explicit industrialization of distribution. If political operations were purchasing WhatsApp group lists and bulk message sending services at the scale reported, the misinformation was not primarily spread organically (person to person, based on genuine engagement) but through coordinated mass distribution that mimicked organic spread.
This distinction matters for both legal and analytical reasons. If misinformation was distributed through coordinated campaign operations using purchased services, it may constitute illegal campaign expenditure under Brazilian electoral law. It also means that engagement patterns on WhatsApp misinformation do not reflect organic audience interest but coordinated distribution — undermining any inference that the content reflected what Brazilians genuinely wanted to share and discuss.
Subsequent research has been careful to distinguish between two phenomena: genuine organic spread of political content through social networks (people sharing content they believe and find important) and coordinated inauthentic distribution (content appearing to spread organically but actually being mass-distributed through coordinated operations). The 2018 Brazilian case likely involved both, in proportions that are still being debated.
Regulatory Response and Its Limits
Brazil's regulatory response to the WhatsApp election was more active than the regulatory responses in most countries that have experienced similar dynamics. The TSE developed partnerships with platforms for election monitoring; legislation targeted the purchase of bulk messaging services; fact-checking initiatives were established with platform cooperation; and media literacy campaigns were funded.
These responses improved the accountability environment for the 2022 election without fundamentally resolving the structural problem. WhatsApp's architecture — encryption, groups, forwarding — remained unchanged because changing it would require fundamentally altering the platform's privacy model. The coordinated distribution operations identified in 2018 continued in more sophisticated forms in 2022. The "stolen election" narrative that preceded the 2022 vote could not be effectively pre-bunked because it existed primarily within encrypted channels that fact-checkers could not reach.
The Brazilian case suggests a fundamental tension between privacy-protective encryption and accountability for coordinated political manipulation. There is no simple resolution: weakening encryption to allow monitoring creates privacy risks with serious human rights implications in authoritarian contexts; maintaining encryption while strengthening external accountability mechanisms provides inadequate protection against coordinated inauthentic distribution.
The January 8 Connection
The storming of the Brazilian seat of government on January 8, 2023, directly followed from the WhatsApp misinformation campaign that had been building since 2021. The "stolen election" narrative — which had been distributed through WhatsApp to millions of Brazilians for over a year — had been explicitly designed to produce non-acceptance of electoral loss. When the loss occurred, the prepared narrative was available to mobilize supporters.
The January 8 events illustrate the delayed costs of allowing political misinformation campaigns to operate at scale in encrypted media: the consequences of campaigns that cannot be monitored or addressed in real time may only become visible when they materialize in offline violence, and by then the causal chain is difficult to reconstruct, accountability is difficult to assign, and the damage is already done.
Discussion Questions
-
WhatsApp's end-to-end encryption is a genuine privacy protection that benefits users in many contexts, including authoritarian contexts where political dissent is persecuted. How should policymakers weigh the benefits of encryption against the documented costs of encrypted platforms for political misinformation? Is there a middle ground between full encryption and full monitoring?
-
The "trusted intermediary" dynamic — misinformation traveling through trusted personal relationships — is particularly powerful because personal endorsement is persuasive and difficult to counter. What counter-misinformation strategies would be effective against misinformation that arrives through trusted social relationships?
-
The January 8, 2023 attack in Brasília and the January 6, 2021 attack in Washington both followed from sustained social media misinformation campaigns about stolen elections. What does this temporal pattern — months or years of misinformation leading to specific political violence — suggest about the timeframe of accountability that platform governance needs to operate on?
-
Brazil's regulatory response was more active than most countries' responses to platform-mediated election misinformation. Evaluate its effectiveness. What would a more adequate regulatory response have looked like?
What This Means for Users
The Brazil WhatsApp case provides practical lessons for navigating political communication in encrypted messaging environments:
Message forwarding is not an endorsement of accuracy. When content arrives through trusted social relationships, the trust you have in the relationship can inappropriately transfer to the content. Political content that arrives through family or community WhatsApp groups deserves the same skepticism as political content from any other source — perhaps more, if you understand that coordinated distribution operations specifically target trusted community networks.
The "frequently forwarded" label matters. WhatsApp's "frequently forwarded" indicator (an arrow symbol showing multiple forwards) signals that content has traveled through many hands before reaching you — and research shows that frequently-forwarded content is more likely to be misinformation than content shared from first-hand knowledge. The label is a genuine risk signal.
Search before sharing. The "think before you share" advice is particularly important in encrypted messaging environments where corrections are unlikely to reach the people who received the original false content. In encrypted environments, the correction cannot follow the misinformation through the same channels; the only way to prevent misinformation from spreading further is not to share it in the first place.
Coordinated campaigns target trusted networks. Political operations specifically target trusted community leaders — pastors, community organizers, family elders — because content that arrives through them carries implicit endorsement. Being aware that your trusted community networks may themselves have been targeted for distribution of politically motivated content is a form of structural awareness that protects against the specific vulnerability that encrypted political misinformation exploits.