Case Study 14.2: Cookie Consent Dark Patterns — The Post-GDPR Manipulation Industry
Background
When the European Union's General Data Protection Regulation came into force on May 25, 2018, it was supposed to fundamentally change the relationship between internet users and the tracking technologies that follow them across the web. The GDPR's consent provisions required that users actively agree to the use of cookies and similar tracking technologies for purposes beyond strictly necessary site functioning. For the first time, European users had a legal right to say no to tracking — and platforms, advertisers, and publishers were legally required to obtain meaningful consent before deploying it.
What happened instead was one of the most thoroughgoing demonstrations of dark pattern design in internet history. Within months of GDPR's implementation, a cottage industry had emerged to help publishers and advertisers design Consent Management Platforms (CMPs) — the cookie banners and consent dialogs that appear on nearly every website in the EU — in ways that maximized the rate of user consent without actually providing users with meaningful choice. The result was a systematic perversion of the regulation's intent: a regime nominally requiring consent that produced consent rates exceeding 90 percent through design that made genuine non-consent practically impossible.
This case study examines how cookie consent dark patterns work, the research that documented their prevalence, the regulatory response, and what the episode reveals about the limits of consent-based regulation when platforms control the interface through which consent is exercised.
How Cookie Consent Banners Became Dark Pattern Laboratories
The core architecture of cookie consent dark patterns can be understood through a single design asymmetry: the path to "Accept All" is always shorter, more prominent, and more attractive than the path to "Reject All" or "Manage Preferences." This asymmetry is not accidental. It is the product of deliberate A/B testing, professional design expertise, and commercial incentives that reward maximum tracking consent rates.
The specific dark patterns deployed in cookie consent banners fall into identifiable categories, documented in research by academics at MIT, Carnegie Mellon, Oxford, and in industry analyses by the Norwegian Consumer Council and the Electronic Frontier Foundation.
The "Accept" Button vs. the "Manage" Labyrinth
The most fundamental cookie consent dark pattern involves visual hierarchy: the "Accept All" option is presented as a large, brightly colored button (typically green or the site's primary brand color), while rejecting tracking requires navigating to a "Manage Preferences," "Cookie Settings," or "Privacy Center" option, rendered in gray, smaller font, positioned away from the primary action zone of the interface.
Research by Mathur et al. (2019) analyzed 53,000 cookie consent notices and found that the "accept" button was more prominent (in color, size, or position) than the reject option in the vast majority of cases examined. The authors calculated that this asymmetry significantly increased consent rates compared to interfaces where both options were presented with equal visual weight.
Pre-Ticked Boxes and Reversed Defaults
The GDPR explicitly prohibits pre-ticked consent boxes: consent for non-essential data processing must be opt-in, not opt-out. But many early post-GDPR consent managers deployed a workaround: rather than a single "Accept All" button, they presented a list of consent categories with individual toggles, many pre-set to "on." Users who scrolled through and clicked "Save Preferences" without changing anything had technically "configured" their preferences — but had done so through a design that defaulted to maximum consent.
Variations on this pattern persisted for years after GDPR implementation. A 2019 study by the Norwegian Consumer Council found that pre-ticked boxes were among the three most commonly deployed dark patterns in cookie consent interfaces on major Norwegian websites, despite the GDPR's explicit prohibition.
The Unavoidable Banner and the Consent Wall
Some websites deployed what researchers termed "consent walls": the website content was entirely blocked until the user clicked "Accept All." This pattern, sometimes called a "take it or leave it" model, presents users with a binary choice: accept all tracking or leave the site. The GDPR's requirement that consent be "freely given" is incompatible with a consent wall — if the alternative to consenting is being denied access to the service, the consent is coerced rather than free.
Regulatory guidance from the European Data Protection Board addressed consent walls in 2020, concluding that they generally do not produce valid GDPR consent. Enforcement actions followed in several EU member states, though implementation has been inconsistent.
Infinite Layers of the Preferences Interface
For users determined to reject non-essential cookies, many consent management platforms deployed what researchers called the "labyrinth" pattern: a "Manage Preferences" flow that required navigating through multiple screens, each presenting additional categories (strictly necessary, performance, functional, targeting, social media, analytics...), each with individual toggles, each requiring a separate interaction. Some implementations had users navigate through seven or eight screens before reaching a final "Save" button.
The cognitive burden of this navigation was not accidental. Each additional click in the rejection path reduced the proportion of users who completed it. The practical effect was that "Reject All" was a theoretical option that few users successfully exercised, not because of a single dramatic deception but because of accumulated friction so substantial that most users gave up.
Timeline of GDPR Cookie Consent: From Promise to Failure to Partial Reform
May 2018: GDPR enters into force. Data protection authorities across the EU begin issuing guidance on cookie consent. The initial guidance is interpreted by many publishers as requiring explicit consent for analytics and advertising cookies.
May–December 2018: A proliferation of cookie banners appears across EU websites. Industry organizations scramble to build CMPs. Early versions often deploy obvious dark patterns, partly from genuine uncertainty about requirements and partly from commercial pressure to maintain tracking consent rates.
January 2019: French data protection authority (CNIL) launches an investigation into major platforms' cookie consent practices, focusing on deceptive design. CNIL issues its first major cookie-related fine (against Google, €50 million) for inadequate consent.
May 2019: Academic researchers publish the first systematic empirical study of cookie consent dark patterns (Utz et al., 2019, CHI conference), documenting the consent rate differential between banners with prominent "Accept" only versus banners offering equal-weight accept/reject options. The finding: prominently positioned opt-out options reduced consent rates by roughly 23 percentage points compared to interfaces that only offered "Accept All" prominently.
2020: The European Data Protection Board issues Guidelines 05/2020 on consent, explicitly addressing several cookie consent dark patterns including pre-ticked boxes, deceptive visual design, and bundled consent. Enforcement authority rests with individual member-state DPAs, creating inconsistent application.
2021: A landmark study by Nouwens et al. at the University of Oxford examines 10,000 UK websites and finds that only 11.8% use a consent interface that is potentially GDPR-compliant. The study documents that dark patterns are not edge cases but the dominant mode of cookie consent implementation.
2022: The French DPA (CNIL) begins requiring that websites offer an "Accept All/Reject All" binary at the first level of the consent interface — not buried in preferences — effectively requiring the same prominence for rejection as for acceptance. Google, Facebook, and other major platforms operating in France modify their cookie consent UIs in response. Consent rates for tracking cookies on major platforms in France drop significantly.
2022–2023: The Irish Data Protection Commission (the lead regulator for most major US tech companies in the EU, due to their European headquarters being in Dublin) issues decisions against Meta's consent practices, ultimately resulting in a €390 million fine for requiring users to accept personalized advertising as a condition of using Facebook and Instagram. The case effectively challenged Meta's attempt to bypass GDPR consent requirements by reframing advertising as a "contractual necessity."
2024: The EU's Digital Services Act comes into force for very large online platforms. The DSA explicitly prohibits dark patterns and requires "easy and effective" mechanisms for users to withdraw consent. Enforcement actions under the DSA against cookie-related manipulation begin.
The Empirical Record: What Research Found
The systematic study of cookie consent dark patterns produced some of the clearest empirical evidence in the dark patterns literature, because the same consent choice (tracking or no tracking) was presented to users through wildly varying interface designs, allowing researchers to directly measure the effect of design choices on user behavior.
Key findings from the research literature:
Default effects are large. When tracking cookies were opt-in (off by default), consent rates were dramatically lower than when they were opt-out (on by default). Estimates from multiple studies suggest the difference is 20–40 percentage points, depending on context. The implication is stark: the vast majority of users who appear to have "consented" to tracking in opt-out systems would not have consented in an opt-in system.
Visual hierarchy matters enormously. Mathur et al. (2019) found that consent rates were significantly higher when the "Accept" button was visually prominent compared to conditions of equal visual weight. The design choice that gets credit for millions of "consent" actions is font size, button color, and spatial position — not user preference.
Information provision does not significantly improve genuine decision-making. Studies that presented users with information about what tracking entails found only modest effects on consent rates, suggesting that most users are making decisions primarily on the basis of interface friction rather than informed deliberation about their privacy preferences.
Most users do not prefer to be tracked. Surveys consistently find that when users are asked about their privacy preferences in the abstract, large majorities say they prefer not to be tracked across the web. The high tracking consent rates produced by CMP dark patterns systematically contradict stated user preferences, providing strong evidence that these rates reflect design manipulation rather than genuine consent.
What This Means for Users
The cookie consent case study is, in some ways, the purest possible demonstration of dark pattern ethics, because the gap between what the law intended (genuine user choice about privacy) and what the industry produced (an infrastructure of manufactured consent) is starkly visible.
Regulatory intent can be systematically subverted through interface design. The GDPR was drafted by people who understood that consent could be manipulated, which is why it includes requirements that consent be freely given, specific, informed, and unambiguous. These requirements were not vague or ambiguous. But they do not prescribe the specific interface through which consent must be obtained. That gap — between the substantive requirement and the interface implementation — was the space that the CMP industry colonized.
The consent-based model has structural limits. The deeper lesson of the cookie consent episode may be that consent-based privacy regulation is inadequate when the entity seeking consent also controls the interface through which consent is expressed. When the platform designs the consent banner, there is no reason to expect that the banner will be designed to facilitate genuine refusal. Regulatory models that require specific interface outcomes (equal prominence for accept and reject, mandatory "Reject All" at the first level) are more effective than models that specify only the quality of consent obtained.
The commercial stakes make manipulation rational. Tracking consent drives advertising revenue. A website that maintains an 85% tracking consent rate generates substantially more advertising revenue than one with a 45% rate. The financial incentive to deploy dark patterns in consent interfaces is direct and substantial. Without regulatory frameworks that make dark pattern deployment more costly than it is profitable — either through fines, liability, or market consequences — the manipulation will continue.
Users have largely habituated to non-decision. The proliferation of cookie banners in the post-GDPR environment has produced a phenomenon researchers call "consent fatigue": users click "Accept" on cookie banners reflexively, without reading or engaging with the choice, because they encounter dozens of such banners per week and cannot afford the cognitive resources to engage with each one. This fatigue was predictable, was arguably intended by the CMP industry, and represents a failure of regulation that addressed the form of consent while leaving the underlying power dynamic intact.
Discussion Questions
-
The GDPR requires consent to be "freely given." The research evidence suggests that consent obtained through a consent wall (where the alternative is being denied access to the site) is not freely given. But some publishers argue that they need tracking revenue to survive and that users are making a genuine economic exchange: free content in return for tracking. Evaluate this argument. Is this a meaningful form of free choice?
-
The research evidence shows that most users do not prefer to be tracked, but that most users also click "Accept" on cookie consent banners. What is the ethical significance of this gap? Does it mean that users' clicking behavior does not represent their actual preferences? What would a consent system need to look like to accurately capture users' preferences?
-
The CMP industry exists specifically to help websites comply with GDPR's consent requirement while maximizing consent rates. Some CMP vendors openly market their products as tools for maintaining high tracking consent rates. Is this industry providing a legitimate compliance service, or is it an industry built around systematic legal evasion? What obligations do the lawyers and engineers who work for CMP companies bear?
-
France's requirement that websites offer an "Accept All/Reject All" option at the first level of the consent interface reduced tracking consent rates significantly when implemented. Evaluate the trade-offs: what are the costs and benefits of this design requirement for users, publishers, and advertisers? Is this the right regulatory solution, or would you design a different intervention?
-
Consider the concept of "consent fatigue" — users clicking "Accept" reflexively on cookie banners without genuine deliberation. Does fatigue-based non-deliberation undermine the moral validity of consent? If a user clicks "Accept" automatically without reading, are they responsible for the tracking that follows? How should regulation account for the cognitive realities of how people actually engage with consent interfaces?