Case Study 12.2: The GDPR Cookie Wars — European Consent Compliance in Practice


Introduction: A Law That Changed the Web (Sort Of)

When the General Data Protection Regulation took effect on May 25, 2018, European internet users immediately noticed something new: the web was suddenly full of consent dialogs. Every website — or so it seemed — was asking for permission to set cookies. The consent banner had arrived.

Within weeks, it was clear that not all consent banners were created equal. Privacy researchers, consumer advocates, and regulators began systematically documenting the gap between what GDPR's consent requirements demanded and what the majority of consent implementations actually provided. What emerged was a richly documented case study in how commercial incentives, legal ambiguity, and regulatory capacity interact to shape privacy outcomes in practice.

What GDPR Actually Required

GDPR's requirements for consent are straightforward in their statement and complex in their implementation. Recital 32 of the regulation specifies that consent "should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject's agreement." Article 7 adds that "the data subject shall have the right to withdraw his or her consent at any time" and that "withdrawing consent shall be as easy as giving it."

Applied to cookie consent, this means:

  • Pre-checked boxes do not constitute consent (boxes pre-checked to "accept advertising cookies" are illegal under GDPR)
  • Continuing to use a website does not constitute consent (no "by using this site you agree")
  • Consent must be specific — agreeing to "analytics" is not consent to "advertising"
  • The option to decline must be presented with equal prominence to the option to accept
  • Withdrawing consent (later opting out) must be as easy as granting it

These requirements were legally clear. What was not clear was whether the hundreds of thousands of companies operating websites with EU users would comply with them — and whether European data protection authorities had the capacity and will to enforce them.

The Norwegian Consumer Council's Investigation

In January 2020, the Norwegian Consumer Council (Forbrukerrådet) published a report titled "Deceived by Design" that systematically documented dark patterns in consent interfaces used by major digital platforms. The investigation analyzed the consent mechanisms of Google, Facebook, and Windows 10, measuring the friction and choice architecture applied to privacy-protective choices versus data-sharing choices.

The findings were striking:

Google: To enable personalized advertising, users required one click. To disable personalized advertising, users required nine steps, including navigation through multiple menu levels and a separate confirmation page.

Facebook: The "Accept" option for using Facebook's data for advertising required one click. The option to limit data use required multiple clicks, navigation to settings, and separate toggles for multiple data categories — with some options unavailable entirely (Facebook claimed "legitimate interest" rather than consent for some processing, making opt-out legally distinct from consent withdrawal).

Windows 10: During setup, options to disable Microsoft's data collection required navigating through screens where the default was always maximum data sharing, with the privacy-protective alternative requiring active selection of a less prominent button.

The Council concluded that all three companies were using interface design to systematically undermine GDPR's consent requirements — achieving the legal form of a consent mechanism while ensuring that the actual consent rate for privacy-protective choices remained artificially low.

The digital advertising industry recognized that GDPR created a potential crisis for the behavioral advertising ecosystem. Without a compliant consent mechanism, the entire real-time bidding infrastructure — built on behavioral profiles assembled without GDPR-compliant consent — was potentially illegal.

The Internet Advertising Bureau (IAB) developed the Transparency and Consent Framework (TCF) as an industry standard for managing cookie consent across the advertising ecosystem. The framework created a standardized consent signal: a consent management platform (CMP) deployed on a publisher's website collects user choices and encodes them into a "TC string" — a data structure that specifies which vendors (among hundreds listed in a Global Vendor List) the user has consented to receive data.

The TCF was widely adopted. By 2020, it was used by thousands of publishers across Europe, and hundreds of consent management platform vendors had built tools implementing it.

It was also, according to a comprehensive 2022 ruling by the Belgian Data Protection Authority, deeply non-compliant with GDPR.

The Belgian DPA's investigation of IAB Europe (the organization that administers the TCF) found that:

  • The TC string constituted personal data under GDPR (because it was linkable to specific users)
  • IAB Europe was a data controller of that personal data but had not fulfilled controller obligations
  • The TCF did not ensure that consent was validly obtained
  • The TCF enabled the transmission of user data to hundreds of vendors before consent was obtained and verified
  • The framework lacked adequate mechanisms to prevent processing in cases of invalid consent

IAB Europe was fined €250,000 and required to bring the framework into compliance — a significant regulatory moment that confirmed what privacy advocates had argued for years: the industry's primary self-regulatory consent mechanism did not meet GDPR's legal standard.

The Pattern Across Jurisdictions

As GDPR enforcement accumulated, a pattern emerged. Major enforcement actions included:

Google (2022) — €150 million fine (French CNIL): Google made it straightforward to accept all cookies (one click) and much harder to reject them (multiple steps). The CNIL found this violated the requirement that withdrawing consent be as easy as giving it.

Facebook/Meta (2022) — €390 million fine (Irish DPC): Meta had relied on "contractual necessity" as the legal basis for behavioral advertising in Facebook and Instagram — arguing that users agreed to receive personalized ads as a condition of using the service. The EDPB (European Data Protection Board) rejected this basis, finding that behavioral advertising was not necessary to the service contract and that valid consent was required.

Google Analytics (2022) — Multiple national DPAs: Several European data protection authorities ruled that using Google Analytics (which transfers data to U.S. servers) violated GDPR's data transfer requirements, as U.S. law did not provide equivalent protection to EU law. This ruling created significant uncertainty for the majority of European websites using Google Analytics.

TikTok (2023) — €345 million fine (Irish DPC): TikTok's cookie consent interface for users under 18 was found to use dark patterns that pushed minor users toward accepting advertising cookies.

What Changed — and What Didn't

The cumulative effect of GDPR enforcement has produced real changes in the consent landscape. Most major platforms revised their cookie consent interfaces after enforcement actions. The "accept all / reject all" binary became more common, replacing the asymmetric "accept all / manage settings" pattern. Several companies developed genuinely neutral cookie interfaces.

What did not change was the underlying commercial logic. Behavioral advertising remained the dominant business model of the web. Companies sought alternative legal bases — legitimate interest — where consent was unavailable. First-party data collection (directly from users logged into accounts) became more valuable as third-party tracking faced headwinds. The Privacy Sandbox and other post-cookie alternatives emerged to replace the tracking capability that GDPR was beginning to constrain.

Perhaps most significantly, the consent banner ecosystem became an enormous industry in itself. Consent Management Platform vendors — companies whose entire business was building compliant (or compliant-appearing) consent interfaces — proliferated. The market for CMPs grew substantially, with companies selling increasingly sophisticated tools for managing consent at scale. The regulatory requirement to obtain consent had generated a consent management industry, which had its own commercial interests in the design and architecture of consent mechanisms.

Analysis Questions

  1. The GDPR's consent requirements are clear in principle — freely given, specific, informed, unambiguous — but compliance has been systematically undermined by dark patterns. What does this gap between legal requirement and commercial practice reveal about the relationship between privacy law and corporate behavior?

  2. The Norwegian Consumer Council found that Google required nine steps to disable personalized advertising and one step to enable it. Without using the term "dark pattern" (since your reader may not be familiar with it), explain to a non-specialist why this asymmetry is legally and ethically problematic.

  3. The IAB Transparency and Consent Framework was developed as an industry self-regulatory solution to GDPR compliance. The Belgian DPA found it was fundamentally non-compliant. What does this outcome suggest about the limits of industry self-regulation as a privacy governance mechanism?

  4. The cumulative GDPR enforcement fines against major platforms (€150M, €390M, €345M) are large in absolute terms but small relative to those companies' annual revenues. Meta, for example, earned approximately €100 billion in 2023 revenue against a €390M GDPR fine. At what fine level does enforcement become an effective deterrent? What other enforcement mechanisms might be more effective?

  5. The chapter concludes that the data economy's problems require structural rather than individual responses. Does the GDPR represent the right kind of structural response? What has it succeeded at? What has it failed to address?


Connections

  • Dark patterns in consent design (Section 12.7)
  • GDPR and third-party cookies (Section 12.8)
  • Consent as fiction (Part 3 recurring theme)
  • Regulatory frameworks for commercial surveillance (Chapter 28)
  • The behavioral advertising ecosystem (Chapter 14)

Case Study 12.2 | Chapter 12 | Part 3: Commercial Surveillance