Case Study 31-2: The GDPR Enforcement Gap
Ireland, Meta, and the Limits of the World's Strongest Privacy Law
Background: The Promise of GDPR
When the GDPR took effect on May 25, 2018, it was heralded as a transformative moment for digital privacy. The regulation imposed comprehensive obligations on any organization processing the personal data of EU residents. Violations could result in fines of up to €20 million or 4% of annual global turnover — whichever was higher. For companies like Meta, 4% of global turnover could mean billions of euros.
Privacy advocates celebrated. Tech companies invested heavily in compliance. Cookie consent banners appeared on websites worldwide. Data protection officers were hired. Privacy policies grew longer and more elaborate.
But between the promise of GDPR and its enforcement, a complex geography emerged — one that reveals how legal frameworks can be technically ambitious yet practically constrained.
The Ireland Problem
Meta (then Facebook) chose Ireland as its European headquarters in 2008, part of a broader pattern of large technology companies establishing EU operations in low-tax jurisdictions. Under GDPR's "one-stop shop" mechanism, the regulatory authority in a company's "main establishment" in the EU is its lead supervisory authority — responsible for overseeing that company's EU-wide data processing.
The Irish Data Protection Commission (DPC) became the lead authority for some of the world's most powerful data companies: Meta, Google, Apple, Twitter/X, LinkedIn, PayPal. Ireland, a small country with a technology sector constituting a substantial portion of its economy, became the de facto privacy regulator for a significant portion of the internet.
Critics quickly identified the structural problem: Ireland had strong economic incentives to attract and retain technology companies. Large technology companies generated jobs, tax revenue, and economic activity. A regulatory authority that was aggressive in enforcing GDPR against those companies risked undermining Ireland's economic model.
Privacy advocates, led by Austrian activist Max Schrems and his organization NOYB (None of Your Business), began filing complaints with the DPC in 2018 — the day GDPR took effect — about Meta's data processing practices. What followed was a years-long process that exposed both the ambition and the limits of GDPR enforcement.
The Schrems Cases: Transatlantic Data Transfers
The most consequential GDPR battles have concerned data transfers between the EU and the United States. Under GDPR, personal data can only be transferred to countries outside the EU if those countries provide "adequate" protection — meaning protection essentially equivalent to GDPR.
The United States does not have adequate protection, for reasons this chapter has examined: no comprehensive federal privacy law, the third-party doctrine gutting Fourth Amendment protections, and FISA Section 702 allowing intelligence agencies to access data held by U.S. tech companies without meaningful oversight.
Schrems I (2015): Max Schrems filed a complaint in 2013 arguing that Facebook's transfer of EU users' data to U.S. servers violated EU law because U.S. surveillance programs (revealed by Snowden) meant U.S. law didn't adequately protect that data. The Court of Justice of the EU invalidated the "Safe Harbor" framework — the legal mechanism companies used to justify EU-US data transfers — finding that U.S. surveillance law was incompatible with EU privacy rights.
Schrems II (2020): After Safe Harbor was replaced by "Privacy Shield," Schrems challenged that too. The CJEU again agreed, invalidating Privacy Shield for the same reasons: FISA Section 702 programs allowed surveillance of EU citizens' data without adequate redress, incompatible with the GDPR's adequacy requirements. The CJEU also called into question the Standard Contractual Clauses (SCCs) that companies use as an alternative transfer mechanism, finding they could not substitute for genuine legal protection against government surveillance.
The Aftermath: The invalidation of Privacy Shield created a legal crisis for transatlantic data flows. Hundreds of thousands of European companies used Privacy Shield or SCCs to transfer data to U.S. service providers. After Schrems II, the legal basis for many of those transfers became uncertain. The DPC, in theory, had to review all of Meta's EU-US data transfers and potentially order them to stop.
The Enforcement Journey: Five Years of Delay
Max Schrems filed his complaint with the DPC in 2018. The DPC did not issue a final decision until 2023 — five years later.
During those five years: - The DPC issued draft decisions that were reviewed by other EU data protection authorities - Other EU DPAs repeatedly found the DPC's proposed decisions too lenient and pushed for stronger action - The "consistency mechanism" — GDPR's process for resolving disagreements among national DPAs — was invoked repeatedly - Meta challenged multiple procedural aspects of the proceedings - Privacy advocates accused the DPC of "regulatory capture" — being too deferential to the company it was supposed to regulate
In May 2023, the DPC issued its final decision: Meta had unlawfully transferred EU users' data to U.S. servers in violation of GDPR. The fine: €1.2 billion — the largest GDPR fine ever imposed.
Meta was also ordered to stop the unlawful data transfers and to bring its data processing into compliance within a specified period.
The Aftermath and the EU-US Data Privacy Framework
Simultaneous with the DPC enforcement action, the EU and U.S. were negotiating a new legal framework for transatlantic data transfers. In July 2022, President Biden signed an Executive Order creating the EU-US Data Privacy Framework (DPF), which: - Established a new "Data and Civil Liberties Protection Officer" position to handle EU citizens' complaints about U.S. intelligence activities - Created a new Data Protection Review Court to adjudicate those complaints - Required intelligence agencies to apply proportionality standards to surveillance of EU persons
The European Commission declared the DPF adequate in July 2023, just weeks after the €1.2 billion Meta fine. Companies could once again use the DPF as a legal basis for EU-US data transfers.
Max Schrems announced his intention to challenge the DPF — arguing that the new mechanisms don't genuinely address the fundamental problem that FISA Section 702 allows surveillance incompatible with EU privacy rights. As of 2026, that challenge is working through the courts. Privacy advocates describe the cycle as a "merry-go-round" — frameworks are created, challenged, invalidated, replaced, and challenged again, while data flows continue throughout.
What This Case Reveals About Legal Frameworks
The GDPR/Meta/Ireland saga illuminates several dynamics that apply to privacy law generally:
1. Enforcement is political. GDPR has strong formal powers but its enforcement depends on regulatory authorities that exist within political and economic structures. Ireland's relationship with the tech industry shaped the DPC's approach, regardless of formal independence requirements. Legal frameworks are only as strong as the institutions enforcing them.
2. Regulatory arbitrage persists. Even under GDPR, companies can exploit the "one-stop shop" mechanism by headquartering in the most permissive jurisdiction. This is a structural feature of international regulatory frameworks generally.
3. The transatlantic surveillance gap is real. The repeated invalidation of EU-US data transfer frameworks reflects a genuine incompatibility: GDPR requires adequate protection for EU persons' data, and U.S. surveillance law — particularly FISA Section 702 — provides surveillance access that is incompatible with that standard. The legal workarounds (Privacy Shield, DPF) are attempts to paper over a structural difference in values, not to resolve it.
4. Individual complainants can move mountains, slowly. Max Schrems, a law student when he first challenged Facebook, has spent more than a decade in litigation and advocacy that ultimately produced a €1.2 billion fine and multiple invalidations of major legal frameworks. Individual legal action, sustained over time, can produce significant systemic change — but the timeline is measured in years and decades, not months.
5. The law follows technology, slowly. GDPR was adopted in 2016, applied from 2018, and is still being interpreted and enforced six-plus years later. Technology companies innovate continuously during that period; regulators respond to yesterday's practices while today's practices become the new normal.
Analysis Questions
1. GDPR's "one-stop shop" mechanism was designed to simplify compliance for pan-European businesses. It has the unintended consequence of concentrating enforcement in Ireland. What regulatory design changes might address this problem without destroying the benefits of the one-stop shop mechanism?
2. The Court of Justice of the EU has twice found that U.S. surveillance law is incompatible with EU privacy requirements. Is the solution: (a) reform U.S. surveillance law, (b) create better legal mechanisms for limiting surveillance of EU persons, (c) accept that EU-US data transfers simply cannot meet GDPR requirements, or (d) something else? What trade-offs does each option involve?
3. Max Schrems filed his first complaint in 2013. The first major CJEU decision came in 2015. His second complaint produced a CJEU decision in 2020. The DPC's enforcement decision came in 2023. What does this timeline suggest about the practical adequacy of legal frameworks for addressing fast-moving technology issues?
4. Meta's €1.2 billion fine represents a significant sum. But Meta's annual revenue is approximately $100+ billion. Does a 1-2% revenue fine constitute meaningful deterrence? What would meaningful deterrence look like for a company of Meta's scale?
5. Privacy advocates describe the EU-US data transfer negotiation cycle as a "merry-go-round." At what point does legal advocacy become complicit in a system that appears to resolve problems without actually resolving them? When should advocates pursue alternatives to legal challenge?
6. GDPR's compliance costs fall most heavily on small businesses (which have fewer resources to build compliance infrastructure) while providing the most protection against large platforms (which can absorb compliance costs). Is this a feature or a bug? How might GDPR be reformed to better distribute its costs and benefits?
Connecting to the Broader Framework
This case study illustrates the gap that Chapter 31 maps: between the formal existence of rights and the practical ability to exercise or enforce them. GDPR is genuinely the world's strongest comprehensive privacy framework. Its enforcement record includes the largest privacy fines ever imposed. And yet:
- The most important cases took five or more years to resolve
- The fundamental problem — U.S. surveillance access to EU persons' data — remains unresolved
- Corporate behavior continues during enforcement proceedings
- Regulatory capture is a persistent risk
Legal rights are necessary but not sufficient. They require enforcement institutions with independence, resources, and political support. They require individuals willing to bring complaints and sustain litigation over years. They require political will to address structural incompatibilities rather than paper over them.
As Jordan Ellis learned from their data broker opt-out experience: the right exists on paper, but exercising it is designed to be difficult.
This case study connects to Chapter 31 Section 31.5 (GDPR analysis) and backward to Chapter 9 (mass interception and the Snowden revelations that prompted Schrems I). It connects forward to Chapter 32 (individual counter-surveillance) and Chapter 39 (designing for privacy).