28 min read

> "A regulation without enforcement is a suggestion. A suggestion without consequences is a wish."

Learning Objectives

  • Describe the structure, powers, and resource constraints of data protection authorities across jurisdictions
  • Analyze patterns in GDPR enforcement actions, including the largest fines and the reasoning behind them
  • Explain regulatory capture and identify its manifestations in data protection governance
  • Distinguish between compliance (following rules) and ethics (doing right), connecting to the ethical frameworks of Chapter 6
  • Evaluate what law can and cannot achieve in governing data practices
  • Assess the effectiveness of self-regulation and industry codes of conduct

Chapter 25: Enforcement, Compliance, and the Limits of Law

"A regulation without enforcement is a suggestion. A suggestion without consequences is a wish." — Wojciech Wiewiórowski, European Data Protection Supervisor

Chapter Overview

By now, we have surveyed the global regulatory landscape (Chapter 20), examined the EU AI Act's risk-based approach (Chapter 21), explored organizational data governance frameworks (Chapter 22), navigated cross-border data flows (Chapter 23), and dissected sector-specific governance regimes (Chapter 24). The architecture of data governance law is extensive, detailed, and growing.

But law on the books is not law in practice. A regulation is only as powerful as the institution that enforces it — and enforcement requires resources, expertise, political will, and independence. A compliance obligation is only as meaningful as the consequences of non-compliance.

This chapter closes Part 4 by examining the enforcement gap: the distance between what data protection laws require and what actually happens. We'll look at how data protection authorities work, what they've achieved, and where they've fallen short. We'll examine regulatory capture — the phenomenon in which regulators become too close to the industries they regulate. We'll confront the uncomfortable gap between legal compliance and genuine ethical behavior. And we'll assess the fundamental limits of what law can accomplish in governing the data practices that shape modern life.

This is also the chapter in which Eli testifies before the Detroit city council, and Sofia Reyes brings the DataRights Alliance's enforcement advocacy to the foreground.

In this chapter, you will learn to: - Evaluate data protection authority effectiveness based on structure, resources, and independence - Analyze enforcement patterns and identify what they reveal about regulatory priorities - Recognize and critique regulatory capture in data governance contexts - Articulate the gap between compliance and ethics with reference to concrete examples - Assess the limits of legal approaches and evaluate alternatives


25.1 Data Protection Authorities: Structure, Powers, and Limitations

25.1.1 What Is a Data Protection Authority?

A data protection authority (DPA) is an independent public body responsible for monitoring and enforcing data protection law within its jurisdiction. The GDPR requires each EU member state to establish at least one independent supervisory authority. Similar bodies exist in most countries with data protection laws — though their independence, powers, and resources vary enormously.

25.1.2 Powers of EU Data Protection Authorities

Under the GDPR, DPAs possess three categories of powers:

Investigative powers (Article 58(1)): - Order controllers and processors to provide information - Conduct data protection audits - Carry out investigations, including on-site inspections - Obtain access to premises, including data processing equipment

Corrective powers (Article 58(2)): - Issue warnings and reprimands - Order compliance with data subject requests - Order controllers or processors to bring processing into compliance - Impose temporary or definitive bans on processing - Order the suspension of cross-border data flows - Impose administrative fines

Advisory powers (Article 58(3)): - Issue opinions on processing operations - Advise parliament, government, and other institutions on data protection matters - Approve codes of conduct and certification mechanisms

25.1.3 The Resource Problem

On paper, DPA powers are formidable. In practice, many DPAs are chronically under-resourced.

DPA Country Approximate Staff (2024) Population Served Budget (approx.)
ICO UK ~900 67 million $85M
CNIL France ~270 67 million $27M
BfDI Germany (federal) ~320 84 million $40M
DPC Ireland ~210 5 million $25M
AEPD Spain ~200 47 million $20M
Garante Italy ~170 60 million $30M
UODO Poland ~250 38 million $12M

The Irish Data Protection Commission (DPC) is particularly significant because Ireland hosts the European headquarters of Meta, Google, Apple, Microsoft, TikTok, and many other major tech companies. Under the GDPR's one-stop-shop mechanism, the DPC serves as the lead supervisory authority for cross-border processing by these companies — making it the de facto primary enforcer of data protection for hundreds of millions of Europeans.

With roughly 210 staff and a budget of 25 million euros, the DPC is responsible for overseeing some of the most powerful technology companies in history. Critics have argued that this creates a structural enforcement deficit — a mismatch between the scope of the regulatory challenge and the resources available to address it.

Accountability Gap: The GDPR's one-stop-shop mechanism was designed for efficiency — ensuring that companies operating across multiple EU member states would deal with a single lead authority rather than 27 different regulators. But the effect has been to concentrate enforcement responsibility in a small number of DPAs (particularly Ireland and Luxembourg) that are not adequately resourced for the task. The Accountability Gap exists not only between organizations and individuals but between the law's ambitions and the institutions charged with realizing them.

25.1.4 Independence: Theory and Practice

DPA independence is essential to effective enforcement. A data protection authority that answers to the government cannot effectively investigate government data practices. A DPA whose budget is controlled by the ministry it must regulate cannot be truly independent.

The GDPR requires DPAs to act with "complete independence" (Article 52). In practice, independence exists on a spectrum:

  • Structural independence: The DPA is established as an independent body, not a division of a government ministry.
  • Financial independence: The DPA controls its own budget, with funding sufficient for its mandate.
  • Operational independence: The DPA sets its own priorities, investigates without political direction, and makes enforcement decisions without government approval.
  • Personal independence: DPA leadership is appointed for fixed terms, protected from removal except for serious misconduct, and free from conflicts of interest.

Some DPAs meet all these criteria robustly. Others — particularly in newer democracies or countries with weak institutional traditions — may have formal independence while operating under practical constraints: inadequate budgets, political pressure, or leadership appointments that reward loyalty over competence.


25.2 Enforcement in Practice: Landmark Cases

25.2.1 GDPR Enforcement Patterns

Since the GDPR became applicable in May 2018, DPAs have imposed over 4 billion euros in fines through thousands of enforcement actions. But the pattern of enforcement reveals important dynamics.

The largest GDPR fines (as of early 2026):

Year Company DPA Fine Violation
2023 Meta (Facebook) Ireland DPC 1.2B euros Unlawful cross-border data transfers to the US
2021 Amazon Luxembourg CNPD 746M euros Non-compliant targeted advertising practices
2023 Meta (Instagram) Ireland DPC 405M euros Processing children's data without adequate protections
2022 Meta (Facebook) Ireland DPC 265M euros Data scraping — failure to protect user data from being scraped
2022 Meta (WhatsApp) Ireland DPC 225M euros Transparency failures in privacy policy
2019 Google France CNIL 150M euros Cookie consent violations
2024 LinkedIn (Microsoft) Ireland DPC 310M euros Unlawful advertising targeting

Patterns: 1. Big Tech dominance. The largest fines are overwhelmingly against a small number of major tech companies — Meta alone accounts for over 2.5 billion euros in fines. This reflects both the scale of these companies' data processing and the regulatory focus on the most visible actors.

  1. Ireland as bottleneck. The concentration of enforcement through the Irish DPC has been controversial. Other DPAs (particularly Germany, France, and Austria) have publicly criticized the DPC's pace and approach, arguing that it is too slow, too procedural, and too lenient.

  2. Consent and transparency. Many of the largest fines relate to consent violations — companies processing data without valid consent or without adequately informing users about their data practices. This reflects the GDPR's elevation of consent as a governance principle.

  3. Limited individual redress. Despite the large fines, individual data subjects have seen limited direct benefit. Fines go to the state, not to affected individuals. The GDPR provides for individual compensation through courts (Article 82), but few cases have resulted in meaningful awards.

25.2.2 FTC Enforcement in the United States

The Federal Trade Commission operates under a fundamentally different model from EU DPAs. The FTC lacks specific data protection authority — it uses its general authority over "unfair or deceptive acts or practices" (Section 5 of the FTC Act) to address data privacy and security violations.

Key FTC enforcement actions:

Facebook/Meta (2019): $5 billion. The largest FTC privacy penalty in history at the time, settled through a consent decree that required Facebook to undergo periodic independent privacy assessments, give Facebook's board of directors direct oversight of privacy decisions, and designate compliance officers.

Amazon/Ring (2023): $5.8 million. Penalty for allowing employees and contractors to access customers' private Ring camera footage and failing to adequately secure video data.

Fortnite/Epic Games (2022): $520 million. The largest FTC enforcement involving children's privacy, addressing dark patterns that tricked players (many under 13) into making unintended purchases and violated COPPA consent requirements.

Key limitation: consent decrees. The FTC's primary enforcement tool is the consent decree — a negotiated agreement in which a company agrees to specific practices without admitting wrongdoing. Consent decrees typically last 20 years and include monitoring requirements.

But consent decrees have been criticized as insufficient. Facebook entered into a consent decree with the FTC in 2012 over deceptive privacy practices. Three years later, the Cambridge Analytica scandal revealed that Facebook had continued the very practices the consent decree was supposed to address. The 2019 $5 billion settlement was a response to the violation of the 2012 consent decree — raising the question of whether consent decrees serve as genuine accountability mechanisms or merely as the cost of doing business.

Consent Fiction: The FTC's consent decree model exhibits the Consent Fiction in regulatory form. Companies "consent" to changed behavior through a legal agreement, much as users "consent" to data practices through a click. In both cases, the consent may be formal rather than substantive — the company calculating that the cost of compliance is manageable, just as the user clicks "accept" because the alternative is losing access to the service.


25.3 Regulatory Capture

25.3.1 The Theory

Regulatory capture is the phenomenon in which a regulatory agency, created to act in the public interest, comes to act primarily in the interest of the industry it regulates. The concept, developed by economists George Stigler and Mancur Olson, identifies several mechanisms through which capture occurs:

Revolving door. Staff and leadership move between the regulatory agency and the regulated industry. Former regulators become industry lobbyists or executives; former industry executives become regulators. This movement creates relationships, sympathies, and conflicts of interest that can bias regulatory decisions.

Information asymmetry. The regulated industry has more information about its own operations than the regulator does. Regulators depend on industry cooperation to understand the practices they regulate — creating a dynamic in which the industry shapes the regulator's understanding of reality.

Resource disparity. Regulated industries can spend far more on lobbying, legal defense, and public relations than regulators can spend on enforcement. A company with a $500 million legal budget facing a DPA with a $25 million total budget has a structural advantage in any adversarial proceeding.

Cultural proximity. Regulators and regulated entities often share professional backgrounds, educational institutions, and social networks. This proximity can create a "insider/outsider" dynamic in which the regulator identifies more with industry perspectives than with the public interest.

25.3.2 Capture in Data Protection

Evidence of regulatory capture in data protection is contested but concerning:

The Irish DPC. The concentration of tech industry presence in Ireland — and the corresponding importance of the tech sector to the Irish economy — creates structural incentives for the DPC to take an approach that is perceived as friendly to industry. Whether this constitutes capture or simply reflects resource constraints and procedural caution is debated.

The FTC's revolving door. Multiple former FTC commissioners and senior staff have joined technology companies, law firms representing technology companies, or technology industry trade groups. While this is not unique to the tech sector, it raises questions about whether enforcement decisions are influenced by the prospect of future industry employment.

Industry participation in standard-setting. The GDPR's co-regulatory mechanisms — approved codes of conduct, certification schemes — allow industry to shape the operational standards that define compliance. This can be valuable (industry expertise improves standard quality) or captured (industry writes standards that favor its interests).

Sofia Reyes had spent years studying regulatory capture from her position at DataRights Alliance. She presented her analysis to Dr. Adeyemi's class during a video guest lecture:

"Capture doesn't always look like corruption. It doesn't require bad actors. It operates through perfectly legal channels — industry conferences where regulators and executives socialize, comment processes where companies submit hundreds of pages of technical analysis that under-resourced DPAs cannot independently verify, revolving doors that create invisible loyalties. The result is not a regulator that is bought. It's a regulator that is slowly, incrementally, unconsciously aligned with the perspective of the industry it oversees."

Power Asymmetry: Regulatory capture is the Power Asymmetry operating within the governance system itself. The asymmetry between a multinational technology company and a data protection authority — in resources, expertise, legal capacity, political influence — means that even well-intentioned regulators face structural disadvantages. The people with the most data power have the most ability to shape the rules governing that power.

25.3.3 Mitigating Capture

Mechanisms to reduce regulatory capture include:

  • Adequate funding. DPAs need budgets proportional to their mandate. The EU should consider centralized funding mechanisms to prevent national budget constraints from undermining enforcement.
  • Cooling-off periods. Mandatory waiting periods between leaving a regulatory position and joining a regulated company.
  • Civil society participation. Ensuring that public interest organizations have standing, resources, and access to participate in regulatory proceedings on equal footing with industry.
  • Transparency. Public disclosure of meetings between regulators and industry representatives, lobbying expenditures, and enforcement decision rationale.
  • Multi-DPA enforcement. The EDPB's dispute resolution mechanism, which allows other DPAs to challenge a lead authority's decision, provides a check on single-DPA capture.

25.4 Compliance vs. Ethics: The Gap

25.4.1 The Distinction

Chapter 6 introduced the gap between law and ethics — Dr. Adeyemi's distinction between the "floor" (what the law requires) and the "ceiling" (what ethics demands). In the context of data governance, this gap is not abstract. It manifests in specific, consequential ways:

Compliance Question Ethics Question
Does our privacy policy comply with GDPR Article 13? Does anyone actually understand our privacy policy?
Did we obtain consent as required by law? Was the consent meaningful — could the person have reasonably refused?
Are our data practices within the letter of the law? Would the people whose data we collect be comfortable if they truly understood what we do?
Have we documented our data protection impact assessment? Did we genuinely consider the impact, or did we fill out the template?
Did we meet the 72-hour breach notification deadline? Should we have designed the system to prevent the breach in the first place?

25.4.2 The Compliance Mindset

Organizations that adopt a compliance-first approach — treating legal requirements as the goal rather than the floor — exhibit predictable patterns:

Checkbox culture. Compliance becomes a documentation exercise. The Data Protection Impact Assessment is filled out, filed, and forgotten. The consent banner is deployed, and the boxes are checked. The Data Protection Officer is appointed, and the organizational chart is updated. The substance of these mechanisms — genuinely assessing impact, genuinely informing users, genuinely empowering the DPO — may be absent.

Optimization at the margins. Compliance-minded organizations seek the most favorable interpretation of ambiguous requirements. If the law says "reasonable security measures," the compliance team asks: "What is the minimum we can do that would be considered 'reasonable' if we were audited?" An ethics-minded organization asks: "What security measures would we want in place if this were our data?"

Jurisdictional arbitrage. Compliance-minded organizations may locate data processing in jurisdictions with the weakest requirements, establish lead supervisory authority in the most lenient DPA's territory, or structure activities to fall outside regulatory scope. All of this may be perfectly legal. None of it is ethical.

25.4.3 Moving From Compliance to Ethics

Ray Zhao described NovaCorp's journey from compliance to ethics during his guest lecture:

"When I arrived at NovaCorp, the company's data governance was entirely compliance-driven. We had a GLBA privacy notice because GLBA required it. We had PCI-DSS controls because the card brands required them. We had a SOX compliance program because the auditors required it. Every data governance activity was tied to a regulatory requirement. There was no initiative to do anything beyond what regulators demanded.

"The shift happened when we started asking different questions. Not 'what does the regulation require?' but 'what do our customers expect and deserve?' Not 'will this pass an audit?' but 'is this the right thing to do with someone's financial data?' Those questions led us to stronger protections, more transparent practices, and better relationships with our customers — not because the law changed, but because our perspective changed."

Connection to Chapter 6: This shift maps directly onto the ethical frameworks from Chapter 6. Compliance is a rules-based approach — close to deontological ethics in its adherence to prescriptive requirements. The move to ethics incorporates virtue ethics (what kind of organization do we want to be?), care ethics (what do we owe to the people whose data we hold?), and justice theory (are our practices fair to all stakeholders, including the most vulnerable?).

Dr. Adeyemi crystallized the distinction: "Compliance asks: 'Are we following the rules?' Ethics asks: 'Are the rules sufficient — and if not, what more should we do?' The best organizations do not wait for regulators to tell them what is right. They figure it out for themselves, using ethical reasoning, stakeholder engagement, and a genuine commitment to the well-being of the people whose data they hold. That is the subject of Part 5."


25.5 The Limits of Law

25.5.1 What Law Can Do

Data protection law has achieved significant things:

  • Established norms. The GDPR has made "privacy by design," "data minimization," and "purpose limitation" part of the global vocabulary of responsible data management, even in jurisdictions where they are not legally required.
  • Created accountability mechanisms. DPAs, data protection officers, data protection impact assessments, and breach notification requirements create institutional structures for accountability that did not exist before.
  • Imposed costs on bad behavior. Multi-billion-euro fines, consent decrees, and enforcement actions have made data protection violations financially consequential for even the largest companies.
  • Empowered individuals. Data subject rights — access, portability, erasure, objection — give individuals tools to understand and control how their data is used.
  • Catalyzed organizational change. Regulatory requirements have driven investment in data governance programs, privacy engineering, and ethics infrastructure that many organizations would not have undertaken voluntarily.

25.5.2 What Law Cannot Do

Despite these achievements, law faces inherent limitations in governing data:

Speed. Legislative processes are slow. Technology moves fast. The GDPR took four years to negotiate and two more to implement. By the time it became applicable in 2018, the data practices it was designed to govern had already evolved. This regulatory lag is structural — law will always follow technology, never lead it.

Complexity. Data systems are extraordinarily complex. Understanding what an algorithm does, what data it uses, and what biases it contains requires technical expertise that most regulators do not possess in sufficient depth. Regulating what you cannot fully understand is inherently limited.

Extraterritoriality. Data crosses borders effortlessly; jurisdiction stops at them. A law enacted in Brussels cannot be enforced in Beijing without the cooperation of Chinese authorities. The GDPR's extraterritorial scope is a partial solution, but enforcement against companies with no physical EU presence remains challenging.

Culture. Law can mandate practices, but it cannot mandate values. An organization that complies with data protection law out of fear of fines will behave differently from one that complies out of genuine respect for privacy. When enforcement attention shifts elsewhere, the fear-based organization will cut corners. The values-based organization will not. Law can create floors, but it cannot create ceilings.

Power. The most fundamental limitation: data protection law operates within existing power structures. It can constrain those structures at the margins — imposing fines, requiring impact assessments, creating individual rights — but it cannot fundamentally redistribute the power that data confers. The companies that collect data on billions of people, the governments that conduct mass surveillance, the data brokers that trade in personal information — all remain powerful actors that law can modulate but not transform.

25.5.3 What Fills the Gap?

If law alone is insufficient, what else is needed?

  • Organizational ethics programs — the subject of Chapter 26
  • Professional norms — ethical standards within data science, engineering, and product management communities
  • Market pressure — consumers choosing privacy-respecting products and services
  • Civil society — organizations like DataRights Alliance that advocate, research, and hold power accountable
  • Technology design — privacy by design, security by default, and ethical AI development practices
  • Democratic participation — citizens engaging in data governance through public consultations, citizen assemblies, and political action

Law is necessary but not sufficient. The chapters ahead explore what else is needed.


25.6 Self-Regulation and Industry Codes of Conduct

25.6.1 The Appeal of Self-Regulation

Industry self-regulation holds persistent appeal for several reasons:

  • Expertise. Industry understands its own practices better than external regulators.
  • Speed. Industry can develop standards faster than legislative processes.
  • Flexibility. Self-regulatory standards can be updated quickly as technology evolves.
  • Buy-in. Standards developed by industry may be implemented more willingly than externally imposed requirements.

25.6.2 The Track Record

The track record of self-regulation in data protection is, to put it charitably, uneven.

The 1990s US experiment. In the late 1990s, the Clinton administration adopted a policy of "industry self-regulation" for online privacy. Industry groups developed voluntary privacy frameworks (TRUSTe, BBBOnline) and committed to self-governance. The result was privacy policies that were lengthy, legalistic, and incomprehensible; data practices that were invasive and opaque; and the growth of a surveillance advertising industry that monetized personal data on an unprecedented scale. The self-regulatory approach was widely judged a failure.

Google's AI ethics. In 2018, Google published its AI Principles — a voluntary ethical framework for AI development. In 2019, Google established an external AI ethics advisory council, which was disbanded within a week amid controversy over its membership. In 2020, Google fired two prominent AI ethics researchers who had published research critical of the company's large language models. The gap between Google's stated principles and its operational behavior illustrates the limits of voluntary ethics commitments.

Industry codes under the GDPR. The GDPR provides for approved codes of conduct (Article 40) developed by industry associations and approved by DPAs. To date, relatively few codes have been approved, and their impact on actual data practices remains uncertain.

25.6.3 When Self-Regulation Works

Self-regulation is most effective when:

  1. The incentives align. When poor data practices genuinely harm the company's business — through reputational damage, customer churn, or legal liability — self-regulation can be effective because the company has a business reason to comply.
  2. There is a credible regulatory backstop. Self-regulation works better when the implicit threat of government regulation provides motivation. The prospect of legislation can drive genuine self-regulatory efforts.
  3. Standards are independently verified. Self-regulation with third-party auditing is more credible than self-regulation with self-assessment.
  4. Civil society has a seat at the table. Multi-stakeholder self-regulatory processes — including industry, civil society, and academic representatives — produce better outcomes than industry-only processes.

When these conditions are absent — when the incentives favor data exploitation, when there is no regulatory backstop, when assessment is self-conducted, and when civil society is excluded — self-regulation is a fig leaf.


25.7 Eli Testifies: The Detroit Data Governance Ordinance

25.7.1 The Context

In the autumn of 2025, the Detroit City Council considered a proposed Data Governance Ordinance — a local law that would require city agencies and their contractors to:

  • Conduct community impact assessments before deploying surveillance technology or algorithmic decision-making systems
  • Establish a public registry of all data-intensive systems operated by or on behalf of the city
  • Create a Community Data Advisory Board with seats for residents of communities most affected by data practices
  • Require annual algorithmic audits for systems used in policing, public benefits, and housing

The ordinance was modeled on similar initiatives in Oakland, San Francisco, and New York City — and it was opposed by city agencies that viewed it as bureaucratic, by technology vendors that viewed it as burdensome, and by police unions that viewed it as obstructive.

25.7.2 Eli's Testimony

Eli testified before the city council during the public comment period. His testimony drew on everything he had learned in Dr. Adeyemi's course — and everything he had experienced growing up in a neighborhood that had been subjected to data-intensive governance without data governance protections.

"Council members, my name is Eli Okonkwo. I'm a senior at [the university], and I grew up in the Brightmoor neighborhood.

"Three years ago, Smart City sensors were installed on lampposts in my neighborhood. Nobody asked us. Nobody told us what data was being collected. Nobody explained that the 'traffic optimization' sensors would also collect WiFi probe requests from our phones, ambient audio, and license plate data. We found out from a journalism student's FOIA request.

"Two years ago, the Detroit Police Department began using a predictive policing algorithm — ShotSpotter-enhanced patrol deployment — that disproportionately directed police resources to Black neighborhoods. The algorithm was trained on historical crime data, which reflected decades of over-policing in communities like mine. The result was a feedback loop: we were policed more because the algorithm said we should be, and we generated more data because we were policed more.

"I am not here to argue that data systems are inherently evil. I am here to argue that data systems deployed without community input, without transparency, without accountability, and without independent oversight are a form of governance that is incompatible with democracy.

"This ordinance is not about stopping technology. It is about governing technology — the same way we govern every other form of power in a democratic society. You would not allow a corporation to operate a police force in my neighborhood without oversight. You should not allow a corporation to operate an algorithmic system that functions like a police force without oversight either.

"I ask you to pass this ordinance. Not because it is perfect — it is not. But because the alternative — the status quo, in which data systems are deployed in my community with no transparency, no accountability, and no community voice — is unacceptable."

The council chamber was quiet for a moment after Eli finished. Several council members asked follow-up questions. The ordinance passed 7-2 three weeks later.

25.7.3 The Significance

Eli's testimony — and the ordinance's passage — illustrate several themes that thread through this chapter:

Local governance fills federal gaps. In the absence of comprehensive federal data protection legislation, cities and states are developing their own frameworks. These local efforts are imperfect, inconsistent, and limited in scope — but they represent democratic communities asserting control over the data systems that govern their lives.

Affected communities must have voice. The ordinance's Community Data Advisory Board ensures that the people most affected by data systems — not just the technologists who build them or the administrators who deploy them — have a role in governance decisions. This is not a courtesy. It is a democratic requirement.

Enforcement requires institutional capacity. The ordinance created obligations, but whether Detroit has the resources to enforce them — to conduct algorithmic audits, to maintain a public registry, to staff a community advisory board — remains an open question. As we've seen throughout this chapter, the gap between law and practice is often a resource gap.

Power Asymmetry: Eli's testimony is a direct confrontation with the Power Asymmetry. A 23-year-old college student standing in a city council chamber, armed with data literacy and democratic conviction, challenging the deployment of algorithmic systems backed by multinational technology vendors and institutional inertia. The asymmetry is vast. But the democratic process — public testimony, elected representatives, binding law — provides a mechanism for channeling that asymmetry into accountability. Imperfect, slow, incomplete — but real.


25.8 Sofia Reyes: The Advocacy Perspective

Sofia brought the DataRights Alliance's perspective to the class through a video lecture that complemented Ray Zhao's corporate view:

"At DataRights Alliance, we spend most of our time on enforcement. Not because enforcement is the most interesting part of data governance — it isn't — but because it's the part where theory meets reality.

"Here's what I've learned: the most beautifully written law is worthless if no one enforces it. And enforcement depends on three things: resources, independence, and political will. Resources are the simplest to measure and the hardest to secure — every DPA budget debate is a fight. Independence is harder to measure but equally important — a DPA that is formally independent but practically constrained by industry relationships or government pressure is independent in name only. And political will — the decision to pursue powerful entities, to impose meaningful penalties, to risk the political backlash — is the rarest and most important.

"We advocate for all three. We file complaints. We intervene in cases. We publish enforcement scorecards that evaluate DPA performance. We lobby for increased budgets. We support whistleblowers. And we try to shift the narrative from 'data protection is a burden on business' to 'data protection is a precondition for democratic society.'

"I'll leave you with one thought: every person who says 'regulation doesn't work' should ask themselves who benefits from that conclusion. Because the companies with the most to gain from unregulated data practices are often the loudest voices arguing that regulation is ineffective. If regulation truly didn't work, they wouldn't spend hundreds of millions of dollars trying to weaken it."


25.9 Closing Part 4: From Rules to Responsibility

This chapter — and Part 4 — ends with a tension that cannot be resolved by law alone.

We have spent six chapters mapping the governance landscape: regulatory models, AI regulation, organizational governance, cross-border flows, sector-specific frameworks, and enforcement. The architecture is impressive in its scope and ambition. It is also, as we've seen, incomplete, under-resourced, politically contested, and structurally lagging behind the technology it seeks to govern.

The fundamental insight of Part 4 is this: governance is necessary but not sufficient. Laws provide the floor. Enforcement makes the floor real. But the ceiling — the aspiration to handle data not just legally but well — requires something more than rules. It requires responsibility.

That is the subject of Part 5.


25.10 Chapter Summary

Key Concepts

  • Data protection authorities vary enormously in resources, independence, and effectiveness. The GDPR's one-stop-shop mechanism concentrates enforcement responsibility in a small number of DPAs.
  • GDPR enforcement patterns show a focus on Big Tech, consent violations, and transparency failures, with multi-billion-euro fines but limited individual redress.
  • FTC enforcement relies on Section 5 authority and consent decrees, which have been criticized as insufficient to change corporate behavior.
  • Regulatory capture occurs through revolving doors, information asymmetry, resource disparity, and cultural proximity — undermining regulatory independence without requiring corruption.
  • Compliance vs. ethics is a critical distinction: compliance means following rules; ethics means doing right, even when rules are absent, ambiguous, or insufficient.
  • Law has inherent limits — regulatory lag, technical complexity, jurisdictional boundaries, the inability to mandate values, and the persistence of structural power.
  • Self-regulation has a poor track record in data protection but can be effective with aligned incentives, credible regulatory backstops, independent verification, and multi-stakeholder participation.

Key Debates

  • Is the GDPR's enforcement regime effective, or do large fines against Big Tech mask systemic under-enforcement?
  • Does the one-stop-shop mechanism create efficiency or capture?
  • Can the gap between compliance and ethics be closed through regulation, or does it require cultural and organizational change?
  • Is self-regulation a legitimate governance approach or an industry strategy to forestall real regulation?
  • What role should affected communities play in data governance — advisory, consultative, or decision-making?

Applied Framework

When evaluating data governance enforcement: 1. Authority: Does the enforcer have adequate legal authority to investigate and penalize? 2. Resources: Are staffing and budget proportional to the scope of the regulatory mandate? 3. Independence: Is the enforcer structurally, financially, operationally, and personally independent? 4. Effectiveness: Do enforcement actions lead to actual changes in behavior, or merely to financial settlements? 5. Beyond compliance: Does the enforcement regime encourage organizations to go beyond minimum legal requirements? 6. Community voice: Are the people affected by data practices involved in governance decisions?


What's Next

Part 4 has mapped the landscape of rules. Part 5: Corporate Responsibility and Data Ethics in Practice examines what happens inside organizations that take data governance seriously — not because the law demands it, but because responsibility demands it. In Chapter 26: Building a Data Ethics Program, we'll explore how organizations can move from compliance to ethics, establishing ethics committees, operationalizing ethical frameworks, and changing organizational culture. Ray Zhao and VitraMed will feature prominently — because the transition from rules to responsibility is ultimately an organizational journey.

Before moving on, complete the exercises and quiz to solidify your understanding of enforcement, compliance, and the limits of law.


Chapter 25 Exercises → exercises.md

Chapter 25 Quiz → quiz.md

Case Study: GDPR's Largest Fines: Enforcement Patterns → case-study-01.md