Case Study 38-1: The EU Digital Services Act — The World's Most Ambitious Platform Regulation
What It Requires, How It Works, and What It Reveals About Regulatory Capacity
Background: Why the DSA Was Necessary
By the time the EU's Digital Services Act was proposed in December 2020, the case for a new regulatory framework was well established. The existing legal infrastructure — primarily the E-Commerce Directive (2000) and GDPR (2018) — had been designed for a different information environment. The E-Commerce Directive's notice-and-takedown framework assumed that platform harms consisted of specific illegal content that, once identified, could be removed. GDPR addressed data privacy but said nothing about algorithmic amplification or platform design manipulation.
The problems documented by researchers, journalists, civil society organizations, and whistleblowers by 2020 were different in kind from what the E-Commerce Directive contemplated:
- Algorithmic recommendation systems that amplified harmful, false, and extremist content not because platforms chose to amplify it but because engagement optimization selected for it
- Design patterns specifically engineered to override users' own preferences and extend time on platform
- Structural features that concentrated enormous market power in a handful of US-based platforms
- The systematic disadvantage of independent researchers who lacked access to data needed to assess platform effects
The DSA was designed to address these problems directly, not through the lens of individual content moderation but through the lens of platform accountability and systemic risk.
What the DSA Actually Requires
The DSA has a tiered structure: more requirements for larger platforms, with the most demanding obligations falling on Very Large Online Platforms (VLOPs) with 45+ million monthly active EU users.
For all platforms: - Clear and transparent terms of service - Notice-and-takedown mechanisms for illegal content - Transparency about advertising (who placed it, on what basis it was targeted) - Age verification obligations for platforms hosting legal but harmful content
For large platforms (medium-tier): - Notice and action mechanisms with user appeals - Additional transparency reporting - Cooperation with trusted flaggers (organizations given elevated reporting status)
For VLOPs (the tier that matters most):
Algorithmic transparency and user control: - Users must be able to opt out of recommendation systems based on profiling - Platforms must provide at least one recommendation system not based on profiling (a "chronological" or "non-personalized" alternative) - Platforms must explain, upon request, why specific content was recommended to a user - The main parameters of recommendation systems must be publicly disclosed
Dark pattern prohibition: - Article 25 explicitly prohibits "dark patterns" — defined as practices that "materially distort or impair the ability of recipients of the service to make free and informed decisions" - Specific examples include: consent interfaces using deceptive framing, subscription cancellation more difficult than enrollment, repeated requests for consent after initial refusal, hiding privacy-protective options
Systemic risk assessment: - Annual risk assessments addressing: risks to fundamental rights, civic discourse, electoral integrity, public health, gender-based violence, and minors' wellbeing - Assessments must consider algorithm effects, advertising targeting, data access policies, and content moderation practices - Must be independently audited
Researcher data access: - Platforms must provide vetted researchers (certified through a national Digital Services Coordinator) with access to data necessary to assess systemic risks - Researchers cannot be given access to data that would violate individual privacy, but may receive anonymized or aggregated data, and in some cases more detailed access with appropriate safeguards
Emergency mechanisms: - The European Commission can require specific mitigation measures during crises (defined as events posing serious threat to public security or public health) on 48 hours' notice
Enforcement Architecture
A key design choice in the DSA was enforcement. GDPR had delegated enforcement to national Data Protection Authorities, with major multinationals' lead supervisory authority being Ireland's DPA — which was chronically under-resourced and slow to act. The DSA corrected this by:
- Giving the European Commission direct enforcement authority over VLOPs
- Creating national Digital Services Coordinators (DSCs) for smaller platforms
- Establishing a European Board for Digital Services to coordinate
- Setting penalties at 6% of global annual turnover (rather than GDPR's 4%)
The Commission's direct enforcement over VLOPs means that enforcement is not dependent on the capacity of the country where a platform has its European headquarters.
Early Implementation: What Has Actually Changed
The DSA formally applied to VLOPs from August 2023 and to all platforms from February 2024. The first year of implementation produced several noteworthy developments:
Chronological feeds. In direct response to the requirement for a non-personalized recommendation option, Meta launched chronological feed options on Facebook and Instagram. TikTok launched a "Following" feed in Europe that shows only content from accounts the user has chosen to follow, without algorithmic recommendations. YouTube launched a "Latest" tab showing recent videos without algorithmic amplification. These are genuine user-empowering changes attributable to regulation.
Formal proceedings. The European Commission opened formal investigations into: - X (formerly Twitter) — over concerns about dark patterns, misleading interfaces, and failure to adequately address risks from illegal content and disinformation - TikTok — over concerns about protection of minors and potential VLOP misdesignation of its TikTok Lite product (a lighter version of the app) - Meta — over concerns about advertising transparency and whether its "pay or consent" model (EU users can pay for Meta products without ads instead of consenting to targeted ads) complies with DSA's free consent requirement
Algorithmic transparency reports. VLOPs published their first algorithmic transparency documents, disclosing (at varying levels of detail) the parameters of their recommendation systems. These are a significant information gain compared to what was publicly available before, though critics have noted they are sometimes quite general.
Researcher access. The researcher data access framework began operationalizing, with national DSCs developing processes for certifying researchers. Early research projects began accessing platform data under DSA provisions that would previously have required direct platform partnerships (which platforms could and did decline).
What the DSA Has Not Changed
An honest assessment requires acknowledging what the DSA has not achieved in its early implementation:
Business model fundamentals are unchanged. Platforms' core business model — maximizing engagement to sell advertising — is not prohibited or directly constrained by the DSA. Platforms that conduct risk assessments acknowledging that engagement maximization harms users can satisfy DSA requirements while maintaining the practices that cause harm, as long as they also implement "mitigation measures" they determine are adequate.
The dark pattern prohibition is proving complex to enforce. What exactly constitutes a "dark pattern" under DSA Article 25 is still being defined through enforcement actions and legal interpretation. Platforms have significant latitude to argue that their designs fall outside the prohibition, and litigation will take years.
Compliance theater risk is real. Transparency reports can be technically accurate while obscuring operationally significant information. Risk assessments can acknowledge risks while underestimating them or proposing inadequate mitigations. The information asymmetry between platforms (which know how their systems actually work) and regulators (who must evaluate what platforms tell them) is substantial.
Enforcement capacity is a genuine constraint. The European Commission has taken on responsibility for directly enforcing against dozens of designated VLOPs with staff that is adequate for initial proceedings but will be stretched as the enforcement workload grows.
What the DSA Reveals About Regulatory Capacity
Beyond its specific provisions, the DSA is instructive about the requirements and limits of platform regulation more generally:
Specificity matters. The DSA's explicit prohibition of dark patterns, specific requirements for recommendation system alternatives, and specific researcher data access mechanisms are more actionable than GDPR's more general principles. Regulations that specify required and prohibited behaviors are more effective than regulations that specify goals and leave implementation to platforms.
Enforcement architecture is as important as substantive requirements. GDPR had good requirements; the national enforcement delegation undermined them. The DSA's direct Commission enforcement for VLOPs addresses this. Regulatory capacity — funding, expertise, independence — is necessary for requirements to translate into behavior change.
International regulatory leadership has effects beyond jurisdiction. The DSA is producing changes in platform behavior that extend beyond the EU, through the Brussels Effect and through platforms choosing not to create EU-specific versions of features. This is a more efficient regulatory mechanism than it appears from the text of the regulation alone.
The fundamental tension remains. The DSA is the world's most ambitious platform regulation, and it does not directly address the fundamental misalignment between engagement-maximizing business models and user wellbeing. It requires platforms to assess the risks from their practices; it does not prohibit the practices that cause those risks if platforms implement mitigation measures. Whether the mitigation measure framework is adequate to address serious systemic harms remains the central open question in DSA implementation.
Conclusion: The DSA as a Proof of Concept
The EU Digital Services Act demonstrates that comprehensive platform regulation is achievable — that legislative coalitions can be built, technical requirements can be specified, and enforcement architecture can be designed. This matters because it refutes the claim that platform regulation is inherently impossible.
It also demonstrates the limits of what regulation can accomplish when it leaves fundamental business model incentives unchanged. The DSA is the most ambitious regulation enacted anywhere, and it addresses symptoms of the misalignment between platform incentives and user welfare without addressing the misalignment itself.
This is not a failure specific to the DSA. It reflects a genuine political constraint: directly regulating advertising business models, or imposing liability for engagement-amplified harms, faces more powerful political opposition than transparency and accountability requirements. The DSA achieves what was politically achievable. Whether it is sufficient for the harms documented in this book is a different question, and the honest answer is: probably not fully, but substantially more than what preceded it.
This case study draws on: Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services (Digital Services Act). European Commission enforcement decisions and statements, 2023-2024. Researchers' assessments including: Leerssen, P. et al. (2023). "DSA Implementation Challenges." Internet Policy Review; and ongoing reporting by Politico Europe, TechCrunch, and the EURACTIV platform on DSA enforcement developments.