Chapter 38: Key Takeaways — Regulatory Approaches


  1. Regulations are slower than platforms but more durable. Individual behavior change and platform self-regulation have not adequately constrained algorithmic harms. Regulation — though slower to develop and enforce — creates baseline expectations and accountability structures that persist across platform iterations.

  2. Most existing regulation was designed for a different information environment. Section 230 (1996), COPPA (1998), and GDPR (2018) were all designed before the smartphone era, before short-form video, and before the full development of engagement-maximizing algorithmic recommendation. Applying old frameworks to new problems produces inadequate results.

  3. Section 230's immunity for user-generated content was designed for passive hosting, not active algorithmic amplification. The legal question of whether Section 230 protects platforms' decisions to algorithmically amplify specific content remains contested. Courts have generally extended immunity, but the Supreme Court has not directly resolved the algorithmic amplification question.

  4. Section 230 reform is complicated by the fact that left and right critics want opposite outcomes. Progressive critics want platforms to moderate more (and face liability for harmful amplified content). Conservative critics want platforms to moderate less (and face liability for politically biased moderation). These contradictory demands prevent coalition-building for any specific reform.

  5. GDPR is the world's most comprehensive data protection regulation, with significant enforcement gaps. The consent framework is functionally broken by dark patterns in consent interfaces. Enforcement has been undermined by underfunded national DPAs, particularly Ireland's. Substantial fines have been levied but have not fundamentally changed platform business models.

  6. The EU Digital Services Act is the world's most ambitious platform regulation. The DSA requires algorithmic transparency, explicit prohibition of dark patterns, mandatory systemic risk assessments with independent auditing, and researcher data access — for the first time creating genuine accountability for platform design rather than just data collection.

  7. Early DSA implementation has produced concrete design changes. Chronological feed alternatives have been launched by Meta, TikTok, and YouTube in direct response to DSA requirements. Formal enforcement proceedings have been opened against X and TikTok. Researcher data access is beginning to operationalize.

  8. The DSA does not address the fundamental incentive misalignment. Platforms can satisfy DSA requirements while maintaining engagement-maximizing business models if they document risks and implement self-assessed "mitigation measures." The regulation addresses accountability for harmful practices without prohibiting the incentive to engage in them.

  9. The UK Online Safety Act's "duty of care" approach focuses on harmful content categories rather than design manipulation. The OSA is more controversial than the DSA, with significant concerns about age verification privacy tradeoffs, the treatment of end-to-end encryption, and the definition of "legal but harmful" content that the government can require platforms to restrict.

  10. COPPA has largely failed to protect children online. The age verification mechanism (self-certification checkboxes) is systematically circumvented. Major platforms have large under-13 userbases despite nominal age minimums. Enforcement has produced significant fines but not changed business models.

  11. Proposed successors to COPPA (KOSA) have bipartisan support but significant civil liberties concerns. The Kids Online Safety Act's duty of care framework for minors has broad political support but has been criticized for potentially requiring platforms to restrict legal content adults have a right to access, and for privacy costs of age verification.

  12. Algorithmic accountability requires more than transparency. Requiring platforms to publish algorithm descriptions (DSA transparency provisions) is valuable but insufficient because machine learning systems can be accurately described in general terms while their operational behavior remains effectively opaque. Meaningful accountability requires third-party auditing with data access, pre-deployment testing, and liability for documented harms.

  13. The EU's "Brussels Effect" extends EU regulations globally. Multinationals often implement EU standards globally rather than maintaining geographic variants, meaning EU regulations have regulatory reach beyond EU borders. This makes the EU the most consequential jurisdiction for global platform governance despite not having the largest user base.

  14. Three regulatory philosophies produce different outcomes. The EU precautionary principle justifies regulation before harms are fully proven and produces the most comprehensive framework. The US speech-protective tradition constrains content-adjacent regulation through First Amendment concerns and produces a patchwork of sector-specific rules. Chinese state control demonstrates what is technically possible while illustrating why such control is incompatible with civil liberties.

  15. The FTC has meaningful but limited authority to address dark patterns. FTC Section 5 authority over unfair and deceptive practices covers specific dark patterns, and enforcement is increasing (the Amazon Prime case). But comprehensive platform regulation — including mandatory algorithmic impact assessments and liability for engagement-amplified harms — requires new legislation the FTC cannot create unilaterally.

  16. Effective regulation would need to address incentive structures, not just practices. The fundamental problem is the misalignment between platform incentives (maximize engagement to sell advertising) and user interests (wellbeing, accurate information). Regulation that changes forms of engagement maximization without changing the incentive to maximize engagement addresses symptoms rather than causes. Liability for harm, advertising market regulation, and interoperability requirements would reach deeper.

  17. Regulation is necessary but not sufficient. No single regulatory intervention addresses all dimensions of platform harm. Effective governance requires a combination of approaches: data protection (GDPR-type), design accountability (DSA-type), content safety (OSA-type), children's protection (COPPA successor), competition (interoperability and antitrust), and algorithmic liability — alongside individual behavior change and platform self-reform.