Case Study: The UK Age Appropriate Design Code in Practice
"If a service is likely to be accessed by children, it should be designed for children." — Elizabeth Denham, UK Information Commissioner (2020)
Overview
On September 2, 2021, the UK Age Appropriate Design Code — officially the Children's Code — came into full effect. It was the most ambitious attempt by any jurisdiction to shift the burden of children's data protection from individual consent to organizational design. Rather than asking whether a child (or parent) had clicked "I agree," the AADC asked whether the service itself was designed to protect children.
The result was a global earthquake in technology regulation. Platforms with hundreds of millions of users — Instagram, TikTok, YouTube, Snapchat, and dozens of others — changed their default settings, redesigned their notification systems, and altered their data collection practices. Some of these changes applied worldwide, not just in the UK. A single regulatory code, issued by the Information Commissioner's Office of a country with 67 million people, reshaped the digital experience of children everywhere.
This case study examines how the AADC works, how platforms responded, what it achieved, and where its limitations lie.
Skills Applied: - Analyzing regulatory design choices and their consequences - Evaluating corporate compliance strategies (genuine vs. performative) - Comparing regulatory approaches across jurisdictions - Assessing the relationship between design obligations and user outcomes
The AADC's Design
The Fifteen Standards
The AADC contains fifteen standards of age-appropriate design. Each standard specifies a requirement for online services "likely to be accessed by children" — defined as anyone under 18. The standards include:
- Best interests of the child — The best interests of the child should be a primary consideration when designing and developing online services likely to be accessed by children.
- Data protection impact assessments — Undertake a DPIA that considers the specific risks to children.
- Age-appropriate application — Take a risk-based approach to recognizing the age of individual users and ensure you effectively apply the standards in this code to child users.
- Transparency — Provide clear information to children and their parents about data practices, using language appropriate to the child's age.
- Detrimental use of data — Do not use children's personal data in ways that are detrimental to their well-being or that run counter to industry codes of practice, government guidance, or the commissioner's advice.
- Policies and community standards — Uphold published terms, policies, and community standards.
- Default settings — Settings must be "high privacy" by default, unless there is a compelling reason for a different default and that reason takes account of the best interests of the child.
- Data minimization — Collect and retain only the minimum amount of personal data necessary.
- Data sharing — Do not disclose children's data unless there is a compelling reason to do so, taking account of the best interests of the child.
- Geolocation — Switch geolocation options off by default and provide an obvious sign to children when location tracking is active.
- Parental controls — If parental controls are provided, give the child age-appropriate information about this.
- Profiling — Switch options that use profiling off by default, unless there is a compelling reason for profiling that takes account of the best interests of the child.
- Nudge techniques — Do not use nudge techniques to lead or encourage children to provide unnecessary personal data, weaken privacy protections, or extend their use beyond what they intended.
- Connected toys and devices — If providing a connected toy or device, include effective tools to enable conformance with this code.
- Online tools — Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.
What makes this framework distinctive is its orientation. COPPA asks: "Did the parent consent?" The AADC asks: "Is the service designed to protect the child?" This is a paradigm shift — from consent-based governance to design-based governance.
The "Likely to Be Accessed" Standard
The AADC's most consequential design choice may be its scope. It applies not only to services "directed at children" (the COPPA standard) but to services "likely to be accessed by children." This means that mainstream platforms — social media, video streaming, gaming, search engines, messaging — must comply, even if their primary audience is adults.
The ICO provided guidance on how to assess whether a service is "likely to be accessed by children," including: - The nature of the content - The way the service is marketed - Whether similar services attract child users - Available audience data and industry research - The design and functionality of the service
This broad scope closed the loophole that allowed platforms to avoid children's data protection obligations by claiming they were "not a children's service."
Platform Responses
In September 2021 — coinciding with the AADC's enforcement date — Instagram announced several changes:
- Default to private accounts for users under 16. New accounts created by users under 16 (or under 18 in certain countries) were set to private by default.
- Restrictions on adult messaging. Adults who had not previously been connected to a teen user could not send them direct messages.
- Advertising restrictions. Instagram disabled certain advertising targeting options for users under 18, including interest-based and activity-based targeting. Advertisers could only target teens based on age, gender, and location.
- "Take a Break" reminders. Instagram introduced optional reminders encouraging users to take breaks after extended sessions.
Instagram explicitly stated that some of these changes were influenced by the AADC and similar regulatory developments. The changes were applied globally — not just to UK users — a phenomenon known as the "Brussels effect" (or in this case, the "London effect"), where a single jurisdiction's regulation influences global platform behavior because it is more efficient to implement uniform changes than to maintain separate systems for different countries.
Assessment: The changes were substantive but incomplete. Default-to-private addressed Standard 7 (default settings). Messaging restrictions addressed safety concerns. But the fundamental business model — algorithmically amplified content feeds designed for maximum engagement — remained unchanged. Instagram's recommendation algorithm continued to operate on the same engagement-optimization logic for teen users, even if some of the most harmful content categories were subject to additional restrictions.
TikTok
TikTok implemented changes including:
- Disabling direct messages for users under 16.
- Limiting notifications for younger users (no notifications after 9 p.m. for users under 16, after 10 p.m. for users 16-17).
- Disabling duets and stitches (features that allow users to create content alongside strangers' content) for users under 16.
- Default screen time limits — a 60-minute daily screen time limit for users under 18, after which users must actively choose to continue.
Assessment: TikTok's changes went further than most platforms on notifications and screen time. But the company's core challenge — that its algorithm is explicitly designed to learn individual preferences and deliver content that maximizes watch time — remained unaddressed by the AADC. A design code focused on data collection and settings cannot easily reach the algorithmic architecture that determines what children see.
YouTube
YouTube's response predated the AADC. In 2019, following an FTC settlement over COPPA violations, YouTube made children's content "made for kids" — disabling personalized ads, comments, and live chat on children's content, and limiting data collection. YouTube Kids, its separate children's app, applied stricter controls.
Assessment: YouTube's approach was structural — separating children's content into a distinct regulatory category. But the effectiveness depends on accurate content classification. Content creators self-designate whether their videos are "made for kids," and gaming the classification (to retain advertising revenue) is an ongoing challenge. Children who use the main YouTube platform (rather than YouTube Kids) encounter the same algorithmic recommendation system as adult users.
What the AADC Achieved
The Positive Case
The AADC achieved something that consent-based regulation had failed to achieve for over two decades: it compelled major platforms to change their default settings, restrict data collection, and redesign features for child users. The "likely to be accessed" standard ensured that mainstream platforms could not avoid compliance. The design-orientation of the code meant that the burden fell on companies rather than on parents or children.
The AADC also demonstrated the "Brussels effect" in action — a single jurisdiction's regulation changing global platform behavior. When Instagram changed its default privacy settings for teen users worldwide, it did so because maintaining separate systems for UK users and non-UK users was impractical. The AADC thus had an extraterritorial impact far exceeding the UK's population.
Perhaps most importantly, the AADC shifted the policy conversation. Before the code, debates about children's online safety focused primarily on parental responsibility and content moderation. After the code, the conversation expanded to include platform design — the recognition that the architecture of digital services, not just their content, shapes children's experiences.
The Limitations
Design vs. Architecture. The AADC addresses data collection, default settings, and specific features. It does not directly regulate the algorithmic recommendation systems that determine what children see. A platform can comply with every AADC standard while still running an engagement-optimization algorithm that surfaces content harmful to children's mental health. Standard 12 (profiling) requires switching off profiling "by default," but platforms can argue that non-personalized content feeds are not feasible without some form of profiling.
Enforcement Challenges. The ICO has limited resources. Investigating whether a platform's compliance is genuine or performative requires deep technical analysis of platform architecture — the kind of access that regulators rarely have. The ICO can issue enforcement notices and fines, but the gap between the code's ambition and the regulator's capacity is significant.
Age Verification Remains Unsolved. The AADC requires platforms to take a "risk-based approach to recognizing the age of individual users." But the age verification paradox (Section 35.2) remains: how do you verify age without collecting data that creates new privacy risks? Most platforms rely on self-reported age during registration — a system easily circumvented by any child who knows to enter a false birth date.
Industry Framing. Some platforms framed their AADC compliance as corporate social responsibility rather than regulatory obligation, using their changes as marketing tools ("we care about young people's safety") while lobbying against more stringent regulation. This is the familiar pattern from Chapter 26: corporate ethics as both genuine commitment and strategic positioning.
The Global Ripple Effect
The AADC inspired similar initiatives worldwide:
- California's Age-Appropriate Design Code Act (2022) — modeled directly on the UK AADC, requiring businesses to configure privacy settings to the highest level for children. (Its implementation has been subject to legal challenge on First Amendment grounds.)
- The EU Digital Services Act (2022) — includes provisions requiring platforms to protect minors, drawing on AADC principles.
- Australia's Online Safety Act (2021) — established online safety expectations for social media services, informed by the AADC model.
The AADC did not create a perfect regulatory framework. But it created a proof of concept — demonstrating that design-based regulation of children's digital environments is feasible, enforceable (if imperfectly), and capable of compelling meaningful changes in platform behavior.
Discussion Questions
-
The AADC applies to services "likely to be accessed by children." Should this standard also apply to services that are occasionally accessed by children, even if children are a tiny fraction of the user base? Where should the line be drawn?
-
Instagram applied some of its AADC-inspired changes globally rather than only in the UK. Is this an argument for or against individual countries pursuing their own regulatory approaches? What are the risks of a single jurisdiction's regulation shaping global platform behavior?
-
The AADC's fifteen standards focus on data collection and design but do not directly address algorithmic recommendation systems. If you were drafting an update to the code, what standard would you add to address algorithmic amplification? How would you define it?
-
Consider the enforcement gap: the AADC's ambitions are broad, but the ICO's resources are limited. What enforcement strategies could maximize the code's impact given resource constraints?
-
Connect the AADC to the concept of "prefigurative governance" from Chapter 39. In what ways does the AADC represent a governance model that other jurisdictions can build upon? What must change for design-based children's data protection to become a global standard?
Further Investigation
- Read the full text of the UK Age Appropriate Design Code on the ICO's website.
- Compare the California Age-Appropriate Design Code Act with the UK AADC. What provisions are identical? What differs?
- Research the legal challenge to the California AADC on First Amendment grounds. What does this challenge reveal about the differences between US and UK approaches to regulating platform design?