Case Study 37-2: YouTube Kids and the Monetization of Childhood Attention
Background
In September 2019, the Federal Trade Commission and the New York State Attorney General announced a $170 million settlement with Google over violations of the Children's Online Privacy Protection Act (COPPA). The settlement was the largest civil penalty ever imposed under COPPA, and it emerged from a finding that Google had collected personal information from children under 13 who watched child-directed content on YouTube — without obtaining the verifiable parental consent that COPPA requires.
The case revealed something that industry insiders had known for years and that parents were only beginning to understand: YouTube, one of the most powerful media platforms in the world, had built a business model around child viewership that was systematically incompatible with the legal requirements for children's data privacy. And YouTube Kids — the nominally child-safe version of the platform launched in 2015 — had not, in practice, separated child viewers from adult surveillance infrastructure as thoroughly as its branding implied.
How YouTube's Business Model Collides with Children's Privacy
YouTube's business model depends on behavioral advertising: showing users advertisements that are targeted based on their demonstrated interests, demographic characteristics, and behavioral patterns. To do behavioral advertising, YouTube must collect and analyze data about user behavior — what they watch, how long, what they engage with, what they search for, what they watch next.
This model is enormously valuable and, for adult users who have made at least nominal choices about privacy, governed by YouTube's (extensive) terms of service and Google's (vast) privacy infrastructure. For children under 13, it is directly prohibited by COPPA.
The COPPA problem for YouTube was structural: the platform did not have a reliable mechanism for distinguishing content watched by children from content watched by adults. "Baby Shark" — which became the most-viewed YouTube video in history — was on regular YouTube, watched by millions of small children, served with behavioral advertising based on data that, under COPPA, could not legally be collected from those viewers.
The content creators who made child-directed content for YouTube were aware of this dynamic, and many of them had actively resisted YouTube's requests that they label their content as "made for kids" — because "made for kids" designation, under Google's implementation of the COPPA settlement, meant losing behavioral advertising revenue and accepting standard advertising with dramatically lower rates. The financial incentives created by YouTube's advertising model pushed creators in the direction of not labeling children's content as such, because the alternative was financially devastating.
YouTube Kids — The False Sanctuary
YouTube Kids was launched in 2015 as a response to parent and regulatory concern about children's access to inappropriate content on regular YouTube. The platform offered curated, age-appropriate content in a simplified interface, and was marketed to parents as a safe environment for young children.
Several dimensions of YouTube Kids' privacy practices were, despite its positioning, continuous with the surveillance practices of regular YouTube. First, YouTube Kids collected device identifiers even for unsigned-in users — enabling tracking of viewing behavior across sessions without account registration. Second, the content available on YouTube Kids was not comprehensively moderated: investigative reporting and parent accounts documented the presence of inappropriate, disturbing, or violent content that had slipped past the platform's curation mechanisms. Third, the behavioral data generated by children using YouTube Kids was used to improve YouTube's broader recommendation algorithms, even if not used for direct behavioral advertising to the child viewer.
The 2019 settlement required Google to implement age screening for YouTube Kids, provide notice to parents about data collection, and establish a $170 million fund for child privacy initiatives. It did not require YouTube Kids to become a non-tracking platform. The post-settlement architecture of YouTube Kids still involves data collection; it just involves data collection with somewhat improved parental notice and reduced advertising targeting.
The Recommendation Algorithm as Developmental Environment
Beyond the specific COPPA compliance issues, the YouTube Kids case raises a deeper question about what it means to subject children's cognitive and emotional development to algorithmic curation. YouTube's recommendation algorithm is designed to maximize watch time — to keep users watching as long as possible by continuously recommending content that is likely to maintain engagement.
For adult users, this produces the "rabbit hole" phenomenon documented by researchers who have tracked how YouTube's recommendations can move users from moderate to increasingly extreme content. For children, the implications are different but no less significant. An algorithm designed to maximize a child's engagement will consistently select content that is most stimulating — most colorful, most surprising, most emotionally intense — regardless of whether that content is educationally appropriate or developmentally beneficial. The algorithm is not educated in child development. It is educated in engagement metrics.
Pediatricians and child development researchers have raised concerns about the effects of algorithm-curated content on children's cognitive development, attention patterns, and emotional regulation. The American Academy of Pediatrics' screen time guidance has evolved over time, but has consistently emphasized the quality of content and the presence of co-viewing caregivers as more important than raw screen time — considerations that algorithmic recommendation systems are not designed to optimize for.
The YouTube Kids algorithm, optimized for watch time in an environment where parents often leave children with devices unsupervised, is a developmental environment that prioritizes engagement over learning, stimulation over reflection, and continuous consumption over pause and processing. This is not a COPPA violation. It is a design choice with developmental consequences that the platform's business model makes it difficult to modify.
Implications for COPPA Reform
The YouTube Kids case illustrated the limitations of COPPA as a framework for protecting children's privacy in a behavioral advertising economy. COPPA's parental consent requirement, designed for a world of website registrations, is difficult to operationalize for a world of ambient digital media consumption by young children. The penalty, while a regulatory record, was modest relative to the value of the data Google had collected. And the settlement's remedies — improved notice, age screening, a fine — did not fundamentally change the relationship between child viewership and data collection on the platform.
Several proposals for strengthening COPPA have been advanced, including extending its protections to teenagers (COPPA currently applies only to children under 13), requiring data minimization (collecting only what is necessary) rather than merely parental consent, prohibiting behavioral advertising to minors regardless of consent, and establishing a private right of action for families whose children's data has been misused.
Whether such reforms are enacted will reflect not merely a policy debate but a power asymmetry: the companies that benefit from children's data surveillance are enormously powerful actors in political economy. The parents and children whose interests reform would serve are, individually, weak. The structural interest in the continued monetization of childhood attention is significant.
Discussion Questions
-
YouTube's content creators faced financial incentives that pushed against labeling children's content as "made for kids." What does this reveal about the relationship between platform architecture, financial incentives, and COPPA compliance?
-
The case study describes YouTube Kids as a "false sanctuary" — a platform that positioned itself as child-safe while maintaining data collection practices continuous with regular YouTube. How does this relate to the concept of consent as fiction from the book's recurring themes?
-
The recommendation algorithm optimizes for engagement rather than developmental appropriateness. Is this a problem that regulation can address, or is it intrinsic to the economic model of free, ad-supported media?
-
COPPA's parental consent mechanism requires parents to make informed choices about their children's data. In practice, how informed can parents be? What would meaningfully informed parental consent for children's media platforms look like?
-
Should children's media platforms be permitted to use behavioral advertising at all? Defend your answer with reference to both child development research and the practical economics of media production.