Exercises: Children, Teens, and Digital Vulnerability

These exercises progress from concept checks to challenging applications. Estimated completion time: 3-4 hours.

Difficulty Guide: - ⭐ Foundational (5-10 min each) - ⭐⭐ Intermediate (10-20 min each) - ⭐⭐⭐ Challenging (20-40 min each) - ⭐⭐⭐⭐ Advanced/Research (40+ min each)


Part A: Conceptual Understanding ⭐

Test your grasp of core concepts from Chapter 35.

A.1. COPPA applies to operators of websites and online services directed at children under a specific age, or operators who have "actual knowledge" they are collecting data from children under that age. What is the age threshold, and why do critics argue it is both too low and too high?

A.2. Explain the difference between COPPA's approach to children's data protection and the UK Age Appropriate Design Code's approach. Which focuses more on organizational behavior, and which focuses more on individual consent mechanisms? What are the practical implications of this difference?

A.3. Define the "age verification paradox" as discussed in Section 35.2. In your own words, explain why verifying a child's age to protect their privacy can itself create privacy risks.

A.4. GDPR Article 8 allows member states to set the digital age of consent anywhere between 13 and 16. List three countries and their chosen age thresholds. Why might this variation create challenges for platforms operating across Europe?

A.5. Section 35.3 distinguishes between correlational and causal evidence regarding social media and youth mental health. In two to three sentences, explain why this distinction matters for policymaking. What would a policymaker risk by treating correlational evidence as if it were causal?

A.6. What is "privacy by default" as it applies to children's digital services under the UK AADC? How does it differ from the standard "privacy by design" concept introduced in Chapter 10?

A.7. Define "dark patterns" in the context of children's digital services. Provide two examples not mentioned in the chapter of design choices that could function as dark patterns when directed at minors.


Part B: Applied Analysis ⭐⭐

Analyze scenarios, arguments, and real-world situations using concepts from Chapter 35.

B.1. Consider the following scenario:

A popular educational math app for children ages 6-12 offers a free version supported by in-app advertising. The app collects the child's first name, age, grade level, math performance scores, time spent on each problem, device identifiers, and coarse location data. The app's privacy policy states that data is shared with "trusted advertising partners" to deliver "age-appropriate advertisements." A parent consented to the terms during initial setup.

Analyze this scenario using COPPA requirements. Identify at least three potential compliance violations. Then analyze the same scenario under the UK Age Appropriate Design Code. Would the outcome differ? Why?

B.2. Section 35.4 discusses the rapid expansion of educational technology (EdTech) during the COVID-19 pandemic. Analyze the following claim:

"The pandemic proved that EdTech surveillance was acceptable because students continued to use the platforms without complaint."

Identify at least three assumptions in this argument and explain why each is problematic, drawing on concepts of consent fiction, power asymmetry, and the special status of children.

B.3. A social media company proposes the following age verification system: users under 18 must upload a government-issued ID, which is verified by a third-party age verification service and then "immediately deleted." Evaluate this proposal from three perspectives: (a) effectiveness at achieving its goal, (b) privacy risks, and (c) equity implications for young people who lack government-issued ID.

B.4. The VitraMed thread in this chapter raises the issue of children's health data collected through a pediatric predictive analytics module. Mira discovers that the module's training data over-represents children from higher-income families with consistent primary care access. Analyze the fairness implications of this bias. Who is harmed, and how? What governance mechanisms should be in place for pediatric health data systems?

B.5. Section 35.1.3 notes that the UK AADC applies not just to services "directed at children" but to services "likely to be accessed by children." Explain why this broader scope matters. Identify three mainstream services (not specifically designed for children) that would fall under this broader standard and analyze the governance implications for each.

B.6. A school district implements a student monitoring system that tracks all student activity on school-issued devices, including browsing history, keystrokes, and social media posts — even when students use the devices at home after school hours. The district argues this is necessary for "student safety." Apply the six-question framework from Chapter 1 to analyze this system. Where does the balance between safety and privacy fall?


Part C: Real-World Application Challenges ⭐⭐-⭐⭐⭐

These exercises ask you to investigate your own environment and apply chapter concepts to current events.

C.1. ⭐⭐ Children's App Audit. Select a popular children's app or game (e.g., a learning app, a children's messaging platform, or a game rated for children). Download and review its privacy policy and terms of service. Answer: (a) What data does it collect? (b) Does it comply with COPPA requirements for verifiable parental consent? (c) How does it handle data retention and deletion? (d) Are there any dark patterns in the user interface? Present your findings in a one-page analysis.

C.2. ⭐⭐ EdTech Privacy Review. Identify two educational technology platforms used in your school, university, or local school district. For each, determine: (a) what student data is collected, (b) whether the platform sells or shares data with third parties, (c) what data is retained after a student leaves the institution, and (d) whether students or parents can request data deletion. Compare the two platforms and assess which provides stronger data governance.

C.3. ⭐⭐⭐ Age Verification Landscape. Research three different approaches to online age verification currently deployed or proposed (e.g., age estimation using facial analysis, digital ID verification, credit card verification, parental attestation). For each, create a brief assessment covering: (a) accuracy, (b) privacy impact, (c) equity implications, (d) technical feasibility, and (e) regulatory compliance. Which approach best balances child protection with privacy?

C.4. ⭐⭐⭐ Youth Mental Health Evidence Review. Locate two peer-reviewed studies on the relationship between social media use and adolescent mental health. For each study, identify: (a) the research design (correlational, longitudinal, experimental), (b) the sample size and demographics, (c) the main findings, and (d) the limitations acknowledged by the authors. Write a one-page assessment of whether the current evidence base is sufficient to justify specific regulatory interventions.


Part D: Synthesis & Critical Thinking ⭐⭐⭐

These questions require you to integrate multiple concepts from Chapter 35 and earlier chapters.

D.1. Section 35.5 discusses the tension between protecting children and respecting their developing autonomy. A 16-year-old activist uses social media to organize political protests. Her government wants to surveil her online activity. Her parents want to restrict her social media use. The platform wants to collect her data for advertising. The child protection framework says she needs protection. The political participation framework says she has rights. Analyze this scenario using at least two ethical frameworks from Chapter 6 (e.g., deontological, utilitarian, virtue ethics, care ethics). How should the competing interests be balanced?

D.2. The chapter argues that children's data governance cannot be solved through consent alone because children cannot meaningfully consent. But the alternative — having parents or guardians consent on children's behalf — has its own problems: parents may not understand the technology, may not share the child's interests, or may themselves be the source of risk (as in cases of domestic abuse where a parent uses tracking technology to control a child).

Write a proposal (300-500 words) for a data governance framework for children that does not rely primarily on either children's consent or parental consent. What alternative governance mechanisms could protect children while respecting their developing autonomy?

D.3. Connect the themes of Chapter 35 to the broader arc of the textbook's four recurring threads:

  • How does the power asymmetry manifest specifically in the context of children's data?
  • Where does the consent fiction appear in children's digital environments?
  • What does the accountability gap look like when children are harmed by data systems?
  • How does the VitraMed thread (ethical debt accumulation) apply to companies that have been collecting children's data for years?

Write a synthesis (400-600 words) that integrates these four threads as they apply to Chapter 35.

D.4. Imagine you are advising a national government that is drafting a new children's data protection law. Drawing on the strengths and weaknesses of COPPA, GDPR Article 8, and the UK AADC as analyzed in this chapter, draft a one-page outline of your recommended framework. Your outline should specify: (a) the age threshold(s), (b) the consent mechanism, (c) the design obligations on companies, (d) the enforcement mechanism, and (e) the treatment of EdTech.


Part E: Research & Extension ⭐⭐⭐⭐

Open-ended projects for students seeking deeper engagement. Each requires independent research beyond the textbook.

E.1. COPPA Enforcement History. Research FTC enforcement actions under COPPA from 2019 to the present. Select three cases and write a 1,000-word report covering: (a) the company involved and its data practices, (b) the specific COPPA violations found, (c) the penalties imposed, and (d) whether the enforcement action led to meaningful changes in the company's behavior. Assess whether COPPA enforcement is sufficient to protect children in the current digital landscape.

E.2. Global Children's Data Protection. Select two countries not discussed in Chapter 35 (e.g., South Korea, Australia, Brazil, Japan, Nigeria) and research their approaches to children's data protection. Write a comparative analysis (1,000-1,200 words) covering: (a) the legal framework, (b) the age thresholds, (c) enforcement mechanisms, and (d) notable gaps. How do these approaches compare to the COPPA/GDPR/AADC models?

E.3. Designing for Children: A Case Study. Select a digital service that has made public commitments to child-safe design (e.g., YouTube Kids, Messenger Kids, a major EdTech platform). Research the service's actual practices. Write an analysis (800-1,200 words) evaluating the gap between the company's stated commitments and its observed practices. Use the UK AADC's fifteen standards as your evaluation framework.


Solutions

Selected solutions are available in appendices/answers-to-selected.md.