Quiz: Children, Teens, and Digital Vulnerability

Test your understanding before moving to the next chapter. Target: 70% or higher to proceed.


Section 1: Multiple Choice (1 point each)

1. COPPA applies to operators of websites and online services that are directed at children under what age?

  • A) 16
  • B) 18
  • C) 13
  • D) 10
Answer **C)** 13 *Explanation:* Section 35.1.1 explains that COPPA applies to children under 13. This threshold has been criticized as arbitrary — it provides no protection for teenagers 13-17, who are the heaviest users of social media and among the most vulnerable to its harms. The EU's GDPR Article 8 allows member states to set the digital age of consent between 13 and 16, and the UK AADC takes a fundamentally different approach by regulating design rather than relying solely on an age threshold.

2. Which of the following best describes the UK Age Appropriate Design Code (AADC)?

  • A) A law that requires parental consent before any data can be collected from children under 16.
  • B) A set of fifteen standards that require online services likely to be accessed by children to provide high privacy protections by default.
  • C) A voluntary industry code that encourages companies to consider children's safety.
  • D) A regulation that bans all advertising targeted at children under 13.
Answer **B)** A set of fifteen standards that require online services likely to be accessed by children to provide high privacy protections by default. *Explanation:* Section 35.1.3 describes the AADC as a statutory code of practice issued by the UK Information Commissioner's Office (ICO) containing fifteen standards, including "high privacy by default," data minimization, and transparency appropriate to the age of the child. The AADC's distinctive feature is that it applies to services "likely to be accessed by children" — not just services directed at children — and focuses on organizational design obligations rather than individual consent.

3. The "age verification paradox" refers to:

  • A) The fact that older children are harder to verify than younger children.
  • B) The tension between verifying a child's age to protect their privacy and the privacy risks created by the verification process itself.
  • C) The observation that children who lie about their age are more tech-savvy and thus less vulnerable.
  • D) The legal conflict between COPPA and GDPR regarding age thresholds.
Answer **B)** The tension between verifying a child's age to protect their privacy and the privacy risks created by the verification process itself. *Explanation:* Section 35.2 explains this paradox in detail. Effective age verification often requires collecting sensitive data — government ID, biometric data (facial analysis), or parental information — which itself creates privacy risks, especially for children. The paradox illustrates a broader theme in data governance: protective mechanisms can themselves generate the very harms they seek to prevent.

4. Which of the following is an accurate characterization of the evidence on social media and youth mental health?

  • A) Multiple large-scale randomized controlled trials have established a clear causal link between social media use and depression in adolescents.
  • B) There is no evidence of any association between social media use and youth mental health outcomes.
  • C) The evidence is predominantly correlational, with some longitudinal studies suggesting modest associations, but causal claims remain contested.
  • D) The evidence clearly shows that social media improves mental health for most adolescents but harms a small minority.
Answer **C)** The evidence is predominantly correlational, with some longitudinal studies suggesting modest associations, but causal claims remain contested. *Explanation:* Section 35.3 reviews the evidence base carefully, noting that while correlational studies consistently find associations between heavy social media use and indicators of poor mental health (anxiety, depression, body dissatisfaction), the causal direction is unclear, effect sizes are often modest, and individual variation is substantial. The chapter emphasizes the importance of distinguishing between correlational and causal evidence for policymaking.

5. The COVID-19 pandemic's impact on educational technology and student data is best described as:

  • A) Temporary — all pandemic-era data practices were reversed when schools reopened.
  • B) An acceleration of pre-existing trends, with EdTech platforms gaining access to unprecedented volumes of student data under emergency conditions that weakened normal governance oversight.
  • C) Negligible — schools already used extensive EdTech before the pandemic.
  • D) Positive — the pandemic proved that EdTech surveillance systems work effectively and should be made permanent.
Answer **B)** An acceleration of pre-existing trends, with EdTech platforms gaining access to unprecedented volumes of student data under emergency conditions that weakened normal governance oversight. *Explanation:* Section 35.4 documents how the pandemic did not create new EdTech data governance problems but dramatically accelerated existing ones. Schools adopted platforms rapidly, often without adequate privacy review. Student data collection expanded into home environments. Emergency conditions compressed procurement timelines, reducing governance scrutiny. Many of these practices persisted after the emergency ended, illustrating how temporary exceptions can become permanent defaults.

6. Which of the following best illustrates a "dark pattern" as it applies to children's digital services?

  • A) A children's app that uses bright colors to make the interface engaging.
  • B) A game that requires children to share the app with five friends to unlock the next level, without clear disclosure that this shares contact information.
  • C) An educational platform that includes a progress tracker showing completed lessons.
  • D) A children's video service that autoplays related content after the current video ends, with a clearly visible "stop" button.
Answer **B)** A game that requires children to share the app with five friends to unlock the next level, without clear disclosure that this shares contact information. *Explanation:* Dark patterns are design choices that manipulate users into actions they would not otherwise take. When applied to children, dark patterns exploit developmental vulnerabilities — children's desire for social approval, their limited understanding of data consequences, and their susceptibility to manipulative game mechanics. Option B combines social pressure with hidden data sharing. Option A describes aesthetic design. Option C is a standard educational feature. Option D, while potentially concerning, includes a visible opt-out mechanism.

7. FERPA (the Family Educational Rights and Privacy Act) differs from COPPA primarily in that:

  • A) FERPA applies only to children under 13, while COPPA applies to all minors.
  • B) FERPA governs educational records held by institutions receiving federal funding, while COPPA governs commercial online services that collect data from children.
  • C) FERPA provides stronger protections than COPPA in all circumstances.
  • D) FERPA was enacted after COPPA to address gaps in children's online privacy.
Answer **B)** FERPA governs educational records held by institutions receiving federal funding, while COPPA governs commercial online services that collect data from children. *Explanation:* Section 35.4 discusses the intersection of FERPA and COPPA in the EdTech context. FERPA (1974) protects the privacy of student educational records and applies to educational institutions that receive federal funding. COPPA (1998) applies to commercial operators of websites and online services. The regulatory gap appears when schools use commercial EdTech platforms — it can be unclear whether FERPA or COPPA governs, and neither may be adequate.

8. Mira's work on VitraMed's pediatric module raises which of the following concerns about children's health data?

  • A) Children's health data is identical to adult health data and requires no special treatment.
  • B) Predictive analytics trained on biased data may systematically under-serve children from lower-income communities, and children cannot advocate for themselves when algorithmic decisions affect their care.
  • C) VitraMed should not collect children's health data under any circumstances.
  • D) HIPAA adequately protects all children's health data in all contexts.
Answer **B)** Predictive analytics trained on biased data may systematically under-serve children from lower-income communities, and children cannot advocate for themselves when algorithmic decisions affect their care. *Explanation:* The VitraMed thread in this chapter extends the bias concerns from earlier chapters to the pediatric context. The compounding factor is that children are doubly vulnerable: they are subject to algorithmic decisions they cannot understand or challenge, and they depend on adults — parents, clinicians, institutions — to advocate for them. When the algorithm itself is biased, the adults in the system may not recognize the failure.

Section 2: True/False with Justification (1 point each)

For each statement, determine whether it is true or false and provide a brief justification.

9. "Under COPPA, a children's app that collects only persistent device identifiers (not names or addresses) does not need to obtain parental consent."

Answer **False.** *Explanation:* Section 35.1.1 notes that COPPA's definition of "personal information" includes persistent identifiers that can be used to recognize a user over time and across different websites or online services. Device identifiers fall within this definition. The FTC has consistently held that persistent identifiers used for behavioral advertising purposes require parental consent under COPPA.

10. "The UK AADC applies only to services specifically designed for and marketed to children."

Answer **False.** *Explanation:* Section 35.1.3 emphasizes that the AADC applies to services "likely to be accessed by children" — a significantly broader standard than "directed at" children. This means mainstream platforms like social media services, search engines, and video streaming sites must comply with the AADC's fifteen standards if children are likely to use them, even if children are not the primary audience. This broader scope was a deliberate design choice to prevent the "we're not a children's service" defense.

11. "Frances Haugen's testimony revealed that Facebook's internal research showed Instagram was harmful to teenage girls' mental health, and the company acted promptly on these findings."

Answer **False.** *Explanation:* The chapter's opening overview describes Haugen's testimony as revealing that Facebook knew Instagram was harmful to teenage girls' mental health but *had not acted on that knowledge*. Haugen stated that the company prioritized "astronomical profits before people." The case illustrates the accountability gap: the company possessed the evidence of harm but lacked the institutional will — or the governance mechanisms — to compel action.

12. "Age estimation using facial analysis technology is a privacy-neutral alternative to traditional age verification methods."

Answer **False.** *Explanation:* Section 35.2 discusses facial analysis as one of several age verification approaches and notes that it raises significant privacy concerns. Facial analysis requires processing biometric data — among the most sensitive data categories identified in Chapter 12. Even if the facial data is "not stored," the processing itself constitutes a privacy-relevant event. Additionally, facial analysis systems have documented accuracy disparities across racial and ethnic groups, creating equity concerns.

13. "The concept of 'developmental appropriateness' in children's data governance means that the same data practice may be appropriate for a 16-year-old but inappropriate for a 7-year-old."

Answer **True.** *Explanation:* Section 35.5 discusses developmental appropriateness as a governance principle that recognizes children's cognitive, emotional, and social capacities change with age. A binary child/adult distinction fails to capture these developmental differences. The UK AADC explicitly requires services to consider the range of ages in their likely audience and design protections appropriate to each developmental stage.

Section 3: Short Answer (2 points each)

14. Explain why COPPA's "actual knowledge" standard creates a loophole for platforms. How could this standard be strengthened?

Answer COPPA requires operators to comply with its provisions when they have "actual knowledge" that they are collecting data from children under 13. This standard creates a loophole because platforms can avoid COPPA obligations by choosing not to ask users their age — if they don't know, they don't have "actual knowledge." A platform can thus claim ignorance even when its content, design, and user demographics strongly suggest that children are present. The standard could be strengthened by adopting a "constructive knowledge" or "should have known" standard, similar to the UK AADC's "likely to be accessed by children" approach. This would require platforms to assess whether their services attract child users based on content, design features, marketing practices, and available audience data, regardless of whether they formally ask for age information.

15. Describe two specific ways in which the pandemic-era expansion of EdTech created data governance challenges that persist after the emergency ended.

Answer First, platforms adopted during the pandemic under emergency conditions — often without full privacy impact assessments or competitive procurement processes — became institutionally entrenched. Schools invested in training, curriculum integration, and workflows built around specific platforms, creating switching costs that make it difficult to move to more privacy-protective alternatives even when the emergency justification has expired. Second, the scope of student monitoring expanded during remote learning to include home environments — webcam feeds during class, keystroke logging on school devices used at home, and browser history tracking outside school hours. These expanded surveillance capabilities were normalized during the pandemic and in many cases have not been rolled back, even though the original justification (ensuring students are engaged during remote classes) no longer applies.

16. What is the difference between "privacy by default" as applied to children under the UK AADC and general "privacy by design" as discussed in Chapter 10?

Answer Privacy by design (Chapter 10) requires organizations to embed privacy considerations into the design of systems and processes from the outset — building privacy protections into architecture rather than adding them as an afterthought. It applies broadly and allows flexibility in how privacy is implemented. Privacy by default under the UK AADC goes further by requiring that the *highest* privacy settings must be the default configuration for services likely to be accessed by children. Users (or their parents) would need to actively choose to lower their privacy protections, rather than being required to opt into higher protections. This shifts the default from "collect unless refused" to "protect unless explicitly released" — a more protective starting position that accounts for children's limited ability to navigate privacy settings.

Section 4: Scenario Analysis (3 points each)

17. A major video streaming platform popular with children (ages 5-12) introduces a "family sharing" feature that allows up to five family members to use the same account. The platform collects viewing history, search queries, and engagement metrics for each profile. A child's profile is linked to a parent's payment account. The platform uses the child's viewing data to recommend content and to inform advertising on the parent's profile. Analyze this scenario under COPPA and the UK AADC. What violations might exist, and what changes would you recommend?

Answer **Under COPPA:** The platform may be in violation if it uses the child's viewing data for advertising purposes without verifiable parental consent specifically for that use. COPPA requires that data collection from children under 13 be limited to what is reasonably necessary for the child's participation in the activity. Using a child's viewing patterns to inform advertising on a parent's profile extends beyond what is necessary for the child's use of the platform. Additionally, the linked-account structure may make it difficult for parents to exercise their COPPA rights (review, deletion) specifically for the child's data. **Under the UK AADC:** The platform would likely violate several standards: (1) "High privacy by default" — the child's data should not be used for advertising purposes by default; (2) "Data minimization" — collecting engagement metrics beyond what is needed for content delivery exceeds the minimum necessary; (3) "Detrimental use" — using a child's data to target advertising (even on a parent's profile) could be considered detrimental to the child's interests; (4) "Connected toys and devices" principles apply to connected family accounts. **Recommended changes:** (a) Isolate the child's profile data from the advertising system entirely; (b) Apply privacy by default to the child's profile, disabling data-driven advertising by default; (c) Provide parents with granular controls to review and delete the child's data independently; (d) Limit data retention for children's profiles to the minimum necessary for service functionality.

18. A school district partners with an EdTech company to deploy an AI-powered "early warning system" that analyzes student behavioral data — attendance patterns, grade trends, disciplinary records, counselor visit frequency, and social media posts made on school networks — to predict which students are "at risk" of dropping out. The system flags students for intervention by school counselors. Analyze this system from the perspective of children's data governance, addressing: (a) the consent and power dynamics, (b) the potential for bias, (c) the appropriate governance mechanisms, and (d) the role of the student's own voice.

Answer **(a) Consent and power dynamics:** Students cannot meaningfully opt out of a school-mandated system — they are compelled to attend school and use school systems. Parental consent may exist formally but is constrained by the power asymmetry between families and school institutions. The system collects sensitive behavioral data (counselor visits, disciplinary records) that students may not have anticipated being used for predictive analytics. This represents the consent fiction at its most acute: subjects who cannot refuse, consenting to uses they do not understand. **(b) Potential for bias:** Predictive dropout models are susceptible to the same biases documented throughout the algorithmic chapters (13-19). Historical data on dropouts reflects historical inequities — students from marginalized communities are more likely to appear in dropout statistics due to systemic factors (poverty, housing instability, racism) rather than individual risk. The system may disproportionately flag students of color, students from low-income families, and students with disabilities — converting structural disadvantage into individual risk scores. Additionally, being flagged itself may create stigma that affects a student's educational experience. **(c) Appropriate governance mechanisms:** The system should be subject to: an algorithmic impact assessment before deployment; regular bias audits examining flag rates across demographic groups; clear data retention and deletion policies; restrictions on sharing flag data beyond school counselors (e.g., not with law enforcement or future employers); independent oversight by a body that includes parents and students. **(d) The student's voice:** Students — especially older adolescents — should have the right to know they have been flagged, to understand (in age-appropriate terms) why, and to contest the flag if they believe it is inaccurate. Denying students any voice in a system that affects their educational trajectory violates the principle of developing autonomy and treats students as objects of governance rather than participants in it.

Solutions

Selected solutions are available in appendices/answers-to-selected.md.