Case Study: Ed-Tech and Student Privacy: The Pandemic Acceleration

"The pandemic did not create the ed-tech privacy crisis. It accelerated a crisis that was already underway — and it stripped away the pretense that existing governance frameworks were adequate." — Amelia Vance, President, Public Interest Privacy Center

Overview

In March 2020, schools across the world closed their doors. Virtually overnight, education moved online. Learning management systems, video conferencing platforms, student engagement trackers, and online proctoring tools went from optional supplements to essential infrastructure. School districts that had spent years deliberating about ed-tech adoption made deployment decisions in days — often without privacy impact assessments, contract negotiations, or governance frameworks.

The result was the largest unplanned experiment in educational data collection in history. This case study examines what happened to student privacy during the pandemic — the governance failures that occurred, the lessons that emerged, and the reforms they demand.

Skills Applied: - Analyzing governance failures that result from emergency adoption of technology - Evaluating the adequacy of FERPA and COPPA for modern ed-tech - Assessing the power dynamics between school districts and ed-tech companies - Proposing governance frameworks for educational technology


The Pre-Pandemic Landscape

Ed-Tech's Quiet Expansion

Even before the pandemic, educational technology had become deeply embedded in American schools. By 2019:

  • 94% of US public schools provided at least one device per student or operated a bring-your-own-device program
  • The average school district used 1,403 distinct ed-tech applications per month
  • The global ed-tech market was valued at approximately $76 billion

This expansion occurred with minimal governance oversight. School districts — often understaffed, underfunded, and lacking technical expertise — adopted ed-tech tools based on pedagogical appeal and cost, with data governance as an afterthought. Privacy policies were rarely reviewed in depth. Data practices were rarely audited. And FERPA, the primary federal governance framework, was designed for paper records in filing cabinets, not for machine learning models trained on millions of student interactions.

The Governance Gap

Several structural factors created a pre-existing governance gap:

FERPA's limitations. FERPA governs "education records" — records maintained by the school or a party acting for the school. But many ed-tech tools collect data that arguably falls outside this definition: metadata (when a student logged in, how long they spent on each page), behavioral data (mouse movements, typing patterns), and environmental data (background audio during proctored exams). Whether these data types constitute "education records" under FERPA was — and remains — unclear.

The school official exception. Ed-tech companies accessed student data primarily through the "school official" exception, which was designed for school employees and narrow service contracts, not for data-intensive technology platforms.

Power asymmetry. Large ed-tech companies had armies of lawyers and data scientists; school districts had overworked administrators and limited legal counsel. Contract negotiations were heavily asymmetric, with ed-tech companies' standard terms often prevailing.

COPPA's partial coverage. The Children's Online Privacy Protection Act (COPPA) applies to commercial websites and apps directed at children under 13 but allows schools to consent on behalf of students — a provision that effectively offloads consent management from parents to already-burdened schools.


The Pandemic Pivot (March-September 2020)

Speed Over Governance

When schools closed in March 2020, the pressure to maintain educational continuity overrode governance processes. Districts adopted ed-tech tools with extraordinary speed:

  • Video conferencing: Zoom, Google Meet, and Microsoft Teams were deployed within days, often without privacy reviews. Zoom's initial deployments in schools raised security concerns (including "Zoom-bombing" — unauthorized users joining class sessions) that the company scrambled to address.
  • Learning management systems: Canvas, Google Classroom, Schoology, and others saw enrollment spikes of 200-400%. Schools that had been piloting these systems expanded them to entire districts overnight.
  • Online proctoring: With in-person exams impossible, schools adopted AI-powered proctoring tools (Proctorio, Respondus, ExamSoft) that used webcam monitoring, eye-tracking, keystroke analysis, and environmental audio to detect cheating. These tools collected some of the most intrusive data in the ed-tech ecosystem.
  • Student engagement tracking: Platforms like GoGuardian and Gaggle monitored students' online activity — including web browsing, search queries, and content typed in school-issued devices — ostensibly to ensure engagement and safety. Some of these tools operated on school devices 24/7, including when students used them at home for non-school purposes.

What Was Collected

Human Rights Watch conducted a global investigation of ed-tech products used during the pandemic and published findings in 2022 that revealed the scope of data collection:

  • Of 163 ed-tech products analyzed across 49 countries, 146 (89%) engaged in data practices that risked or undermined children's privacy.
  • 119 products sent children's personal data to advertising technology companies.
  • 73 products had the technical capability to monitor students outside of school hours.
  • Most products did not require informed parental consent and relied instead on school-level authorization.

The Proctoring Problem

Online proctoring tools became the pandemic's most controversial ed-tech category. These tools collected:

  • Video recordings of students during exams, including their faces, backgrounds, and anyone else visible in the room
  • Audio recordings of the testing environment
  • Eye-tracking data that flagged "suspicious" gaze patterns
  • Keystroke analysis to detect copy-paste behavior
  • Browser lockdown data showing any attempt to navigate away from the exam

Students reported feeling surveilled in their own homes. The tools were known to produce disproportionate false positives for students of color (whose facial features were less reliably detected by the AI), students with disabilities (whose movements were flagged as "suspicious"), and students testing in small apartments with family members moving in the background. The governance implications were severe: proctoring tools collected biometric data (facial recognition), health-related data (some tools flagged students who appeared to have anxiety), and home environment data (economic indicators visible in background), with minimal consent processes and unclear data retention policies.


The Governance Reckoning

Regulatory Response

The pandemic exposed the inadequacy of existing governance frameworks, prompting several responses:

FTC enforcement. The Federal Trade Commission brought enforcement actions against several ed-tech companies for COPPA violations. Most significantly, in 2022 the FTC required Chegg (an education platform) to delete data collected from children, implement a data security program, and limit data retention. The FTC's 2023 Policy Statement on Education Technology emphasized that ed-tech companies cannot collect more data than necessary for their educational purpose and cannot use student data for commercial purposes.

State legislation. States enacted a wave of student privacy laws. California's Student Online Personal Information Protection Act (SOPIPA) prohibited ed-tech companies from using student data for non-educational purposes. Other states followed with similar legislation. By 2024, over 40 states had enacted some form of student privacy law — creating the same patchwork problem that characterizes US data protection generally.

Student Privacy Pledge. Industry responded with the Student Privacy Pledge, a voluntary commitment by ed-tech companies not to sell student data, not to use student data for advertising, and to maintain security standards. Over 400 companies signed — but the pledge was voluntary, unenforceable, and criticized as a public relations exercise.

Institutional Response

School districts — having learned painful lessons from the pandemic's governance failures — began implementing more systematic ed-tech governance:

  • Privacy impact assessments: Some districts now require PIAs before adopting new ed-tech tools, evaluating data collection scope, security measures, and retention practices.
  • Approved vendor lists: Districts maintain vetted lists of ed-tech products that have passed privacy and security review, prohibiting teachers from independently adopting unapproved tools.
  • Contract standardization: Organizations like the Student Data Privacy Consortium developed standardized contract templates (the National Data Privacy Agreement) that establish baseline privacy protections in ed-tech contracts.
  • Parent communication: Districts improved transparency, providing parents with information about what ed-tech tools are used, what data they collect, and how to opt out where legally possible.

Ongoing Challenges

The Surveillance Normalization Problem

The pandemic normalized student surveillance to a degree that may be difficult to reverse. Students who spent years being monitored through engagement trackers and proctoring tools may have internalized the expectation that surveillance is a normal condition of education. Teachers who relied on monitoring tools to manage remote classrooms may continue using them in person. The infrastructure of surveillance — installed during an emergency — tends to persist after the emergency ends.

The AI Transition

Ed-tech is transitioning from data collection to AI-driven decision-making. Adaptive learning platforms adjust curriculum based on student performance data. Predictive analytics flag at-risk students. Automated essay grading evaluates written work. Each of these applications raises governance questions that neither FERPA nor COPPA was designed to address: What happens when an algorithm's assessment of a student is wrong? Who is accountable? Can a student challenge an algorithmic determination?

The Equity Dimension

The pandemic's ed-tech adoption was deeply inequitable. Students with reliable internet, private spaces for learning, and up-to-date devices had different experiences than students sharing devices, working from noisy homes, or relying on mobile hotspots. The data collected by ed-tech tools reflected these inequities — and the algorithmic systems trained on that data risk encoding them. A "student engagement" score based on login frequency and time-on-task may penalize students whose circumstances prevented consistent access, creating a data record that follows them through their educational career.


Discussion Questions

  1. The pandemic forced schools to adopt technology without governance frameworks. Was this an acceptable trade-off — prioritizing educational continuity over data protection in a crisis? Or should schools have resisted deploying tools that had not been properly vetted, even at the cost of educational disruption?

  2. Online proctoring tools collect biometric data, monitor students' home environments, and produce disproportionate false positives for students of color and students with disabilities. Should these tools be prohibited? If not, what governance requirements should apply?

  3. The Human Rights Watch study found that 89% of pandemic-era ed-tech tools risked or undermined children's privacy. What does this tell us about the effectiveness of existing governance frameworks (FERPA, COPPA) for protecting student privacy?

  4. Ed-tech companies argue that data collection enables "personalized learning" that improves student outcomes. Privacy advocates argue that the data collected far exceeds what is necessary for educational purposes. How should governance frameworks balance these competing claims?


Your Turn: Mini-Project

Option A: Interview a teacher, school administrator, or IT director about their experience with ed-tech adoption during the pandemic. What governance challenges did they face? What tools were adopted without privacy review? What has changed since? Write a 1,000-word narrative.

Option B: Research one specific online proctoring tool (Proctorio, Respondus, or ExamSoft). Analyze its data collection practices, its accuracy claims, its impact on different student populations, and the governance concerns it raises. Write a 1,000-word assessment.

Option C: Draft a model ed-tech privacy policy for a school district. The policy should cover: vendor vetting requirements, data minimization standards, retention limits, parental transparency, student rights, and incident response procedures. Target length: 2-3 pages.


References

  • Human Rights Watch. "'How Dare They Peep into My Private Life?': Children's Rights Violations by Governments That Endorsed Online Learning During the Covid-19 Pandemic." May 2022.

  • Federal Trade Commission. "Policy Statement of the Federal Trade Commission on Education Technology and the Children's Online Privacy Protection Act." May 2022.

  • Regan, Priscilla M., and Jolene Jesse. "Ethical Challenges of Edtech, Big Data, and Personalized Learning: Twenty-First Century Student Sorting and Tracking." Ethics and Information Technology 21 (2019): 167–179.

  • Vance, Amelia, and Lee Tien. "Student Privacy and Ed-Tech." Electronic Frontier Foundation, 2020.

  • Kelley, Jason. "Proctoring Tools and Student Privacy." Electronic Frontier Foundation, September 2020.

  • Student Data Privacy Consortium. "National Data Privacy Agreement." Available at https://privacy.a4l.org/national-dpa/.

  • Marwick, Alice, and danah boyd. "Understanding Privacy at the Margins." International Journal of Communication 12 (2018): 1157–1165.