Exercises — Chapter 39: Designing for Privacy
Exercise 39.1 — Privacy by Design Audit (Group, 60–75 minutes)
Overview: Apply the Cavoukian framework to audit an existing application or platform.
Instructions:
Working in groups of three or four, select one of the following platforms for a Privacy by Design audit: - Instagram (or any major social media platform) - Google Maps with location history - A school-issued monitoring platform (GoGuardian, Gaggle, or your institution's equivalent) - A fitness tracker application with health data (Fitbit, Apple Health, or similar) - A workplace monitoring software (Hubstaff, Teramind, or similar)
For each of Cavoukian's seven principles, assess whether the selected platform: - Fully implements the principle (with evidence) - Partially implements the principle (describe what is present and what is missing) - Does not implement the principle (with evidence of non-implementation)
Scoring and analysis:
After your principle-by-principle assessment, write a 400–600 word analysis that: 1. Identifies the two principles the platform implements most adequately 2. Identifies the two principles the platform implements least adequately 3. Proposes one specific design change that would most significantly improve the platform's privacy properties 4. Assesses what business or technical barriers exist to implementing your proposed change
Class presentation: Each group presents their audit findings (5–7 minutes). After all presentations, identify patterns: which principles are most consistently implemented or violated across platforms? What does this pattern suggest about the surveillance economy's relationship to privacy by design?
Exercise 39.2 — Differential Privacy in Practice (Individual, 45–60 minutes)
Overview: Understand the epsilon trade-off through a practical calculation exercise.
Setup:
Your university wants to publish aggregate data about student mental health, derived from an annual survey. The survey asks 2,000 students whether they have experienced significant anxiety in the past year. You need to publish a differentially private version of the count.
Questions:
-
The true count of students reporting anxiety is 680. If you apply the Laplace mechanism with ε = 1.0 and sensitivity = 1.0, what is the expected noise added? (Recall: the Laplace distribution with scale b = sensitivity/ε has standard deviation √2 × b.)
-
Calculate the expected magnitude of noise for ε values of: 0.1, 0.5, 1.0, 2.0, 5.0, 10.0. At what epsilon value does the noise become small enough that the published count is useful for policy-making (within ±25 students of the true count)?
-
The university is also interested in publishing the count broken down by college (Engineering, Arts, Sciences, Business — approximately 500 students each). How does the smaller sample size affect the utility of differential privacy at the same epsilon value?
-
A public health researcher argues that the university should use ε = 10.0 because the benefits of accurate data outweigh the privacy concerns. A privacy advocate argues for ε = 0.1. Evaluate both positions. What considerations should drive the choice of epsilon, and who should make it?
-
Reflection (300–400 words): The Laplace mechanism adds mathematical noise to produce a privacy guarantee. But the students who completed the survey did not consent to this specific form of privacy protection — they may not know what differential privacy is. Does the mathematical privacy guarantee make consent unnecessary? Or does privacy by design require consent in addition to technical protections?
Exercise 39.3 — Privacy Policy Design Project (Individual or Pair, Major Assignment, 800–1,200 words)
Overview: Design a complete privacy policy for a hypothetical application, applying all of Chapter 39's principles.
Scenario:
"CampusHealth" is a mobile application being considered by a university health services department. The application would enable students to: - Schedule appointments with health center providers - Message securely with their assigned counselor - Track their mood and mental health over time (self-reported) - Access mental health resources and crisis hotline information - Optionally share mood data with their counselor to support appointment preparation
CampusHealth would be built by a third-party vendor. It would run on iOS and Android. Students would use their university email to register.
Your privacy policy should address all of the following:
-
Data inventory: What categories of data are collected, from whom, by what mechanism?
-
Lawful basis: Under what legal basis is each category of data collected? (GDPR framework; apply even if the institution is U.S.-based)
-
Data minimization statement: For each data category, explain why this data is necessary for the application's function and what less privacy-invasive alternatives were considered and rejected
-
Purpose limitation: For each data category, what specific purposes is data used for? What uses are explicitly prohibited?
-
Security measures: What technical measures protect data in transit and at rest? (Reference specific standards: E2EE for messages, data encryption at rest, etc.)
-
Retention schedule: How long is each category of data retained? What triggers deletion?
-
Student rights: What rights do students have regarding their data? How do they exercise them?
-
Third-party sharing: Under what conditions, if any, is data shared with third parties? What conditions govern the vendor relationship?
-
Law enforcement: What is the application's policy on responding to law enforcement requests? Who within the university receives such requests?
-
Sensitive data: Mental health data is a particularly sensitive category. What additional protections apply to mood tracking data and counselor communications?
After drafting the policy, write a 300–400 word reflection: What was the most difficult design decision you made? Where do the principles of Chapter 39 conflict with each other, and how did you resolve those conflicts?
Exercise 39.4 — CCOPS Ordinance Drafting (Group, Major Assignment, 75–90 minutes)
Overview: Draft a Community Control Over Police Surveillance ordinance for your municipality or campus.
Instructions:
Working in groups of four or five, draft a surveillance ordinance for one of the following: - Your city or town - Your university campus - A fictional mid-size city of 200,000 residents
Your ordinance should include the following provisions:
-
Definitions: Define "surveillance technology," "personal information," "biometric identifier," and "surveillance program" for purposes of the ordinance
-
Pre-acquisition approval process: What process is required before a law enforcement or security agency acquires new surveillance technology? Who must approve? What information must be disclosed? What community input is required?
-
Prohibited technologies: Are there any surveillance technologies that should be prohibited outright regardless of the approval process? (Facial recognition? Predictive policing? Remote biometric identification?) Justify your inclusions and exclusions.
-
Use policies: What must a surveillance use policy contain? Who approves use policies?
-
Annual reporting: What information must be publicly reported annually for each approved surveillance technology in use?
-
Accountability and enforcement: What happens when the ordinance is violated? What penalties apply? Who has standing to enforce the ordinance?
-
Appeals: Can community members challenge the acquisition or use of a surveillance technology after approval? By what process?
Deliverable: A 700–1,000 word ordinance draft with section headings and brief explanatory notes for each provision. After submitting, be prepared to present and defend your ordinance choices in a 10-minute class session.
Exercise 39.5 — The Exceptional Access Debate (Structured Discussion, 45–60 minutes)
Setup:
The FBI has asked Congress to require that all end-to-end encrypted communications platforms maintain a technical mechanism enabling law enforcement access to message content when authorized by a court order. The platforms argue this is technically impossible to implement securely.
Three positions for structured discussion:
-
Law enforcement position: Criminals and terrorists use E2EE to coordinate; "going dark" — law enforcement's inability to access communications despite legal authority — is a serious public safety problem that requires legislative solution
-
Cryptographer/technology position: There is no way to build exceptional access that is available only to authorized law enforcement and not to adversaries; any backdoor is a vulnerability; a communications infrastructure with a backdoor is insecure for everyone
-
Civil liberties position: Even if exceptional access could be implemented securely (which the cryptographers deny), the power to access all private communications with legal authorization is subject to mission creep, political abuse, and disproportionate harm to marginalized communities
Discussion format: - Each position presents for 5 minutes - Cross-examination: 10 minutes - Open discussion: 15 minutes - Synthesis: what would a genuinely satisfactory resolution look like? Is one available? (10 minutes)