Quiz — Chapter 39: Designing for Privacy
Total: 100 points.
Part A: Multiple Choice (3 points each)
1. Ann Cavoukian's "Privacy by Design" framework holds that privacy protection should be:
a) Implemented as a compliance layer after systems are designed and deployed b) Proactively engineered into systems before data flows begin c) Achieved primarily through user choice and consent mechanisms d) Mandated by regulators who audit systems after deployment
2. The "privacy as the default" principle (Cavoukian's Principle 2) means:
a) Users must actively choose to share data; privacy protection requires no action from users b) Users can choose to enable privacy settings, which are available in all application menus c) Regulators assume that companies have implemented privacy unless evidence of violation is presented d) Privacy is the default legal standard, and companies must demonstrate consent to deviate from it
3. Differential privacy works by:
a) Distributing data across multiple servers to prevent any single breach from exposing all user data b) Adding carefully calibrated random noise to query results so individual-level inference is impossible c) Requiring that each data subject's information be encrypted with their personal key d) Limiting data access to users who have explicitly consented to specific research uses
4. The "epsilon" (ε) parameter in differential privacy controls:
a) The maximum number of queries that can be made against a differentially private dataset b) The proportion of users whose data is excluded from analysis to protect outliers c) The trade-off between privacy protection and data accuracy — smaller ε means stronger privacy and more noise d) The minimum sample size required before differential privacy can be applied
5. Federated learning differs from conventional machine learning by:
a) Training models only on data that has been anonymized before being sent to the central server b) Distributing model training across individual devices, with only model updates (not raw data) shared centrally c) Using differential privacy to protect training data rather than storing it on device d) Requiring explicit user consent before including any individual's data in model training
6. End-to-end encryption creates what surveillance governance implication?
a) The service provider can access communications with a valid court order, creating a legal pathway for law enforcement b) The service provider cannot read communications even with legal authorization, because it holds no decryption keys c) Communications are encrypted in transit but stored in readable form on company servers d) Only the sender's device holds the encryption key, meaning the recipient's data is accessible to the service provider
7. The GDPR's data minimization requirement mandates that personal data must be:
a) Deleted within two years of collection regardless of purpose b) Anonymized before being stored in any company database c) Adequate, relevant, and limited to what is necessary for the purposes for which it is processed d) Stored only in EU member state territory for the duration of its retention
8. The EU AI Act's approach to AI surveillance systems is best described as:
a) Prohibiting all AI-based surveillance in the European Union b) Requiring government ownership of all AI surveillance systems as a condition of deployment c) Risk-based classification, with prohibited applications for highest-risk uses and accountability requirements for high-risk uses d) Voluntary certification programs that AI companies may choose to participate in
9. The "Community Control Over Police Surveillance" (CCOPS) model requires:
a) Direct popular vote before any surveillance technology can be acquired by law enforcement b) Pre-acquisition democratic approval, public disclosure, use policies, and annual reporting for police surveillance technology c) Federal Department of Justice review of all local surveillance technology acquisitions d) Civil society organizations to be given veto power over law enforcement surveillance decisions
Part B: True/False with Justification (4 points each)
10. True or False: Cavoukian's Principle 4 — that privacy and functionality are "positive-sum, not zero-sum" — accurately describes how current commercial surveillance markets actually operate.
Briefly justify your answer (2–3 sentences).
11. True or False: End-to-end encryption can be made "safe" for law enforcement access by building in an exceptional access mechanism that only authorized agencies can use.
Briefly justify your answer (2–3 sentences), drawing on the cryptographic community's position.
12. True or False: Privacy by design, if adopted universally, would be sufficient to resolve the surveillance problems documented throughout this book.
Briefly justify your answer (2–3 sentences).
Part C: Short Answer (8 points each)
13. Explain the "aggregation problem" in privacy analysis. How do data minimization and purpose limitation work together to address it? Use a concrete example in your explanation. (150–200 words)
14. Jordan's privacy policy exercise revealed several challenges in translating good privacy principles into design. Identify two specific challenges Jordan encountered and explain what they reveal about the relationship between privacy principles and privacy implementation. (150–200 words)
15. What is algorithmic auditing, and what are its current limitations as a surveillance governance mechanism? What would need to change for algorithmic auditing to function as effective accountability? (150–200 words)
Part D: Extended Response (choose one, 14 points)
16. The chapter argues that "privacy by design is necessary but not sufficient." Construct the argument for this claim in full: Why is privacy by design necessary (what specific problems does it address that other approaches cannot)? Why is it insufficient (what problems does it fail to address, and why)? What additional elements — technical, legal, economic, political — would a sufficient response to the surveillance economy require? Your response should draw on the political economy analysis from Chapter 34. (300–400 words)
OR
17. Compare the EU AI Act's approach to AI surveillance regulation with the CCOPS municipal ordinance model. What are the distinctive strengths of each approach? What problems does each fail to address? In what ways do they complement each other, and in what ways do they overlap or conflict? Which approach do you find more promising as a governance model, and why? (300–400 words)
Answer Key and Rubric available in Instructor Resources.