Chapter 23: Quiz — Data Privacy Fundamentals

20 questions. Each question is worth 5 points. Total: 100 points.


Question 1. Under GDPR's definition, which of the following is NOT personal data?

A) A user's IP address B) A photograph of a person's face C) A company's registered business address D) A device's unique advertising identifier

Correct Answer: C A company's registered business address relates to a legal entity, not a natural person. IP addresses, facial photographs (biometric data), and device advertising identifiers all relate to identifiable natural persons and are personal data under GDPR.


Question 2. The "aggregation problem" in data privacy refers to:

A) The difficulty of complying with multiple privacy laws simultaneously B) How combining individually innocuous data points can create serious privacy intrusions C) The challenge of managing data across multiple storage systems D) The problem of aggregating consent from multiple jurisdictions

Correct Answer: B The aggregation problem describes how data that is harmless in isolation can create serious privacy violations when combined. A name, employer, neighborhood, physical description, and daily schedule are individually innocuous but together enable stalking.


Question 3. Helen Nissenbaum's contextual integrity framework holds that privacy is violated when:

A) Any personal information is shared without explicit written consent B) Information flows in ways that violate the norms of the context in which it was originally shared C) Personal data leaves its country of origin D) Information is shared with parties who did not originally collect it

Correct Answer: B Contextual integrity focuses on appropriate information flows within social contexts. Medical information shared with a treating physician flows appropriately; the same information flowing to an employer violates the norms of the healthcare context.


Question 4. GDPR's "right to erasure" (right to be forgotten) requires:

A) Immediate deletion of all personal data upon any request B) Deletion of personal data in certain specific circumstances, such as when data is no longer necessary or consent is withdrawn C) Deletion of personal data from all third-party systems within 30 days D) Permanent suppression of all records from search engines

Correct Answer: B The right to erasure applies in specific circumstances defined in GDPR Article 17. It is not an absolute right — it applies when data is no longer necessary for its original purpose, when consent is withdrawn, when data has been unlawfully processed, and in several other defined circumstances.


Question 5. Which of the following is NOT one of GDPR's six lawful bases for processing personal data?

A) Consent B) Legitimate interests C) Commercial necessity D) Legal obligation

Correct Answer: C GDPR Article 6 identifies six lawful bases: consent, contract, legal obligation, vital interests, public task, and legitimate interests. "Commercial necessity" is not a lawful basis — commercial benefit to the controller is relevant only as part of the legitimate interests balancing test.


Question 6. The principle of "purpose limitation" under GDPR means that:

A) Organizations may only process a limited number of data fields per individual B) Personal data may only be collected for specified, explicit, and legitimate purposes and may not be processed in incompatible ways C) Processing must be limited to the EU unless adequacy decisions are in place D) Organizations must limit the number of purposes disclosed in their privacy policy

Correct Answer: B Purpose limitation prevents "function creep" — using data collected for one purpose for a different, incompatible purpose. Organizations must specify their purposes at collection and cannot later use the data for unrelated purposes.


Question 7. A Data Protection Impact Assessment (DPIA) is required under GDPR when:

A) Any new IT system is implemented B) Processing is likely to result in high risk to the rights and freedoms of natural persons C) An organization employs more than 250 people D) Personal data is being transferred outside the EU

Correct Answer: B GDPR Article 35 requires DPIAs for processing "likely to result in a high risk." This includes large-scale processing of sensitive categories, systematic monitoring of publicly accessible areas, and automated decision-making with significant effects.


Question 8. The United States' current approach to personal data privacy is best described as:

A) A comprehensive federal framework modeled on GDPR B) A sector-specific patchwork of federal laws supplemented by state legislation C) A purely voluntary self-regulatory framework with no binding rules D) An opt-in system requiring consent for all commercial data processing

Correct Answer: B The US lacks a comprehensive federal privacy law. Instead, sector-specific laws (HIPAA for health data, COPPA for children's data, GLBA for financial data, FERPA for student records) are supplemented by FTC enforcement and a growing body of state law, most prominently California's CCPA/CPRA.


Question 9. HIPAA's privacy protections apply to:

A) Any company that collects health information about consumers B) Healthcare providers, health plans, and healthcare clearinghouses (and their business associates) C) All US employers who maintain employee health records D) Pharmaceutical companies and medical device manufacturers

Correct Answer: B HIPAA applies to "covered entities" — healthcare providers that transmit health information electronically, health plans, and healthcare clearinghouses — and their "business associates." A wellness app not working with a covered entity is not subject to HIPAA.


Question 10. Which of the following best describes a "membership inference attack" on an AI model?

A) An attack that extracts the model's parameters from API responses B) An attack that manipulates model outputs by inserting malicious training data C) A technique for determining whether a specific individual's data was used in the model's training set D) An attack that impersonates legitimate users to gain unauthorized model access

Correct Answer: C Membership inference attacks attempt to determine whether a specific data record was included in a model's training data. This can reveal sensitive information — for example, that a specific individual was a patient if the model was trained on medical records.


Question 11. Privacy by Design's principle of "privacy as the default setting" means:

A) All privacy features must be implemented in the software defaults B) The default configuration of a system should maximize privacy protection; users who want less privacy must actively choose to reduce it C) Privacy settings should be visible on the default home screen D) All data processing must require affirmative opt-in

Correct Answer: B Privacy as default means that without any user action, the system provides maximum privacy. Pre-checked consent boxes are the opposite of this principle — they require users to take action to opt out of privacy-invasive defaults.


Question 12. Brazil's data protection law (LGPD) is significant because:

A) It was the first comprehensive data protection law in the world B) It provides GDPR-equivalent protection in Latin America's largest economy C) It is the only law to explicitly address AI-specific privacy risks D) It established the first global data protection authority

Correct Answer: B Brazil's LGPD, which took effect in 2020, is closely modeled on GDPR and provides comprehensive data protection in South America's largest economy. It does not have temporal priority over GDPR (enacted 2016, effective 2018).


Question 13. The "chilling effect" in the context of surveillance and privacy refers to:

A) The financial impact of data breaches on company valuations B) The modification of behavior caused by awareness of surveillance, even without any specific threat C) The cooling of innovation caused by overly restrictive privacy regulations D) The dampening of consumer enthusiasm for privacy-invasive products

Correct Answer: B The chilling effect describes how people modify their behavior — self-censoring speech, avoiding certain searches, refraining from political activity — when they know or suspect they are being watched, even if no specific consequences have materialized.


Question 14. GDPR's CCPA comparison: which right exists under GDPR but NOT under CCPA/CPRA?

A) Right to know what personal information is collected B) Right to delete personal information C) Right to not be subject to solely automated decisions with significant legal effects D) Right to opt out of the sale of personal information

Correct Answer: C GDPR Article 22 provides a right not to be subject to solely automated decisions with significant effects. While CPRA creates some rights related to automated decision-making, it does not include an equivalent comprehensive protection against fully automated decision-making.


Question 15. The key finding in the ICO's investigation of the Royal Free / DeepMind arrangement was that:

A) Google DeepMind had accessed data it was not authorized to receive B) The Royal Free had shared patient data on a scope broader than justified by the AKI detection purpose and without adequate patient information C) The data sharing agreement violated NHS confidentiality rules D) DeepMind had monetized patient data in violation of its agreement with the Royal Free

Correct Answer: B The ICO found that the Royal Free had shared data beyond what was proportionate to the AKI detection purpose and had failed to adequately inform patients about the commercial partnership. The "direct care" justification was inadequate for 1.6 million patients, most of whom would never need AKI detection services from that trust.


Question 16. Which privacy risk is MOST specific to AI systems, as opposed to traditional databases?

A) Unauthorized access by external attackers B) Failure to delete data when retention periods expire C) The ability to infer sensitive attributes not directly present in the training data D) Failure to notify individuals about data collection

Correct Answer: C While all the listed risks apply to databases, the ability to infer sensitive attributes from non-sensitive data — deriving health status from purchasing behavior, or sexual orientation from browsing history — is particularly characteristic of AI systems. Traditional databases return what is stored; AI systems can generate new inferences.


Question 17. Which of the following best describes the "data minimization" principle?

A) Organizations should collect the minimum number of data fields required to stay under regulatory thresholds B) Personal data must be adequate, relevant, and limited to what is necessary for the purpose of processing C) Data should be stored in the smallest possible database format to reduce breach risk D) Organizations should collect data from the minimum number of sources

Correct Answer: B GDPR's data minimization principle (Article 5(1)(c)) requires that personal data be "adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed." This principle challenges the AI development tendency to collect as much data as possible.


Question 18. What is the maximum fine available under GDPR for the most serious violations?

A) 10 million euros B) 500,000 pounds sterling C) 20 million euros or 4% of global annual turnover, whichever is higher D) 1% of EU revenue

Correct Answer: C GDPR Article 83 provides for fines up to 20 million euros or 4% of global annual turnover, whichever is higher, for the most serious violations. A second tier of fines (10 million euros / 2% of turnover) applies to less serious violations.


Question 19. "Machine unlearning" refers to:

A) The process of removing bias from machine learning models B) Techniques for removing the influence of specific training examples from a trained model without full retraining C) The degradation of model performance over time as data distributions change D) Regulatory requirements to retrain models after data privacy violations

Correct Answer: B Machine unlearning addresses the technical challenge posed by the right to erasure: once data has been used to train a model, simply deleting the raw data does not remove its influence on the model's parameters. Unlearning techniques attempt to remove that influence without the cost of retraining from scratch.


Question 20. Which of the following is NOT one of Ann Cavoukian's seven Privacy by Design foundational principles?

A) Privacy as the default setting B) End-to-end security across the data lifecycle C) Mandatory third-party privacy audits D) Respect for user privacy through user-centric design

Correct Answer: C Cavoukian's seven principles are: (1) proactive/preventive, (2) privacy as default, (3) privacy embedded in design, (4) full functionality (positive-sum), (5) end-to-end security, (6) visibility and transparency, and (7) respect for user privacy. Mandatory third-party audits are not one of the seven foundational principles, though audits may be a good practice.