Quiz: Health Data, Genetic Data, and Biometric Privacy

Test your understanding before moving to the next chapter. Target: 70% or higher to proceed.


Section 1: Multiple Choice (1 point each)

1. HIPAA's Privacy Rule applies to which of the following entities?

  • A) All companies that collect health-related data, including fitness apps and wellness platforms.
  • B) Covered entities (health plans, healthcare clearinghouses, and healthcare providers who transmit health information electronically) and their business associates.
  • C) Any organization that stores data on a server located in the United States.
  • D) Only hospitals and physician practices, not insurance companies or pharmacies.
Answer **B)** Covered entities (health plans, healthcare clearinghouses, and healthcare providers who transmit health information electronically) and their business associates. *Explanation:* Section 12.1.1 defines HIPAA's scope. HIPAA applies to covered entities — health plans, healthcare clearinghouses, and healthcare providers who conduct certain electronic transactions — and their business associates (companies that handle PHI on behalf of covered entities). Critically, many entities that hold sensitive health data are NOT covered: fitness apps (unless they are business associates of a covered entity), wellness platforms, direct-to-consumer genetic testing companies, and health-related social media communities all fall outside HIPAA's scope. This gap is one of the chapter's central concerns.

2. Protected Health Information (PHI) under HIPAA includes:

  • A) Only a patient's medical diagnosis and treatment records.
  • B) Any individually identifiable health information held or transmitted by a covered entity or business associate, including demographic information, medical records, billing records, and any information that could identify the individual.
  • C) Only data stored in electronic health record systems.
  • D) Only information shared between doctors during a clinical consultation.
Answer **B)** Any individually identifiable health information held or transmitted by a covered entity or business associate, including demographic information, medical records, billing records, and any information that could identify the individual. *Explanation:* Section 12.1.1 defines PHI broadly: it encompasses any information about health status, provision of healthcare, or payment for healthcare that can be linked to an individual, held or transmitted by a covered entity or business associate. This includes not just clinical data but also billing records, appointment records, insurance claims, and demographic details when linked to health information. The key qualifier is that it must be held by a covered entity — the same information held by a non-covered entity (e.g., a health app) is not PHI under HIPAA.

3. The Genetic Information Nondiscrimination Act (GINA) prohibits the use of genetic information in:

  • A) Health insurance underwriting and employment decisions, but not life insurance, disability insurance, or long-term care insurance.
  • B) All forms of insurance and all employment decisions.
  • C) Criminal investigations and law enforcement databases.
  • D) Direct-to-consumer genetic testing marketing.
Answer **A)** Health insurance underwriting and employment decisions, but not life insurance, disability insurance, or long-term care insurance. *Explanation:* Section 12.2.2 describes GINA's scope and its critical gaps. GINA's Title I prohibits health insurers from using genetic information to make coverage or premium decisions, and Title II prohibits employers from using genetic information in hiring, firing, or promotion decisions. However, GINA does not cover life insurance, disability insurance, or long-term care insurance — leaving significant gaps. A person who undergoes genetic testing revealing a predisposition to a serious illness could be denied life insurance based on that information without violating GINA. This gap has been identified as a major limitation.

4. In the Golden State Killer case, law enforcement used which technique to identify the suspect?

  • A) They obtained a warrant for the suspect's 23andMe data.
  • B) They uploaded crime scene DNA to the public genealogy database GEDmatch and identified distant relatives of the suspect, then used traditional genealogy to narrow the suspect pool.
  • C) They required all residents of the suspect's neighborhood to provide DNA samples.
  • D) They used facial recognition to match the suspect's photo to a driver's license database.
Answer **B)** They uploaded crime scene DNA to the public genealogy database GEDmatch and identified distant relatives of the suspect, then used traditional genealogy to narrow the suspect pool. *Explanation:* Section 12.2.3 describes the investigative technique: law enforcement uploaded crime scene DNA to GEDmatch, a public genealogy database where individuals voluntarily share genetic profiles for ancestry research. The DNA matched several distant relatives (third and fourth cousins) of the suspect. Investigators then used traditional genealogy — family trees, public records, geographic analysis — to narrow the candidates to Joseph James DeAngelo, who was confirmed through a discarded DNA sample. The technique relied on the genetic data of relatives who had never consented to law enforcement use of their profiles.

5. The Illinois Biometric Information Privacy Act (BIPA) requires:

  • A) All companies to delete biometric data after 30 days.
  • B) Informed consent before collecting biometric identifiers, a written policy on retention and destruction, and prohibits selling or profiting from biometric data.
  • C) Only government agencies to obtain consent before using facial recognition.
  • D) Companies to encrypt biometric data but does not restrict its collection.
Answer **B)** Informed consent before collecting biometric identifiers, a written policy on retention and destruction, and prohibits selling or profiting from biometric data. *Explanation:* Section 12.3.2 describes BIPA's requirements: organizations must obtain informed, written consent before collecting biometric identifiers (fingerprints, facial geometry, iris scans, voiceprints); must publish a written policy establishing retention schedules and guidelines for permanent destruction; and must not sell, lease, trade, or otherwise profit from biometric data. Critically, BIPA includes a private right of action — individuals can sue for violations and recover $1,000 per negligent violation or $5,000 per intentional or reckless violation. This private right of action, unique among state biometric laws, has been the engine driving significant litigation.

6. The Robert Williams wrongful arrest case illustrates which specific problem with facial recognition technology?

  • A) Facial recognition is always inaccurate and should never be used.
  • B) Facial recognition systems have documented accuracy disparities by race and gender, with higher error rates for darker-skinned individuals, and their use in law enforcement can lead to wrongful arrests.
  • C) Facial recognition violates the Fourth Amendment in all circumstances.
  • D) The technology works perfectly but was applied to the wrong database.
Answer **B)** Facial recognition systems have documented accuracy disparities by race and gender, with higher error rates for darker-skinned individuals, and their use in law enforcement can lead to wrongful arrests. *Explanation:* Section 12.3.3 describes the Robert Williams case: in January 2020, Williams, a Black man in Detroit, was wrongfully arrested based on a false facial recognition match. The system matched a grainy surveillance image to Williams' driver's license photo, and Detroit police used this match as the basis for an arrest warrant — without independent verification. Williams was handcuffed in front of his family, detained for 30 hours, and released when investigators realized the match was incorrect. Research by Joy Buolamwini and Timnit Gebru (the Gender Shades study) has documented that commercial facial recognition systems have error rates up to 34% for darker-skinned women compared to less than 1% for lighter-skinned men.

7. Which of the following health-related data scenarios is NOT covered by HIPAA?

  • A) A hospital sharing patient records with an insurance company for claims processing.
  • B) A user entering their blood pressure readings into a smartphone app that has no relationship with a healthcare provider.
  • C) A pharmacy transmitting prescription records to a health plan electronically.
  • D) A physician's office sending lab results to a patient's referring specialist.
Answer **B)** A user entering their blood pressure readings into a smartphone app that has no relationship with a healthcare provider. *Explanation:* Section 12.1.2 highlights this as one of HIPAA's most significant gaps. When a user enters health data into a consumer app that is not a covered entity or business associate, HIPAA does not apply. The app developer is free to share, sell, or use that data under its own privacy policy, subject only to general consumer protection law (such as FTC Section 5 authority over unfair or deceptive practices). This gap is increasingly consequential as more health-related data is generated outside the traditional healthcare system — through fitness trackers, mental health apps, fertility tracking apps, and wellness platforms.

8. Why does the chapter argue that biometric data poses "irreversible" privacy risks?

  • A) Biometric data is more expensive to collect than other types of personal data.
  • B) Biometric identifiers — fingerprints, facial geometry, iris patterns — are permanently tied to the body and cannot be changed or reset if compromised, unlike passwords, credit card numbers, or even Social Security numbers.
  • C) Biometric data is always more accurate than other forms of identification.
  • D) Biometric data is only collected by government agencies, which have permanent databases.
Answer **B)** Biometric identifiers — fingerprints, facial geometry, iris patterns — are permanently tied to the body and cannot be changed or reset if compromised, unlike passwords, credit card numbers, or even Social Security numbers. *Explanation:* Section 12.3.1 makes this argument as the foundation for why biometric data requires special protection. If a password is stolen, you change it. If a credit card number is compromised, the bank issues a new one. But if your fingerprint template, facial geometry, or iris scan is stolen or leaked, there is no remediation — you cannot get new fingerprints. This permanence means that a single biometric data breach creates a lifelong vulnerability, which is why BIPA and other biometric privacy laws impose stricter requirements than general data protection laws.

9. VitraMed's first privacy incident, as described in the chapter, involved:

  • A) A cyberattack that exfiltrated patient records to a foreign server.
  • B) An employee accessing and leaking celebrity patient records to the media.
  • C) A configuration error that exposed patient portal data, combined with an analytics integration that shared health data with a third-party vendor outside HIPAA's scope.
  • D) A ransomware attack that encrypted all patient records.
Answer **C)** A configuration error that exposed patient portal data, combined with an analytics integration that shared health data with a third-party vendor outside HIPAA's scope. *Explanation:* Section 12.1.3 describes VitraMed's first privacy incident as a compound problem: a misconfigured patient portal allowed unauthorized access to some patient records, and an analytics tool integrated into the portal was sharing page-view data — including URLs containing patient identifiers and diagnosis codes — with a third-party analytics vendor that was not a HIPAA business associate. The incident illustrates how health data can leak through technical integrations rather than dramatic breaches, and how the gap between HIPAA-covered and non-covered entities creates real-world exposure.

10. The chapter's discussion of DTC genetic testing highlights which fundamental consent problem?

  • A) Genetic testing companies make it too easy to consent, using simple checkboxes.
  • B) When one person shares their genetic data, they are also revealing genetic information about their biological relatives — who did not consent and may not even be aware.
  • C) DTC genetic testing companies refuse to obtain consent from users.
  • D) Consent is unnecessary for genetic data because it is not personally identifiable.
Answer **B)** When one person shares their genetic data, they are also revealing genetic information about their biological relatives — who did not consent and may not even be aware. *Explanation:* Section 12.2.1 identifies this as a fundamental problem unique to genetic data. DNA is shared: siblings share approximately 50% of their DNA, first cousins approximately 12.5%, and even third cousins share detectable genetic similarities. When one person uploads their genetic profile to a database, they are effectively sharing partial genetic information about every biological relative. This creates a consent externality — the decision of one person imposes privacy consequences on others who had no role in the decision. The Golden State Killer case dramatically illustrates this: investigators identified the suspect through a distant relative's voluntarily uploaded DNA, not through the suspect's own data.

Section 2: True/False with Justification (1 point each)

For each statement, determine whether it is true or false and provide a brief justification.

11. "HIPAA requires healthcare providers to obtain patient consent before sharing medical records with other healthcare providers for treatment purposes."

Answer **False (with nuance).** *Explanation:* Section 12.1.1 explains that HIPAA's Privacy Rule permits covered entities to use and disclose PHI without patient authorization for three core purposes: treatment, payment, and healthcare operations (TPO). A doctor can share a patient's records with a specialist for referral purposes without obtaining specific consent for that disclosure. While HIPAA requires a general "Notice of Privacy Practices" that informs patients about how their information may be used, it does not require separate consent for each treatment-related disclosure. Specific authorization is required for uses outside TPO — such as marketing, research (in most cases), or disclosure to employers.

12. "GINA protects individuals from genetic discrimination by life insurance companies."

Answer **False.** *Explanation:* Section 12.2.2 explicitly identifies this as a major gap in GINA. GINA prohibits genetic discrimination in health insurance and employment, but it does not apply to life insurance, disability insurance, or long-term care insurance. Life insurers can legally ask applicants about genetic test results and use that information to deny coverage or set premiums. This gap has been widely criticized and has been cited as a reason some individuals avoid genetic testing — the fear that results could be used against them in non-health insurance contexts.

13. "The Gender Shades study by Joy Buolamwini and Timnit Gebru found that commercial facial recognition systems performed equally well across all demographic groups."

Answer **False.** *Explanation:* Section 12.3.3 describes the Gender Shades study as demonstrating significant accuracy disparities. Buolamwini and Gebru tested commercial facial recognition systems from IBM, Microsoft, and Face++ and found error rates of up to 34.7% for darker-skinned women compared to less than 1% for lighter-skinned men. This disparity reflects bias in training data (datasets overrepresenting lighter-skinned faces), algorithmic design choices, and the underlying physics of how imaging systems capture different skin tones. The study was influential in prompting industry improvements and legislative action.

14. "Under Illinois BIPA, only government enforcement agencies can bring legal action for violations — individuals cannot sue directly."

Answer **False.** *Explanation:* Section 12.3.2 highlights BIPA's private right of action as one of its most distinctive and powerful features. Unlike most state privacy laws, BIPA allows individuals to bring lawsuits directly against organizations that violate the statute. Plaintiffs can recover $1,000 per negligent violation or $5,000 per intentional or reckless violation, plus attorneys' fees. This private right of action has driven hundreds of lawsuits and class actions — including major cases against Facebook (settled for $650 million) and BNSF Railway (a jury verdict of $228 million) — making BIPA one of the most consequential privacy laws in the United States.

15. "Health data generated by consumer wellness apps and fitness trackers is generally subject to the same legal protections as health data in a hospital's electronic health record system."

Answer **False.** *Explanation:* Section 12.1.2 makes this point as one of the chapter's central arguments. Health data in a hospital EHR is protected by HIPAA because the hospital is a covered entity. Health data in a consumer wellness app — even if it contains identical information (heart rate, blood pressure, medication logs) — is generally not protected by HIPAA because the app developer is typically not a covered entity or business associate. The app's use of the data is governed only by its own privacy policy and general consumer protection law. This creates a regulatory gap where some of the most sensitive health data — mental health app records, fertility tracking data, substance use treatment information — receives the weakest legal protection because it is held by entities outside HIPAA's scope.

Section 3: Short Answer (2 points each)

16. Explain the concept of "familial DNA searching" as used in the Golden State Killer case. Why does this technique raise privacy concerns that are fundamentally different from other law enforcement surveillance methods?

Sample Answer Familial DNA searching involves uploading crime scene DNA to a genetic genealogy database and searching for partial matches — people who share enough DNA with the unknown suspect to be biological relatives. In the Golden State Killer case, investigators uploaded the suspect's DNA to GEDmatch and found distant relatives (third and fourth cousins). They then built a family tree using traditional genealogy techniques to identify potential suspects, eventually zeroing in on Joseph James DeAngelo. This technique raises unique privacy concerns because it implicates people who are not suspects and who never consented to law enforcement use of their data. When a person voluntarily uploads their DNA to GEDmatch or a similar platform, they consent to ancestry matching with other users — not to serving as an investigative lead for law enforcement. More fundamentally, the technique works because of shared biology: a person's decision to upload their DNA exposes their relatives to potential law enforcement scrutiny, regardless of those relatives' own choices. This is structurally different from other surveillance methods (phone tapping, location tracking, financial monitoring) that target specific individuals. Familial DNA searching is inherently communal — it searches through biological relationships, not individual actions — meaning that one person's privacy decision has inescapable consequences for their entire biological family. *Key points for full credit:* - Explains how familial DNA searching works (partial matches, genealogy narrowing) - Identifies that it implicates non-consenting relatives - Distinguishes it from individual-targeting surveillance methods

17. The chapter describes a "regulatory patchwork" for health-related data in the United States. What does this mean, and what practical problems does it create for individuals?

Sample Answer The "regulatory patchwork" refers to the fragmented, sector-specific nature of U.S. health data protection. HIPAA covers data held by healthcare providers, health plans, and their business associates. GINA covers genetic information in health insurance and employment. BIPA (in Illinois) and similar state laws cover biometric data. The FTC enforces against deceptive practices by apps and websites. State consumer protection laws add additional layers. But there is no comprehensive federal law covering all health-related data regardless of who holds it. This creates practical problems: the same data (e.g., a heart rate reading) receives different levels of protection depending on who holds it. A cardiologist's record of your heart rate is PHI under HIPAA. The same heart rate reading from a Fitbit is governed only by Fitbit's privacy policy. A patient who shares their health data across clinical providers, consumer apps, genetic testing services, and insurance wellness programs is subject to multiple overlapping and gapped regulatory regimes — and cannot reasonably be expected to understand which protections apply where. For individuals, the practical result is confusion, inconsistent protection, and a false sense of security — many people assume that all health data is "protected" without understanding that protection varies dramatically based on who holds it. *Key points for full credit:* - Defines the patchwork as sector-specific, fragmented regulation - Provides the example of the same data receiving different protection based on the holder - Identifies practical harm to individuals (confusion, inconsistent protection)

18. Explain how the Robert Williams wrongful arrest case connects to both the technical problem of algorithmic bias and the governance problem of how law enforcement uses automated tools. Why is the case significant beyond its individual facts?

Sample Answer The Robert Williams case connects to algorithmic bias because the facial recognition system produced a false match — identifying Williams as a shoplifting suspect based on a grainy surveillance image. Research has shown that these systems have significantly higher error rates for darker-skinned individuals, and Williams, a Black man, was a predictable victim of this documented bias. But the case also reveals a governance failure: even if the technology had been more accurate, the Detroit Police Department's process was inadequate. Officers used the facial recognition output as the primary basis for an arrest warrant rather than as an investigative lead requiring independent corroboration. There was no policy requiring human review, no minimum confidence threshold, and no documentation of the role facial recognition played in the investigation. The case is significant beyond its individual facts for three reasons. First, it was the first publicly documented wrongful arrest in the U.S. attributed to facial recognition, making visible a harm that may have occurred many times without being identified. Second, it demonstrated that accuracy disparities by race in facial recognition are not abstract statistics — they produce concrete, life-altering harms (Williams was handcuffed in front of his daughters and detained for 30 hours). Third, it catalyzed legislative and policy responses, including calls for bans on government use of facial recognition and requirements for accuracy standards and audit mechanisms. *Key points for full credit:* - Connects to algorithmic bias (accuracy disparities by race) - Connects to governance (inadequate police procedures for using automated tools) - Explains why the case is significant beyond the individual facts

19. If HIPAA were being written today, what three changes would you recommend to address the gaps identified in this chapter? For each, explain the specific gap it would close and the practical effect it would have.

Sample Answer First, I would expand HIPAA's scope to cover all entities that collect, store, or process health-related data — not just covered entities and business associates. This would bring fitness apps, wellness platforms, mental health apps, and DTC genetic testing companies under HIPAA's protections, closing the most significant gap in the current framework. The practical effect would be that a user's health data receives the same protection regardless of whether it was generated by a cardiologist or a smartwatch. Second, I would add a data minimization requirement. Current HIPAA requires safeguards for the data that is collected but does not limit what is collected. A modernized HIPAA should require that health-related data collection be limited to what is necessary for the specified purpose — aligning with the GDPR's data minimization principle and Chapter 10's Privacy by Design framework. The practical effect would be to prevent the accumulation of unnecessary health data that increases both breach risk and potential for misuse. Third, I would add a private right of action, modeled on BIPA, allowing individuals to sue for HIPAA violations. Currently, HIPAA enforcement is exclusively federal (through the HHS Office for Civil Rights), and individual patients cannot bring lawsuits for violations. Adding a private right of action would increase enforcement capacity and give individuals direct recourse when their health data is mishandled. The BIPA experience demonstrates that private enforcement can be a powerful driver of compliance. *Key points for full credit:* - Identifies three specific, distinct reforms - Links each to a specific gap identified in the chapter - Explains practical effects, not just abstract principles

Section 4: Applied Scenario (5 points)

20. Read the following scenario and answer all parts.

Scenario: MindWell Mental Health App

MindWell is a mental health app that provides guided meditation, mood tracking, and AI-powered "therapy chatbot" sessions. It has 5 million users. MindWell collects: user-entered mood logs (including free-text journal entries), voice recordings from guided meditation sessions, facial expressions captured through the phone's camera during check-ins (to assess emotional state), medication logs, and responses to standardized depression and anxiety screening questionnaires (PHQ-9 and GAD-7).

MindWell is not a HIPAA-covered entity. Its privacy policy states that it may "share de-identified and aggregated data with research partners and third parties." MindWell has a data-sharing agreement with a pharmaceutical company that is developing antidepressant medications — the pharma company receives aggregate (but granular) data on medication usage patterns, symptom severity scores, and user demographics.

A journalist discovers that MindWell's "de-identified" research dataset includes ZIP codes, age ranges (in 5-year bands), gender, specific medication names, and PHQ-9 scores — and that for some combinations of these quasi-identifiers, only one or two individuals in the dataset match.

(a) Identify all the categories of sensitive data MindWell collects. For each category, state whether it falls under HIPAA, GINA, BIPA, or no sector-specific federal law. (1 point)

(b) Analyze the re-identification risk of MindWell's "de-identified" research dataset, using the concepts from Chapter 10 (k-anonymity, quasi-identifiers). Is the dataset truly de-identified? What specific re-identification attacks are possible? (1 point)

(c) The pharmaceutical company's access to MindWell data is legal under current U.S. law. Evaluate whether it is ethical. Consider: the sensitivity of the data, the information asymmetry between MindWell and its users, and the power dynamics involved. (1 point)

(d) MindWell captures facial expressions to assess emotional state. Under BIPA (in Illinois), would this capture constitute collection of a "biometric identifier"? What consent requirements would apply? What about in states without BIPA-like laws? (1 point)

(e) Propose a comprehensive governance framework for MindWell that addresses the privacy gaps identified in parts (a) through (d). Your framework should include at least one technical measure (from Chapter 10), one economic consideration (from Chapter 11), and one sector-specific legal reform (from Chapter 12). (1 point)

Sample Answer **(a)** Sensitive data categories: - **Mood logs and journal entries** — mental health data. Not covered by HIPAA (MindWell is not a covered entity). No sector-specific federal protection. - **Voice recordings** — biometric data (voiceprint) and potentially health data (voice analysis can reveal emotional states). Covered by BIPA in Illinois (voiceprint is explicitly listed as a biometric identifier). Not covered by HIPAA or any federal biometric law. - **Facial expressions** — biometric data (facial geometry). Covered by BIPA in Illinois. Not covered by HIPAA or federal law. - **Medication logs** — health data. Not covered by HIPAA (MindWell is not a covered entity). No sector-specific federal protection. - **PHQ-9 and GAD-7 scores** — mental health screening data. Not covered by HIPAA. No sector-specific federal protection. - **Demographics** — general personal data. Subject to state consumer privacy laws (CCPA in California) but no health-specific protection. The critical gap: some of the most sensitive health data imaginable — mental health screenings, medication records, emotional expression data — falls completely outside HIPAA because it is held by a non-covered entity. **(b)** The dataset is not truly de-identified. The quasi-identifiers — ZIP code, 5-year age range, gender, specific medication name, and PHQ-9 score — create a high-dimensional space where many combinations are unique or near-unique. For example, a 30-34-year-old female in ZIP code 48226 taking a specific medication with a PHQ-9 score of 19 may be the only person in the dataset matching that profile, achieving only 1-anonymity. Cross-referencing with pharmacy records, insurance claims, or even social media posts about mental health could enable re-identification. The combination of specific medication names and depression severity scores is particularly dangerous — individuals taking uncommon medications for severe depression are a small population, and the combination of demographic and clinical attributes can narrow identification to a single person. **(c)** The pharmaceutical company's access is legal but ethically problematic on several grounds. First, information asymmetry: MindWell's privacy policy uses vague language ("de-identified and aggregated data" shared with "research partners and third parties") that does not inform users that a pharmaceutical company is receiving detailed mental health data. Users who download a meditation app do not expect their depression scores to reach a drug manufacturer. Second, the sensitivity of the data is extreme — mental health data carries stigma and can affect employment, relationships, custody, and insurance. Users are in a vulnerable state (they sought help for mental health concerns) and are poorly positioned to evaluate downstream data flows. Third, the power dynamic is deeply asymmetric: the pharmaceutical company is a sophisticated commercial entity purchasing data from a population that is often in distress, for the purpose of commercializing treatments that those same users may eventually purchase. The users bear the privacy risk; the pharma company captures the commercial value. **(d)** Under BIPA, facial geometry is explicitly listed as a biometric identifier. If MindWell operates in Illinois or collects biometric data from Illinois residents, BIPA would apply. MindWell would need to: (1) inform each user in writing that facial geometry is being collected and for what purpose, (2) obtain written consent before the first capture, (3) publish a retention and destruction schedule, and (4) refrain from selling or profiting from the biometric data. The current implementation — capturing facial expressions automatically during check-ins — almost certainly violates BIPA if Illinois residents are among its users, as the app likely does not obtain the specific, informed, written consent BIPA requires. In states without BIPA-like laws, there is no specific biometric consent requirement, and the collection is governed only by MindWell's general privacy policy — a significantly weaker protection. **(e)** Comprehensive governance framework: - **Technical (Chapter 10):** Implement differential privacy on any research data shared externally, with a managed privacy budget. Replace the current "de-identified" dataset with a differentially-private query interface that allows the pharmaceutical company to ask aggregate statistical questions without receiving microdata. This prevents re-identification while preserving research utility. - **Economic (Chapter 11):** Internalize the privacy externality by requiring MindWell to carry data breach insurance proportional to the sensitivity of the data it holds, and to disclose to users the economic value of the data partnerships it maintains. If users understood that their mental health data generates revenue for MindWell through pharmaceutical data sales, they would be better positioned to evaluate the trade-off. - **Legal reform (Chapter 12):** Extend HIPAA-equivalent protections to all entities that collect health-related data, regardless of whether they are traditional healthcare providers. MindWell should be required to treat mood logs, PHQ-9 scores, and medication records with the same safeguards that a hospital would apply — including purpose limitations, minimum necessary standards, and individual rights of access and deletion. Additionally, enact federal biometric privacy legislation modeled on BIPA, requiring informed consent for facial and voice data collection by all entities, with a private right of action.

Scoring & Review Recommendations

Score Range Assessment Next Steps
Below 50% (< 15 pts) Needs review Re-read Sections 12.1-12.3, focus on HIPAA scope, GINA, and BIPA
50-69% (15-20 pts) Partial understanding Review the regulatory gaps and the case studies, redo Part B exercises
70-85% (21-25 pts) Solid understanding Ready to proceed to Chapter 13; review any missed topics
Above 85% (> 25 pts) Strong mastery Proceed to Chapter 13: How Algorithms Shape Society
Section Points Available
Section 1: Multiple Choice 10 points (10 questions x 1 pt)
Section 2: True/False with Justification 5 points (5 questions x 1 pt)
Section 3: Short Answer 8 points (4 questions x 2 pts)
Section 4: Applied Scenario 5 points (5 parts x 1 pt)
Total 28 points