Key Takeaways: Chapter 12 — Health Data, Genetic Data, and Biometric Privacy


Core Takeaways

  1. HIPAA provides important but incomplete protection for health data. The Health Insurance Portability and Accountability Act establishes a framework for protecting health information held by covered entities (healthcare providers, health plans, and clearinghouses) and their business associates. But HIPAA's scope is defined by who holds the data, not by what the data is. Identical health information — a heart rate reading, a medication log, a depression screening score — receives full HIPAA protection when held by a hospital and no sector-specific protection when held by a consumer app. This gap is growing as more health-related data is generated outside the traditional healthcare system.

  2. The HIPAA gap is the defining challenge of digital health privacy. Fitness trackers, mental health apps, fertility tracking apps, wellness platforms, and telehealth startups collect some of the most sensitive health data imaginable — and most of them are not HIPAA-covered entities. The regulatory framework designed for the 1996 healthcare landscape has not kept pace with the 2026 digital health ecosystem. Users often assume that "health data is protected" without understanding that protection depends entirely on who holds it.

  3. Genetic data is categorically different from other personal data. DNA cannot be changed. It is shared with biological relatives who did not consent to its collection or disclosure. It reveals not only the individual's own health predispositions but those of their parents, children, siblings, and extended family. And it can be used for purposes far beyond its original collection — from law enforcement identification to insurance risk assessment to ancestry discovery. These properties mean that genetic data requires governance frameworks that go beyond individual consent.

  4. GINA protects against genetic discrimination but has critical gaps. The Genetic Information Nondiscrimination Act prohibits the use of genetic information in health insurance and employment decisions — a meaningful protection. But GINA does not cover life insurance, disability insurance, or long-term care insurance, leaving individuals exposed to discrimination in precisely the contexts where genetic information is most consequential. These gaps create a chilling effect on genetic testing, as individuals weigh the medical benefits of knowing their genetic risks against the financial risks of that knowledge being used against them.

  5. Direct-to-consumer genetic testing creates novel privacy challenges. Companies like 23andMe and AncestryDNA have tested tens of millions of people, creating the largest genetic databases in history. These databases sit outside traditional health data governance (they are not covered by HIPAA), are subject to corporate privacy policies that can change or be superseded by acquisition, and create consent externalities for genetic relatives. The question of what happens to genetic data when a DTC company is sold, goes bankrupt, or changes its policies remains insufficiently addressed by existing law.

  6. Biometric data poses irreversible privacy risks. Fingerprints, facial geometry, iris patterns, and voiceprints are permanently tied to the body. They cannot be changed if compromised. A stolen password can be reset; a stolen biometric template creates a lifelong vulnerability. This permanence is the foundation for the argument that biometric data requires stronger protection than other categories of personal data — and it is why BIPA's consent requirements and private right of action represent a significant advance in privacy law.

  7. Facial recognition accuracy varies dramatically by race and gender. The Gender Shades study and NIST evaluations have documented that commercial facial recognition systems have significantly higher error rates for darker-skinned individuals, particularly darker-skinned women. When these systems are deployed in law enforcement — as in the Robert Williams case — accuracy disparities translate directly into civil rights violations: wrongful identifications, wrongful arrests, and wrongful detentions that disproportionately affect Black Americans.

  8. The Robert Williams case reveals a governance failure, not merely a technology failure. Williams was arrested not only because the facial recognition system produced a false match but because the Detroit Police Department treated the system's output as an identification rather than an investigative lead, failed to conduct independent corroboration, and administered a flawed photo lineup. The case illustrates a pattern common to algorithmic decision-making across domains: automated outputs become de facto decisions, and human oversight becomes a rubber stamp.

  9. Biometric privacy law is fragmented and inadequate at the federal level. Illinois BIPA remains the strongest biometric privacy statute in the United States, with its informed consent requirements, ban on biometric data sales, and private right of action. A handful of other states have enacted weaker biometric privacy laws. There is no comprehensive federal biometric privacy law. This patchwork creates inconsistent protection depending on geography — a person's biometric privacy rights change when they cross state lines.

  10. Sensitive data categories are converging. Health data, genetic data, and biometric data increasingly overlap. A biometric scan can reveal health conditions. Genetic data predicts health outcomes. Health records contain biometric identifiers. This convergence challenges governance frameworks designed for separate data categories and argues for integrated, comprehensive approaches to sensitive data protection.


Key Concepts

Term Definition
HIPAA The Health Insurance Portability and Accountability Act (1996), establishing privacy and security standards for protected health information held by covered entities and business associates.
Protected Health Information (PHI) Individually identifiable health information held or transmitted by a HIPAA-covered entity or business associate.
Covered entity Under HIPAA: health plans, healthcare clearinghouses, and healthcare providers who conduct certain electronic transactions.
Business associate An entity that performs functions or activities involving PHI on behalf of a HIPAA-covered entity.
GINA The Genetic Information Nondiscrimination Act (2008), prohibiting genetic discrimination in health insurance and employment.
DTC genetic testing Direct-to-consumer genetic testing services (23andMe, AncestryDNA) that provide genetic analysis directly to individuals without physician intermediation.
Familial DNA searching An investigative technique using genetic databases to identify relatives of an unknown suspect through partial DNA matches.
GEDmatch A publicly accessible genetic genealogy database where users voluntarily share DNA profiles for ancestry research; used by law enforcement in the Golden State Killer case.
Consent externality A privacy cost imposed on individuals who did not consent to the data-sharing decision that affects them — particularly relevant for genetic data, where one person's disclosure reveals information about relatives.
Biometric data Data derived from physical or behavioral characteristics that can be used to identify individuals: fingerprints, facial geometry, iris patterns, voiceprints, gait, typing patterns.
BIPA The Illinois Biometric Information Privacy Act (2008), requiring informed consent before biometric data collection, prohibiting biometric data sales, and providing a private right of action.
Private right of action A legal provision allowing individuals to bring lawsuits directly against violators, as opposed to relying solely on government enforcement.
Gender Shades study Research by Joy Buolamwini and Timnit Gebru demonstrating significant accuracy disparities in commercial facial recognition systems across race and gender.
NIST FRVT The National Institute of Standards and Technology's Face Recognition Vendor Test, evaluating demographic accuracy disparities across commercial facial recognition algorithms.
Investigative genetic genealogy (IGG) The use of consumer genetic databases and genealogical research to identify criminal suspects through their biological relatives' voluntarily shared DNA.

Key Debates

  1. Should HIPAA's scope be expanded to cover all health data? Expanding HIPAA to all entities holding health data would close the app-and-wearable gap but would impose significant compliance costs on startups, app developers, and technology companies. The tension between comprehensive protection and innovation incentives remains unresolved.

  2. Can facial recognition be made fair, or should it be banned? Some argue that the technology should be improved through better training data and accuracy standards. Others argue that even perfectly accurate facial recognition creates an infrastructure of ubiquitous face surveillance that is incompatible with civil liberties. The debate is not only about technical performance but about the kind of society we want to live in.

  3. How should law enforcement access to genetic databases be governed? The Golden State Killer case demonstrated the investigative power of genetic genealogy. But the technique implicates non-consenting relatives and could chill genetic testing participation. Should law enforcement be required to obtain a warrant? Should access be limited to violent crimes? Should genetic databases be required to offer meaningful opt-out for law enforcement searches?

  4. What happens to genetic and biometric data when companies fail? If 23andMe goes bankrupt, its database of 14+ million genetic profiles becomes a corporate asset. If a biometric security company is acquired, its fingerprint templates transfer to the new owner. Existing bankruptcy and acquisition law does not treat biological data as fundamentally different from other corporate assets — but perhaps it should.


Looking Ahead

Chapter 12 concludes Part 2 of this textbook — the deep dive into privacy in the digital age. We have examined what privacy is (Chapter 7), how surveillance operates (Chapter 8), the failures of consent (Chapter 9), technical privacy protections (Chapter 10), the economics of privacy (Chapter 11), and the special challenges of health, genetic, and biometric data (Chapter 12). Part 3, "Algorithmic Systems and AI Ethics," shifts focus from what data is collected to what is done with it — beginning with Chapter 13, "How Algorithms Shape Society," which examines how algorithmic decision-making systems sort, rank, recommend, and decide in ways that shape individual opportunities and social structures.


Use this summary as a study reference and a quick-access card for key vocabulary. The concepts of HIPAA scope, genetic consent externalities, biometric irreversibility, and facial recognition accuracy disparities will recur in later chapters on algorithmic bias (Chapter 14), fairness (Chapter 15), and sector-specific governance (Chapter 24).