> "The governance of data must be as specific as the harm it can cause. Financial data that can bankrupt a family, health data that can determine treatment, education data that can shape a child's future — each demands governance calibrated to its...
Learning Objectives
- Explain the rationale for sector-specific data governance and its relationship to general-purpose data protection law
- Describe the key financial data governance frameworks — Basel Committee, PCI-DSS, PSD2/open banking, and algorithmic trading regulation
- Analyze HIPAA's Privacy Rule, Security Rule, and Breach Notification Rule and their implications for health-tech companies
- Evaluate the European Health Data Space proposal and its significance for health data governance
- Describe FERPA's protections for student education records and the governance challenges posed by ed-tech
- Identify cross-sector patterns in data governance that apply regardless of industry
In This Chapter
- Chapter Overview
- 24.1 Why Sector-Specific Governance?
- 24.2 Financial Data Governance
- 24.3 Health Data Governance
- 24.4 Education Data Governance
- 24.5 Cross-Sector Patterns
- 24.6 VitraMed's Health Sector Compliance
- 24.7 NovaCorp's Financial Data Governance
- 24.8 Chapter Summary
- What's Next
- Chapter 24 Exercises → exercises.md
- Chapter 24 Quiz → quiz.md
- Case Study: Open Banking and Data Portability in Finance → case-study-01.md
- Case Study: Ed-Tech and Student Privacy: The Pandemic Acceleration → case-study-02.md
Chapter 24: Sector-Specific Governance: Finance, Health, Education
"The governance of data must be as specific as the harm it can cause. Financial data that can bankrupt a family, health data that can determine treatment, education data that can shape a child's future — each demands governance calibrated to its consequences." — Helen Nissenbaum, Privacy in Context
Chapter Overview
In Chapter 20, we surveyed the broad regulatory landscape — the general-purpose data protection laws like the GDPR and CCPA that apply across all sectors. In Chapter 22, we examined organizational data governance frameworks that any entity can adopt. This chapter turns to something different: the sector-specific governance regimes that exist because some data is so consequential that general-purpose rules are not enough.
Financial data determines who can borrow money, at what rate, and under what conditions. Health data shapes medical treatment, insurance coverage, and, increasingly, algorithmic predictions about who will get sick. Education data records what a child has learned, how they behaved, and what they struggled with — creating a permanent record that can follow them through life.
Each of these sectors has developed specialized governance frameworks — sometimes layered on top of general data protection law, sometimes preceding it by decades. Understanding these sector-specific regimes is essential for practitioners in any field, because the principles they embody — fiduciary duty, confidentiality, informed consent, minimum necessary access — are universal even when their implementation is sector-specific.
This chapter examines governance in three critical sectors and draws out the cross-sector patterns they reveal.
In this chapter, you will learn to: - Navigate the major data governance frameworks in finance, health, and education - Analyze how sector-specific governance interacts with general-purpose data protection law - Evaluate the distinctive governance challenges of each sector - Apply cross-sector governance principles to organizations operating in regulated industries
24.1 Why Sector-Specific Governance?
24.1.1 The Limits of General-Purpose Regulation
General-purpose data protection laws like the GDPR establish baseline principles — consent, purpose limitation, data minimization, security — that apply to all personal data processing. But some data processing contexts involve:
- Unique harm profiles: A breached medical record can lead to discrimination, stigma, or denial of insurance. A breached financial record can lead to identity theft and financial ruin. The consequences differ qualitatively from, say, the exposure of a shopping preference.
- Pre-existing regulatory infrastructure: Finance and health were regulated long before data protection law existed. These sectors have established regulators, compliance cultures, and industry-specific standards that data protection law must integrate with, not replace.
- Technical complexity: Financial transaction processing, clinical data interoperability, and educational record management involve domain-specific technical standards that general regulators may lack the expertise to address.
- Power dynamics specific to the sector: The relationship between a doctor and a patient, a bank and a borrower, a school and a child are each distinctive, with particular asymmetries and duties of care.
24.1.2 Layered Governance
In most jurisdictions, sector-specific governance operates as a layer on top of general-purpose data protection law, not as a replacement. A US health-tech company must comply with HIPAA and applicable state privacy laws. A European fintech must comply with PSD2, PCI-DSS, and the GDPR.
This layering creates complexity but also depth. General-purpose law provides the floor; sector-specific governance raises it where the stakes are highest.
Connection: The US sectoral model (Chapter 20) is effectively built entirely from these sector-specific layers, without a general-purpose floor beneath them. The GDPR model adds sector-specific governance on top of a comprehensive foundation. Both approaches have strengths, but the absence of a US general-purpose floor means that data falling outside any sector-specific statute may have no protection at all.
24.2 Financial Data Governance
24.2.1 The Stakes
Financial data governance exists because money is power, and the data that determines access to money is therefore a determinant of power. A credit score, a risk assessment, a fraud detection flag — each of these data-driven outputs can determine whether a person can buy a home, start a business, or even open a bank account.
The financial sector has been a pioneer in data governance — partly because regulators recognized the consequences of data mismanagement early, and partly because financial crises (2008, in particular) demonstrated what happens when data systems fail at systemic scale.
24.2.2 The Basel Committee and Regulatory Capital
The Basel Committee on Banking Supervision — an international body of central bank representatives — has established data governance standards as part of its broader regulatory framework for banks.
BCBS 239 (Principles for Effective Risk Data Aggregation and Risk Reporting, 2013) established fourteen principles covering:
- Governance: Banks must establish clear data governance frameworks, with roles and responsibilities defined at the board level
- Data architecture and IT infrastructure: Banks must maintain data systems capable of accurate, complete, and timely risk data aggregation
- Accuracy and integrity: Risk data must be accurate and complete, with reconciliation processes between different data sources
- Completeness: Risk data must cover all material risks and be available at the firm-wide, business-line, and entity levels
- Timeliness: Data must be available quickly enough to support risk management decisions, including during crisis periods
- Adaptability: Systems must be able to produce data on an ad hoc basis and in response to emerging risks
BCBS 239 arose from a specific failure: during the 2008 financial crisis, many banks could not aggregate their risk exposures quickly enough to understand their own financial positions. The data existed — spread across dozens of systems in different formats with different definitions — but it could not be combined, reconciled, and analyzed in time to prevent catastrophic losses.
Ray Zhao had direct experience with BCBS 239 compliance: "When I was at a larger bank before NovaCorp, we spent three years and $40 million implementing BCBS 239. The regulatory requirement forced us to do what we should have been doing anyway — building a coherent data architecture, defining critical data elements, measuring data quality. The regulation was the catalyst. Good governance was the outcome."
24.2.3 PCI-DSS: Payment Card Security
The Payment Card Industry Data Security Standard (PCI-DSS) is an industry-developed standard (maintained by the PCI Security Standards Council, founded by Visa, Mastercard, American Express, Discover, and JCB) that applies to any organization that stores, processes, or transmits payment card data.
PCI-DSS specifies twelve requirements across six categories:
| Category | Requirements |
|---|---|
| Build and maintain a secure network | Install and maintain a firewall; change vendor-supplied defaults for system passwords |
| Protect cardholder data | Protect stored cardholder data; encrypt transmission across open networks |
| Maintain a vulnerability management program | Protect against malware; develop and maintain secure systems |
| Implement strong access controls | Restrict access on a need-to-know basis; identify and authenticate access; restrict physical access |
| Monitor and test networks | Track and monitor all access to network resources and cardholder data; regularly test security systems |
| Maintain a security policy | Maintain an information security policy |
PCI-DSS is notable as an example of co-regulation (Chapter 20): it was developed by the private sector but enforced through contractual relationships — banks require merchants to comply with PCI-DSS as a condition of accepting card payments. Non-compliance can result in fines, increased processing fees, or loss of the ability to accept cards.
Accountability Gap: PCI-DSS illustrates both the strengths and weaknesses of industry self-regulation. On the positive side, the standard is technically detailed, regularly updated, and enforced through economic incentives. On the negative side, compliance is assessed through self-assessment questionnaires for most small merchants, and the standard has been criticized for focusing on perimeter security rather than the entire data lifecycle. Major payment card breaches — Target (2013), Home Depot (2014), Capital One (2019) — have occurred at PCI-DSS compliant organizations.
24.2.4 Open Banking: PSD2 and Data Portability
The EU's revised Payment Services Directive (PSD2, 2018) and its successor PSD3 represent a fundamentally different approach to financial data governance: requiring banks to open their data to authorized third parties.
Under PSD2: - Banks must provide access to customer account data to regulated third-party providers (with customer consent) - Third-party providers can initiate payments on behalf of customers (with customer consent) - Strong customer authentication (SCA) is required for electronic payments
Open banking transforms the power dynamics of financial data. Historically, banks controlled customer financial data as a competitive asset. PSD2 reconceptualizes that data as belonging to the customer, who can authorize third parties to access and use it.
The promises: - Increased competition and innovation in financial services - Better products for consumers (e.g., apps that aggregate accounts across multiple banks, automated savings tools, price comparison services) - Reduced switching costs — customers can more easily move between banks when their data is portable
The concerns: - Consent complexity. Customers must understand and manage which third parties have access to their financial data — adding to consent fatigue (Chapter 9) - Security risks. More parties with access to data means a larger attack surface - Inequality. Sophisticated consumers may benefit from open banking; less tech-savvy consumers may face new risks without understanding them
Consent Fiction: Open banking simultaneously challenges and reinforces the Consent Fiction. It challenges it by giving consumers genuine control — the ability to direct their data to providers of their choice. But it reinforces it by requiring consumers to manage increasingly complex consent relationships, often through interfaces designed to make sharing easy and opting out difficult.
24.2.5 Algorithmic Trading Regulation
Financial markets have become increasingly algorithmic. High-frequency trading algorithms execute trades in microseconds, market-making algorithms provide liquidity, and portfolio management algorithms make investment decisions.
The governance challenge is significant:
- Speed outpaces oversight. Algorithmic trades execute faster than human oversight is possible. The "Flash Crash" of May 6, 2010, in which the Dow Jones lost nearly 1,000 points in minutes, demonstrated that algorithmic systems can cascade into systemic events.
- Opacity. Complex trading algorithms are difficult for regulators (and sometimes even their creators) to audit.
- Fairness. Algorithmic trading advantages firms with the fastest technology and the most data, potentially disadvantaging retail investors.
The EU's Markets in Financial Instruments Directive (MiFID II) and the US SEC's Regulation SCI address algorithmic trading governance through requirements for testing, circuit breakers, and regulatory reporting.
NovaCorp's trading desk operated within this framework, and Ray described the practical implications: "Every algorithm we deploy must be tested against stress scenarios, approved by compliance, logged for audit, and subject to kill switches. If an algorithm behaves unexpectedly, we can halt it immediately. That's governance with teeth — and in financial markets, the teeth are necessary."
24.3 Health Data Governance
24.3.1 The Unique Sensitivity of Health Data
Health data occupies a special category in every major data protection framework. The GDPR classifies it as a "special category" requiring enhanced protection. HIPAA imposes sector-specific requirements. Even countries without comprehensive data protection laws often have health data legislation.
Why this heightened protection?
- Intimacy. Health data reveals some of the most private aspects of human life — mental health diagnoses, reproductive health, substance use, genetic predispositions, sexual health.
- Discrimination risk. Health information can be used to discriminate in employment, insurance, housing, and social relationships.
- Power asymmetry. Patients in a clinical setting are vulnerable. They cannot meaningfully "consent" to data practices when refusing might affect their care.
- Permanence. Unlike a credit card number that can be changed after a breach, health data — a diagnosis, a genetic trait — is permanent. Once exposed, it cannot be un-exposed.
24.3.2 HIPAA in Depth
The Health Insurance Portability and Accountability Act (1996) is the foundational US health data regulation. Despite its age, HIPAA remains the primary federal framework governing health information privacy and security.
Covered entities: HIPAA applies to health plans, healthcare clearinghouses, and healthcare providers who transmit health information electronically. It extends to business associates — organizations that perform functions involving protected health information (PHI) on behalf of covered entities. VitraMed, as an EHR provider, is a business associate to the clinics it serves.
The Privacy Rule establishes national standards for the protection of PHI: - Patients have the right to access their health records - Patients have the right to request corrections to inaccurate records - Covered entities must provide notice of privacy practices - PHI can be used or disclosed for treatment, payment, and healthcare operations without individual authorization - Most other uses require individual authorization - The minimum necessary standard requires entities to limit PHI access to the minimum needed for a particular purpose
The Security Rule establishes requirements for protecting electronic PHI (ePHI): - Administrative safeguards: Security management processes, workforce training, contingency plans - Physical safeguards: Facility access controls, workstation use policies, device and media controls - Technical safeguards: Access controls, audit controls, integrity controls, transmission security
The Breach Notification Rule requires notification to affected individuals, the Department of Health and Human Services, and (for large breaches) the media when unsecured PHI is breached.
24.3.3 HIPAA's Limitations
Despite its importance, HIPAA has significant gaps:
Scope limitations. HIPAA applies only to covered entities and their business associates. It does not apply to: - Health apps (fitness trackers, period trackers, mental health apps) that are not connected to covered entities - Data brokers who purchase and sell health-related information - Employers who collect health data through wellness programs - Social media platforms where people voluntarily share health information
This means that a patient's official medical record at a hospital is protected by HIPAA, but the same patient's health data on a fitness app, in a search history, or in a data broker's database may have no federal protection at all.
The consent gap. HIPAA permits the use and disclosure of PHI for "treatment, payment, and healthcare operations" without individual authorization. This is practically necessary — a doctor needs to access a patient's record without obtaining a separate authorization each time — but it creates a broad category of permissible uses that patients may not understand or anticipate.
De-identification as escape valve. HIPAA permits unrestricted use and disclosure of "de-identified" health data. But as we examined in Chapter 10, de-identification is an increasingly fragile protection. Health datasets that are technically de-identified under HIPAA's standards can often be re-identified through linkage with other data sources.
Mira encountered these limitations directly through VitraMed: "Dad thinks HIPAA compliance means we're handling patient data responsibly. And within HIPAA's scope, we are. But HIPAA was written in 1996. It doesn't address machine learning models trained on patient data, it doesn't address algorithmic risk scoring, and it doesn't address the re-identification risk from combining de-identified datasets. HIPAA is the floor, not the ceiling — and the floor was built for a world that no longer exists."
24.3.4 The European Health Data Space
The EU's proposed European Health Data Space (EHDS) represents the most ambitious attempt to create a comprehensive health data governance framework.
Primary use: The EHDS would give patients the right to access their health data in digital format, to share it across borders within the EU, and to restrict access to specific health professionals.
Secondary use: Crucially, the EHDS also creates a framework for the "secondary use" of health data — use for research, innovation, public health, and policymaking. A new category of authorized bodies ("health data access bodies") would evaluate and grant access to health data for secondary uses, with strict conditions on data security, data minimization, and prohibited uses.
Prohibited secondary uses: - Decisions detrimental to individuals based on their health data - Advertising or marketing - Increasing insurance premiums - Developing products or services that could be harmful - Making decisions about employment based on health data
Implications for VitraMed: If VitraMed expands to the EU, the EHDS would layer additional requirements on top of the GDPR and AI Act. VitraMed's predictive analytics — which constitute a secondary use of health data — would need to be authorized through a health data access body, with patients having the right to opt out.
Power Asymmetry: The EHDS attempts to rebalance the power asymmetry in health data governance by giving patients meaningful control over both primary and secondary uses of their data. But critics note that the opt-out mechanism for secondary use (rather than opt-in) may not adequately protect patients who are unaware of how their data is being used for research and innovation.
24.3.5 Clinical Trial Data Governance
Clinical trials generate enormous volumes of sensitive health data under conditions of particular governance intensity. The governance framework includes:
- Institutional Review Boards (IRBs) / Ethics Committees must approve all research protocols before data collection begins
- Informed consent must be specific, voluntary, and documented — a standard far more rigorous than commercial consent
- Good Clinical Practice (GCP) standards govern data collection, management, and reporting
- Data integrity requirements ensure that clinical trial data is attributable, legible, contemporaneous, original, and accurate (ALCOA principles)
The clinical trial data governance model is often held up as a standard that commercial health-tech should aspire to. As Dr. Adeyemi observed: "Clinical trials demonstrate that rigorous data governance is achievable even for highly sensitive health data. The fact that commercial health-tech companies operate with far less rigorous governance is not a technical limitation — it's a choice."
24.4 Education Data Governance
24.4.1 The Distinctive Vulnerability of Student Data
Education data governance operates in a context of particular vulnerability: the data subjects are often children and adolescents, the power asymmetry between institutions and students (and their parents) is extreme, and the data — academic records, behavioral observations, disciplinary actions, learning difficulties — can shape a young person's trajectory for years.
24.4.2 FERPA: The Foundation
The Family Educational Rights and Privacy Act (1974) is the foundational US education data protection law. FERPA applies to all educational institutions that receive federal funding — which includes virtually all public schools and most universities.
Key provisions: - Parent/student rights: Parents (and students over 18) have the right to inspect education records, request corrections, and consent to disclosures - Directory information exception: "Directory information" (name, address, phone number, enrollment dates, degrees received) can be disclosed without consent unless the student opts out - Legitimate educational interest exception: Education records can be disclosed to school officials with a "legitimate educational interest" — a broad exception that institutions define internally - Research exception: Education records can be disclosed for research purposes if certain conditions are met (including de-identification)
24.4.3 FERPA's Gaps in the Digital Age
FERPA was enacted in 1974 — before personal computers, before the internet, before learning management systems, before algorithmic student success predictions. Its gaps in the modern educational context are significant:
Ed-tech data. When a school adopts a third-party learning platform, students' interaction data — every click, every pause, every question answered — flows to the platform provider. FERPA's "school official" exception can be used to share data with ed-tech vendors designated as performing institutional functions, but the scope of data collected far exceeds what FERPA's drafters contemplated.
Learning analytics. Many institutions now use algorithmic systems to predict student success, flag at-risk students, and recommend interventions. These systems process education records at scale — generating predictions that can influence advising, financial aid, and even course enrollment. FERPA does not specifically address algorithmic decision-making about students.
The pandemic acceleration. The COVID-19 pandemic forced a rapid shift to remote learning, dramatically expanding the volume and sensitivity of education data flowing to third-party platforms. Proctoring software monitored students' faces, eyes, and environments during exams. Virtual classroom platforms recorded lectures and student participation. The governance infrastructure was not prepared for this expansion.
Consent Fiction: FERPA's consent model was designed for a paper-records world in which parents could review a physical file in the school office. In the digital age, education data flows through dozens of systems maintained by third-party vendors, many of which parents have never heard of. The formal consent structure exists; the practical ability to understand and control data flows does not.
24.4.4 Learning Analytics Governance
Learning analytics — the measurement, collection, analysis, and reporting of data about learners and their contexts — presents a governance challenge that sits at the intersection of education data, algorithmic decision-making, and student welfare.
The promise: Early warning systems can identify students at risk of dropping out and connect them with support services. Adaptive learning platforms can personalize instruction. Institutional research can identify systemic barriers to student success.
The concerns: - Profiling and labeling. Algorithmic predictions can create self-fulfilling prophecies — a student labeled "at risk" may receive different treatment (less challenging coursework, more intrusive monitoring) that reinforces the prediction. - Bias. Predictive models trained on historical data may encode historical disparities — predicting lower success for students from underrepresented backgrounds not because of individual characteristics but because of systemic barriers reflected in the training data. - Consent and autonomy. Students rarely consent to being analyzed algorithmically. The power asymmetry between institution and student makes meaningful consent difficult even when it is formally sought. - Purpose creep. Data collected for academic support may be repurposed for marketing, retention metrics, or institutional rankings.
Mira's work in the university's Office of Institutional Research made these concerns tangible: "We built a model to predict which first-year students were likely to struggle. The model was accurate — but its strongest predictive features were family income and parents' education level. We were essentially using a machine learning model to tell us that poor students and first-generation students face challenges. We didn't need an algorithm for that. What we needed was more financial aid and better advising."
"And the algorithm gave the institution something to point to," Eli added. "'We're using data to help students.' It sounds progressive. But if the intervention is monitoring rather than support, the algorithm becomes a tool for surveillance, not service."
24.5 Cross-Sector Patterns
Despite their differences, sector-specific governance frameworks reveal common patterns that illuminate general principles of data governance:
24.5.1 The Fiduciary Principle
In finance, health, and education, the entity handling data occupies a position of trust relative to the data subject. A bank's relationship with a depositor, a doctor's relationship with a patient, a school's relationship with a student — all involve an asymmetry of power, knowledge, and vulnerability that creates fiduciary-like obligations.
This principle has led legal scholars like Jack Balkin to propose the concept of information fiduciaries — organizations that, because of their access to personal data and the trust placed in them, should owe fiduciary duties (loyalty and care) to their data subjects.
24.5.2 The Minimum Necessary Principle
Each sector has developed a version of the minimum necessary principle: - HIPAA: The minimum necessary standard limits PHI access to what is needed for a particular purpose - Financial regulation: Need-to-know access controls are required by PCI-DSS and reinforced by banking regulators - Education: FERPA's legitimate educational interest exception is, in principle, a minimum necessary standard
This principle — that access to sensitive data should be limited to what is necessary for a defined purpose — is universal, though its implementation varies by sector.
24.5.3 Breach Notification
All three sectors require some form of breach notification: - HIPAA mandates notification to individuals, HHS, and (for large breaches) media - Financial regulators (OCC, SEC, state regulators) require breach notification - State student privacy laws and some FERPA interpretations require breach notification for education records
The trend is toward faster, more comprehensive notification — reflecting a recognition that timely information about breaches is essential for individuals to protect themselves.
24.5.4 The Challenge of Emerging Technology
Each sector is struggling to govern data practices that did not exist when its foundational regulatory framework was created:
| Sector | Foundational Law | Year | Emerging Challenge |
|---|---|---|---|
| Finance | GLBA | 1999 | Algorithmic trading, cryptocurrency, DeFi |
| Health | HIPAA | 1996 | Health apps, ML-based diagnostics, genetic data markets |
| Education | FERPA | 1974 | Learning analytics, proctoring software, ed-tech surveillance |
This pattern suggests a structural tension in sector-specific governance: foundational laws provide stability and predictability, but they struggle to keep pace with technological change. The result is a growing gap between what the law governs and what technology enables — a gap filled, inadequately, by guidance documents, enforcement actions, and industry self-regulation.
24.5.5 Regulatory Arbitrage
The layered, sector-specific nature of data governance creates opportunities for regulatory arbitrage — structuring activities to fall under the jurisdiction with the weakest requirements.
- A health app that avoids becoming a HIPAA business associate can process health data with no federal health privacy protection
- A financial services company that structures its AI system to avoid classification as a "consumer report" under FCRA can escape fair lending scrutiny
- An ed-tech platform that processes learning data as "directory information" can avoid FERPA's consent requirements
Accountability Gap: Regulatory arbitrage is one of the most consequential manifestations of the Accountability Gap. When entities can structure their activities to avoid the governance framework designed to address the harms their activities create, accountability breaks down — not through violation of the law but through its circumvention. Closing this gap requires regulators to think in terms of harm rather than category — regulating what data practices do, not just what entities are.
24.6 VitraMed's Health Sector Compliance
24.6.1 The Compliance Landscape
VitraMed, as a health-tech company operating as a HIPAA business associate and considering EU expansion, faces a layered compliance landscape:
| Requirement | Source | Status |
|---|---|---|
| HIPAA Privacy Rule | US federal | Compliant — BAAs with all clinic clients |
| HIPAA Security Rule | US federal | Compliant — annual risk assessments conducted |
| HIPAA Breach Notification | US federal | Compliant — incident response plan in place |
| State health privacy laws | Various US states | Partially compliant — some states have requirements exceeding HIPAA |
| GDPR (if EU expansion) | EU | In progress — data architecture redesign underway |
| EU AI Act (if EU expansion) | EU | In progress — risk scoring system requires conformity assessment |
| EHDS (if adopted and VitraMed operates in EU) | EU | Not yet applicable — monitoring developments |
24.6.2 The HIPAA Audit
In the governance timeline, VitraMed faces its first HIPAA audit in the Part 4 period. The audit — conducted by a third-party firm under a HHS Office for Civil Rights initiative — examined VitraMed's compliance across all three HIPAA rules.
Findings included: - Privacy Rule: Substantially compliant, with one finding: VitraMed's notice of privacy practices had not been updated to reflect the addition of predictive analytics capabilities. Patients were not informed that their data would be used for risk scoring. - Security Rule: Substantially compliant, with two findings: one workstation lacked automatic screen lock, and the company's risk assessment had not been updated after migrating to a new cloud infrastructure configuration. - Breach Notification: Compliant. No breaches had occurred, and the incident response plan was documented and tested.
The audit findings were minor but revelatory. The privacy notice issue — failing to inform patients about predictive analytics — connected directly to the Consent Fiction. VitraMed's consent forms, written when the company was a simple EHR tool, did not reflect what the company had become.
"We started as a digital filing cabinet for patient records," Vikram acknowledged to Mira. "Now we're a predictive analytics platform. But our consent forms still describe the filing cabinet."
24.7 NovaCorp's Financial Data Governance
24.7.1 Ray Zhao's Regulatory Stack
Ray Zhao's data governance program at NovaCorp had to navigate a complex regulatory stack:
- Basel Committee/BCBS 239: Risk data aggregation and reporting for banking regulators
- PCI-DSS: Payment card data security for the company's payment processing operations
- SOX (Sarbanes-Oxley): Financial reporting data integrity requirements
- SEC regulations: Regulatory reporting, algorithmic trading oversight, and customer data protection
- GLBA: Customer financial data privacy (Safeguards Rule)
- Fair Lending Laws: Anti-discrimination requirements for credit decision models
- State regulations: Various state-level financial data protection requirements
"Every regulation has its own definitions, its own reporting requirements, its own audit cycle," Ray told the class. "BCBS 239 defines 'critical data elements' one way. SOX defines 'financial reporting data' another way. PCI-DSS has its own scope definition. My team has to translate between frameworks constantly."
24.7.2 The Integration Challenge
Ray's key insight was that these frameworks, despite their different vocabularies and scopes, were asking similar questions:
- Is the data accurate?
- Is it accessible to those who need it and inaccessible to those who don't?
- Can we trace its lineage from source to report?
- Do we have governance structures with appropriate authority?
- Are we measuring and monitoring continuously?
Rather than building separate compliance programs for each regulation, Ray built a unified data governance program (Chapter 22) that addressed the common requirements, with sector-specific modules for the areas where frameworks diverged.
"Integration is the key," Ray said. "If you build separate compliance silos for each regulation, you end up with duplicated effort, inconsistent standards, and gaps between the silos. If you build one governance program and map each regulation to it, you get efficiency, consistency, and coverage."
Real-World Application: Ray's integration approach illustrates a best practice: rather than treating each regulatory requirement as a standalone project, build a governance foundation that addresses common principles (quality, security, lineage, access control, documentation) and map specific regulatory requirements to that foundation. This is more efficient, more sustainable, and produces better outcomes.
24.8 Chapter Summary
Key Concepts
- Sector-specific governance exists because some data categories — financial, health, educational — carry risks that general-purpose data protection law alone cannot address.
- Financial data governance includes Basel Committee standards (BCBS 239), PCI-DSS, open banking regulation (PSD2), and algorithmic trading requirements.
- HIPAA provides the US foundational framework for health data through Privacy, Security, and Breach Notification Rules — but its scope excludes health apps, data brokers, and many modern health-tech practices.
- The European Health Data Space proposes a comprehensive framework for both primary and secondary health data use.
- FERPA protects student education records but was designed before ed-tech, learning analytics, and algorithmic student profiling.
- Cross-sector patterns — fiduciary principles, minimum necessary access, breach notification, regulatory lag, and regulatory arbitrage — reveal universal governance principles embedded in sector-specific frameworks.
- Integration across regulatory frameworks, rather than separate compliance silos, is a best practice for organizations operating under multiple governance regimes.
Key Debates
- Should health apps that collect health data be subject to HIPAA-level regulation?
- Does open banking genuinely empower consumers, or does it create new consent complexity and security risks?
- Should learning analytics require affirmative student consent, or is an opt-out model sufficient?
- Is the "information fiduciary" concept a useful framework for extending sector-specific duties of care to technology companies?
Applied Framework
When analyzing sector-specific data governance: 1. Identify the sector: What regulatory frameworks apply to this data based on its sector? 2. Layer the requirements: What general-purpose data protection law also applies? 3. Identify gaps: What data practices fall outside the sector-specific framework's scope? 4. Check for arbitrage: Is the entity structured to avoid governance requirements that its practices warrant? 5. Apply cross-sector principles: Regardless of sector, are fiduciary duties, minimum necessity, breach notification, and quality standards being met?
What's Next
In Chapter 25: Enforcement, Compliance, and the Limits of Law, we close Part 4 by examining what happens when governance frameworks meet reality — how data protection authorities enforce the law, what the major enforcement actions reveal about regulatory priorities, what happens when regulators are captured by the industries they regulate, and, fundamentally, what law can and cannot achieve in the governance of data. Eli testifies before the Detroit city council. Sofia advocates for stronger enforcement. And we confront the gap between compliance and ethics.
Before moving on, complete the exercises and quiz to solidify your understanding of sector-specific governance.