Quiz: Your Responsibility — From Knowledge to Action (Comprehensive)

This is a comprehensive quiz covering themes across the entire textbook. Target: 70% or higher.


Section 1: Multiple Choice — Core Concepts (1 point each)

1. The "consent fiction" — a recurring theme throughout this textbook — refers to:

  • A) The legal requirement that organizations obtain consent before collecting data
  • B) The gap between the formal mechanisms of consent (clicking "I agree," checking a box) and meaningful, informed, voluntary authorization — where consent creates an appearance of legitimacy without the substance
  • C) The practice of fabricating consent records
  • D) The philosophical impossibility of consent in any context
Answer **B)** The gap between the formal mechanisms of consent and meaningful, informed, voluntary authorization. *Explanation:* The consent fiction is introduced in Chapter 1 and traced through every subsequent chapter. It manifests as unreadable privacy policies (Ch. 9), consent under information asymmetry and structural coercion (Ch. 11), consent to black-box algorithmic decisions (Ch. 16), consent from children who cannot comprehend terms (Ch. 35), consent to surveillance programs whose existence is classified (Ch. 36), and consent to neural data collection that is involuntary (Ch. 38). In each case, the formal mechanism of consent exists, but the conditions for genuine consent do not.

2. The chapter identifies four structural causes of the accountability gap. Which of the following is NOT one of them?

  • A) Diffusion of responsibility across multiple actors
  • B) Temporal displacement between decisions and harms
  • C) Excessive regulatory enforcement
  • D) Informational asymmetry between those who cause harm and those who suffer it
Answer **C)** Excessive regulatory enforcement. *Explanation:* Section 40.1.3 identifies four structural causes of the accountability gap: diffusion of responsibility, temporal displacement, informational asymmetry, and jurisdictional fragmentation. Excessive enforcement is not one of them — in fact, under-enforcement is a contributing factor to the accountability gap throughout the textbook (Chapters 25, 30).

3. Mira's Reformed Governance Framework for VitraMed was adopted by the board as:

  • A) A permanent, company-wide mandate with all provisions approved
  • B) A two-year pilot with the five pillars accepted in principle but the executive compensation provision deferred
  • C) A voluntary suggestion with no binding authority
  • D) An external regulatory requirement imposed by a data protection authority
Answer **B)** A two-year pilot with the five pillars accepted in principle but the executive compensation provision deferred. *Explanation:* Section 40.3.1 describes the board's response to Mira's presentation. The five governance pillars were accepted, funding was allocated for implementation, but the provision linking executive compensation to governance outcomes was deferred. Mira recognized this as "not a revolution" but "a beginning" — illustrating the reality of institutional change: incremental, imperfect, and requiring sustained effort.

4. Dr. Adeyemi's final question to the class was:

  • A) "What technology will define the next decade?"
  • B) "What is your responsibility?"
  • C) "Which governance model is best?"
  • D) "Will you pursue a career in data governance?"
Answer **B)** "What is your responsibility?" *Explanation:* Section 40.4 describes Dr. Adeyemi's closing lecture, in which she told the class three things and asked one question. The question — "What is your responsibility?" — was deliberately left unanswered, because the chapter argues that the answer must be given not in words but in careers, choices, and lives.

5. The Practitioner's Oath proposed in this chapter includes all of the following provisions EXCEPT:

  • A) "I will respect persons"
  • B) "I will seek fairness"
  • C) "I will maximize shareholder value through data"
  • D) "I will speak up"
Answer **C)** "I will maximize shareholder value through data." *Explanation:* Section 40.8 presents the Practitioner's Oath with eight provisions: respect persons, seek fairness, practice transparency, accept accountability, consider power, anticipate, speak up, and keep learning. Maximizing shareholder value is not among them — the oath explicitly prioritizes the rights and dignity of data subjects over organizational interests, reflecting the textbook's argument that data governance must serve people, not systems.

6. Ray Zhao told Dr. Adeyemi's class that the moments that defined his ethical practice were:

  • A) The big, dramatic decisions that made headlines
  • B) The small, ordinary decisions — insisting on deleting a dataset, flagging a vendor contract, requiring an equity audit — that set precedents which became culture
  • C) The moments when he received awards for ethical leadership
  • D) The times he followed instructions from the CEO without question
Answer **B)** The small, ordinary decisions that set precedents which became culture. *Explanation:* Section 40.2.4 quotes Ray Zhao describing how ethical practice is built through thousands of ordinary decisions, not dramatic moments. This connects to the chapter's broader argument that ethical character is cultivated over time through consistent practice, not demonstrated in rare moments of crisis.

Section 2: True/False with Justification — Cross-Textbook (1 point each)

7. "The power asymmetry theme applies only to the relationship between corporations and individuals, not to government-citizen or Global North-Global South relationships."

Answer **False.** *Explanation:* The power asymmetry theme is traced across multiple domains: corporations and individuals (Chapters 3-4, 11), algorithms and subjects (Chapters 13-16), governments and citizens (Chapters 8, 36), the Global North and Global South (Chapter 37), platforms and children (Chapter 35), and emerging technology developers and affected populations (Chapter 38). The chapter explicitly states that power asymmetry connects to every other theme in the textbook.

8. "The textbook argues that individual ethics is sufficient for responsible data governance — if every individual acts ethically, governance structures are unnecessary."

Answer **False.** *Explanation:* Section 40.5.2 explicitly states: "Individual ethics without structural support is fragile — one principled individual cannot reform an unethical organization by force of character alone." The textbook consistently argues for structural governance — laws, institutions, audits, oversight mechanisms — as the essential complement to individual ethics. The Practitioner's Oath is a starting point, not a substitute for governance.

9. "Eli's Community Data Governance Charter has no practical power because it lacks legal authority."

Answer **False, with qualification.** *Explanation:* The charter lacks direct legal authority, and Eli acknowledges this: "The city can still ignore us. The sensors are still there. The power imbalance is still real." But the charter provides a coherent demand backed by community consensus, a negotiating position for engagement with the city government, and a template that DataRights Alliance will share with communities in twelve other cities. Its power is political and organizational, not legal — but as the textbook consistently demonstrates (Chapters 37, 39), governance change often begins with non-legal mechanisms that build political pressure for institutional change.

10. "The textbook presents data governance as a technical field requiring specialized expertise that most citizens cannot understand or participate in."

Answer **False.** *Explanation:* The textbook consistently argues the opposite — that data governance is a social and political question, not merely a technical one, and that affected communities can and should participate in governance decisions. Chapter 39 provides extensive evidence that citizen assemblies, data cooperatives, and participatory processes produce informed governance when citizens are given adequate time, information, and institutional support. Dr. Adeyemi's final lecture explicitly charges *every* student — not just the technically specialized ones — with the responsibility to engage in data governance.

Section 3: Short Answer — Synthesis (2 points each)

11. Explain the concept of "ethical debt" as it operates in the VitraMed thread. How does ethical debt accumulate, and what is the cost of deferred governance decisions?

Answer Ethical debt, like technical debt, accumulates through small governance shortcuts that are individually defensible but collectively dangerous. In the VitraMed thread: a consent mechanism that was slightly too perfunctory (early chapters), a bias audit that was slightly too cursory (Chapters 14-15), a data retention policy slightly too loose (Chapter 10), and internal ethics programs that existed on paper but hadn't been tested by adversity (Chapter 26). Each shortcut was individually defensible under time and resource constraints. Collectively, they created an organization that was one crisis away from catastrophic failure — and the data breach (Chapter 30) provided that crisis. The cost of ethical debt is not merely the direct harm of the eventual failure but the systemic degradation of governance capacity: once shortcuts become normalized, they are no longer recognized as shortcuts. The "debt" metaphor captures the compounding nature of governance failures — each deferred decision makes the next deferral easier and the eventual reckoning more severe.

12. The textbook identifies the Practitioner's Oath provision "I will speak up" as important but acknowledges that speaking up has costs. Using specific examples from the textbook, describe the costs of speaking up and explain why governance structures (rather than individual heroism) are the more sustainable solution.

Answer The textbook documents the costs of speaking up through real-world examples: Frances Haugen lost her job, faced legal threats, and endured years of public scrutiny after leaking Facebook documents. Timnit Gebru was fired from Google after co-authoring a paper on AI ethics risks. Christopher Wylie faced legal action and industry ostracism after exposing Cambridge Analytica. The chapter argues that while individual moral courage is necessary, relying on individual heroism is neither fair nor sustainable. Governance structures — mandatory algorithmic audits, independent oversight bodies, whistleblower protection laws, robust internal ethics programs — are more sustainable because they catch problems before they become crises, distribute the burden of accountability across institutions rather than placing it on individuals, and create institutional cultures where raising concerns is normal rather than heroic. The goal is a governance environment where whistleblowing is rarely needed because structural mechanisms identify and address problems proactively.

13. Dr. Adeyemi's three things in her final lecture are: (1) every data system is a human system that can be changed by different human decisions; (2) knowledge creates obligation; (3) the people in the room will determine the future of data governance. Connect each of these three statements to specific concepts from the textbook.

Answer (1) "Every data system is a human system" connects to the anti-technological-determinism argument developed throughout the textbook and stated explicitly in Chapter 38. The GDPR was written by people who made choices. VitraMed's algorithm was designed by engineers who made optimization decisions. Surveillance systems were deployed by officials who chose where to place sensors. If systems were designed by human choices, they can be redesigned by different choices. (2) "Knowledge creates obligation" connects to the epistemic obligation concept introduced in Section 40.7.2. Once students understand the consent fiction, the power asymmetry, the accountability gap, and the mechanisms of algorithmic bias, they cannot participate in systems that perpetuate these failures without moral cost. This draws on Simone Weil's principle that "attention is the rarest and purest form of generosity" — having attended to these questions obligates action. (3) "The people in the room will determine the future" connects to the participatory governance argument of Chapter 39 and the agency argument that runs through the entire book. The future of data governance is not determined by technology (technological determinism is false) or by markets (the economics of privacy, Chapter 11, shows that market outcomes are not natural but reflect power dynamics). It is determined by people — in legislatures, boardrooms, community centers, and classrooms — who choose to govern differently.

Section 4: Scenario Analysis — Comprehensive (3 points each)

14. You have completed this course and begun your first job as a data analyst at a healthcare company. In your second week, you discover that the company's patient risk-scoring model was trained on data from a single hospital system that serves a predominantly affluent, white patient population. The model is now being deployed to score patients across a network that includes hospitals serving diverse, lower-income communities. Your supervisor says the model "performs well on our test data." Drawing on Chapters 14, 15, 17, 26, 30, and 40, analyze this situation. What are the risks? What should you do? What governance structures should exist?

Answer **Risks:** The model is likely to perform poorly — and potentially harmfully — for patients whose demographic and clinical profiles differ from the training data. As documented in Chapters 14-15, models trained on unrepresentative data systematically under-serve underrepresented populations. In a healthcare context, this means patients from lower-income and minority communities may receive less accurate risk scores, leading to missed interventions for high-risk patients and unnecessary interventions for low-risk patients. This is the VitraMed pattern (Chapters 13-19, 30) — a model that "performs well" in aggregate while systematically failing specific populations. **What you should do:** Following the Practitioner's Oath's "speak up" provision, raise the concern with your supervisor — clearly, professionally, and with specific evidence. If the concern is not addressed, escalate to the ethics committee or data governance officer (if one exists). Document your concern in writing. If the company has no governance mechanism for addressing algorithmic bias, propose one — citing the governance frameworks from Chapters 17 and 26. **What governance structures should exist:** A pre-deployment bias audit (Chapter 17) that tests the model across demographic subgroups before deployment. An algorithmic impact assessment (Chapter 28) that evaluates the model's effects on different patient populations. An ongoing monitoring system that tracks model performance across demographic groups after deployment. An escalation pathway that allows any employee to raise bias concerns without retaliation. And — ideally — input from the communities the model will serve, through participatory governance mechanisms (Chapter 39).

15. Ten years from now, you are a senior data governance professional. You are asked to design the governance framework for a national digital twin that will model the economy, infrastructure, and population of your country. The digital twin will be used for economic planning, disaster response, and public health management. Drawing on Chapters 1, 8, 10, 22, 36, 38, 39, and 40, outline the key governance principles and structures you would include. Address: data collection, consent, access control, algorithmic governance, community participation, oversight, and limitations on use.

Answer **Data collection:** Apply data minimization (Ch. 10) at every layer. The twin should use the minimum granularity necessary for each application — aggregate data for economic planning, anonymized data for infrastructure modeling, with individual-level data used only when absolutely necessary and subject to heightened governance. Collect what is needed; do not build a comprehensive surveillance infrastructure "just in case." **Consent:** Individual consent is structurally inadequate for a national digital twin (Ch. 38 — ambient intelligence makes individual consent obsolete). Instead, implement democratic authorization: the twin's data collection parameters should be approved through a legislative process with public debate. A citizen assembly (Ch. 39) should review and recommend data governance policies. **Access control:** Tiered access reflecting the sensitivity of data and the purpose of use. Public dashboard for citizens showing aggregate information. Researcher access through a governed data access process (similar to Stats NZ's IDI governance). Government access for enumerated, legislatively authorized purposes only. Strict prohibition on law enforcement access without judicial authorization — learning from the FISA Court's failures (Ch. 36). **Algorithmic governance:** All predictive models used within the twin must be subject to algorithmic impact assessments (Ch. 17, 28), bias audits, and transparency requirements. Models that produce predictions about specific communities must be validated with those communities' input. **Community participation:** A permanent citizen oversight body — a standing citizen assembly for the digital twin — with meaningful authority to review governance policies, assess proposed new uses, and recommend limitations. Maori data sovereignty principles (Ch. 39 case study) provide a model for ensuring that indigenous communities have governance authority over data about their communities. **Oversight:** Independent oversight body with access to the twin's data, algorithms, and usage logs. Regular public reporting. Mandatory periodic review (every three years) with public consultation. **Limitations on use:** Explicitly prohibited uses, legislatively defined: no use for political surveillance, no profiling of political or religious groups, no use in immigration enforcement, no sale or sharing with commercial entities. Purpose limitation enforced through technical architecture (not just policy), with audit trails for all access. Sunset clauses requiring reauthorization of the twin's mandate.

Solutions

Selected solutions are available in appendices/answers-to-selected.md.