Exercises: Emerging Technologies and Anticipatory Governance

These exercises progress from concept checks to challenging applications. Estimated completion time: 3-4 hours.

Difficulty Guide: - ⭐ Foundational (5-10 min each) - ⭐⭐ Intermediate (10-20 min each) - ⭐⭐⭐ Challenging (20-40 min each) - ⭐⭐⭐⭐ Advanced/Research (40+ min each)


Part A: Conceptual Understanding ⭐

Test your grasp of core concepts from Chapter 38.

A.1. Define the Collingridge dilemma. Why is it described as a "dilemma" rather than merely a "challenge"? What makes both horns of the dilemma genuinely difficult?

A.2. Explain the "pacing problem" in technology regulation. How does the pacing problem relate to the Collingridge dilemma? Are they the same concept, or do they describe different aspects of governance difficulty?

A.3. Define anticipatory governance. How does it differ from reactive regulation? What capabilities must a governance institution have to practice anticipatory governance effectively?

A.4. Explain why quantum computing threatens current encryption systems. What is "post-quantum cryptography," and why is preparation for the quantum threat an example of anticipatory governance?

A.5. What is "neural data," and why do brain-computer interfaces pose consent challenges that go beyond those of any technology we have studied so far? Explain the concept of "cognitive liberty."

A.6. Define the precautionary principle. How does it differ from a standard risk-benefit analysis? Under what conditions is the precautionary principle most appropriate, and when might it be counterproductive?

A.7. What is a "regulatory sandbox"? Explain how sandboxes attempt to resolve the Collingridge dilemma. What are their limitations?


Part B: Applied Analysis ⭐⭐

Analyze scenarios, arguments, and real-world situations.

B.1. A technology company develops a brain-computer interface (BCI) that allows users to type text by thinking. The device collects raw neural activity data continuously while in use. The company's privacy policy states that "neural activity data is processed on-device and not transmitted to our servers." However, "aggregate usage patterns" and "de-identified neural metrics" are transmitted. Analyze the governance implications. Is the "on-device processing" claim meaningful? What does "de-identified neural metrics" likely mean, and why might it not provide adequate protection?

B.2. Section 38.3 discusses the Internet of Things (IoT) at scale — billions of ambient sensors embedded in everyday objects and environments. Apply the concept of "ambient intelligence" to Eli's neighborhood in Detroit. If every lamppost, traffic signal, building entrance, and public bench contained sensors collecting environmental, acoustic, visual, and radio-frequency data, how would individual consent and privacy rights function? Propose an alternative governance framework appropriate for ambient intelligence environments.

B.3. A city government proposes creating a "digital twin" — a detailed computational model of the city that simulates traffic patterns, energy usage, emergency response, and population movement in real time. The digital twin would be used for urban planning. Analyze this proposal using the six-question framework: What data is collected? By whom? For what purpose? What unstated purposes? Who benefits and who bears the risk? What governance exists?

B.4. A pharmaceutical company uses a BCI clinical trial to collect neural data from patients with neurological conditions. The trial consent form states that "data may be used for research purposes." After the trial, the company discovers that the neural data can predict consumer preferences with high accuracy. The company considers licensing this data to an advertising firm. Analyze this scenario using the consent fiction and ethical debt frameworks. Where does governance fail?

B.5. Compare two approaches to governing autonomous vehicles: (a) the precautionary principle (ban autonomous vehicles from public roads until they are proven safe to a high standard of confidence) and (b) adaptive governance (allow deployment under restricted conditions, monitor outcomes, and adjust rules as evidence accumulates). For each approach, identify the strengths, weaknesses, and the populations most likely to bear the costs of its failures.

B.6. Section 38.5 discusses "technological determinism" — the belief that technology develops according to its own logic and cannot be meaningfully shaped by governance. Dr. Adeyemi challenges this belief. Present the strongest argument for technological determinism and then the strongest argument against it. Which position do you find more persuasive, and why?


Part C: Real-World Application Challenges ⭐⭐-⭐⭐⭐

C.1. ⭐⭐ Emerging Technology Governance Scan. Select one emerging technology not discussed in depth in the chapter (e.g., gene editing/CRISPR, deepfake detection, autonomous weapons, synthetic biology, augmented reality). Write a one-page governance assessment covering: (a) the technology's current state of development, (b) the data governance challenges it will create, (c) existing governance frameworks (if any), and (d) a recommendation for how anticipatory governance should be applied.

C.2. ⭐⭐ Regulatory Sandbox Inventory. Research two existing regulatory sandboxes (e.g., the UK's FCA fintech sandbox, Singapore's AI sandbox, the EU's AI Act sandbox provisions). For each, describe: (a) the scope, (b) the participants, (c) the governance rules, (d) the outcomes to date, and (e) your assessment of whether the sandbox model is effectively addressing the Collingridge dilemma.

C.3. ⭐⭐⭐ Post-Quantum Preparedness Assessment. Research the current state of post-quantum cryptography standardization (e.g., NIST's post-quantum cryptography project). Write a briefing (800-1,000 words) covering: (a) which algorithms have been standardized or are under consideration, (b) the timeline for implementation, (c) the risks of delayed transition, and (d) the governance implications for organizations that currently rely on encryption vulnerable to quantum attack.

C.4. ⭐⭐⭐ Digital Twin Governance Framework. Design a governance framework for a city-scale digital twin. Your framework should address: (a) what data is collected and from whom, (b) who controls the digital twin, (c) how affected communities participate in governance, (d) what limitations exist on data use, (e) how the framework handles accuracy, bias, and error, and (f) what transparency mechanisms are in place. Present your framework in a structured one-page document.


Part D: Synthesis & Critical Thinking ⭐⭐⭐

D.1. The Collingridge dilemma states that early governance is possible but uninformed, and late governance is informed but difficult. But the chapter also argues that anticipatory governance can partially resolve this dilemma. Evaluate this claim critically. Under what conditions can anticipatory governance succeed? Under what conditions does the Collingridge dilemma remain unresolved? Use at least two specific technologies from the chapter to illustrate your analysis.

D.2. Connect the emerging technologies discussed in this chapter to the four recurring threads:

  • Power asymmetry: How will quantum computing, BCIs, IoT, and digital twins affect the distribution of data power?
  • Consent fiction: Which emerging technology will make consent most fictional, and why?
  • Accountability gap: Which emerging technology will create the widest accountability gap between harm and remedy?
  • Ethical debt: Where is ethical debt currently accumulating in emerging technology development?

Write a 500-word synthesis.

D.3. Section 38.6 argues that the Global South (Chapter 37) will be disproportionately affected by emerging technologies because it has the least governance capacity to address them and the least influence over how they are developed. Evaluate this argument. Is the Global South inevitably a "taker" of technology governance frameworks, or can the participatory models discussed in Chapter 37 (data cooperatives, community governance) provide anticipatory governance from the bottom up?

D.4. Dr. Adeyemi asks: "Every technology we have studied in this course — social media, algorithmic decision-making, surveillance systems — was once 'emerging.' The governance failures we documented were foreseeable at the emerging stage. What would it have taken to govern them well from the beginning?" Write a response that draws on at least three specific technologies from earlier chapters.


Part E: Research & Extension ⭐⭐⭐⭐

E.1. Neurorights Legislation. Research Chile's 2021 constitutional amendment on neurorights — the first national legislation specifically addressing neural data governance. Write a 1,000-word analysis covering: (a) the amendment's provisions, (b) the political context, (c) its enforcement mechanisms, (d) its limitations, and (e) its implications for other countries considering neural data governance.

E.2. Quantum Computing and Data Governance. Write a 1,200-word research report on the implications of quantum computing for data governance, covering: (a) the "harvest now, decrypt later" threat, (b) the NIST post-quantum cryptography standardization process, (c) the transition challenges for organizations, and (d) governance frameworks for managing the quantum transition.

E.3. Anticipatory Governance in Practice. Research one organization or initiative that practices anticipatory governance (e.g., the Foresight Institute, the World Economic Forum's Global Technology Governance initiative, the IEEE's Ethically Aligned Design). Write a 1,000-word assessment of: (a) the organization's methods, (b) specific governance recommendations it has produced, (c) evidence of policy impact, and (d) limitations of its approach.


Solutions

Selected solutions are available in appendices/answers-to-selected.md.