Exercises: Digital Divide, Data Justice, and Equity
These exercises progress from concept checks to challenging applications. Estimated completion time: 3-4 hours.
Difficulty Guide: - Star-1 Foundational (5-10 min each) - Star-2 Intermediate (10-20 min each) - Star-3 Challenging (20-40 min each) - Star-4 Advanced/Research (40+ min each)
Part A: Conceptual Understanding (Star-1)
Test your grasp of core concepts from Chapter 32.
A.1. Section 32.1.1 identifies three levels of the digital divide: access, skills, and outcomes. Explain each level and describe how Dr. Adeyemi's observation that "the three levels don't add — they multiply" applies to a specific population (e.g., elderly residents in a rural community, low-income students in an urban neighborhood, or immigrants with limited English proficiency).
A.2. Define digital redlining as presented in Section 32.2. How does it differ from historical physical redlining, and how is it similar? Identify at least two mechanisms through which digital redlining operates.
A.3. Section 32.3 introduces the concept of data colonialism. Using the comparison table in Section 32.3.1 (which maps colonial dynamics to data colonial dynamics), explain in your own words what it means to say that "terms of service" function similarly to the legal mechanisms of colonial land seizure.
A.4. What are the CARE Principles for Indigenous Data Governance (Section 32.4.2)? Explain each letter and describe how the CARE Principles complement rather than compete with the FAIR Principles.
A.5. List and briefly define at least five of the seven principles of data feminism from Section 32.5.1. For each, provide one concrete example of how the principle could be applied to a real data system.
A.6. Explain what "missing data" means in the context of data feminism (Section 32.5.2). Why does Dr. Adeyemi argue that "silence in data is not just an absence" but rather "a statement about whose experiences are worth measuring"?
A.7. Section 32.8.1 presents Linnet Taylor's three pillars of data justice: (in)visibility, engagement with technology, and non-discrimination. Explain the tension between the first pillar's two dimensions — being visible can enable access to services but also enable surveillance. Provide one example where a marginalized community might benefit from greater data visibility and one where greater visibility would cause harm.
Part B: Applied Analysis (Star-2)
Analyze scenarios, arguments, and real-world situations using concepts from Chapter 32.
B.1. Consider the following scenario:
A city government partners with a ride-hailing company to analyze anonymized trip data for the purpose of improving public transit. The analysis reveals that certain neighborhoods — predominantly low-income neighborhoods of color — generate significantly fewer ride-hailing trips. The city interprets this as evidence of "lower demand" for transit services in those neighborhoods and proposes reducing bus routes accordingly.
Using concepts from this chapter, identify at least three flaws in the city's reasoning. Explain how the data divide (Section 32.1.3) and digital redlining (Section 32.2) contribute to the misinterpretation.
B.2. Eli describes his neighborhood as "the missing data" (Section 32.1.3) — a place that "doesn't show up in Google Maps the way downtown Detroit does." Apply the data feminism concept of "missing data" (Section 32.5.2) to Eli's observation. What specific consequences does this invisibility have for residents? How does it connect to the algorithmic bias patterns examined in Chapter 14?
B.3. VitraMed's equity audit (Section 32.6.2) revealed that its predictive health models were 23% less sensitive for patients in rural ZIP codes and had 31% higher false negative rates for Black patients. Using the Data Equity Audit framework from Section 32.9, walk through all five steps (Representation, Access, Benefit, Harm, Governance) applied to VitraMed's situation. At which step is the most critical intervention needed?
B.4. Section 32.3.2 describes platform dependency in the Global South, where communities depend on platforms controlled by Global North companies for basic communication and economic activity. Analyze this relationship using the data colonialism framework. Then evaluate the following counterargument: "These platforms provide valuable services to communities that would otherwise have no digital tools. The relationship is mutually beneficial, not colonial." What does the counterargument get right, and what does it miss?
B.5. Section 32.4.3 describes the Maori Data Sovereignty Network (Te Mana Raraunga) and the First Nations Information Governance Centre (FNIGC). Compare these two institutional models. What principles do they share? How might their approaches inform data governance for other marginalized communities (e.g., Black communities in the US, Roma communities in Europe, or refugee populations)?
B.6. Section 32.5.3 describes "counter-data practices" — community-controlled data collection that challenges dominant narratives. The Anti-Eviction Mapping Project is cited as an example. Identify a data gap in your own community (a problem that is not well-captured by existing data systems) and propose a counter-data practice that could address it. Describe: (a) what data would be collected, (b) by whom, (c) using what methods, and (d) how it would challenge the existing data narrative.
Part C: Real-World Application Challenges (Star-2 to Star-3)
These exercises ask you to investigate digital equity in your own environment.
C.1. (Star-2) Broadband Access Audit. Using the FCC's Broadband Map (broadbandmap.fcc.gov) or an equivalent resource, look up broadband availability in two different neighborhoods: one affluent and one low-income. Compare: (a) the number of providers available, (b) the maximum speeds offered, (c) the approximate monthly cost for comparable service. Document your findings and analyze whether they are consistent with the digital redlining patterns described in Section 32.2.
C.2. (Star-2) Data Representation Check. Choose a dataset that affects public policy in your area (e.g., census data, crime statistics, health data, educational outcomes). Investigate: (a) which populations are well-represented in the data, (b) which populations are likely underrepresented, and (c) what consequences the underrepresentation might have for policy decisions. Write a one-page analysis.
C.3. (Star-3) Platform Dependency Analysis. Select a community you belong to or are familiar with (defined by geography, identity, profession, or interest). Map the digital platforms that community depends on for communication, economic activity, information access, and civic participation. For each platform, identify: (a) who controls it, (b) what data it extracts, (c) what value flows back to the community, and (d) whether alternatives exist. Present your analysis as a stakeholder map.
C.4. (Star-3) Equity Impact Assessment. Choose an algorithmic system that affects your life (a credit scoring system, a university admissions tool, a healthcare algorithm, a content recommendation system). Apply the Data Equity Audit framework (Representation, Access, Benefit, Harm, Governance) to evaluate whether the system is equitable. Write a two-page assessment with specific recommendations.
Part D: Synthesis & Critical Thinking (Star-3)
These questions require you to integrate multiple concepts from Chapter 32 and think beyond the material presented.
D.1. The chapter argues that "individual rights — privacy rights, data access rights, consent rights — are necessary but not sufficient for data equity" (Section 32.8.2). Construct a detailed argument explaining why individual rights alone cannot achieve data justice. In your argument, reference at least three specific barriers (from the digital divide, data colonialism, or missing data) that prevent individual rights from producing equitable outcomes. Then propose one collective mechanism that could address these barriers.
D.2. Eli argues that "digital literacy must include not just technical skills but political literacy" (Section 32.7.2). What does Eli mean by "political literacy" in the context of digital technology? Design a brief curriculum (three to five learning objectives) for a "political digital literacy" program targeting a specific community. Explain what each objective covers and why it matters for data justice.
D.3. The data colonialism framework (Section 32.3) draws explicit parallels between historical colonialism and contemporary data extraction. Some scholars argue that this comparison is illuminating; others argue that it trivializes historical colonialism by metaphorical extension. Write a balanced evaluation of this debate. What insights does the data colonialism framework provide that other frameworks (e.g., surveillance capitalism, platform capitalism) do not? What are its limitations?
D.4. Section 32.6 describes VitraMed's equity audit and the steps the company took in response. Mira observes: "Our model works best for the patients who already have the best healthcare and worst for the patients who already have the worst." Using this observation as a starting point, argue that algorithmic equity audits should be legally required for all health technology systems that receive public funding. Address potential counterarguments (cost, competitive disadvantage, the difficulty of defining "equity") in your analysis.
Part E: Research & Extension (Star-4)
These are open-ended projects for students seeking deeper engagement. Each requires independent research beyond the textbook.
E.1. Indigenous Data Sovereignty in Practice. Research one specific indigenous data sovereignty initiative (the Maori Data Sovereignty Network, the FNIGC, the US Indigenous Data Sovereignty Network, or another). Write a 1,000-word report covering: (a) the organization's history and mission, (b) the governance principles it applies, (c) specific examples of how it has asserted data sovereignty in practice, (d) challenges it faces, and (e) lessons that non-indigenous data governance frameworks could learn from its approach. Use at least four sources beyond this textbook.
E.2. The Data Divide in Your Community. Conduct a small-scale investigation of the digital divide in your local community. Interview at least three people from different demographic backgrounds (varying by age, income, or geography) about their digital access, skills, and outcomes. Document: (a) their internet access quality and cost, (b) their comfort with digital tools, (c) how they use digital services for education, employment, healthcare, and civic participation, and (d) barriers they encounter. Write a report (800-1,200 words) connecting your findings to the three levels of the digital divide described in Section 32.1.
E.3. Data Feminism Applied. Select one of the seven principles of data feminism (Section 32.5.1) and apply it to a specific data system or practice in an organization you are familiar with (your university, employer, or a public service you use). Write a detailed analysis (800-1,200 words) that: (a) describes the data system, (b) applies the selected principle, (c) identifies what the analysis reveals, and (d) proposes specific changes that would bring the system into alignment with the principle.
Solutions
Selected solutions are available in appendices/answers-to-selected.md.