Chapter 35: Exercises

Global Disparities: How Algorithmic Addiction Hits Different Around the World


Section A: Comprehension and Analysis

Exercise 1 [Definitional] Explain the "Facebook IS the internet" phenomenon in your own words. What specific conditions produce this situation, and what are its specific consequences for misinformation vulnerability, filter bubbles, and democratic discourse? How does it differ from the filter bubble conditions described for high-income country users?

Exercise 2 [Analysis] The Myanmar case involves several intersecting factors: pre-existing ethnic tension, Facebook-as-primary-information-medium, absent Burmese moderation, and engagement-optimization algorithm. Which of these factors do you consider most central to the catastrophic outcome? Could removing any single factor have prevented the outcome? Write a 300-word analysis.

Exercise 3 [Comparison] Compare the misinformation dynamics in Myanmar (Facebook feed-based) and Brazil (WhatsApp group-based). What are the key architectural differences between the two platforms, and how do those differences produce different misinformation dynamics? What countermeasures are available in each case that are not available in the other?

Exercise 4 [Critical Thinking] The chapter describes Facebook's investment in content moderation as following advertising revenue rather than social impact. Is this a legitimate business decision, a moral failure, or both? What alternative incentive structure would produce different investment patterns? Write a 300-word analysis.

Exercise 5 [Framework Application] Apply the concept of "context collapse" — the collapse of contextual norms when content designed for one audience reaches another — to the global deployment of platforms designed for North American users. What specific contextual assumptions are embedded in Facebook's design that do not transfer to non-Western contexts?

Exercise 6 [Conceptual] Explain the digital colonialism critique in your own words. What are the strongest arguments for the parallel, and where does the analogy break down? What policy implications does the critique support that standard development economics frameworks would not?

Exercise 7 [Evidence Evaluation] The UN Fact-Finding Mission concluded that Facebook played a "determining role" in spreading hate speech that contributed to ethnic cleansing in Myanmar. What does "determining role" mean, and how would you evaluate this causal claim? What evidence would you need to accept or reject it? What alternative explanations would a Facebook defender offer?


Section B: Research and Investigation

Exercise 8 [Research] Research the current status of legal proceedings against Meta/Facebook related to Myanmar. Identify at least two different legal actions (by country or jurisdiction), the legal theories being argued, and the current procedural status. Write a 400-word summary assessing the prospects for legal accountability and what the barriers are.

Exercise 9 [Country Analysis] Choose one country in the Global South (not Myanmar or Brazil) that has experienced documented cases of social media-mediated violence or political manipulation. Research: (a) the specific platform(s) involved, (b) the specific harms documented, (c) the local context (media landscape, political environment, ethnic tensions, etc.), (d) platform and regulatory responses, and (e) what, if anything, changed as a result. Write a 500-word case analysis.

Exercise 10 [Language Disparity Research] Research the language distribution of Facebook's content moderation infrastructure. What languages have full moderation support? What languages are underserved? What are the documented consequences of moderation gaps for specific language communities? Write a 400-word analysis with attention to the gap between the number of languages with adequate moderation and the number of languages in which significant platform harm has been documented.

Exercise 11 [TikTok Investigation] Research the current legal and regulatory status of TikTok in the United States and in at least two other countries. What specific concerns have been raised? What regulatory actions have been taken? What has been the outcome of those actions as of your research date? Write a 400-word analysis assessing whether the regulatory responses you find are proportionate to the documented concerns.

Exercise 12 [Algorithmic Bias Research] Research the documented evidence of skin tone bias in social media image processing and recommendation systems. Sources might include academic papers by Joy Buolamwini, Timnit Gebru, and colleagues; internal platform documents that have been leaked; or journalistic investigations. Write a 400-word summary of the evidence and its implications.


Section C: Design and Policy

Exercise 13 [Policy Design] Design a "responsible global deployment" framework for social media platforms expanding into new markets. Your framework should specify: minimum viable content moderation requirements (by language and cultural context), required pre-deployment assessment processes, ongoing monitoring requirements, and accountability mechanisms for deployment failures. Present your framework in 500 words.

Exercise 14 [Regulatory Analysis] The European Union's Digital Services Act (DSA) includes provisions that apply to global platforms operating in EU markets. Research the DSA's specific requirements regarding content moderation, transparency, and risk assessment for very large online platforms. Write a 400-word analysis of whether the DSA's approach would, if applied globally, have been adequate to prevent the Myanmar or Brazil harms documented in this chapter.

Exercise 15 [Alternative Governance Design] Design an alternative governance framework for global platform deployment that addresses the language disparity in content moderation without requiring platforms to develop full moderation capacity in all 7,000 living languages. Your design might involve: community-based moderation, regulatory partnerships, graduated deployment requirements, or other approaches. Present your design in 400 words with analysis of its limitations.

Exercise 16 [Data Sovereignty Analysis] Research the specific data localization and data sovereignty frameworks adopted by three different countries (from different regions). For each, identify: (a) what data localization is required, (b) what the stated justification is, (c) what the documented effects have been on platform operations and data privacy, and (d) the criticisms from platforms and digital rights advocates. Write a 500-word comparative analysis.


Section D: Perspective-Taking and Ethics

Exercise 17 [Perspective-Taking] Write a 400-word first-person account from the perspective of a Facebook product manager in 2014 who is responsible for Myanmar market expansion. What pressures are you operating under? What information do you have about Myanmar's conditions? What decisions do you make? This exercise asks you to take the institutional perspective seriously without excusing the outcomes.

Exercise 18 [Ethical Analysis — Platform Responsibility] The chapter documents that Facebook received specific warnings about genocide risk in Myanmar before the 2017 crisis. Write a 500-word ethical analysis of Facebook's responsibility given those warnings. Your analysis should address: the "should have known" vs. "did know" distinction, the relationship between economic incentives and ethical obligations, and what response would have been adequate to the warnings received.

Exercise 19 [Stakeholder Analysis — Myanmar] Map the stakeholders in the Myanmar Facebook crisis: Facebook, Myanmar military, ultra-nationalist Buddhist leaders, Rohingya community, international human rights organizations, UN agencies, Myanmar government, other governments, advertisers, WhatsApp users in Myanmar, etc. For each stakeholder, identify their interests, their power in the situation, and what they could have done differently. Write a 400-word analysis identifying where the most leverage for different outcomes existed.

Exercise 20 [Cultural Relativity and Universal Standards] The chapter discusses how what counts as harmful content varies across cultural contexts, and how moderation standards developed for Western markets may not translate appropriately to other contexts. At the same time, some harms (incitement to genocide, for example) seem to require universal standards that do not defer to local context. Write a 400-word analysis of where you draw the line between cultural relativity and universal standards in content moderation.


Section E: Extended Projects

Exercise 21 [Extended Research Paper] Write a 1,500-word research paper on the relationship between social media penetration and specific political or human rights outcomes in one country. Your paper should: (a) document the specific social media ecosystem in your chosen country, (b) examine evidence of platform-mediated harms or benefits, (c) analyze the role of local context (political environment, media landscape, cultural factors) in mediating platform effects, and (d) evaluate platform and regulatory responses.

Exercise 22 [Comparative Analysis — Platform Governance Models] Write a 1,200-word comparative analysis of three different national approaches to social media platform governance: the U.S. model (largely self-regulatory), the EU model (regulatory with extensive requirements), and the Chinese model (state-controlled). For each, describe the specific governance mechanisms, analyze the outcomes (what harms are addressed, what harms remain, what new harms the model creates), and evaluate the model's transferability to other contexts.

Exercise 23 [Historical Analysis — Media and Political Violence] Write a 1,000-word historical analysis comparing the role of media in facilitating political violence in two historical cases: one from the pre-digital era (the Rwandan genocide and Radio Mille Collines, the Weimar Republic and propaganda, or another of your choice) and one from the social media era. What are the similarities and differences in mechanism? What does the historical comparison suggest about the novelty or continuity of social media's role in political violence?

Exercise 24 [Policy Brief — Language Parity] Write a 1,000-word policy brief for a multinational standards organization recommending specific requirements for language parity in social media content moderation. Your brief should: identify the problem with precision, propose specific standards (what constitutes adequate moderation in a language), identify who should set and enforce the standards, and address the resource challenges that would need to be overcome.

Exercise 25 [Synthesis — Digital Equity] Write a 1,000-word synthesis essay on the concept of "digital equity" — the principle that the benefits and harms of digital technologies should be distributed equitably across different populations globally. Your essay should: define what digital equity would require in the context of social media platforms, assess how far the current situation departs from this standard, identify the most significant barriers to digital equity, and evaluate what combination of regulatory, market, and civil society interventions would most effectively advance it.


Section F: Practical Engagement

Exercise 26 [Global Perspective Exercise] Identify a current social media misinformation crisis happening outside your home country (you may find examples through international news coverage or through organizations like First Draft or the Stanford Internet Observatory). Research: the country context, the platforms involved, the misinformation narratives, and what, if any, platform or regulatory response has occurred. Write a 300-word analysis from a global platform governance perspective.

Exercise 27 [Platform Bias Audit] Using a platform you regularly use, observe the representation of different national, linguistic, and demographic groups in your content recommendations over one week. Document: what languages appear in recommended content, what geographic regions are represented, what demographic groups are most represented in algorithmic recommendations. Write a 300-word analysis of what your observation suggests about the platform's global content curation.

Exercise 28 [News Desert Mapping] Using available resources (Penny Abernathy's news desert maps, the Google News Initiative's data journalism tools, or similar), identify a news desert in your own country. Research: how many newspapers have closed in the region, what the community's current news sources are, and what social media's role is in filling the information gap. Write a 300-word analysis.

Exercise 29 [WhatsApp Misinformation Simulation] Design a classroom exercise that simulates how misinformation spreads through WhatsApp group networks. Your design should: describe the setup, the "misinformation" message to be distributed, the forwarding rules, the intervention points, and what participants will learn about information spread in encrypted group environments. (Note: this is a design exercise — do not actually run an experiment involving real misinformation.)

Exercise 30 [Data Sovereignty Reflection] Research what data collection practices the social media platforms you use apply in your country specifically. Are there regulatory requirements that affect what data can be collected, where it can be stored, and how it can be used? How do these requirements differ from what the platform does in countries with different regulatory frameworks? Write a 300-word reflection on the implications.


Section G: Advanced Analysis

Exercise 31 [Algorithmic Justice Framework] Research the Algorithmic Justice League (founded by Joy Buolamwini) and its framework for auditing AI systems for bias. Apply this framework to one specific algorithmic system in social media — face recognition, content recommendation, or automated moderation. What would a bias audit of this system involve? What would adequate findings and remediation look like? Write a 600-word analysis.

Exercise 32 [Post-Colonial Theory Application] Read a key text in post-colonial theory (Frantz Fanon, Edward Said, or a contemporary digital post-colonialism thinker like Saidiya Hartman or Ruha Benjamin). Write a 600-word analysis applying their framework to the global deployment of social media platforms. What does post-colonial theory add to the analysis that standard regulatory or market-based frameworks do not?

Exercise 33 [Cross-Chapter Synthesis] Connect the Myanmar case to the misinformation dynamics analyzed in Chapter 33. Specifically: how do the Vosoughi et al. findings about what makes false news spread fast apply to Myanmar? How do the structural platform dynamics analyzed in Chapter 33 interact with the context-specific factors analyzed in this chapter to produce the documented outcome? Write a 500-word synthesis essay.

Exercise 34 [International Law Analysis] Research the current state of international law regarding corporate responsibility for human rights abuses. The UN Guiding Principles on Business and Human Rights (Ruggie Principles) are a starting point. Specifically: what obligations do they create for technology companies? How have they been applied (or failed to be applied) in the Myanmar case? What reforms to international law would be necessary to provide meaningful accountability for platform-mediated harms? Write a 700-word analysis.

Exercise 35 [Future Scenarios — Global Platform Governance] Write a 900-word analysis of three possible futures for global social media platform governance over the next decade: (a) fragmentation into national/regional internet ecosystems ("splinternet"), (b) emergence of binding international governance standards through multilateral institutions, or (c) emergence of non-Western platform alternatives that reshape global competition. For each scenario, analyze: likelihood, implications for platform harms in the Global South, implications for free expression, and implications for the power dynamics between platforms and governments.