Chapter 24: Exercises

Facebook's News Feed: A Decade of Optimization Against Users


Reflection Exercises

1. (Reflection) Recall the last time you scrolled through a social media feed longer than you intended. What were you feeling at the moment you started scrolling? What were you feeling when you stopped? Write a 200-word reflection on the gap between your intention and your behavior, and consider what mechanisms might have produced that gap.

2. (Reflection) The 2006 Facebook users who protested News Feed were correct that something important had changed — and they were unsuccessful in changing it back. Think of a time when you objected to a change in a platform or service you depended on. What was the outcome? What leverage did you have, and what leverage did you lack?

3. (Reflection) Facebook's decision to hold News Feed despite user protest was justified, in part, by the argument that users were still using the feature even while protesting it. Do you find this argument persuasive? Under what circumstances, if any, is behavioral data a more reliable guide to user preferences than stated preferences?

4. (Reflection) The "angry" emoji was weighted five times more heavily than "Like" in Facebook's algorithm. Think about your own emoji usage on social platforms. Do you use certain reactions more than others? What drives your choice? Now consider: if the platform is rewarding some of your reactions more than others, how might that affect what you see?

5. (Reflection) Frances Haugen described her decision to become a whistleblower as a choice she made after concluding that Facebook was "incapable of holding itself accountable." What does it take to hold a powerful institution accountable? What role does individual conscience play, and what are the limits of relying on individual conscience as an accountability mechanism?


Research Exercises

6. (Research) Find the original 2014 Kramer et al. paper "Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks." Read the methods section and the authors' statement on ethics. Write a 300-word analysis of how the authors justified the study's ethical adequacy, and evaluate whether their justification holds up against the principles of research ethics you have studied.

7. (Research) Research the concept of "meaningful social interaction" (MSI) as Facebook defined it after the January 2018 algorithm change. Find at least two news articles published in 2018 and two published in 2019 or later that describe the outcomes of the MSI change. Compare the framing and findings across these articles. What changed in how journalists and researchers understood the MSI change as time passed?

8. (Research) Locate the BuzzFeed News analysis by Craig Silverman (November 2016) comparing Facebook engagement from fake news sites to engagement from major news outlets in the final three months of the 2016 election. Summarize the methodology and findings. What are the limitations of engagement metrics as a measure of influence?

9. (Research) Research the history of Facebook's video metric inflation controversy from 2016. Find the Wall Street Journal's reporting on the inflated metrics and the subsequent advertiser lawsuit. Write a 400-word summary of what happened, what Facebook admitted, and what the settlement terms were.

10. (Research) The Facebook Papers were disclosed in October 2021. Find reporting from at least three different news organizations about the Papers. Identify one finding from the Papers that each organization emphasized. Are there differences in what different outlets chose to highlight? What might explain those differences?

11. (Research) Research Albert Hirschman's concept of "exit, voice, and loyalty" from his 1970 book of the same name. In 200–300 words, apply this framework to the 2006 Facebook News Feed protest. Which option did users exercise? Why did it fail? What would the other options have required?

12. (Research) Find at least three peer-reviewed studies published between 2015 and 2023 examining the relationship between passive social media use and wellbeing. Summarize their findings. Are the findings consistent? Where do they diverge, and what methodological differences might explain divergences?


Analysis Exercises

13. (Analysis) EdgeRank's time decay factor penalized older content, creating pressure for continuous posting. Diagram the feedback loop this creates for content creators. Who benefits from this pressure? Who bears the costs? How does the time decay factor affect the types of content that are produced versus the types that would be produced in its absence?

14. (Analysis) The emotional contagion experiment manipulated 689,003 users' feeds to test whether emotional states are contagious. Construct an alternative research design that could test the same hypothesis while meeting standard research ethics requirements. What would you sacrifice (in terms of ecological validity, sample size, cost) by using an ethical design?

15. (Analysis) Analyze the following chain of events as a system: (a) Facebook introduces the "angry" emoji; (b) the algorithm assigns anger 5x the weight of Like; (c) content producers learn what generates angry reactions; (d) content optimized for anger proliferates in feeds; (e) users see more anger-optimized content; (f) users' emotional state is affected. At which point(s) in this chain could the dynamic have been interrupted? What would interruption have required?

16. (Analysis) The Velocity Media scenario in Section 6.3 describes Marcus Webb modeling the revenue cost of reducing harmful content amplification. Construct a counter-model from Dr. Aisha Johnson's perspective: what are the long-term costs of not reducing harmful amplification? Include reputational risk, user churn, regulatory risk, and any other factors you consider relevant. Do the long-term costs change the calculus?

17. (Analysis) Facebook's privacy controls, added after the 2006 News Feed revolt, were opt-out rather than opt-in by default. Research shows that most users accept default settings. Analyze the ethical significance of this design choice. What is the difference between giving users control and giving users the appearance of control?

18. (Analysis) The chapter describes the MSI change as an instance of "proxy metric failure" — the algorithm optimized for a measurable proxy (comments and reactions) that was not identical to the underlying thing it was meant to measure (meaningful social interaction). Identify two other examples of proxy metric failure from the chapter. For each, describe: (a) what was being optimized; (b) what the proxy was; (c) how the proxy diverged from the underlying goal; (d) what the consequences were.

19. (Analysis) The chapter argues that the "pivot to video" caused significant harm to the journalism industry. Evaluate this argument. What other factors contributed to the decline of text-based journalism in the same period? How would you separate the effects of Facebook's algorithm change from other concurrent trends?

20. (Analysis) Compare the 2006 protest against News Feed to the 2021 Haugen whistleblower disclosure as two different attempts to hold Facebook accountable. What made each attempt the kind of leverage it was? Which was more effective? What does the comparison reveal about who has the power to hold platforms accountable and under what conditions?


Creative Exercises

21. (Creative) Write a fictional internal memo from a Facebook product manager in 2019, responding to the integrity team's finding that the MSI change has amplified divisive political content. The memo must grapple honestly with the trade-off between engagement and integrity. What does the product manager recommend, and why? The memo should be 300–400 words and should feel plausible given what we know about the internal culture documented in the Facebook Papers.

22. (Creative) Design a "Wellbeing Dashboard" for a social media platform that gives users meaningful information about how the algorithm is shaping their experience. What would you show users? What data would the platform need to collect? What are the limitations of this approach? Present your design as a written description of the dashboard's key features, with a brief analysis of its likely effectiveness.

23. (Creative) Write the first two paragraphs of a news article published in an alternate timeline where Facebook removed News Feed in response to the 2006 protest. What would the platform have looked like? What might have been different about the subsequent history of social media?

24. (Creative) You are Dr. Aisha Johnson at Velocity Media, preparing to present your argument for reducing harmful content amplification to CEO Sarah Chen and Head of Product Marcus Webb. Write the opening 5 minutes of your presentation — the framing, the evidence, and the ask. How do you make the case in language that both a CEO focused on growth and an ethics professional would find compelling?


Group Discussion Exercises

25. (Group Discussion) Divide into two groups. Group A argues that Facebook's harmful outcomes were the product of individual choices by specific leaders (Zuckerberg, Sandberg, Webb, etc.) who could have chosen differently. Group B argues that the outcomes were structurally determined by the business model and would have been produced by any company in the same position. After 15 minutes of preparation, debate the proposition: "Facebook's leadership bears specific moral responsibility for the harms its algorithm caused." After the debate, discuss: are these positions actually incompatible?

26. (Group Discussion) As a group, read the description of the emotional contagion experiment from Section 3. Then vote: should the experiment have been conducted? Under what conditions, if any, would it have been ethical? Record the range of positions in the group and the reasons offered for each. What does the range of positions reveal about the underlying ethical disagreements?

27. (Group Discussion) Imagine your group is a regulatory committee tasked with designing rules governing the Facebook News Feed algorithm. You must address: (a) transparency requirements; (b) user consent for experimental manipulation; (c) constraints on engagement optimization that demonstrably harms users; (d) accountability mechanisms. Draft a one-page regulatory framework and present it to the class. Be prepared to defend your choices.

28. (Group Discussion) The chapter describes Maya's experience of Instagram at 11 PM as a product of algorithmic design rather than her own choices. As a group, discuss: to what extent does this analysis remove moral agency from Maya? Does understanding the systemic forces shaping her behavior change how we should think about individual responsibility for social media use? Where does structural constraint end and individual agency begin?

29. (Group Discussion) Examine the five statements in the "Voices from the Field" section. For each speaker, identify: (a) their relationship to Facebook/Meta; (b) the interest they have in their stated position; (c) whether their statement is supported by evidence in the chapter. Does the speaker's position or interest affect how you evaluate their claim? What does this tell you about how to evaluate sources of criticism and defense of platform practices?


Advanced Synthesis Exercises

30. (Analysis — Advanced) The chapter traces News Feed's evolution from a social feed to an AI-driven entertainment feed. Write a 600-word analysis of this trajectory as a case study in "mission drift" — the process by which an institution's operational practices diverge from its founding purposes. What was the original mission? What drove the drift? Was drift inevitable? What, if anything, could have prevented it?

31. (Research — Advanced) Research the concept of "Goodhart's Law": when a measure becomes a target, it ceases to be a good measure. Write a 500-word application of Goodhart's Law to at least three specific cases from the Facebook News Feed's history. In each case, identify the measure, how it became a target, and how optimization of the target undermined the measure's value as a proxy for the underlying goal.

32. (Analysis — Advanced) The chapter argues that the harms of engagement-optimization systems are often emergent properties of scale — that algorithms that produce minor distortions in small deployments produce serious harms at the scale of billions. Research the concept of emergent properties in complex systems. Apply this concept to two specific cases from the chapter. What does the concept of emergence imply for how we should regulate algorithmic systems?

33. (Creative — Advanced) Using the framework of the five themes identified in this book's introduction (Asymmetry of Power, Ethics of Attention Extraction, Individual Agency vs. Structural Constraint, Gap Between Intent and Effect, Historical Continuity of Persuasion Technology), write a comprehensive 800-word essay analyzing the Facebook News Feed through each of these lenses. The essay should show how the five themes are not independent but interconnected — each illuminating a different dimension of the same underlying dynamic.

34. (Research — Advanced) Research the status of social media regulation as of 2025, with particular attention to the EU's Digital Services Act (DSA) and any US legislation that has passed or been proposed. Evaluate these regulatory approaches against the structural problems identified in this chapter. Do these regulations address the root causes of the harms described? What do they miss? What would a more comprehensive regulatory response look like?

35. (Group Discussion — Advanced) The chapter ends with the claim that the Facebook News Feed story shows "not that technology companies are uniquely malevolent, but that systems optimized for engagement, operating at scale, in competitive markets, without meaningful accountability, tend to produce outcomes that are bad for people regardless of the intentions of the people who built them." As a group, evaluate this claim. Does it let Facebook's leadership off the hook too easily? Does it accurately describe the causal dynamics? What are the policy implications if the claim is correct? What are the implications if it is wrong?