Chapter 13 Exercises: Social Media as Observation Tower
Exercise 13.1 — The Disclosure Audit (Individual, 30–45 minutes)
Purpose: Map what social media platforms know about you through your own active disclosure.
Instructions:
Choose one social media platform you actively use. Conduct a systematic "disclosure audit" by examining your account settings, privacy dashboard, and data download.
Step 1: Most major platforms (Facebook, Instagram, Twitter/X, TikTok) offer a "Download Your Data" or "Request Archive" feature. Request a copy of your data. While you wait (it may take several hours), proceed with the following steps.
Step 2: Review your account profile settings. List everything you have explicitly declared: name, birthday, location, employment, education, relationship status, political views (if applicable), religious views (if applicable), phone number, email, linked accounts.
Step 3: Review your activity log (available on Facebook and Instagram). How many posts, reactions, shares, and comments have you made? Scroll back to your earliest activity. What does the aggregate look like?
Step 4: When your data archive arrives, open it and explore. What categories of data does it contain? Look specifically for: your search history on the platform, your ad targeting categories, your location history (if tracked), and your message metadata.
Reflection Questions:
a. What is the total volume of data you have disclosed to this platform over your lifetime as a user?
b. What is in your ad targeting profile? Do the categories match your self-understanding? Are there categories that surprised you?
c. If you were to describe who "you" are based solely on this data archive, what portrait would emerge? What does it capture accurately? What does it miss?
Exercise 13.2 — The Shadow Profile Problem (Pairs, 45–60 minutes)
Purpose: Understand how social media data about you is generated by others' participation.
Instructions:
Step 1 (Individual): With a willing partner (friend or classmate), each of you independently identify a third person you both know who either does not use social media or uses it minimally. Do not yet discuss who you are thinking of.
Step 2 (Individual): Based only on social media (public posts, tags, photographs, mutual friends), list everything you can learn about your chosen third person from other people's participation on social media — without looking at anything posted by the third person directly.
Step 3 (Together): Share your findings. Together, construct a composite profile of the non-user third person based only on what other people's social media activity reveals about them.
Step 4: With the permission of the third person (important: always ask before sharing research about real people), share what you found and ask them to evaluate it. Was any of it surprising to them? Was any of it information they would have preferred remain private?
Discussion Questions:
a. How comprehensive was the profile you assembled of someone who chose not to be on social media?
b. What does this exercise suggest about the limits of individual privacy decisions in a networked social environment?
c. Should social media platforms have obligations to non-users who are captured in the data ecosystem through others' participation? What might those obligations look like?
Exercise 13.3 — Platform Comparison: What Consent Actually Says (Small Groups, 45–60 minutes)
Purpose: Analyze what social media platforms' terms of service and privacy policies actually disclose about data collection.
Instructions:
Working in groups of 3, divide the following platforms among group members: Facebook/Meta, Instagram, TikTok, Twitter/X, Snapchat.
Each group member reads their assigned platform's privacy policy (use the current version, available on each platform's website) and identifies answers to the following questions:
- What behavioral data does the platform collect beyond your posts? (scroll behavior, hover time, session data)
- Does the platform collect data about non-users?
- Does the platform share data with third parties, and if so, which categories?
- Does the platform collect data through off-platform tracking (pixels, APIs)?
- What data rights does the platform grant users (access, deletion, portability)?
- What is the most surprising disclosure in the privacy policy?
Group Synthesis: Compare findings across platforms. Where are they similar? Where do they differ? Does any platform's disclosure stand out as more or less transparent?
Discussion: Having read the actual privacy policies, do you now understand the full scope of what these platforms collect? Or does the policy language itself obscure as much as it reveals?
Exercise 13.4 — The Emotional Contagion Ethics (Individual Written Assignment, 500–700 words)
Purpose: Apply ethical reasoning to the Facebook emotional contagion experiment.
Context: Read the abstract and "Methods" section of Kramer, Guillory, and Hancock, "Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks," PNAS, 2014. (Available freely online.)
Writing Prompt:
The Facebook emotional contagion experiment was conducted without explicit consent from participants, without IRB approval at Facebook (though with IRB approval from Cornell University), and without disclosure to users during or after the study.
In your essay:
-
Describe what the study did, using your own words.
-
Evaluate Facebook's defense that the Terms of Service covered the research. Is this a sufficient basis for ethical research involving human subjects? What are the strongest arguments for and against this position?
-
Apply at least two ethical frameworks from your course readings (e.g., utilitarian analysis, Kantian analysis, virtue ethics, or the Belmont Report's research ethics principles) to the study. Do different frameworks yield different conclusions?
-
Consider the broader implication: if platforms routinely adjust content feeds for engagement optimization, and if emotional contagion is a real effect of such adjustment, what ongoing ethical obligations do platforms have? What would compliance with those obligations look like in practice?
Exercise 13.5 — Geofence Warrant Scenario Analysis (Small Groups, 45–60 minutes)
Purpose: Apply Fourth Amendment and civil liberties reasoning to geofence warrants.
Scenario:
A serious crime — an armed robbery — occurred at 10:15 PM at a convenience store in a small city. Law enforcement applies for a geofence warrant covering a two-block radius around the store from 9:45 PM to 10:45 PM. Google, which maintains Location History data from users' Android devices and location-enabled Google accounts, receives the warrant.
Within the geofenced area and time window, Google's data shows 47 devices.
Questions for Group Discussion:
-
What are the likely categories of people whose devices appear in this geofence? (Think carefully about who might be in a two-block radius of a convenience store at 10:15 PM on a given night.)
-
Of the 47 devices, how many are likely to be connected to the crime? What proportion of people captured by the warrant are probably innocent bystanders?
-
The Fourth Amendment requires that search warrants "particularly describe" the places to be searched and persons or things to be seized. Does a geofence warrant satisfy this requirement? Why or why not?
-
Now modify the scenario: the crime was participation in an unlawful assembly (protest that turned disruptive). The geofence covers the protest site. How does this change your analysis?
-
Google has begun pushing back on geofence warrants and restructuring its location data storage to limit their scope. What are Google's motivations for doing this, and is it an appropriate role for a private company to play in limiting law enforcement access?
Exercise 13.6 — Participatory Surveillance: What Would Andrejevic Say? (Individual or Pairs, 30–45 minutes)
Purpose: Apply Andrejevic's participatory surveillance framework to contemporary platforms.
Instructions:
Part A: Choose one social media platform that did not exist when Andrejevic wrote his 2010 framework (options: TikTok, BeReal, Threads, Discord's public channels, or any platform launched after 2010). Using Andrejevic's framework:
- Describe how "participatory surveillance" operates on your chosen platform
- Identify what makes this platform's surveillance structure similar to and different from the Facebook/YouTube platforms Andrejevic originally analyzed
- Does the concept of "digital enclosure" apply? Are there alternative platforms that provide the same functionality without the same surveillance structure?
Part B: Andrejevic argues that participatory surveillance is structural, not individual — it cannot be addressed by individual behavior change. Do you agree? Identify at least one example where individual behavior change DID meaningfully reduce participatory surveillance exposure for the person who made the change. Then identify at least one reason why the broader problem persists despite individual actions.
Chapter 13 | Part 3: Commercial Surveillance