Chapter 39 Quiz: Design Ethics and Humane Technology
Instructions: This quiz contains 20 questions in two sections. Section A contains 14 multiple choice questions (2 points each). Section B contains 6 short answer questions (variable points as noted). Total: 70 points.
Section A: Multiple Choice
Choose the best answer for each question.
Question 1 The "Time Well Spent" movement proposed reframing platform success metrics away from time-on-platform toward:
A) Daily active users and monthly growth rates B) Whether the user's time on the platform served their actual goals and interests C) The number of social connections a user makes through the platform D) Revenue per user as a proxy for value delivered
Question 2 Tristan Harris's 2013 internal Google presentation was significant primarily because:
A) It led to Google immediately reforming its engagement-maximization practices B) It was the first time anyone had written about persuasive technology C) It went viral inside Google and demonstrated broad internal awareness of attention exploitation, while also demonstrating the limits of internal advocacy for structural change D) It introduced the concept of dark patterns to the technology industry
Question 3 Which of the following best describes "autonomy-preserving defaults"?
A) Defaults that automatically disable all notifications for new users B) Platform settings configured by default to give users maximum control over their experience, requiring opt-in for data collection and engagement features rather than opt-out C) The practice of showing users only content they have explicitly requested D) Defaults that maximize user engagement to ensure the platform remains financially viable
Question 4 Wikipedia's funding model is based primarily on:
A) Advertising revenue from contextual text ads B) Licensing fees charged to businesses that use Wikipedia's API C) Small donations from individual readers, solicited through periodic appeals D) A subscription model in which premium users pay for ad-free access
Question 5 "Meaningful friction" in humane design refers to:
A) Any design element that slows down user interaction and reduces time-on-platform B) Deliberately introduced resistance at decision points where users benefit from pausing to make conscious choices rather than acting impulsively C) The natural difficulty of learning a new platform's interface D) Technical delays introduced to reduce server load during peak usage
Question 6 A "federated social network" differs from centralized social media primarily in that:
A) It uses a subscription model rather than advertising B) It has no moderation and allows all content C) No single entity controls the network's incentive structure; multiple independently operated servers interconnect using an open protocol D) It is governed by government regulation rather than corporate policy
Question 7 Signal's resistance to surveillance is primarily achieved through:
A) Its terms of service, which prohibit data misuse by employees B) Government regulation of messaging applications C) Technical architecture that ensures even Signal cannot read users' messages, combined with minimal metadata retention D) Its advertising-free business model, which removes the financial incentive for data collection
Question 8 Which of the following is the strongest example of a dark pattern in consent architecture?
A) A cookie consent banner in which "Accept All" is a large, prominent button while "Manage Preferences" requires multiple additional clicks and is presented in smaller, lower-contrast text B) A platform that asks users to verify their age before accessing certain content C) A terms-of-service document that is long and contains legal language D) A notification settings screen that lists all notification types alphabetically
Question 9 The business model problem that makes humane design structurally difficult for most large social media platforms is best described as:
A) Advertising revenue requires engagement maximization, which creates structural incentives that run counter to user wellbeing — a conflict that surface-level design changes cannot resolve B) Users are unwilling to pay subscription fees for platforms they currently access for free C) Platforms lack the technical capability to implement humane design features D) Regulatory requirements prevent platforms from reducing engagement-maximizing features
Question 10 Which of the following platforms is the best example of a large-scale information platform that operates without advertising, algorithmic engagement optimization, or engagement manipulation?
A) Facebook (Meta) B) Reddit C) YouTube D) Wikipedia
Question 11 Twitter's 2020 experiment that prompted users to read an article before sharing it was an example of:
A) Regulatory compliance under the EU's Digital Services Act B) Meaningful friction — a pause inserted at a decision point where reflexive behavior was likely to outrun genuine consideration C) Dark pattern design intended to reduce sharing of competing platforms' content D) Autonomy-preserving defaults applied to the sharing feature
Question 12 The chapter's account of Tristan Harris's trajectory — internal advocacy at Google, departure, founding of CHT, public advocacy — suggests that:
A) Individual ethics is sufficient to produce meaningful change inside large technology companies B) External pressure from former insiders is more structurally significant than internal advocacy, because external advocates are not constrained by institutional loyalty C) The technology industry is incapable of ethical reform without government intervention D) Design ethicist roles are structurally positioned to drive significant organizational change
Question 13 "Contextual advertising" differs from behavioral surveillance advertising primarily in that:
A) It generates higher revenue per impression due to better targeting precision B) It matches ads to the content being viewed rather than to the individual viewer's behavioral profile, and does not require collecting personal data C) It is only used by nonprofit organizations D) It requires users to explicitly opt in to seeing advertisements
Question 14 According to the chapter, which statement about the "minimum viable humane platform" concept is most accurate?
A) It describes a platform too small to be commercially viable B) It is a theoretical ideal that no existing platform has approached C) It describes the smallest set of design and business model choices that would produce a platform serving users rather than exploiting them — and it represents a practical standard, not a utopian one, since elements of it exist in current platforms D) It refers specifically to Wikipedia's governance model
Section B: Short Answer
Answer each question in the number of sentences indicated. Partial credit is available.
Question 15 (6 points) Answer in 3-4 sentences.
Explain what "consent architecture" means and why the structure of how choices are presented — not just the availability of choices — matters ethically. Use one specific example in your answer.
Question 16 (6 points) Answer in 3-4 sentences.
The chapter distinguishes between what individual designers can do inside extractive platforms and what they cannot do. What is the key structural limitation, and what does this suggest about the relationship between individual ethical agency and systemic change?
Question 17 (8 points) Answer in 4-5 sentences.
Evaluate the following claim: "Adding a usage dashboard (like Apple Screen Time) to a social media platform is a meaningful step toward humane design."
Your answer should acknowledge what is genuine about this claim while identifying its limitation, and should use at least two specific concepts from the chapter.
Question 18 (8 points) Answer in 4-5 sentences.
Dr. Aisha Johnson presents three paths to Sarah Chen and Marcus Webb at Velocity Media. Briefly describe what distinguishes Path 2 from Path 1 and Path 3, and explain why Path 2 might be considered a "middle path" between cosmetic change and structural transformation. What does Path 2 require that Path 1 does not?
Question 19 (8 points) Answer in 4-5 sentences.
Why does the chapter argue that Wikipedia and Signal, despite being much smaller than Facebook or TikTok, function as meaningful "proof of concept" for humane design? What specific claim do they refute, and what are the limits of this argument?
Question 20 (8 points) Answer in 5-6 sentences.
The chapter ends with the claim that "the choice between 'large platform' and 'non-manipulative platform' is a false one." Is this claim fully supported by the evidence in the chapter? Identify the strongest evidence for this claim and the most significant challenge to it. What would need to be true for this claim to hold at the scale of Facebook or TikTok?
Answer Key (Instructor Use)
Section A
- B
- C
- B
- C
- B
- C
- C
- A
- A
- D
- B
- B
- B
- C
Section B — Scoring Rubrics
Question 15 (6 points) Full credit requires: (1) a clear definition of consent architecture that includes both the obtaining and the structuring of choices (2 pts); (2) an explanation of why the structure matters — not just availability but how choices are framed, defaulted, and made accessible (2 pts); (3) a specific, accurate example (2 pts). Examples may include cookie banners, notification permission requests, data collection opt-outs, or similar.
Question 16 (6 points) Full credit requires: (1) identification of the structural limitation — that business model incentives (advertising revenue tied to engagement) create a ceiling on what internal advocacy can achieve regardless of individual persuasiveness (3 pts); (2) a clear statement about the implication for systemic change — that individual ethics without structural change has predictable limits, and systemic reform requires external pressure, regulation, or fundamental business model change (3 pts).
Question 17 (8 points) Full credit requires: (1) acknowledgment of what Screen Time genuinely provides — attention transparency, data that users did not previously have access to (2 pts); (2) identification of the limitation — that passive information display does not change the platform architecture, and that users confronting the data still face the same engagement-maximizing design (3 pts); (3) use of at least two chapter concepts correctly (e.g., "attention transparency but not attention budget support," or "useful but insufficient for consent architecture reform") (3 pts). Answers that only identify the limitation without acknowledging the genuine value, or vice versa, should receive partial credit.
Question 18 (8 points) Full credit requires: (1) accurate description of Path 1 as cosmetic — it adds features but does not change the underlying exploitative architecture (2 pts); (2) accurate description of Path 2 as structural redesign of specifically identified harmful features — it changes how specific systems work, not just what sits on top of them (2 pts); (3) accurate description of what Path 2 requires that Path 1 does not — cost, reduced engagement metrics, organizational commitment to changing features that drive revenue (2 pts); (4) clear statement of why it is a "middle path" relative to Path 3 — it addresses specific harms without requiring the fundamental business model rethinking that Path 3 demands (2 pts).
Question 19 (8 points) Full credit requires: (1) statement of the specific claim being refuted — that large-scale platforms require advertising and engagement manipulation to function (2 pts); (2) explanation of why Wikipedia and Signal refute it — they operate at significant scale (Wikipedia at billions of users/month) without these mechanisms (2 pts); (3) identification of at least one genuine limit of the argument — neither platform has achieved the network-effect scale of Facebook or TikTok; Wikipedia is an information resource rather than a social platform; Signal is a messaging app rather than a content platform; both operate with significantly less revenue than the platforms they are compared to (2 pts); (4) overall quality of reasoning — does the answer engage with the argument rather than just listing points (2 pts).
Question 20 (8 points) Full credit requires: (1) engagement with the claim rather than simple acceptance or rejection (2 pts); (2) identification of the strongest evidence — Wikipedia at top-10 global scale, Signal's growth, Mastodon's architecture (2 pts); (3) identification of a genuine challenge — network effects, revenue gap, the fact that no ethical platform has achieved Facebook/TikTok scale (2 pts); (4) a clear statement of what would need to be true — either regulatory pressure to level the playing field, a shift in user willingness to pay, or a technological or cultural change that reduces the advantage of the engagement-maximizing model (2 pts). Answers that simply summarize the chapter without engaging evaluatively should receive partial credit.