Chapter 35: Quiz

Law, Policy, and the Regulation of Propaganda


Part I — Multiple Choice

1. Under the Brandenburg v. Ohio (1969) standard, the government may restrict speech advocating the use of force only when that speech is:

a) Likely to cause psychological harm to a reasonable listener b) Directed toward producing imminent lawless action AND likely to produce such action c) Produced by a foreign government or its agents d) Factually false and intended to mislead the public


2. The Smith-Mundt Act (1948) was designed primarily to:

a) Prohibit foreign governments from broadcasting propaganda in the United States b) Prevent the U.S. government from directing its foreign information operations at domestic American audiences c) Establish the Federal Communications Commission's authority over political advertising d) Criminalize the production of propaganda by private organizations during wartime


3. The 2012 Smith-Mundt Modernization Act:

a) Completely prohibited all domestic government information operations b) Removed the prohibition on domestic dissemination of materials originally produced for foreign audiences c) Established criminal penalties for government disinformation campaigns d) Created the Broadcasting Board of Governors to oversee all government information activities


4. In New York Times v. Sullivan (1964), the Supreme Court held that a public official cannot recover for defamation relating to official conduct unless:

a) The statement was broadcast on network television or published in a major newspaper b) The official can prove that the statement caused measurable economic damages c) The statement was made with knowledge of its falsity or reckless disregard for the truth ("actual malice") d) The official can prove that the defendant intended to influence the outcome of an election


5. The Citizens United v. FEC (2010) decision held that:

a) Corporations may spend unlimited funds on political advertising because money is protected speech b) Foreign nationals may contribute to U.S. election campaigns if the contributions are disclosed c) The First Amendment prohibits government restrictions on independent political expenditures by corporations and unions d) Dark money organizations must disclose their donors to the Federal Election Commission


6. Section 230 of the Communications Decency Act provides internet platforms with:

a) Immunity from liability for third-party content regardless of whether the platform exercises editorial discretion b) Immunity from liability only if the platform refrains from all content moderation c) A safe harbor from copyright infringement claims for user-generated content d) Protection from government compelled disclosure of user information


7. A key criticism of Germany's NetzDG (2017) is that the law's short compliance timelines create incentives for platforms to:

a) Hire fewer content moderators in Germany to reduce compliance costs b) Systematically over-remove content, including legal speech, to avoid large fines c) Lobby the German government for looser standards that reduce removal obligations d) Move their servers outside Germany to avoid the jurisdiction of the law


8. The EU Digital Services Act (DSA) applies its most demanding obligations to "very large online platforms" (VLOPs) defined as platforms with:

a) More than 1 million registered users in the EU b) More than 10 million monthly active users in the EU c) More than 45 million monthly active users in the EU d) Annual revenue exceeding €1 billion from EU operations


9. The DSA's systemic risk assessment obligations (Articles 34–35) are best characterized as:

a) Content regulation — requiring platforms to remove specified categories of harmful content b) Proceduralist regulation — requiring platforms to assess risks, implement mitigation, and report transparently c) Structural regulation — prohibiting specific platform business models that enable disinformation d) Speech-plus regulation — covering only conduct that accompanies protected speech


10. Meta's Oversight Board can:

a) Review and overturn individual content moderation decisions AND change Meta's fundamental platform policies b) Review and overturn individual content moderation decisions but NOT change Meta's fundamental platform policies c) Impose fines on Meta for policy violations and publish the results in an annual public report d) Require Meta to provide researchers with access to platform data and algorithms for independent audit


11. ICCPR Article 20 requires states to prohibit:

a) Only content produced by foreign governments targeting domestic populations b) Any propaganda for war and advocacy of national, racial, or religious hatred constituting incitement c) False factual statements that cause measurable harm to public health or democratic institutions d) Speech that a supermajority of the UN Human Rights Council determines is harmful


12. The United States' position on ICCPR Article 20 is that:

a) The U.S. fully complies with Article 20 through its existing hate speech and incitement laws b) The U.S. interprets Article 20 as applying only to state-produced propaganda, not private speech c) Article 20 does not require any restrictions on speech protected under the First Amendment (formal reservation) d) The U.S. has withdrawn from the ICCPR in order to avoid Article 20 obligations


13. In United States v. Alvarez (2012), the Supreme Court struck down the Stolen Valor Act because:

a) The false claim about military awards did not cause sufficient economic harm to warrant prosecution b) The government did not show that criminalizing the false statement was necessary and narrowly drawn to address a compelling interest c) The First Amendment categorically prohibits any criminal regulation of false statements of fact d) The law violated the equal protection clause because it applied only to military honor claims


14. "Dark money" in political advertising refers to:

a) Political advertising that runs after legal disclosure deadlines b) Political spending by nonprofit organizations that are not required to disclose their donors c) Advertisements that are deliberately misleading about the candidate's record d) Foreign-funded political advertising that violates campaign finance disclosure requirements


15. Germany's concept of "militant democracy" (streitbare Demokratie) holds that:

a) Democracies must maintain strong military forces to deter authoritarian adversaries b) Democratic political culture requires vigorous, even aggressive, democratic discourse c) Democracy must be able to restrict speech and political parties that threaten its democratic foundations d) Democratic elections must be fully competitive without restriction on any political party


Part II — Short Answer

16. Explain in two to three sentences why the Brandenburg "imminent lawless action" standard makes most politically targeted disinformation constitutionally protected in the United States, even when the disinformation is demonstrably false and demonstrably harmful to democratic institutions.


17. Tariq argues that "every law meant to restrict harmful speech has eventually been used against the people it was supposed to protect." Identify one specific historical example that supports this argument and explain the mechanism by which the law was turned against its intended beneficiaries.


18. Distinguish between the following three types of platform conduct for purposes of Section 230 analysis: (a) hosting user content; (b) moderating user content; (c) algorithmically amplifying user content. Which is most clearly covered by Section 230 immunity? Which is most legally contested?


19. Why can the European Union impose DSA compliance obligations on U.S.-headquartered platforms in ways that the U.S. government could not impose equivalent speech regulations on those same platforms operating in the United States?


20. What is the "chilling effect" as it applies to speech regulation, and why does it make NetzDG-style mandatory removal regulations particularly problematic from a free speech perspective?


Part III — Applied Analysis

21. A U.S. Senator proposes legislation requiring social media platforms to label any content containing a verifiable false factual claim about a federal election as "disputed" within 24 hours of the claim being flagged by an independent fact-checking organization certified by the FEC. Analyze this proposal under the Brandenburg standard and identify two constitutional objections it would face and one way its drafters might respond to each objection.


22. Read the following statement from Ingrid Larsen: "The Swedish Fundamental Law is very protective, but it has always recognized that the state has a legitimate interest in prohibiting some kinds of speech. The question is always about the threshold and the process. In the U.S., it feels like the threshold is almost impossibly high, and the process question barely arises."

In 150 words, assess whether Ingrid's characterization is accurate. What specific doctrinal features of U.S. constitutional law make the threshold "almost impossibly high"? Is she right that the "process question barely arises"?


23. Position C in the Chapter 35 debate framework argues for "structural regulation" targeting the infrastructure of disinformation rather than its content. List three specific structural interventions mentioned in the chapter and explain why each is more constitutionally durable than a content-based approach. Then identify one limitation of the structural regulation approach — something it cannot address even if fully implemented.


Answer Key (Selected Questions)

1. b — Brandenburg requires both the direction toward producing lawless action and the likelihood of producing it; neither element alone is sufficient.

2. b — Smith-Mundt's central purpose was to prevent the domestic weaponization of foreign-audience propaganda tools.

3. b — The 2012 modernization removed the domestic dissemination prohibition, which had been the law's primary protective firewall.

4. c — The actual malice standard requires knowledge of falsity or reckless disregard for the truth; negligence is insufficient.

5. c — Citizens United held that independent political expenditures by corporations are protected speech; it did not address contribution limits or foreign donations.

6. a — Section 230 immunity applies regardless of moderation; the legislative purpose was specifically to encourage moderation by removing the perverse incentive to avoid all moderation.

7. b — The asymmetric incentive structure (heavy fines for under-removal, no penalty for over-removal) predictably produces over-removal.

8. c — 45 million monthly active users in the EU is the DSA threshold for VLOP/VLOSE designation.

9. b — The DSA is fundamentally proceduralist; it does not specify what content must be removed.

10. b — The Oversight Board's jurisdiction covers individual content decisions but not Meta's underlying policies and business model.

11. b — Article 20 requires prohibition of war propaganda and advocacy of hatred constituting incitement.

12. c — The U.S. entered a formal reservation declaring that Article 20 does not require restrictions on First Amendment-protected speech.

13. b — The plurality held the government had not shown the law was necessary to directly advance a compelling interest; the Court did not categorically prohibit false statement regulation.

14. b — Dark money refers specifically to the lack of donor disclosure in 501(c)(4) political spending.

15. c — Militant democracy (streitbare Demokratie) is Germany's constitutional doctrine authorizing restriction of parties and speech that threaten the democratic constitutional order.


Short answer and applied analysis questions are graded for analytical depth and accuracy of legal citation, not for ideological position. Students are expected to engage seriously with multiple perspectives, including those with which they disagree.