Case Study 23-2: TikTok and the Montana Ban (2023)
The First State-Level Social Media Ban and What It Reveals About Algorithmic Governance
Background
On May 17, 2023, Montana Governor Greg Gianforte signed into law SB 419 — the first state law in American history to ban a specific social media platform for general consumers. The law, which would have gone into effect on January 1, 2024, prohibited TikTok from operating in Montana and barred app stores (including Apple's App Store and Google's Play Store) from offering TikTok for download within the state. Violations would carry civil penalties of $10,000 per day.
The Montana ban represented a genuinely novel legal moment: a state attempting to exercise jurisdiction over a global digital platform based on the national origin of that platform's owner. It raised questions — about constitutional authority, about the nature of information sovereignty, about what "banning" a digital platform even means technically — that reached beyond TikTok and spoke to fundamental issues of how democratic governments can or should govern algorithmic systems.
The Montana TikTok ban is a useful lens through which to examine a broader set of questions: What authority do states and nations have over globally distributed algorithmic systems? What is the relationship between platform ownership, algorithmic behavior, and national security? And what does it mean to try to legislate the relationship between citizens and recommendation systems?
The Montana Context
Montana's political context is relevant. The state has a conservative-leaning political environment; Governor Gianforte is a Republican who had previously been involved in technology entrepreneurship. The bill passed the state legislature on largely party-line votes. The stated rationale was national security: the concern that TikTok, as a ByteDance product, was subject to potential Chinese government demands for user data and could serve as a vector for Chinese intelligence operations against American citizens.
Montana has approximately 1.1 million residents, making it one of the least populous states in the country. Its relative smallness matters: the economic and symbolic stakes of the Montana ban were considerably lower than they would be for a large-state ban, making it a less politically risky testing ground for this kind of legislation.
Timeline
2022: Montana legislators begin expressing concern about TikTok following national-level hearings in the U.S. Congress where TikTok CEO Shou Zi Chew testified. State-level legislative interest in TikTok restrictions begins developing across multiple states, with Montana among the most aggressive.
January 2023: Montana SB 419 is introduced in the Montana state legislature. The bill is framed as a national security measure targeting TikTok specifically because of its Chinese ownership.
April 2023: SB 419 passes both chambers of the Montana legislature on largely party-line votes, with Republican majorities supporting and Democratic minorities largely opposing.
May 17, 2023: Governor Gianforte signs SB 419 into law. The law is scheduled to take effect on January 1, 2024. TikTok, five content creators based in Montana, and civil liberties organizations including the ACLU and EFF announce legal challenges.
May-September 2023: Multiple legal challenges filed in federal district court in Montana. The challengers argue that SB 419 is unconstitutional on multiple grounds: it violates the First Amendment by restricting speech; it violates the Commerce Clause by attempting to regulate foreign commerce; and it is preempted by federal law in the domain of foreign relations.
November 30, 2023: U.S. District Judge Donald Molloy issues a preliminary injunction blocking the Montana law from taking effect. The judge finds that the challengers are likely to succeed on the merits — that the law raises "serious questions" about First Amendment compliance and that the state's interest in national security is "preempted" by the federal government's primacy in foreign affairs.
December 2023: Montana's Attorney General announces the state will appeal the injunction. The effective date of the law passes without enforcement.
2024: The federal government passes its own TikTok divestiture law, potentially rendering the Montana state law moot on preemption grounds. The Montana case continues on appeal but the center of the legal battle shifts to the federal divestiture law and its constitutional challenges.
January 2025: The Supreme Court unanimously upholds the federal TikTok divestiture law, providing the clearest statement to date on the constitutional authority to restrict a social media platform based on its national security implications.
The Constitutional Arguments
First Amendment Challenge
The primary constitutional challenge to the Montana ban was First Amendment-based. The argument: TikTok is a platform for communication. SB 419, by banning TikTok, restricts the ability of Montanans to communicate through TikTok. Content-based restrictions on speech face heightened constitutional scrutiny; even content-neutral restrictions on speech must be narrowly tailored to a substantial government interest.
Judge Molloy's preliminary injunction found this argument persuasive. The government cannot ban a general-purpose communication platform merely because some users might engage in harmful communication; the First Amendment requires more targeted means-ends fit. The fact that TikTok could hypothetically be used to spread propaganda or collect intelligence for a foreign power does not justify restricting all of TikTok's uses — the vast majority of which involve domestic users communicating about cooking, humor, art, politics, and the full range of human expression.
TikTok's own legal position emphasized the First Amendment dimensions. The platform argued that it was a forum for American speech, that its content was overwhelmingly created by American users expressing themselves on matters of American concern, and that banning it would silence those American voices.
This argument highlights a genuine complexity in social media regulation: social media platforms are simultaneously commercial products owned by corporations, communication platforms hosting the speech of millions of users, and algorithmic systems whose architecture shapes what speech is amplified and suppressed. The First Amendment analysis is complicated by this layered nature. Restricting a platform restricts the speech of its users, even if the intent is to restrict the platform's corporate behavior.
Commerce Clause and Preemption
The Commerce Clause argument was somewhat more straightforward: foreign commerce is a federal matter, not a state matter. Montana's attempt to regulate a Chinese-owned company's operations in the United States was argued to impinge on the federal government's exclusive authority over foreign commerce and foreign relations.
The preemption argument had particular force given that the federal government had been engaged in ongoing regulatory negotiations with TikTok, and that federal agencies (CFIUS, the Department of Justice) had claimed authority over the national security questions Montana was attempting to address through its own law. A state acting unilaterally in a domain where federal authorities are actively engaged faces strong preemption claims.
Judge Molloy found both the preemption argument and the First Amendment argument sufficient to justify preliminary relief.
Technical Realities of Platform Banning
The Montana ban also surfaced important technical questions about what it means to "ban" a globally distributed digital platform and what enforcement mechanisms are practically available.
The Geofencing Challenge
TikTok's platform is globally distributed. It does not have Montana-specific servers; it does not deliver Montana-specific content; its recommendation algorithm does not have a Montana-awareness setting. "Banning" TikTok in Montana requires either:
-
App store enforcement: prohibiting Apple and Google from offering TikTok for download in Montana. This is achievable — app stores can geofence their offerings. But it affects only new users; existing TikTok installations continue functioning. And it is easily circumvented by changing device location settings or using a VPN.
-
Network-level blocking: ISPs serving Montana customers could potentially block TikTok's servers. This is more comprehensive but is technically complex, would require ISP cooperation or compulsion, and is easily circumvented by VPN.
-
Device enforcement: requiring that devices sold in Montana not offer TikTok. This would require enforcing against device manufacturers and would create extremely complex jurisdictional questions.
None of these enforcement mechanisms is both comprehensive and technically feasible without either significant government infrastructure (approaching national internet censorship architecture) or significant collateral restriction of legitimate internet activities.
SB 419 chose the app store mechanism — prohibiting app stores from offering TikTok downloads in Montana, with per-day penalties on the app store operators. But the law did not address existing installations, VPN circumvention, or web-based access to TikTok.
The Enforcement Problem as Constitutional Evidence
The practical difficulty of enforcement became evidence in the legal proceedings. If Montana cannot actually prevent its residents from accessing TikTok — because existing installations continue working, because VPNs are freely available, because TikTok is accessible through mobile browsers — then the law's actual effect on national security is minimal while its burden on free speech is substantial. This makes the law's means-ends fit look poor under constitutional analysis: it burdens speech significantly without meaningfully advancing its stated national security interest.
This logic applies broadly to state-level social media regulation. A global platform operating on infrastructure distributed across dozens of countries is extremely difficult to meaningfully restrict at the state level. The tools of meaningful restriction — mandating ISP-level blocking, requiring app store compliance, enforcing against device manufacturers — generally require either federal authority or the kind of centralized internet governance infrastructure that characterizes authoritarian states like China. The legal and practical architecture for state-level social media regulation in a free and open internet environment is extremely limited.
Geopolitics and Algorithmic Governance
The Montana ban — and the broader federal effort to restrict TikTok — represents a genuinely novel challenge in information law: how should democratic governments govern algorithmic systems when those systems are owned by entities subject to adversarial foreign jurisdiction?
The Ownership-Algorithm Relationship
The national security concern about TikTok rests on a theory of the ownership-algorithm relationship: that ByteDance, as TikTok's owner, can direct TikTok's algorithm to serve Chinese government interests. This theory is coherent — a corporation can be required by law to modify its software — but it requires empirical evidence of actual algorithmic manipulation to be more than a hypothetical concern.
The evidence for actual algorithmic manipulation is limited but not zero. Research on content distribution differences between TikTok and Douyin on politically sensitive topics has documented asymmetries: Douyin, serving Chinese users, suppresses content about Tiananmen Square, Tibetan independence, and other topics sensitive to the Chinese government; TikTok does not apply comparable suppression. Whether this reflects neutral platform policy (different content standards for different markets) or strategic behavior (showing foreign users more contentious political content while protecting Chinese users from it) has not been established.
Research on TikTok's handling of content related to Uyghur human rights, Hong Kong protests, and other topics sensitive to the Chinese government has found some evidence of underrepresentation relative to comparable platforms. But "underrepresentation" is difficult to evaluate without knowing what the baseline should be, and platforms routinely make content distribution decisions on contested topics for reasons unrelated to foreign government influence.
What Algorithmic Governance Would Actually Require
If the genuine concern is that TikTok's algorithm is being or could be directed to serve adversarial geopolitical interests, addressing this concern technically would require:
Algorithmic auditing: independent technical experts examining TikTok's recommendation code and training data to detect whether politically sensitive topics are being systematically amplified or suppressed for political rather than engagement reasons. This requires unprecedented access to proprietary systems.
Data segmentation verification: confirming that U.S. user data is not accessible to ByteDance or Chinese government entities. Project Texas represents an attempt to achieve this; its adequacy is contested.
Ongoing monitoring: because algorithmic systems can be modified continuously, one-time verification of algorithm fairness provides limited assurance. Ongoing regulatory monitoring would be necessary.
None of these mechanisms is provided for in the Montana law, which simply bans the platform rather than regulating its algorithmic behavior. This is a governance approach that addresses the instrument (TikTok as a platform) rather than the alleged harm (algorithmic influence or data access by Chinese authorities). A ban eliminates the hypothetical risk; it also eliminates the substantial legitimate value that TikTok provides to millions of American users and creators.
Analysis: What the Montana Ban Reveals About Algorithmic Governance
The Mismatch Between Risk and Remedy
The Montana ban illustrates a persistent mismatch between the nature of algorithmic risks and the remedies available to governments. The alleged risks are specific: that TikTok's algorithm might be directed to distribute particular content, or that user data might be accessible to Chinese intelligence. Appropriate remedies would target these specific risks: auditing the algorithm's treatment of politically sensitive topics, verifying data access controls, establishing ongoing monitoring.
The Montana remedy — banning the platform — addresses neither specific risk effectively (because the platform remains accessible via VPN and existing installations) while imposing substantial costs: restricting the speech of Montana TikTok users, eliminating livelihoods of Montana-based TikTok creators, and restricting consumer choice.
This mismatch reflects a broader challenge in algorithmic governance: governments lack both the technical expertise and the legal frameworks to regulate algorithmic behavior directly. Banning is blunt but tractable; auditing is targeted but technically demanding and legally complex. Governments default to the tools they have, even when those tools are poorly suited to the risks they're addressing.
The Precedent Problem
The Montana ban set a precedent that extends beyond TikTok. If states can ban social media platforms based on the national origin of their owner, this creates a template applicable to any foreign-owned platform. The Chinese government could, with identical legal logic, ban American social media platforms (it has). European governments could ban platforms based on U.S. ownership if U.S. surveillance laws create comparable concerns about data access. The result is an internet fragmentation that fundamentally changes the nature of global communication.
International human rights law recognizes freedom of information — the right to seek, receive, and impart information across frontiers — as a fundamental right. Platform bans based on ownership nationality are difficult to reconcile with this right. The Montana ban, had it been enforced, would have restricted the ability of Montana residents to seek and receive information from a platform used by a billion people globally.
The Federal-State Balance
The preliminary injunction blocking the Montana ban underscored that foreign policy, including policy toward foreign-owned technology companies, is a federal matter. State-level social media regulation faces fundamental jurisdictional constraints in the U.S. constitutional framework. The federal government's subsequent passage of its own TikTok divestiture law, and the Supreme Court's unanimous affirmation of its constitutionality, demonstrates that there is a constitutional path to restricting TikTok — but it requires federal action under federal foreign affairs authority, not state action under state police powers.
What This Means for Users
The relationship between users and platforms is mediated by geopolitics. The Montana ban illustrates that the platforms users rely on for communication, entertainment, and community are not merely products — they are geopolitical actors whose continued availability depends on the political relationships between nations. A platform ban is simultaneously a content restriction, a commercial action, and a diplomatic statement.
Platform banning is technically insufficient as a governance mechanism. A platform that is globally distributed, accessible via VPN, and installed on billions of devices cannot be meaningfully banned through app store restrictions alone. The technical infrastructure of the global internet, built for openness and resilience, actively resists the kind of localized platform restriction that governments might wish to impose. Meaningful restriction requires either the kind of centralized network-level control that authoritarian governments maintain, or the kind of coordinated international regulatory action that liberal democracies have struggled to achieve.
Creators are caught between platform risks and platform dependence. Montana-based TikTok creators faced a genuine dilemma: the platform that had become central to their livelihood might be banned in their state. This precarity — which is not merely algorithmic (will the algorithm distribute my content?) but now also political (will the platform remain available?) — illustrates the multiple dimensions of uncertainty that creators on algorithmically curated platforms face.
The governance challenge is real even if the Montana remedy was wrong. The concerns motivating the Montana ban — data collection by a Chinese-owned platform, potential algorithmic influence by a government with adversarial interests — are legitimate. The question is not whether they deserve regulatory attention but what governance mechanisms are proportionate, technically feasible, and constitutionally sound. The Montana ban's failure in court does not dissolve the underlying policy challenge; it merely demonstrates that a state-level platform ban was the wrong approach to it.
Discussion Questions
-
Judge Molloy found that the Montana ban raised serious First Amendment concerns because it restricted the speech of Montana TikTok users. But the state argued it was regulating a corporation, not user speech. How should courts analyze the relationship between platform restrictions and user speech rights? Does the distinction between "regulating the platform" and "restricting user speech" hold up under scrutiny?
-
The chapter argues that the Montana ban addresses the wrong unit — the platform — rather than the specific alleged harms (algorithmic manipulation, data access). Design an alternative regulatory approach that would target these specific alleged harms more precisely. What technical capabilities and legal authorities would this approach require?
-
If the United States can restrict TikTok based on its Chinese ownership, can China restrict Google, Facebook, and YouTube based on their American ownership? (China has done exactly this.) How should international law treat these parallel actions? Are they equivalent?
-
The technical obstacles to enforcing a state-level platform ban (VPNs, existing installations, web access) mean that such bans primarily burden legal, compliant users while being easily circumvented by sophisticated users. Does this distributional effect on enforcement matter for the policy analysis? Should it affect the constitutional analysis?
-
TikTok creators in Montana faced potential loss of livelihood from the proposed ban. What obligations, if any, do governments have to creators whose economic activities depend on platforms that may be restricted for security reasons? What remedies would be appropriate?