The First Amendment Implications of TikTok's Banby@legalpdf
123 reads

The First Amendment Implications of TikTok's Ban

by Legal PDFMay 9th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The court case delves into the constitutional debate over the TikTok ban, questioning its validity under the First Amendment and highlighting concerns regarding national security and online free speech rights.
featured image - The First Amendment Implications of TikTok's Ban
Legal PDF HackerNoon profile picture

Tiktok Inc., and ByteDance LTD., v. Merrick B. Garland Update Court Filing, retrieved on May 7, 2024, is part of HackerNoon’s Legal PDF Series. You can jump to any part in this filing here. This part is 7 of 11.

Grounds On Which Relief Is Sought

Petitioners seek review of the constitutionality of the Act on grounds that include, without limitation, the following.

Ground 1: Violation of the First Amendment

47. The First Amendment to the U.S. Constitution provides that “Congress shall make no law . . . abridging the freedom of speech.” U.S. Const., amend. I.

48. By banning all online platforms and software applications offered by “TikTok” and all ByteDance subsidiaries, Congress has made a law curtailing massive amounts of protected speech. Unlike broadcast television and radio stations, which require government licenses to operate because they use the public airwaves, the government cannot, consistent with the First Amendment, dictate the ownership of newspapers, websites, online platforms, and other privately created speech forums.

49. Indeed, in the past, Congress has recognized the importance of protecting First Amendment rights, even when regulating in the interest of national security. For example, Congress repeatedly amended IEEPA — which grants the President broad authority to address national emergencies that pose “unusual and extraordinary threat[s]” to the country — to expand protections for constitutionally protected materials. 50 U.S.C. §§ 1701–02. Accordingly, under IEEPA, the President does not have the authority to even indirectly regulate “personal communication” or the importation or exportation “of any information or informational materials,” id. § 1702(b)(1), (3) — limitations that are necessary “to prevent the statute from running afoul of the First Amendment,” Amirnazmi, 645 F.3d at 585. Yet Congress has attempted to sidestep these statutory protections aimed at protecting Americans’ constitutional rights, preferring instead to simply enact a new statute that tries to avoid the constitutional limitations on the government’s existing statutory authority. Those statutory protections were evidently seen as an impediment to Congress’s goal of banning TikTok, so the Act dispensed with them.

50. The Act’s alternative to a ban — a so-called “qualified divestiture” — is illusory to the point of being no alternative at all. As explained above, divesting TikTok Inc.’s U.S. business and completely severing it from the globally integrated platform of which it is a part is not commercially, technologically, or legally feasible.

51. The Act will therefore have the effect of shutting down TikTok in the United States, a popular forum for free speech and expression used by over 170 million Americans each month. And the Act will do so based not on any proof of a compelling interest, but on speculative and analytically flawed concerns about data security and content manipulation — concerns that, even if grounded in fact, could be addressed through far less restrictive and more narrowly tailored means.

52***. Petitioners’ protected speech rights.*** The Act burdens TikTok Inc.’s First Amendment rights — in addition to the free speech rights of millions of people throughout the United States — in two ways.

53. First, Petitioner TikTok Inc. has a First Amendment interest in its editorial and publishing activities on TikTok. See Hurley v. IrishAm. Gay, Lesbian & Bisexual Grp. of Bos., 515 U.S. 557, 570 (1995). TikTok “is more than a passive receptacle or conduit for news, comment, and advertising” of others; TikTok Inc.’s “choice of material” to recommend or forbid “constitute[s] the exercise of editorial control and judgment” that is protected by the First Amendment. Miami Herald Pub. Co. v. Tornillo, 418 U.S. 241, 258 (1974); see also Alario v. Knudsen — F. Supp. 3d —, 2023 WL 8270811, at *6 (D. Mont. Nov. 30, 2023) (recognizing TikTok Inc.’s First Amendment editorial rights).

54. As the government itself has acknowledged, “[w]hen [social media] platforms decide which third-party content to present and how to present it, they engage in expressive activity protected by the First Amendment because they are creating expressive compilations of speech.” Br. for United States as Amicus Curiae at 12–13, Moody v. NetChoice LLC, No. 22-277 (U.S.), 2023 WL 8600432; see also id. at 18– 19, 25–26.

55. Second, TikTok Inc. is among the speakers whose expression the Act prohibits. TikTok Inc. uses the TikTok platform to create and share its own content about issues and current events, including, for example, its support for small businesses, Earth Day, and literacy and education.[18] When TikTok Inc. does so, it is engaging in core speech protected by the First Amendment. See Sorrell v. IMS Health Inc., 564 U.S. 552, 570 (2011); NetChoice, LLC v. Att’y Gen., Fla., 34 F.4th 1196, 1210 (11th Cir. 2022), cert. granted, 144 S. Ct. 478 (2023). The Act precludes TikTok Inc. from expressing itself over that platform.

56. Even if the U.S. TikTok platform could be divested, which it cannot for the reasons explained above, TikTok Inc.’s protected speech rights would still be burdened. Because the Act appears to conclusively determine that any application operated by “TikTok” — a term that Congress presumably meant to include TikTok Inc. — is a foreign adversary controlled application, Sec. 2(g)(3)(A), the President appears to lack the power to determine that a TikTok Inc.-owned application is “no longer being controlled by a foreign adversary” and has no “operational relationship” with “formerly affiliated entities that are controlled by a foreign adversary,” Sec. 2(g)(6)(A) & (B). The Act therefore appears to conclusively eliminate TikTok Inc.’s ability to speak through its editorial and publishing activities and through its own account on the TikTok platform.

57. For similar reasons, the Act burdens the First Amendment rights of other ByteDance subsidiaries to reach their U.S. user audiences, since those companies are likewise prohibited from speaking and engaging in editorial activities on other ByteDance applications.

58. The Act is subject to strict scrutiny. The Act’s restrictions on Petitioners’ First Amendment rights are subject to strict scrutiny for three independent reasons.

59. First, the Act represents a content- and viewpoint-based restriction on protected speech. The Act discriminates on a content basis because it exempts platforms “whose primary purpose” is to host specific types of content: “product reviews, business reviews, or travel information and reviews.” Sec. 2(g)(2)(B). The Act thus “distinguish[es] favored speech” — i.e., speech concerning travel information and business reviews — “from disfavored speech” — i.e., all other types of speech, including particularly valuable speech like religious and political content. Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 643 (1994).

60. The Act also discriminates on a viewpoint basis because it appears to have been enacted at least in part because of concerns over the viewpoints expressed in videos posted on TikTok by users of the platform. For example, the House Committee Report asserted, without supporting evidence, that TikTok “can be used by [foreign adversaries] to . . . push misinformation, disinformation, and propaganda on the American public”[19] — a concern that in any event could be raised about any platform for user-generated content. See infra ¶¶ 82, 87. Similarly, Rep. Raja Krishnamoorthi, who co-sponsored the Act, expressed the unsubstantiated concern that “the platform continued to show dramatic differences in content relative to other social media platforms.”[20]

61. Second, the Act discriminates between types of speakers. As explained above, TikTok Inc. is a protected First Amendment speaker with respect to the TikTok platform. The Act facially discriminates between TikTok Inc. and other speakers depending on the “primary purpose” of the platforms they operate. Any application offered by Petitioners is automatically deemed a “foreign adversary controlled application,” without any exclusions or exceptions. Sec. 2(g)(3)(A). By contrast, any other company’s application can be deemed a “foreign adversary controlled application” only if the company does not operate a website or application “whose primary purpose is to allow users to post product reviews, business reviews, or travel information and reviews.” Sec. 2(g)(2)(B). The Act thus favors speakers that do offer such websites or applications over speakers that do not.

62. Moreover, the Act singles out TikTok Inc. and other subsidiaries of ByteDance for unique disfavor in other ways. Whereas other companies with ownership in a country deemed a “foreign adversary” become subject to the Act’s restrictions only upon a presidential determination that the company poses “a significant threat to the national security of the United States,” Sec. 2(g)(3)(B), ByteDance Ltd. and its subsidiaries are automatically subject to the Act’s draconian restrictions by fiat, Sec. 2(g)(3)(A). The standard and process that the Act specifies for every other company likely fall short of what is required by the First Amendment and other applicable constitutional protections, but TikTok Inc. and ByteDance have been singled out for a dramatically different, even more clearly unconstitutional regime — with no public notice, no process for a presidential determination that there is a significant national security threat, no justification of that determination by a public report and submission of classified evidence to Congress, and no judicial review for statutory and constitutional sufficiency based on the reasons set forth in the presidential determination. The Act also draws a speaker-based distinction insofar as it specifically names ByteDance Ltd. and TikTok, and also exempts applications with fewer than 1 million monthly users (except if those applications are operated by ByteDance Ltd. or TikTok). Sec. 2(g)(2)(A)(ii), (3)(A).

63. A statutory restriction targeting specific classes of speakers is subject to strict scrutiny. See United States v. Playboy Ent. Grp., Inc., 529 U.S. 803, 812 (2000) (“Laws designed or intended to suppress or restrict the expression of certain speakers contradict basic First Amendment principles.”). And that is especially true when, as here, the Act singles out Petitioners by name for uniquely disfavored treatment and congressional statements indicate that the Act targets Petitioners in part because of concerns about the content on TikTok. Because the Act “target[s]” both “speakers and their messages for disfavored treatment,” strict scrutiny review is required. Sorrell, 564 U.S. at 565; see also Turner, 512 U.S. at 658–60.

64. Third, the Act is subject to strict scrutiny as an unlawful prior restraint. The Supreme Court has “consistently” recognized in a “long line” of cases that government actions that “deny use of a forum in advance of actual expression” or forbid “the use of public places [for plaintiffs] to say what they wanted to say” are prior restraints. Se. Promotions, Ltd. v. Conrad, 420 U.S. 546, 552–53 (1975). “[P]rior restraints on speech and publication are the most serious and the least tolerable infringement on First Amendment rights.” Nebraska Press Ass’n v. Stuart, 427 U.S. 539, 559 (1976). The Act suppresses speech in advance of its actual expression by prohibiting all U.S. TikTok users — including Petitioner TikTok Inc. — from communicating on the platform. See, LLC v. Dart, 807 F.3d 229 (7th Cir. 2015) (defendant’s conduct restricting the operator of classified advertising website was a prior restraint); Org. for a Better Austin v. Keefe, 402 U.S. 415, 418–19 (1971) (ban on distributing leaflets a prior restraint); U.S. WeChat Users All. v. Trump, 488 F. Supp. 3d 912, 926 (N.D. Cal. 2020) (ban on communications application a prior restraint). The same is true of other ByteDance subsidiaries and their platforms. Such restrictions “bear[] a heavy presumption against [their] constitutional validity.” Se. Promotions, 420 U.S. at 558.

65. The Act fails strict scrutiny because it does not further a compelling interest. Strict scrutiny “requires the Government to prove that the restriction [1] furthers a compelling interest and [2] is narrowly tailored to achieve that interest.” Reed v. Town of Gilbert, 576 U.S. 155, 171 (2015) (numerical alterations added). “If a less restrictive alternative would serve the Government’s purpose, the legislature must use that alternative.” Playboy, 529 U.S. at 813. The Act fails on both counts.

66. The Act does not further a compelling interest. To be sure, national security is a compelling interest, but the government must show that the Act furthers that interest. To do so, the government “must do more than simply posit the existence of the disease sought to be cured.” Turner, 512 U.S. at 664 (plurality op.). Rather, it “must demonstrate that the recited harms are real, not merely conjectural, and that the regulation will in fact alleviate these harms in a direct and material way.” Id.

67. Congress itself has offered nothing to suggest that the TikTok platform poses the types of risks to data security or the spread of foreign propaganda that could conceivably justify the Act. The Act is devoid of any legislative findings, much less a demonstration of specific harms that TikTok supposedly poses in either respect, even though the platform was first launched in 2017.

68. The statements of congressional committees and individual Members of Congress during the hasty, closed-door legislative process preceding the Act’s enactment confirm that there is at most speculation, not “evidence,” as the First Amendment requires. Instead of setting out evidence that TikTok is actually compromising Americans’ data security by sharing it with the Chinese government or spreading pro-China propaganda, the House Committee Report for an earlier version of the Act relies repeatedly on speculation that TikTok could do those things. See, e.g., House Committee Report at 6 (TikTok could “potentially [be] allowing the CCP ‘to track the locations of Federal employees and contractors’”) (emphasis added) (quoting Exec. Order 13,942, 85 Fed. Reg. 48637, 48637 (Aug. 6, 2020)); id. at 8 (discussing “the possibility that the [CCP] could use [TikTok] to control data collection on millions of users”) (emphasis added); id. (“TikTok has sophisticated capabilities that create the risk that [it] can . . . suppre[ss] statements and news that the PRC deems negative”) (emphasis added). Speculative risk of harm is simply not enough when First Amendment values are at stake. These risks are even more speculative given the other ways that the Chinese government could advance these asserted interests using a variety of intelligence tools and commercial methods. See infra ¶¶ 85–87.

69. The conjectural nature of these concerns are further underscored by President Biden’s decision to continue to maintain a TikTok account for his presidential campaign even after signing the Act into law.[21] Congressional supporters of the Act have also maintained campaign accounts on TikTok.[22] This continued use of TikTok by President Biden and Members of Congress undermines the claim that the platform poses an actual threat to Americans.

70. Further, even if the government could show that TikTok or another ByteDance-owned application “push[es] misinformation, disinformation, and propaganda on the American public,” House Committee Report at 2, the government would still lack a compelling interest in preventing Americans from hearing disfavored speech generated by TikTok users and shared on the platform just because the government considers it to be foreign “propaganda.” See Lamont v. Postmaster Gen. of U.S., 381 U.S. 301, 305 (1965).

71. The Act also offers no support for the idea that other applications operated by subsidiaries of ByteDance Ltd. pose national security risks. Indeed, the legislative record contains no meaningful discussion of any ByteDance-owned application other than TikTok — let alone evidence “proving” that those other applications pose such risks. Reed, 576 U.S. at 171.

72. The Act also provides neither support nor explanation for subjecting Petitioners to statutory disqualification by legislative fiat while providing every other platform, and users of other platforms, with a process that includes a statutory standard for disqualification, notice, a reasoned decision supported by evidence, and judicial review based on those specified reasons. Only Petitioners are subjected to a regime that has no notice and no reasoned decision supported by evidence — opening the door to, among other things, post-hoc arguments that may not have been the basis for the government action. The Supreme Court recently explained that the requirement of a “reasoned explanation” is “meant to ensure that [the government] offer[s] genuine justifications for important decisions, reasons that can be scrutinized by courts and the interested public. Accepting contrived reasons would defeat the purpose of the enterprise.” Dep’t of Com. v. New York, 139 S. Ct. 2551, 2576 (2019). Depriving Petitioners of those protections imposes a dramatically heavier burden on the free speech rights of Petitioners and TikTok users that is wholly unjustified and certainly not supported by a compelling interest.

73. The Act also fails strict scrutiny because it is not narrowly tailored. “Even where questions of allegedly urgent national security . . . are concerned,” the government must show that “the evil that would result from the [restricted speech] is both great and certain and cannot be mitigated by less intrusive measures.” CBS, Inc. v. Davis, 510 U.S. 1315, 1317 (1994). To satisfy narrow tailoring, the Act must represent the least restrictive means to further the government’s asserted data security and propaganda interests, Sable Commc’ns of Cal., Inc. v. FCC, 492 U.S. 115, 126 (1989), and be neither over- nor under-inclusive, Ark. Writers’ Project, Inc. v. Ragland, 481 U.S. 221, 232 (1987). The Act fails in each of these respects.

74. The Act opts for a wholesale prohibition on Petitioners offering online applications in lieu of a multitude of less restrictive measures it could have taken instead. As discussed above, Petitioners have been involved in negotiations with CFIUS since 2019 over a package of measures that would resolve the government’s concerns about data security and purported propaganda related to TikTok. The terms of that negotiated package are far less restrictive than an outright ban. The negotiations have resulted in the draft National Security Agreement, which TikTok Inc. is already in the process of voluntarily implementing to the extent it can do so without government action. That initiative includes a multi-billion-dollar effort to create a new TikTok U.S. subsidiary devoted to protecting U.S. user data and have U.S.-based Oracle Corporation store protected U.S. TikTok user data in the United States, run the TikTok recommendation system for U.S. users, and inspect TikTok’s source code for security vulnerabilities.

75. If executed by the government, the National Security Agreement would also give CFIUS a “shut-down option” to suspend TikTok in the United States in response to specified acts of noncompliance. The government has never meaningfully explained why the National Security Agreement (a far less restrictive alternative to an outright, total ban) is insufficient to address its stated concerns about data security and propaganda.

76. Even if the government’s dissatisfaction with the draft National Security Agreement were valid (despite the government never explaining why the agreement that the government itself negotiated is unsatisfactory), the CFIUS process in which Petitioners have participated in good faith is geared toward finding any number of other less restrictive alternatives to an outright, total ban. The CFIUS member agencies could return to working with Petitioners to craft a solution that is tailored to meet the government’s concerns and that is commercially, technologically, and legally feasible. Yet the government has not explained why the CFIUS process is not a viable alternative.

77. There are also a wide range of other less restrictive measures that Congress could have enacted. While many of these measures are themselves unjustified as applied to Petitioners, they nevertheless illustrate that the Act does not select the least restrictive means to further the national security goals that appear to have motivated it. For example, Congress could have addressed some members’ stated concern about TikTok allegedly “track[ing] the locations of Federal employees and contractors”[23] by expanding the existing ban on government-owned devices to cover personal devices of federal employees and contractors. Or Congress could have enacted legislation to regulate TikTok’s access to certain features on users’ devices — measures the Department of Homeland Security identified in 2020 as potential mitigations to “reduce the national security risks associated with” TikTok.[24]

78. Of course, Congress could also have decided not to single out a single speech platform (TikTok) and company (ByteDance Ltd.), and instead pursued any number of industry-wide regulations aimed at addressing the industry-wide issues of data security and content integrity. Congress could have enacted a data protection law governing transfers of Americans’ sensitive data to foreign countries, similar to the strategy President Biden is currently pursuing through executive order.[25] Indeed, Congress did enact such a data-transfer law — the similarly named “Protecting Americans’ Data from Foreign Adversaries Act of 2024” — as the very next division of the legislation that contains the Act. Yet it chose to prohibit only “data broker[s]” from “mak[ing] available personally identifiable sensitive data of a United States individual to any foreign adversary country or . . . any entity that is controlled by a foreign adversary.” H.R. 815, div. I, § 2(a), 118th Cong., Pub. L. No. 118-50 (Apr. 24, 2024).

79. There are also models for industry-wide regulation that Congress could have followed from other jurisdictions. For example, the European Union’s Digital Services Act requires certain platforms to make disclosures about their content-moderation policies and to provide regulators and researchers with access to their data so those researchers can assess if the platforms are systemically promoting or suppressing content with particular viewpoints.[26] Congress pursued none of these alternatives.

80. Congress did not even provide Petitioners with the process and fact-finding protections that the Act extends to all other companies — protections which themselves likely fall short of what the Constitution mandates. Other companies receive prior notice, followed by a presidential determination of (and public report on) the national security threat posed by the targeted application, and the submission to Congress of classified evidence supporting that determination, Sec. 2(g)(3)(B), which then is subject to judicial review based on the actual reasons for the decision, not post hoc rationalizations.

81. Because Congress failed to try any of these less restrictive measures, or at a minimum to explain why these alternatives would not address the government’s apparent concerns, the Act is not narrowly tailored.

82. The Act independently fails strict scrutiny because it is both under- and over-inclusive. The Act is under-inclusive because it ignores the many ways in which other companies — both foreign and domestic — can pose the same risks to data security and promotion of misinformation supposedly posed by Petitioners. The government “cannot claim” that banning some types of foreign owned applications is “necessary” to prevent espionage and propaganda “while at the same time” allowing other types of platforms and applications that may “create the same problem.” Reed, 576 U.S. at 172. Put differently, the Act’s “[u]nderinclusiveness raises serious doubts about whether the government is in fact pursuing the interest it invokes, rather than disfavoring a particular speaker or viewpoint.” Brown v. Ent. Merchants Ass’n, 564 U.S. 786, 802 (2011).

83. Most glaringly, the Act applies only to Petitioners and certain other platforms that allow users to generate and view “text, images, videos, real-time communications, or similar content.” Sec. 2(g)(2)(A). The Act’s coverage is thus triggered not by whether an application collects users’ data, but whether it shows them “content.” Accordingly, there is no necessary relationship between the Act’s scope and Congress’s apparent concern with risks to Americans’ data security, which could equally be posed by personal finance, navigation, fitness, or many other types of applications.

84. The Act also singles out Petitioners by exempting all other companies that operate any website or application “whose primary purpose is to allow users to post product reviews, business reviews, or travel information and reviews.” Sec. 2(g)(2)(B). But the Act does not explain why such applications, when (i) “foreign adversary controlled” under the Act’s broad definition; and (ii) determined by the President to be a significant national security threat, could not likewise be used to collect data from Americans — such as Americans’ location information — or to spread misinformation. Nor does the Act explain why an entire company presents no threat simply because it operates a single website or application the “primary purpose” of which is posting “product reviews, business reviews, or travel information and reviews.” Sec. 2(g)(2)(B). The Act’s differential treatment of this favored category of websites and applications also disregards the fact that there is voluminous content on TikTok containing product reviews, business reviews, and travel information and reviews. Yet TikTok and all ByteDance applications are ineligible for this exclusion.

85. More broadly, the Act ignores the reality that much of the data collected by TikTok is no different in kind from the data routinely collected by other applications and sources in today’s online world, including by American companies like Google, Snap, and Meta. The Act also ignores that foreign countries, including China, can obtain such information on Americans in other ways — such as through open-source research and hacking operations.

86. Likewise, the House Committee Report on an earlier version of the Act speculates that allowing source code development in China “potentially exposes U.S. users to malicious code, backdoor vulnerabilities, surreptitious surveillance, and other problematic activities tied to source code development.”[27] But those supposed risks arise for each of the many American companies that employ individuals in China to develop code. The Act, however, does not seek to regulate, much less prohibit, all online applications offered by companies that have offices in China or that otherwise employ Chinese nationals as software developers.[28]

87. Nor does the Act seek to cut off numerous other ways that Americans could be exposed to foreign propaganda. For instance, the Act leaves foreign nationals (and even adversarial governments themselves) free to operate cable television networks in the United States, spread propaganda through accounts on other online platforms that enable the sharing of user-generated content, or distribute copies of state-run newspapers physically or over the Internet (including by software applications) in the United States.[29]

88. The Act is also over-inclusive because it applies to other ByteDance Ltd.-owned applications that Congress has not shown — and could not possibly prove — pose the risks the Act apparently seeks to address.

89. At a minimum, the Act fails intermediate scrutiny. Even if strict scrutiny did not apply, the Act would still fail intermediate scrutiny as a time, place, and manner restriction: the Act prohibits speech activity on TikTok at all times, in all places, and in all manners anywhere across the United States. To pass intermediate scrutiny, a law must be “narrowly tailored to serve a significant governmental interest.” McCullen v. Coakley, 573 U.S. 464, 486 (2014). This means that it must not “burden substantially more speech than is necessary to further the government’s legitimate interests,” Turner, 512 U.S. at 661–62, and “leave open ample alternative channels for communication of the information,” Clark v. Cmty. for Creative Non-Violence, 468 U.S. 288, 293 (1984).

90. For many of the same reasons the Act cannot satisfy strict scrutiny, it also cannot satisfy intermediate scrutiny:

91. As discussed supra ¶¶ 67–69, the government has failed to establish that its apparent data security and propaganda concerns with TikTok are non-speculative. And as discussed supra ¶¶ 73–81, the Act burdens substantially more speech than necessary because there are many less restrictive alternatives Congress could have adopted to address any legitimate concerns. The Act also fails intermediate scrutiny because it “effectively prevents” TikTok Inc. “from reaching [its] intended audience” and thus “fails to leave open ample alternative means of communication.” Edwards v. City of Coeur d’Alene, 262 F.3d 856, 866 (9th Cir. 2001).

92. Regardless of the level of scrutiny, the Act violates the First Amendment for two additional reasons.

93. The Act forecloses an entire medium of expression. First, by banning TikTok in the United States, the Act “foreclose[s] an entire medium of expression.” City of Ladue v. Gilleo, 512 U.S. 43, 56 (1994). A “long line of Supreme Court cases indicates that such laws are almost never reasonable.” Anderson v. City of Hermosa Beach, 621 F.3d 1051, 1064–65 (9th Cir. 2010).

94. The Act is constitutionally overbroad. Second, the Act is facially overbroad. A law is “overbroad if a substantial number of its applications are unconstitutional, judged in relation to the statute’s plainly legitimate sweep.” United States v. Stevens, 559 U.S. 460, 473 (2010) (citation omitted). Here, for example, the government has never contended that all — or even most — of the content on TikTok (or any other ByteDance-owned application) represents disinformation, misinformation, or propaganda. Yet the Act shuts down all speech on ByteDance-owned applications at all times, in all places, and in all manners. That is textbook overbreadth. See, e.g., Bd. of Airport Comm’rs v. Jews for Jesus, Inc., 482 U.S. 569, 574–75 (1987).

Continue Reading Here.

About HackerNoon Legal PDF Series: We bring you the most important technical and insightful public domain court case filings.

This court case retrieved on May 7, 2024, from is part of the public domain. The court-created documents are works of the federal government, and under copyright law, are automatically placed in the public domain and may be shared without legal restriction.

[18] TikTok (@tiktok), TikTok, (last visited May 6, 2024); TikTok (@tiktok), TikTok, (last visited May 6, 2024); TikTok (@tiktok), TikTok, (last visited May 6, 2024).

[19] House Committee Report at 2.

[20] Sapna Maheshwari, David McCabe & Annie Karni, House Passes Bill to Force TikTok Sale From Chinese Owner or Ban the App, N.Y. Times (Mar. 13, 2024),

[21] Monica Alba, Sahil Kapur & Scott Wong, Biden Campaign Plans to Keep Using TikTok Through the Election, NBC News (Apr. 24, 2024),

[22] Tom Norton, These US Lawmakers Voted for TikTok Ban But Use It Themselves, Newsweek (Apr. 17, 2024), At least one Member created a TikTok account after the Act was enacted. See

[23] House Committee Report at 6.

[24] Cybersecurity and Infrastructure Agency, Critical Infrastructure Security and Resilience Note, Appendix B: Department of Homeland Security TikTok and WeChat Risk Assessment 4 (Sept. 2, 2020).

[25] See Exec. Order 14,117, 89 Fed. Reg. 15421 (Mar. 1, 2024).

[26] EU Reg. 2022/2065 arts. 15, 40(4), 42(2).

[27] House Committee Report at 5.

[28] See, e.g., Karen Freifeld & Jonathan Stempel, Former Google Engineer Indicted for Stealing AI Secrets to Aid Chinese Companies, Reuters (Mar. 6, 2024),

[29] The U.S. government has recognized that foreign government propaganda is an industry-wide challenge for online platforms. See, e.g., Nat’l Intel. Council, Declassified Intelligence Community Assessment, Foreign Threats to the 2020 US Federal Elections (Mar. 10, 2021), YouTube, for example, added disclaimers to certain channels that were reportedly being used to spread disinformation on behalf of the Russian government. Paresh Dave & Christopher Bing, Russian Disinformation on YouTube Draws Ads, Lacks Warning Labels - Researchers, Reuters (June 7, 2019), Like others in the industry, TikTok publishes transparency reports on attempts by users to use the platform for government propaganda purposes. See TikTok, Countering Influence Operations (last visited May 6, 2024),