`
`23-356
`
`IN THE UNITED STATES COURT OF APPEALS
`FOR THE SECOND CIRCUIT
`
`EUGENE VOLOKH, LOCALS TECHNOLOGY INC.,
`RUMBLE CANADA INC.,
`
`PLAINTIFFS-APPELLEES,
`
`V.
`
`LETITIA JAMES, IN HER OFFICIAL CAPACITY
`AS ATTORNEY GENERAL OF NEW YORK,
`DEFENDANT-APPELLANT.
`
`
`
`
`
`
`
`
`
`On Appeal from the United States District Court
`for the Southern District of New York
`Case No. 1: 22-cv-10195
`The Honorable Andrew L. Carter, United States District Court Judge
`
`
`BRIEF OF AMICI CURIAE ELECTRONIC FRONTIER FOUNDATION
`AND AMERICAN CIVIL LIBERTIES UNION IN SUPPORT OF
`PLAINTIFFS-APPELLEES AND AFFIRMANCE
`
`
`
`BRIAN HAUSS
`AMERICAN CIVIL LIBERTIES UNION
`FOUNDATION
`125 Broad St., 18th Fl.
`New York, NY 10004
`bhauss@aclu.org
`(212) 549-2500
`
`
`DAVID GREENE
`ELECTRONIC FRONTIER FOUNDATION
`815 Eddy Street
`San Francisco, CA 94109
`davidg@eff.org
`(415) 436-9333
`
`
`Counsel for Amici Curiae
`
`
`
`
`
`
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page2 of 38
`
`CORPORATE DISCLOSURE STATEMENT
`
`Pursuant to Rule 26.1 of the Federal Rules of Appellate Procedure, Amici
`
`Curiae Electronic Frontier Foundation (EFF) and American Civil Liberties Union
`
`(ACLU) state that they do not have a parent corporation and that no publicly held
`
`corporation owns 10% or more of their stock.
`
`Dated: September 26, 2023
`
`
`
`By: /s/ David Greene
`David Greene
`
`
`
` i
`
`
`
`
`
`
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page3 of 38
`
`TABLE OF CONTENTS
`
`CORPORATE DISCLOSURE STATEMENT .......................................................... i
`
`TABLE OF AUTHORITIES.................................................................................... iv
`
`STATEMENT OF INTEREST ................................................................................. 1
`
`INTRODUCTION ..................................................................................................... 3
`
`ARGUMENT ............................................................................................................ 3
`
`I.
`
`The Content Moderation Systems Targeted by the Law Are Editorial
`Processes Protected by the First Amendment. ...................................... 3
`
`A.
`
`Content Moderation Is an Inherently Expressive Editorial
`Process. ....................................................................................... 4
`
`1.
`
`2.
`
`3.
`
`Content Moderation Is an Historic and Widely
`Employed Practice. .......................................................... 4
`
`Social Media Platforms Have Rules, Standards and
`Guidelines about What Content They Want and Don’t
`Want on Their Sites. ......................................................... 7
`
`Content Moderation Has Long Been and Remains a
`Fraught and Controversial Process. .................................. 8
`
`B.
`
`Content Moderation is Protected by the First Amendment. ..... 14
`
`1.
`
`2.
`
`The First Amendment protects the right to speak by
`curating the speech of others. ......................................... 14
`
`Social media and other interactive websites have a
`constitutional right to adopt, define, and implement
`their own editorial policies. ............................................ 17
`
`II. Government Involvement in Content Moderation Raises Serious
`First Amendment Concerns. ............................................................... 21
`
`III. The New York Law Unconstitutionally Coerces Interactive
`Websites to Adopt the State’s Hate Speech Philosophy at the
`Expense of Their Own Editorial Policies. ........................................... 22
`
`
`
`
`ii
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page4 of 38
`
`CONCLUSION ....................................................................................................... 25
`
`CERTIFICATE OF COMPLIANCE ...................................................................... 27
`
`CERTIFICATE OF SERVICE ................................................................................ 28
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`iii
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page5 of 38
`
`
`
`Cases
`
`TABLE OF AUTHORITIES
`
`303 Creative LLC v. Elenis,
`600 U.S. ___, 143 S. Ct. 2298 (2023) ................................................................. 19
`
`Amer. Freedom Defense Initiative v. Lynch,
`217 F. Supp. 3d 100 (D.D.C. 2016) .................................................................... 18
`
`Application of Consumers Union of U. S., Inc.,
`495 F. Supp. 582 (S.D.N.Y. 1980) ...................................................................... 15
`
`Backpage.com v. Dart,
`807 F.3d 229 (7th Cir. 2015) ......................................................................... 23, 24
`
`Bantam Books, Inc. v Sullivan.
`372 U.S. 58 (1963) ........................................................................................ 22, 23
`
`Bursey v. United States,
`466 F.2d 1059 (9th Cir. 1972) ............................................................................. 14
`
`Children's Health Def. v. Facebook Inc.,
`546 F. Supp. 3d 909 (N.D. Cal. 2021)................................................................. 18
`
`Herbert v. Lando,
`441 U.S. 153 (1974) ............................................................................................ 15
`
`Huber v. Biden,
`No. 21-CV-06580-EMC, 2022 WL 827248
`(N.D. Cal. Mar. 18, 2022) ................................................................................... 18
`
`Hurley v. Irish-Am. Gay, Lesbian & Bisexual Grp. of Boston,
`515 U.S. 557 (1995) ............................................................................................ 18
`
`In re Consumers Union of the U.S., Inc.,
`32 Fed. R. Serv. 2d 1373 (S.D.N.Y. 1981) ......................................................... 15
`
`Janus v. American Federation of State, County, & Municipal Employees,
`Council 31,
`138 S. Ct. 2448 (2018) ........................................................................................ 17
`
`
`
`
`iv
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page6 of 38
`
`Kenneally v. Suzuki Motor Co., Ltd.,
`No. M-8-85, 1994 WL 48840 (S.D.N.Y. Feb. 10, 1994) .................................... 15
`
`Kennedy v. Warren,
`66 F.4th 1199 (9th Cir. 2023) .............................................................................. 24
`
`La’Tiejira v. Facebook, Inc.,
`272 F. Supp. 3d 981 (S.D. Tex. 2017)................................................................. 18
`
`Langdon v. Google, Inc.,
`474 F. Supp. 2d 622 (D. Del. 2007) .................................................................... 18
`
`Los Angeles v. Preferred Communications, Inc.,
`476 U.S. 488 (1986) ............................................................................................ 14
`
`Manhattan Community Access Corp. v. Halleck,
`139 S. Ct. 1921 (2019) ........................................................................................ 14
`
`Masterpiece Cakeshop, Ltd. v. Colorado Civil Rights Comm’n,
`138 S. Ct. 1719 (2018) ........................................................................................ 17
`
`Miami Herald Publishing Co. v. Tornillo,
`418 U.S. 241 (1974) .....................................................................................passim
`
`Miller v. California,
`413 U.S. 15 (1973) ................................................................................................ 7
`
`Nat’l Inst. of Family & Life Advocates v. Becerra,
`138 S. Ct. 2361 (2018) ........................................................................................ 17
`
`National Rifle Ass’n v. Vullo,
`49 F.4th 700 (2d Cir. 2022) ................................................................................. 24
`
`NetChoice LLC v. Attorney General, Florida,
`34 F.4th 1196 (11th Cir. 2022) ...................................................................... 18, 19
`
`NetChoice, LLC v Paxton,
`49 F.4th 439 (5th Cir. 2022) .......................................................................... 19, 20
`
`New York Times Co. v. Sullivan,
`376 U.S. 254 (1964) ............................................................................................ 16
`
` v
`
`
`
`
`
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page7 of 38
`
`O’Handley v. Padilla,
`No. 21-CV-07063- CRB, 2022 WL 93625
`(N.D. Cal. Jan. 10, 2022) ..................................................................................... 18
`
`Okwedy v. Molinari,
`333 F.3d 339 (2d Cir. 2003) ................................................................................ 24
`
`Pacific Gas & Elec. Co. v. Pub. Utilities Comm’n of California,
`475 U.S. 1 (1986) ................................................................................................ 18
`
`Prager Univ. v. Google LLC,
`951 F.3d 991 (9th Cir. 2020). .............................................................................. 19
`
`PruneYard Shopping Center v. Robins,
`447 U.S. 74 (1980) .............................................................................................. 19
`
`Rumsfeld v. Forum for Academic & Institutional Rights, Inc.,
`547 U.S. 47 (2006) .............................................................................................. 19
`
`State of Missouri v. Biden,
`No. 23-30445, 2023 WL 5821788 (5th Cir., Sept. 8, 2023)................................ 24
`
`Stossel v. Meta Platforms, Inc.,
`634 F. Supp. 3d 743 (N.D. Cal. 2022)................................................................. 18
`
`Turner Broadcasting Sys., Inc. v. F.C.C.,
`512 U.S. 622 (1994) ............................................................................................ 18
`
`Woodhull Freedom Foundation v. United States,
`72 F.4th 1286 (D.C. Cir. 2023) ........................................................................... 10
`
`Zhang v. Baidu.com Inc.,
`10 F. Supp. 3d 433 (S.D.N.Y. 2014); .................................................................. 18
`
`Statutes
`
`N.Y. Gen. Bus. Law § 394-ccc................................................................ 3, 20, 22, 23
`
`Other Authorities
`
`A New Policy Against Self-Harm Blogs, Tumblr (Feb. 23, 2012) ............................ 6
`
`Adult Nudity and Sexual Activity, Facebook ............................................................. 7
`
`
`
`
`vi
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page8 of 38
`
`Article 19, Sheikh Jarrah: Facebook and Twitter Silencing Protestors, Deleting
`Evidence (May 10, 2021) .................................................................................... 12
`
`Bennett Cyphers, Cory Doctorow, The New ACCESS Act Is a Good Start.
`Here’s How to Make Sure It Delivers, EFF ........................................................ 21
`
`Comment on Evaluating the Competitive Effects of Corporate Acquisitions and
`Mergers, EFF....................................................................................................... 21
`
`Community Guidelines, Instagram ............................................................................ 7
`
`Community Standards Enforcement Report: Bullying and Harassment, Meta....... 13
`
`Community Standards Enforcement Report: Hate Speech, Meta ........................... 13
`
`Danielle Blunt et al., Posting Into The Void, Hacking//Hustling (2020) ................ 10
`
`Defining the Public Interest, X Help Center ............................................................. 6
`
`Digital Well-being, TikTok ....................................................................................... 6
`
`Eric Goldman, Content Moderation Remedies, 28 Mich. Tech. L. Rev. 1 (2021) 6, 8
`
`Facebook Treats Punk Rockers Like Crazy Conspiracy Theorists, Kicks Them
`Offline, EFF ......................................................................................................... 11
`
`Gettr – Terms of Use, Gettr (May 17, 2023) ............................................................. 8
`
`Hannah Bloch-Wehba, Automation in Moderation, 53 Cornell Int’l L.J. 41 (2020) 4
`
`Hannah Denham, Another Fake Video of Pelosi Goes Viral on Facebook, Wash.
`Post (Aug. 3, 2020)................................................................................................ 7
`
`Instagram Stands Against Online Bullying, Instagram ............................................. 6
`
`Jack Shafer, The Op-Ed Page’s Back Pages: A Press Scholar Explains
`How the New York Times Op-Ed Page Got Started,
`Slate (Sept. 27, 2010) .......................................................................................... 16
`
`James Grimmelmann, The Virtues of Moderation, 17 Yale J. of Law & Tech 42
`(2015) .................................................................................................................... 5
`
`Jillian C. York & David Greene, How to Put COVID-19 Content Moderation Into
`Context, Brookings’ TechStream (May 21, 2020) ................................................ 4
`
`
`
`
`vii
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page9 of 38
`
`Jillian C. York, Silicon Values: The Future of Free Speech Under Surveillance
`Capitalism (Verso 2021) ....................................................................................... 9
`
`Kate Crawford & Tarleton Gillespie, What Is a Flag For? Social Media Reporting
`Tools and the Vocabulary of Complaint, 18 New Media & Soc’y 410 (2014) ... 13
`
`Kate Klonick, The New Governors: The People, Rules, And Processes Governing
`Online Speech, 131 Harv. L. Rev. 1598 (2018) ............................................ 5, 7, 8
`
`Kevin Anderson, YouTube Suspends Egyptian Blog Activist’s Account, The
`Guardian (Nov. 28, 2007)................................................................................ 9, 11
`
`Malachy Browne, YouTube Removes Videos Showing Atrocities in Syria, N.Y.
`Times (Aug. 22, 2017) ........................................................................................ 11
`
`Megan Farokhmanesh, YouTube Is Still Restricting and Demonetizing LGBT
`Videos—and Adding Anti-LGBT Ads to Some, The Verge (June 4, 2018) ......... 11
`
`Michael J. Socolow, A Profitable Public Sphere: The Creation of the New York
`Times Op-Ed Page, Commc’n & Journalism Fac. Scholarship (2010)............... 16
`
`Mike Masnick, Content Moderation At Scale Is Impossible: Recent Examples Of
`Misunderstanding Context, TechDirt (Feb. 26, 2021) ........................................ 10
`
`Moderator Code of Conduct, Reddit ....................................................................... 12
`
`Op-Ed, Wikipedia .................................................................................................... 16
`
`Pennysaver, Wikipedia ............................................................................................ 17
`
`Policy: Terms of Use, Wikimedia ........................................................................... 12
`
`Promoting Hate Based on Identity or Vulnerability, Reddit ..................................... 8
`
`Reddiquette, Reddit ................................................................................................. 13
`
`Report Content on Facebook, Facebook ................................................................. 13
`
`Role of Administrators and Moderators on Discord, Discord ................................ 12
`
`Rules Enforcement, Twitter (July 28, 2022)............................................................ 14
`
`Ryan Mac, Instagram Censored Posts About One of Islam’s Holiest Mosques,
`Drawing Employee Ire, BuzzFeed News (May 12, 2021) .................................. 11
`
`
`
`
`viii
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page10 of 38
`
`Santa Clara Principles ............................................................................................. 21
`
`Sensitive Media Policy, Twitter (March 2023) ......................................................... 7
`
`Seny Kamara et al., Outside Looking In: Approaches to Content Moderation in
`End-to-End Encrypted Systems, Ctr. for Democracy & Tech. (2021) .................. 5
`
`Taylor Wofford, Twitter Was Flagging Tweets Including the Word “Queer” as
`Potentially “Offensive Content”, Mic (June 22, 2017) ....................................... 12
`
`Twitter Suspends Accounts of Palestinian Quds News Network, Al Jazeera
`(Nov. 2, 2019) ..................................................................................................... 12
`
`Violent and Graphic Content, Meta Transparency Center ........................................ 6
`
`YouTube Community Guidelines Enforcement, Google .......................................... 13
`
`
`
`
`
`
`
`ix
`
`
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page11 of 38
`
`
`
`STATEMENT OF INTEREST1
`
`Amici curiae submit this brief to provide additional information and
`
`argument on two issues raised by this appeal that are within amici’s field of
`
`expertise: the nature of content moderation as practiced by social media and other
`
`interactive websites as constitutionally protected editorial and curatorial
`
`expression; and, the weighty First Amendment concerns raised by governmental
`
`interference with such curatorial expression. Amici curiae do so from the
`
`perspective of the Internet users who read and contribute to such sites, rather than
`
`the sites themselves as represented by plaintiffs.
`
`The Electronic Frontier Foundation (EFF) is a member-supported, nonprofit
`
`civil liberties organization that has worked for over 30 years to protect free speech,
`
`privacy, security, and innovation in the digital world. EFF, with approximately
`
`30,000 members, represents the interests of technology users in court cases and
`
`broader policy debates surrounding the application of law to the Internet and other
`
`technologies. EFF frequently files briefs in cases addressing online intermediary
`
`content moderation, and studies and writes extensively on the issue.
`
`The American Civil Liberties Union (ACLU) is a nationwide, non-partisan,
`
`
`1 No counsel for a party authored this brief in whole or in part, and no person other
`than amici or their counsel has made any monetary contributions intended to fund
`the preparation or submission of this brief. The parties have consented to the filing
`of this brief.
`
`
`
`1
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page12 of 38
`
`
`
`non-profit organization. The organization is dedicated to defending the principles
`
`embodied in the Constitution and our nation’s civil rights laws and, for over a
`
`century, has been at the forefront of efforts nationwide to protect the full array of
`
`civil rights and liberties, including freedom of speech and freedom of the press
`
`online. The ACLU has frequently appeared before courts throughout the country in
`
`First Amendment cases, both as direct counsel and as amicus curiae.
`
`
`
`
`
`
`
`2
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page13 of 38
`
`
`
`INTRODUCTION
`
`Section 394-ccc is an unconstitutional intrusion into the editorial freedom
`
`that all publishers, including social media platforms and other websites, enjoy. The
`
`law’s clear purpose is to change the editorial policies and editorial methods of
`
`these publishers to accord with the state’s definition of hateful speech and what the
`
`state perceives to be its attendant harms. While the law purports to empower some
`
`internet users, it also aims to silence others and deny still others access to
`
`information.
`
`Content moderation by the online intermediaries is an already fraught
`
`process, and government interjection of itself into that process in any form raises
`
`serious First Amendment, and broader human rights, concerns. Courts must
`
`generally scrutinize such interventions. Laws such as section 394-ccc that seek to
`
`coopt a portion of the process must survive First Amendment scrutiny. Section
`
`394-ccc does not. This Court should affirm the district court’s order enjoining the
`
`law’s enforcement.
`
`ARGUMENT
`
`I. The Content Moderation Systems Targeted by the Law Are Editorial
`Processes Protected by the First Amendment.
`
`As a threshold matter, this Court should join the other courts that recognize
`
`that the social media platforms and other websites subject to section 394-ccc have
`
`the First Amendment right to curate the speech of others that is published on their
`
`
`
`3
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page14 of 38
`
`
`
`sites, regardless of whether they curate a lot or a little, and regardless of whether
`
`their editorial philosophy is readily discernible or consistently applied.
`
`A. Content Moderation Is an Inherently Expressive Editorial
`Process.
`
`1.
`
`Content Moderation Is an Historic and Widely Employed
`Practice.
`
`Social media platforms, at least from their point of mass adoption, have
`
`rarely published all legal speech submitted to their sites. Instead, they engage in
`
`content moderation: the use of policies, systems, and tools to decide what user-
`
`generated content or accounts to publish, remove, amplify, or manage.2 Large-
`
`scale, outsourced content moderation first emerged in the early 2000s.3
`
`Platforms practice content moderation in phases: they define permissible and
`
`impermissible content; detect content that may violate their policies or the law;
`
`evaluate that content to determine whether it in fact violates their policies or the
`
`law; take an enforcement action against violative content; allow users to appeal or
`
`otherwise seek review of content moderation decisions that they believe are
`
`erroneous; and educate users about content moderation policies and their
`
`
`2 See Hannah Bloch-Wehba, Automation in Moderation, 53 Cornell Int’l L.J. 41,
`42, 48 (2020).
`3 Jillian C. York & David Greene, How to Put COVID-19 Content Moderation Into
`Context, Brookings’ TechStream (May 21, 2020),
`https://www.brookings.edu/articles/how-to-put-covid-19-content-moderation-into-
`context/.
`
`
`
`4
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page15 of 38
`
`
`
`enforcement.4 In each phase, platforms make editorial judgments about what
`
`content they wish to allow or forbid on their services, or how to display or arrange
`
`it.
`
`For example, during the definitional phase, some platforms develop a
`
`content policy, i.e., a set of rules about what content is and is not allowed on their
`
`platforms.5 Platforms may engage in significant internal discussion and debate,
`
`conduct internal and external research, and write multiple drafts before
`
`determining their content policies.6 Smaller platforms might not adopt formal
`
`policies at all—particularly, for example, platforms like blogs, where the small
`
`community size could make it manageable for a single blogger to moderate the
`
`platform via ad hoc decision-making.7 Many platforms publish their content
`
`policies, but others do not, in order to maintain flexibility and for other reasons.
`
`
`4 Seny Kamara et al., Outside Looking In: Approaches to Content Moderation in
`End-to-End Encrypted Systems, Ctr. for Democracy & Tech. 9–11 (2021),
`https://cdt.org/wp-content/uploads/2021/08/CDT-Outside-Looking-In-Approaches-
`to-Content-Moderation-in-End-to-End-Encrypted-Systems-updated-20220113.pdf.
`5 Id. at 9.
`6 See Kate Klonick, The New Governors: The People, Rules, And Processes
`Governing Online Speech, 131 Harv. L. Rev. 1598, 1631-35 (2018).
`7 See James Grimmelmann, The Virtues of Moderation, 17 Yale J. of Law & Tech
`42, 73 (2015) (“[T]he larger a community is, the better it is at competing with
`external alternatives, but the more internal moderation it requires . . . . As a
`community grows, it becomes easier for individuals and groups to resist a norm.
`This breakdown makes it harder to use social norms to moderate large
`communities. A group of twenty can operate by unspoken consensus in a way that
`a group of twenty thousand cannot.”).
`
`
`
`5
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page16 of 38
`
`
`
`Once a platform has decided, perhaps after deliberation and debate, during
`
`the evaluation phase that particular content violates its policies, the platform must
`
`decide what action to take in the enforcement phase. That is not a binary decision
`
`about whether to take down content or allow it to remain on a service, but also
`
`includes whether to change the manner or place in which content is displayed or to
`
`add the platform’s own affirmative speech.8 For example, depending on the nature
`
`of the violative content, a platform might choose to add its own content warning,
`
`public service announcement, health or safety resources, or “fact-checking”
`
`information to provide context and support for its users.9
`
`
`8 See Eric Goldman, Content Moderation Remedies, 28 Mich. Tech. L. Rev. 1, 23–
`39 (2021) (describing various enforcement options).
`9 Id. at 26-27, 30-31. See also, e.g., Instagram Stands Against Online Bullying,
`Instagram, https://about.instagram.com/community/anti-bullying (last visited Sept.
`22, 2023) (warnings before posting “potentially offensive” comments); Digital
`Well-being, TikTok, https://www.tiktok.com/safety/en-gb/well-being/ (last visited
`Sept. 22, 2023) (digital well-being and media literacy resource guide); Violent and
`Graphic Content, Meta Transparency Center,
`https://transparency.fb.com/policies/community-standards/violent-graphic-content/
`(last visited Sept. 22, 2023) (warning labels for graphic content); Defining the
`Public Interest, X Help Center, https://help.twitter.com/en/rules-and-
`policies/public-interest (last visited Sept. 21, 2023) (public click-through notice
`added to posts by government official that violate content policies and would
`otherwise be taken down, but are left up under X’s public interest exception); A
`New Policy Against Self-Harm Blogs, Tumblr (Feb. 23, 2012),
`https://staff.tumblr.com/post/18132624829/self-harm-blogs (proposing adding
`PSAs to self-harm-related search results).
`
`
`
`6
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page17 of 38
`
`
`
`2.
`
`Social Media Platforms Have Rules, Standards and
`Guidelines about What Content They Want and Don’t
`Want on Their Sites.
`
`Social media platforms’ content policies commonly prohibit users from
`
`posting speech that a platform believes is detrimental to its users and the public, its
`
`business interests, its editorial preferences, or all of these, even if that speech is
`
`legal. For example, many platforms ban legal, non-obscene sexual content, even
`
`though such speech enjoys First Amendment protection, see Miller v. California,
`
`413 U.S. 15 (1973).10
`
`Content moderation differs from platform to platform.11 Some platforms
`
`detect potentially violating content only after it is posted; others screen some or all
`
`content ex ante.12 Platforms make different judgment calls about whether particular
`
`content violates their content policies, even if those policies are similar.13 They use
`
`
`10 See, e.g., Adult Nudity and Sexual Activity, Facebook,
`https://transparency.fb.com/policies/community-standards/adult-nudity-sexual-
`activity/ (last visited Sept. 21, 2023).
`11 Compare Community Guidelines, Instagram,
`https://help.instagram.com/477434105621119 (last visited Sept. 21, 2023)
`(prohibiting nudity except in the context of breastfeeding, birth-related moments,
`health-related situations, in paintings or sculptures, or as an act of protest), with
`Sensitive Media Policy, Twitter (March 2023), https://help.twitter.com/en/rules-
`and-policies/media-policy (permitting “consensually produced adult nudity”).
`12 Klonick, supra n.6, at 1635.
`13 See, e.g., Hannah Denham, Another Fake Video of Pelosi Goes Viral on
`Facebook, Wash. Post (Aug. 3, 2020),
`https://www.washingtonpost.com/technology/2020/08/03/nancy-pelosi-fake-video-
`facebook/ (reporting that TikTok, Twitter and YouTube removed a doctored video
`of Rep. Nancy Pelosi, while Facebook allowed it to remain with a label).
`
`
`
`7
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page18 of 38
`
`
`
`different methods to enforce their content policies, such as labeling content,
`
`placing interstitial warnings over it, or removing the ability to make money from
`
`it.14 Some platforms allow users to appeal content moderation decisions, while
`
`others do not.15
`
`Although many platforms choose to prohibit speech expressing hatred based
`
`on race, ethnicity, religion, and other characteristics, they do so in divergent
`
`manners. For example, social media platform Gettr explains that it “holds freedom
`
`of speech as its core value and does not wish to censor your opinions,” while at the
`
`same time reserving the right to “address” content that attacks any religion or
`
`race.16 Reddit’s content policy prohibits content that promotes “hate based on
`
`identity or vulnerability,” including race, religion, and national origin.17
`
`3.
`
`Content Moderation Has Long Been and Remains a
`Fraught and Controversial Process.
`
`Content moderation controversies are not a new problem.
`
`In 2007, YouTube, only two years old at the time, shut down the account of
`
`Egyptian human rights activist Wael Abbas after receiving multiple reports that the
`
`
`14 Goldman, supra n.8, at 23–39.
`15 Klonick, supra n.6, at 1648.
`16 Gettr – Terms of Use, Gettr (May 17, 2023), https://gettr.com/terms.
`17 Promoting Hate Based on Identity or Vulnerability, Reddit,
`https://www.reddithelp.com/hc/en-us/articles/360045715951 (last visited Sept. 21,
`2023).
`
`
`
`8
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page19 of 38
`
`
`
`account featured graphic videos of police brutality and torture.18 YouTube’s
`
`community standards at the time stated that “[g]raphic or gratuitous violence is not
`
`allowed.”19 Just one year before, Abbas became the first blogger to receive the
`
`Knight International Journalism Award.20
`
`And government’s attempts to influence content moderation date back just
`
`as far: Abbas’s account was restored only after the U.S. State Department
`
`communicated with YouTube’s new owner, Google.21
`
`Content moderation remains a difficult and often fraught process that even
`
`the largest and best-resourced social media companies struggle with, often to the
`
`frustration of users. Even when using a set of precise rules or carefully articulated
`
`“community standards,” moderated platforms often struggle to draw workable
`
`lines between permitted and forbidden speech. Every online forum for user speech,
`
`not just the dominant social media platforms, struggles with this problem.
`
`Platforms’ content moderation decisions are thus sometimes inconsistent or
`
`seemingly contrary to their own policies. Some of that is inevitable. Given the
`
`
`18 Kevin Anderson, YouTube Suspends Egyptian Blog Activist’s Account, Guardian
`(Nov. 28, 2007),
`https://www.theguardian.com/news/blog/2007/nov/28/youtubesuspendsegyptianbl
`og.
`19 Id.
`20 Jillian C. York, Silicon Values: The Future of Free Speech Under Surveillance
`Capitalism 25-27 (Verso 2021).
`21 Id.
`
`
`
`9
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page20 of 38
`
`
`
`staggering amounts of content posted on platforms every day and the subjective
`
`judgment calls that some content moderation decisions require, platforms make
`
`mistakes in either moderating or failing to moderate content.22
`
`Beyond mistakes, platforms have often aggressively removed content that is
`
`not prohibited by their content policies, especially when attempting to minimize
`
`legal or reputational risks arising from government regulation or criticism. For
`
`example, many platforms responded to the enactment of the Allow States and
`
`Victims to Fight Online Sex Trafficking Act/Stop Enabling Sex Traffickers Act
`
`(“FOSTA”) by removing content by sex workers and sex worker advocates that is
`
`not actually prohibited by FOSTA.23 See Woodhull Freedom Foundation v. United
`
`States, 72 F.4th 1286, 1299-1305 (D.C. Cir. 2023) (describing FOSTA’s very
`
`limited prohibitions). Government pressure to remove terrorist content from
`
`platforms has also led to over-removals of speech. For instance, in 2021, Instagram
`
`removed posts about one of Islam’s holiest mosques, Al-Aqsa, because its name is
`
`contained within the name of an organization the company had designated as a
`
`
`22 See Mike Masnick, Content Moderation At Scale Is Impossible: Recent
`Examples Of Misunderstanding Context, TechDirt (Feb. 26, 2021),
`https://www.techdirt.com/2021/02/26/content-moderation-scale-is-impossible-
`recent-examples-misunderstanding-context/.
`23 See Danielle Blunt et al., Posting Into The Void, Hacking//Hustling (2020),
`https://hackinghustling.org/wp-content/uploads/2020/09/Posting-Into-the-Void.pdf.
`
`
`
`10
`
`
`
`Case 23-356, Document 72, 09/26/2023, 3573797, Page21 of 38
`
`
`
`terrori

Accessing this document will incur an additional charge of $.
After purchase, you can access this document again without charge.
Accept $ ChargeStill Working On It
This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.
Give it another minute or two to complete, and then try the refresh button.
A few More Minutes ... Still Working
It can take up to 5 minutes for us to download a document if the court servers are running slowly.
Thank you for your continued patience.

This document could not be displayed.
We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.
You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.
Set your membership
status to view this document.
With a Docket Alarm membership, you'll
get a whole lot more, including:
- Up-to-date information for this case.
- Email alerts whenever there is an update.
- Full text search for other cases.
- Get email alerts whenever a new case matches your search.

One Moment Please
The filing “” is large (MB) and is being downloaded.
Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!
If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document
We are unable to display this document, it may be under a court ordered seal.
If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.
Access Government Site