Meta’s Fact-Check Farewell and the Section 230 Shield in a Misinformation Minefield

By Emily Pecknay; Photo Credit: REUTERS/Dado Ruvic

Last week, Meta CEO Mark Zuckerberg announced that the company would be ending its longstanding fact-checking program in the U.S., signaling a shift away from censorship on the eve of the impending change in administration.[1] The social media conglomerate will be replacing independent fact checkers in favor of user-generated “community notes,” similar to the system employed on X, formerly Twitter.[2]

Meta, which owns Facebook, Instagram, and WhatsApp, launched its original fact-checking program on Facebook in the wake of the heated 2016 presidential election.[3] The system ran content through third-party fact-checkers, which would then work to verify the accuracy of such content and give it a “content rating,” such as “False,” “Altered,” “Partly False,” “Missing Context,” “Satire,” or “True,” adding relevant notices to the content.[4] If a piece of content has been labeled as “False,” “Altered,” or “Partly False,” it is demoted and subsequently featured less prominently in a user’s feed.[5] Fact-checkers reviewed all forms of content on Meta’s platforms, including “ads, articles, photos, videos, Reels, audio and text-only posts.”[6]

In the announcement, Meta stated that despite being well-intentioned, this complex system had “gone too far,” “too many mistakes” were made, and “too much harmless content gets censored.”[7] Indeed, Meta has been subject to legal scrutiny in recent years for the labels and fact-checks this system affixed to some aggrieved users’ posts and content.[8] Most notably, Meta was involved in the U.S. Supreme Court decision of Murthy v. Missouri, which arose out of alleged pressure by the Biden administration on Meta and other social media platforms to suppress certain COVID-19-related speech.[9]

The new “community notes” program is fundamentally a crowd-sourcing system in which approved users add notes supplying context or clarification to specific posts, and other users “with a range of perspectives” must vote on the note to decide its utility.[10] Moreover, Meta stated that it will redirect its enforcement efforts toward moderating “illegal and high-severity violations” such as terrorism, child sexual exploitation, drugs, fraud, and scams.[11]

This overhaul will undoubtedly open the floodgates to an onslaught of dangerous misinformation on Meta-owned platforms, a fact that Zuckerberg himself acknowledged when he stated that there would be a “trade-off” and the platform would subsequently “catch less bad stuff.”[12] It is not unimaginable that Meta may ultimately face an increase in litigation from users alleging harm caused by such misinformation or Meta’s failure to moderate such content. What current users may not realize, however, is that such suits are almost guaranteed to fail due to the protections afforded by Section 230 of the Communications Decency Act.[13] Section 230(c) shields “interactive computer services” (ICSs) such as Facebook and Instagram from liability for third-party content and any measures taken to censor such content.[14] This immunity has been extended to both the presence and absence of content moderation by ICSs, creating an essentially insurmountable statutory barrier for plaintiffs.[15]

Ultimately, while Meta’s decision to abandon its fact-checking system may invite criticism and a new wave of legal challenges, the shield of Section 230 will ensure that it remains insulated from liability, leaving users to grapple with the consequences of a platform increasingly shaped by unverified, community-policed content.

 

Emily Pecknay is a 2L at Vanderbilt Law School. She plans on focusing on Litigation and White Collar Crime after law school.

 

[1] Claire Duffy, Meta Is Getting Rid of Fact Checkers. Zuckerberg Acknowledged More Harmful Content Will Appear on the Platforms Now, CNN (Jan. 7, 2025), https://www.cnn.com/2025/01/07/tech/meta-censorship-moderation/index.html.

[2] Id.

[3] Mike Isaac & Theodore Schleifer, Meta Says It Will End Its Fact-Checking Program on Social Media Posts, N.Y. Times (Jan. 9, 2025), https://www.nytimes.com/live/2025/01/07/business/meta-fact-checking.

[4] Content Ratings Fact-Checkers Use, Meta (May 10, 2024), https://transparency.meta.com/en-gb/features/content-ratings-fact-checkers-use/.

[5] Stossel v. Meta Platforms, Inc., 634 F. Supp. 3d 743, 749 (N.D. Cal. 2022).

[6] Bruna Horvath, Jason Abbruzzese & Ben Goggin, Meta Is Ending Its Fact-Checking Program in Favor of a ‘Community Notes’ System Similar to X’s, NBC News (Jan. 7, 2025), https://www.nbcnews.com/tech/social-media/meta-ends-fact-checking-program-community-notes-x-rcna186468.

[7] Joel Kaplan, More Speech and Fewer Mistakes, Meta (Jan. 7, 2025), https://about.fb.com/news/2025/01/meta-more-speech-fewer-mistakes/.

[8] Stossel, 634 F. Supp. 3d at 756–59 (holding, in part, that fact-checking labels applied by Meta to journalist’s videos about climate change and forest management were not defamatory because they reflected an assessment of the videos, not false statements of objective fact); Children’s Health Def. v. Meta Platforms, Inc., 112 F.4th 742, 761–63 (9th Cir. 2024) (holding, in part, that Section 230 did not operate to transform Meta’s moderation of vaccine-related content, including labeling posts as misinformation and suspending accounts, into state action).

[9] Murthy v. Missouri, 603 U.S. 43, 48–53 (2024).

[10] Anne Marie Lee, What Is Community Notes, and How Will It Work on Facebook and Instagram?, CBS News (Jan. 10, 2025), https://www.cbsnews.com/news/what-is-community-notes-twitter-x-facebook-instagram/.; Kaplan, supra note 7.

[11] Kaplan, supra note 7

[12] Duffy, supra note 1.

[13] See 47 U.S.C. § 230.

[14] See id.; Lillian H. Rucker, The End of an Era: The Uncertain Future of Section 230 Immunity for Social Media Platforms, 26 Vand. J. Ent. & Tech. L. 241, 254–55 (2023) (“interactive computer services” has since been interpreted to include social media platforms including Facebook and Twitter).

[15] Nina I. Brown & Jonathan Peters, Say This, Not That: Government Regulation and Control of Social Media, 68 Syracuse L. Rev. 521, 538 (2018).

Explore Story Topics