Content moderation in the wake of mass shootings

unnamed.png

Content moderation in the wake of mass shootings

What’s happening this week: In the wake of two tragic mass shootings over the weekend, officials are once again turning their attention to the spread of problematic content on the Internet after hateful messages from the shooter in El Paso were discovered on the online message board 8chan. President Donald Trump yesterday called on social media platforms to work with the Justice Department to identify potential mass shooters, saying that “we must shine light on the dark recesses of the Internet and stop mass murders before they start.”

Early Monday morning, Cloudflare CEO Matthew Prince announced in a company blog post that Cloudflare will no longer provide services for 8chan, which resulted in a temporary takedown of the website. 

Why it matters to startups:  Internet platforms of all sizes have to grapple with the incredibly difficult problem of content moderation. The Internet allows users to post virtually any amount of content at instant speed, making it impossible for websites to identify and delete all illegal or otherwise objectionable content. That’s especially true for startups without the resources to hire thousands of content moderators.

Section 230 of the Communications Decency Act recognizes the tension platforms face when trying to moderate legal but problematic content. Section 230 gives websites a narrow safe harbor from legal liability for 1) deleting content they find objectionable and 2) failing to delete user content that the websites did not themselves create. Critics of Section 230 assert that this safe harbor gives websites too much freedom to permit objectionable content—from hate speech to political misinformation to cyber harassment—to flourish on their platforms. But the law was specifically written to give companies like Cloudflare the ability to moderate user content as they see fit—in this case, refusing to provide service to 8chan—without worrying about whether they’ll face lawsuits for their moderation decisions. If a website could be sued whenever it decides to delete content, the threat of legal liability would force that site to host objectionable content it doesn’t want on its platform.

For months, policymakers have called for reforms to Section 230, but, given the practical realities of content moderation, the proposed changes will do little to curb problematic online speech and many will actually increase it. This is made even more complicated by the fact that much objectionable speech isn’t actually legally actionable, including hate speech and some political misinformation. Changing Section 230 to get at the problem of objectionable but legal speech could actually disincentivize platforms from the kind of moderation they already do, or it could make it impossible to host user content at all without fear of being sued over any content shared by a user. As evidenced by Cloudflare’s response this week, preserving Section 230’s protections is the best way to ensure that platforms can continue to exist and moderate objectionable content.

On the Horizon. 

  • Congressional Startup Day, a nationwide celebration of entrepreneurial communities organized by Engine and other nonprofit organizations, is on Wednesday, August 21st. During Congressional Startup Day, we help lawmakers meet with startups in their district to learn more about how government can work with startups to support innovation and entrepreneurship. Congressional Startup Day events will also be held throughout the week of August 19-23.