States Push Unworkable, Likely Unconstitutional Content Moderation Bills
TLDR: Across the country, state legislators are pushing bills that would make it harder for Internet companies to moderate content on their sites and services. In addition to raising constitutional and legal problems, the state-level push to limit moderation would disproportionately harm small and emerging companies that are attempting to compete in the Internet ecosystem.
What’s Happening This Week: State lawmakers across the country are pushing a variety of bills that would limit social media companies’ ability to moderate user-generated content on their sites and services. The state-level legislative efforts come as policymakers continue to criticize the content moderation practices of big tech companies, with lawmakers framing many of their complaints around Section 230—a bedrock Internet law that lets Internet companies of all sizes host and moderate user-generated content without being held liable for that user-generated content.
Republican lawmakers in a handful of states—seemingly driven by unfounded claims of anti-conservative bias on the part of Internet companies—have led the charge in introducing legislation that would make it harder for companies to engage in content moderation. Florida Gov. Ron DeSantis (R) recently proposed legislation that would require companies to host all politicians’ speech during elections and limit how companies enforce and update their content moderation policies. Yesterday, a Florida House panel advanced the bill in a largely party-line vote.
Similar GOP-led legislative efforts in Kentucky, North Dakota, and Texas also attempt to limit how companies moderate content or punish companies who moderate content in ways they don’t like. Democrats in Colorado, meanwhile, have introduced legislation that would open the door for investigations into Internet companies that host “hate speech,” “disinformation,” “conspiracy theories,” and more—most of which is constitutionally protected speech and incredibly difficult, if not impossible, to find and moderate at scale. And, earlier this month, Utah lawmakers approved legislation that ignores the realities of content moderation and would prohibit “inequitable” content moderation and create burdensome transparency and appeals requirements.
While many of the states’ content moderation proposals are likely unconstitutional, these efforts signal a willingness on the part of policymakers to make it more difficult for Internet companies to host user-generated content.
Why it Matters to Startups: Policymakers have framed their criticisms of Section 230 and content moderation around the actions of the largest tech companies, but state-level efforts to limit Internet companies’ ability to moderate user-generated content would have an outsized impact on startups.
While some Democrats claim that Internet companies are not doing enough to moderate harmful content on their websites, many Republicans say companies are taking down too much content. These opposing concerns fail to take into account the inherent difficulties of content moderation for even the largest online companies, let alone startups. Content moderation will always be imperfect, especially as companies start to scale up. Teams of human moderators and technological moderation tools can never work with 100 percent accuracy—whether it’s removing something that doesn’t violate acceptable use policies or failing to identify policy-violating content—and both are expensive to build and maintain. The current legal framework—created by Section 230 but also the First Amendment—encourages companies to spend the time and resources they have to moderate content in ways that make sense for their users without having to worry that they’ll be held liable for what they don’t catch.
Even with Section 230’s liability limitations, a startup can still spend tens of thousands of dollars to get a lawsuit over user content dismissed. Removing the bedrock liability limitations would increase those costs into the hundreds of thousands of dollars, meaning that just one lawsuit could bankrupt a small company before it even gets off the ground. And state-level efforts to limit content moderation and build new obligations on top of Section 230 would only make it more difficult for startups with small teams to expand.
State lawmakers should drop their problematic and likely unconstitutional attempts to dictate Internet companies' content moderation practices. Instead, policymakers should work with the startup community to better understand the importance of current policies and legal frameworks to competition and startup growth.
On the Horizon.
ITIF's Center for Data Innovation is holding a webinar tomorrow at 10 a.m. to discuss ways of using data to fight e-commerce counterfeits.
The Senate Budget Committee is holding a hearing at 11 a.m. tomorrow to examine America’s income and wealth inequality crisis.
The House Financial Services Subcommittee on Diversity and Inclusion is holding a hearing this Thursday at 10 a.m. to discuss “How Diversity Data Can Measure Commitment to Diversity, Equity and Inclusion.”
Join Engine next Wednesday, March 24th at 4 p.m. for a virtual policy seminar on Section 512 of the DMCA. We’ll discuss how Section 512 impacts startups and how entrepreneurs can get involved in the policy debate. You can RSVP here.