Startup Perspective Critical in 230 Review
TLDR: As the Justice Department convenes a public workshop today to discuss Section 230 of the Communications Decency Act, it’s important for policymakers to understand how the critical intermediary liability protections have allowed Internet platforms of all sizes to grow and succeed.
What’s Happening This Week: The Department of Justice is holding a public workshop this morning to discuss Section 230 of the Communications Decency Act, a law that has provided startups and Internet giants alike with the freedom to moderate user-generated content that is shared on their sites without the fear of potentially ruinous legal liability.
The three-panel workshop, which will include introductory remarks from Attorney General William Barr and FBI Director Christopher Wray, will discuss “the evolution of Section 230 from its original purpose in granting limited immunity to Internet companies, its impact on the American people, and whether improvements to the law should be made.” Some policymakers have attacked the intermediary liability protections for allowing harmful content to proliferate online, blaming Section 230 for all sorts of societal ills such as online opioid sales, child exploitation, and the spread of terrorist propaganda. Quite the opposite: Section 230 has allowed companies to moderate content on their platforms, helping diminish a wide range of problematic but legal content.
Why it Matters to Startups: Section 230’s limitation on liability is especially critical for startups with few full-time staffers, limited budgets, and few legal resources. As we noted in a report released last year, it can cost a startup as much as $80,000 to fight even a baseless lawsuit over the site’s content moderation practices. While a larger Internet company might be able to handle the cost of these lawsuits, the average startup launches with roughly $78,000 in outside funding—meaning that one lawsuit could bankrupt a company before it even has the opportunity to get off the ground.
Beyond empowering platforms to proactively moderate problematic content, Section 230’s protections play an important role in promoting online competition. Large platforms have the resources to employ teams of content moderators to review user posts for illegal content and can more easily bear the litigation costs when illegal content inevitably slips through. Small startup platforms simply cannot afford to hire content moderators, making it impossible to review every piece of user-generated content. Content filtering tools are also prohibitively expensive for startups, and they often do not work reliably or effectively. Filtering tools often cannot detect nuance and context, and human reviewers are often forced to review thousands of pieces of content in an expedited fashion. The inability of large tech companies to effectively police all user-generated content means it's impossible for startups to do so.
Despite the benefits of Section 230, policymakers and officials have many misconceptions about the role of intermediary liability protections and the ability of platforms to combat online harms, particularly illegal content such as child exploitation. One of the panels at today’s DOJ workshop will examine “whether Section 230 encourages or discourages platforms to address online harms.” But as Engine noted in our recent report with the Charles Koch Institute on “The Nuts and Bolts of Content Moderation,” Section 230 is what enables platforms to address problematic content uploaded by users. As the DOJ continues reviewing Section 230, it’s important for policymakers to be aware of the role these intermediary liability protections have played in allowing platforms, especially startups, to host and moderate user content.
On the Horizon.
The Center for Strategic and International Studies is holding an event at 2 p.m. this afternoon to discuss the NIST privacy framework with Dr. Walter Copan, NIST Director and Under Secretary of Commerce for Standards and Technology.