Intermediary liability protections have allowed startups to thrive

unnamed.png

Intermediary liability protections have allowed startups to thrive

TLDR: As a House panel this morning prepares to discuss the merits of critical intermediary liability protections for online platforms, it’s important for lawmakers to understand the positive impact that these rules have on Internet platforms of all sizes, particularly startups. 

What’s happening this week: Two House Energy and Commerce subcommittees are holding a joint hearing this morning to discuss ways of “fostering a healthier Internet to protect consumers.”

In a memorandum to the participating subcommittee members, Chairman Frank Pallone (D-N.J.) highlighted content moderation practices and Section 230 of the Communications Decency Act as specific topics that will be discussed during the hearing. 

Six witnesses—including Reddit co-founder and CEO Steve Huffman—are scheduled to appear before the committee. It was reported last week that U.S. Trade Representative Robert Lighthizer declined an invitation from the panel to discuss the inclusion of Section 230-like language in international trade agreements.

Some lawmakers have criticized the inclusion of these critical intermediary liability provisions in recent trade agreements, such as the new United States-Mexico-Canada Agreement (USMCA) and the recently signed trade deal with Japan.

Why it matters to startups: Section 230 of the Communications Decency Act provides U.S. companies of all sizes with the flexibility to moderate content on their platforms while protecting them from ruinous litigation that might arise from user-generated content shared on their sites. 

Section 230 is especially crucial for startups because small online platforms with bootstrap budgets and few legal resources are far less equipped to take on costly lawsuits over user content than larger industry players. According to a report we released last year, it can cost up to $80,000 just to dismiss a meritless lawsuit against a platform over its content moderation practices, and that’s with the current Section 230 framework in place. While a large Internet company can likely handle $80,000 in legal costs, that could wipe out the budget of a startup still getting off the ground.

At the same time, the costs of content moderation are something more easily shouldered by large tech companies with massive workforces. While a large company can hire tens of thousands of people to manually review user posts or spend millions on an algorithm to detect exact copies of already-flagged content, those kind of outlays would be impossible for a small platform. Even with all of their resources, companies that can afford an army of content moderators or expensive detection tools still can’t achieve perfect content moderation; an algorithm can’t detect things like variation, context and nuance, and human content moderators will still disagree on cases where user content is at the edge of what the platform permits.

As we explained in an op-ed out this morning, “Without Section 230, a startup that hosts user content would have to use its scarce resources to build out prohibitively expensive and largely ineffective moderation tools and be prepared to spend hundreds of thousands of dollars fighting lawsuits any time one user thinks another user’s content should be taken down, and any time a user disagrees when the platform removes that user’s speech.”

Despite the critical nature of this law for so much of what the Internet has to offer today, there are several misunderstandings among policymakers about the protections the law provides, especially around illegal content. Not only does Section 230 provide protections for platforms that moderate their users’ content, thereby incentivizing responsible moderation, the law does not protect platforms that violate federal law, or substantially contributes to any content that violates federal law. This is one of the many misperceptions Engine debunked in a recent report produced with the Charles Koch Institute on “The Nuts and Bolts of Content Moderation.” 

Some lawmakers have also recently called into question the inclusion of intermediary liability provisions in recent trade deals, including the United States-Mexico-Canada, or USMCA, agreement. As we noted in our op-ed this morning, the inclusion of Section 230-like language in international trade agreements simply ensures “that companies of all sizes can more seamlessly compete across the world.”

“Including intermediary liability protections in trade agreements isn’t a gift to ‘big tech,’” we wrote. “It’s to the benefit of platforms of all sizes, especially startups, and Internet users looking to create and share content around the world.”

On the Horizon.

  • Engine and the Charles Koch Institute will be holding the final panel in our series on the nuts and bolts of encryption this Friday at noon. We’ll be looking at the communities that would be impacted by law enforcement's request for backdoor access to encrypted services, including a "game" where attendees will wade through the costs and benefits of building intentional vulnerabilities.

  • The House Financial Services Committee is scheduled to hold a hearing at 10 am next Wednesday, Oct. 23, to examine Facebook’s proposed Libra cryptocurrency. Facebook CEO Mark Zuckerberg will be testifying at the hearing.

  • The House Small Business’ Subcommittee on Economic Growth is scheduled to hold a hearing this Thursday at 10 am to discuss “what prospects the opportunity zones enacted in the Tax Cuts and Jobs Act provide for small businesses and local economic development.” The full House panel previously announced plans to invite officials from Amazon, Google, and Facebook to a hearing later this fall to discuss their impact on small businesses.