As Congress Politicizes Section 230, Startup Concerns Get Left Out
TLDR: Many members of Congress are calling for changes to Section 230—a bedrock Internet law that allows Internet platforms to host and moderate user content without having to worry about ruinous lawsuits. But Democrats and Republicans see very different problems with Section 230 and Internet platforms' content moderation practices, and most policymakers’ criticisms focus exclusively on a handful of large companies while ignoring the outsized impact that changes to Section 230 would have on startups.
What’s Happening This Week: Two different Senate panels are attempting to tackle the issue of Section 230 and content moderation this week through the narrow lens of big tech and alleged anti-conservative bias. On Thursday, the Senate Commerce Committee is holding an executive session to discuss issuing subpoenas to force the CEOs of Facebook, Google, and Twitter to testify at a hearing on Section 230 next month. The subpoenas come after months of largely unsupported claims from Republicans that Internet platforms unfairly censor conservative voices—especially Twitter’s fact-checking of President Donald Trump’s tweets containing misinformation on mail-in ballots earlier this year. Last week, Sen. Maria Cantwell (Wash.)—the top Democrat on the Senate Commerce Committee—called the threat of subpoenas “a partisan effort” and an “attempt to chill the efforts of these companies to remove lies, harassment, and intimidation from their platforms.”
Also on Thursday, members of the Senate Judiciary Committee plan to hold a mark-up on the Online Content Policy Modernization Act (OCPMA)—legislation that would increase abusive copyright litigation and make it easier to sue nascent startups out of existence as a result of their moderation decisions. Despite not hearing from any stakeholders about the bill, members of the committee are poised to rush through legislation that would change fundamental protections for startups and their users.
The criticisms of Section 230 and some of the reform proposals mirror those coming from the administration. Just last week, the Department of Justice sent Congress proposed legislation that would make platforms liable for moderating users’ content. And President Trump issued a May 28th executive order on “preventing online censorship” that directs the Federal Communications Commission to consider whether Internet firms should be allowed to qualify for Section 230’s liability limitations.
Why it Matters to Startups: Any changes to Section 230 as a result of these politically-motivated attacks would disproportionately impact the startup community. Content moderation is a difficult task for even the largest Internet companies, but it’s especially difficult for small companies that can’t hire armies of human content moderators or invest in inherently limited content moderation technologies. Section 230 secures the long-term growth of these nascent firms by providing them with the ability to host and moderate user-generated content in a way that best serves their users and minimizes the threat of potentially devastating lawsuits.
Even though companies can already spend tens of thousands of dollars just to dismiss meritless lawsuits over user content, rolling back Section 230 to pressure companies into removing less content—as envisioned by the Senate Judiciary Committee’s OCPMA—could force Internet platforms to pay hundreds of thousands of dollars in legal fees. While these legislative efforts are ostensibly targeted at the largest social media companies, it’s the smallest Internet platforms that will be harmed the most by any changes to the law. And with state officials, federal agencies, and Congress already asking questions about competition in the online marketplace, it would be wholly ineffectual for them to crack down on perceived anti-competitive practices while also changing the existing liability framework in a way that makes it tougher for new companies to compete.
As we noted in a recent blog post, the legislation being considered this Thursday by the Senate Judiciary Committee would essentially force Internet platforms to choose between engaging in expensive and time consuming lawsuits or leaving up user-generated posts, videos, or photos that they would otherwise take down. While Section 230 currently includes liability limitations for platforms that make a “good faith” effort to remove problematic content, OCPMA would make it easier to sue startups for removing content unless they can show they have an “objectively reasonable belief” that the content is “obscene, lewd, lascivious, filthy, excessively violent, harassing, or unlawful, or that the content promotes self-harm or terrorism.”
By politicizing Section 230 and using it as a cudgel to attack the largest tech firms, policymakers are opening the door to proposals that would harm small platforms. If policymakers truly care about promoting digital competition and combating harmful online content, then they would work with the tech community—especially startups—to defend Section 230 and avoid passing legislation that would allow only the largest firms to host users’ content.
On the Horizon.
The House Antitrust Subcommittee is holding a hearing tomorrow at noon to discuss the Copyright Office’s report on Section 512 of the DMCA.
The House antitrust panel is also holding a hearing this Thursday at 1 p.m. to discuss “proposals to strengthen the antitrust laws and restore competition online.”
New America’s Open Technology Institute is holding a virtual panel at 1:30 p.m. this Thursday to discuss how Internet platforms are working to address the spread of election misinformation on their websites.