Key Takeaways:
The Internet allows businesses to reach global audiences in ways never before possible. User-generated content can increase trust, facilitate transactions, and open new markets for anyone with Internet access.
Section 230 of the Communications Decency Act (CDA) enables online platforms to host user-generated content without being held legally responsible for the speech of their users.
What is Section 230 and why is it important?
The growth of the Internet has allowed any person with a connected device to post practically his entire life online and many people enjoy doing that. Often described as the “26 words that created the Internet,” Section 230 of the CDA, adopted in 1996, shields websites from liability for user-generated speech and gives platforms breathing room to find and remove objectionable content without fear of burdensome litigation. Additionally, Section 230 establishes a uniform regulatory regime, rather than a 50-state patchwork, prevents frivolous litigation, and empowers platforms to proactively monitor for objectionable content.
Why is Section 230 crucial for startups?
For a startup, Section 230 guarantees that a website can give users a forum to express themselves freely without facing ruinous legal liability if a bad actor says something illegal. This means a startup can launch with fewer lawyers and content moderators. Additionally, Section 230 allows more innovation by decreasing regulation and enabling free expression. Those protections are a key reason that the United States has been home to the vast majority of top Internet companies. But startups are continuing to change the way we interact with each other. If you look at your phone, you’ll see startups less than five years old have reinvented the way we share photos, send money, date, order food, and rent our homes. All of these apps rely on user-generated content, and Section 230 has facilitated their growth in multiple ways.
Where are we now?
Some regulators have suggesting rolling back Section 230’s free speech protections in an attempt to hold platforms liable for content posted by some bad actors online since it is easier to tell a social media platform to remove posts than prosecute those who are perpetrating the underlying crimes or libelous content. But making tech companies, including startups, the content police could have a significant chilling effect on innovation online and hurt the platforms that so many consumers have come to depend on. Instead of weakening a law that has been essential to digital innovation and freedom of expressions, law enforcement should use existing tools to address harm by criminals online. Policymakers need to understand and fully weigh the risks of changing laws that enable user-generated content, as well as the costs for small businesses of policing the material posted on their websites and platforms.