At Engine, we believe that if policymakers could see how difficult it is to launch and run a startup, they’d be as invested in making sure the policy landscape works for startups as we are. That’s why we launched The Startup Trail last year, a video game that puts players in the shoes of startup founders who have to make tough choices with limited time and resources.
Now we’ve worked with Copia and Leveraged Play to launch Moderator Mayhem, a video game that has players navigate the inherent tradeoffs around content moderation.
If you’ve shared or consumed user content online, odds are you’ve experienced content moderation—or the decisions of a website or online service to remove, demote, amplify, or curate content created by its users. From having your own innocuous photo inaccurately labeled or removed to seeing a piece of policy-violating content remain accessible, most people have experienced frustration around how a platform moderates its user content.
We know that startups invest a disproportionately large amount of their limited time and resources in content moderation to keep their corners of the Internet safe, healthy, and relevant for their users. Startups of all sizes, catering to all kinds of communities of users, and hosting all types of content (from photos and videos, to reviews, to messages, and more) need to be able to moderate their users’ content to grow and compete in the Internet industry.
And we know that startups would be the most severely impacted if intermediary liability frameworks were to change, opening them up to hundreds of thousands of dollars in litigation costs any time one person wanted to sue over something another user posted. Given that the average seed-stage startup only has about $55,000 to cover all of its costs every month—and many startups that haven’t received significant outside funding have even less—even one legal bill for hundreds of thousands of dollars would put them out of business.
So often policy conversations, especially around intermediary liability frameworks like Section 230, act as if content moderation isn’t a delicate balancing act and Internet companies can simply dedicate enough money, people, and technology at the problem so that only the good, productive content stays online and the bad, harmful content gets taken down. But policymakers can’t agree on what’s good content and what’s bad content, and no amount of employees or technological tools can ensure that content moderation policies are enforced perfectly across a wide range of languages, contexts, and cultures.
We hope Moderator Mayhem helps players understand these realities of content moderation and demonstrates what’s really at stake when policymakers propose legislation that would govern how Internet companies can host and moderate user content.