Every day, people use the Internet to create and share content with others across the globe—and those users, and the Internet companies they rely on, each depend on a copyright framework some policymakers are looking to change. Specifically, a few members of Congress indicated they may be willing to re-open Section 512 of the Digital Millennium Copyright Act (DMCA). Unfortunately, the policy debate mostly revolves around large companies—both big tech companies and Internet platforms as well as large rightsholders in traditional content industries, like mainstream music or movies. But this area of the law is critical to all types of companies, Internet users, and creators. This post attempts to unpack some recent proposals, in an effort to help more stakeholders—especially tech startups—understand what is happening.
Background: For Internet companies and individual websites handling the allegation that a user committed copyright infringement—the law currently provides a relatively clear roadmap: the notice-and-takedown framework in Section 512 of the DMCA. Companies and websites, known as “service providers” under the law, can avoid being automatically liable for user-generated infringement if they implement takedown and comply with a few other requirements. And when this so-called “safe harbor” applies, it generally means that companies are not hauled into court each time a user is accused of copyright infringement that the company had no knowledge of or direct involvement in. Those safe harbors have significant value when one considers the costs of litigation—it can easily cost over half a million dollars to resolve a case before trial, and the damages in copyright cases can be as high as $150,000 for just one user post.
Practically speaking, Section 512 allows companies and websites to launch without first purchasing expensive filters and hiring a lot of moderators and lawyers to review every user-generated post. And it likewise permits growth, because it is impossible for startups to know or predict what its users will share. Among other things, this helps keep the costs of launching an Internet startup within reach—which allows innovators with new tech ideas or business models to provide more and different services to more Internet users and Internet-enabled creators.
For those users and creators, in turn, the DMCA lays the foundation for the online services they rely on to create, share, and connect. That said, there are well-documented instances of DMCA abuse where users and creators have their non-infringing content improperly targeted for takedown. As policymakers consider changes to the law, it is essential they not stifle innovation and that they avoid changes that would exacerbate abuse.
Ideas floated by some policymakers: With some policymakers mentioning changes to Section 512 of the DMCA, these are some of the concerning ideas being discussed.[1]
Mandatory filtering: One idea is requiring service providers to use upload filters and screen all user-generated content to determine if it contains any (potential) copyright infringement.
So what? Filtering tools are very expensive and error-prone (with costs that can range from tens of thousands of dollars per month to tens of millions to develop and maintain a tool in-house). Many companies that currently use some form of filtering also employ dozens of content moderators and lawyers to conduct further human review. And when the filters fail (which they would), service providers could be on the hook for damages in court. Similarly, a mandatory filtering requirement would result in (or at least incentivize) the over-takedown of non-infringing content, with filters tuned to err in favor of takedown if the infringement read is unclear. This obviously disadvantages users and Internet-enabled creators, whose posts are removed without justification, but would also disproportionately impact startups who need to attract and grow a loyal user base.
Notice-and-staydown: With notice-and-takedown, when a service provider receives a notice of alleged infringement it removes the accused content. Switching to notice-and-staydown would mean that service providers would have to not just remove the content flagged in the takedown notice, but go further and (1) look across the entire platform or site to see if the same/similar content appears anywhere else, to remove it, and (2) make sure that the same/similar content is never uploaded to the site ever again.
So what? This raises at least two problems. First, it would be impossible to implement staydown without using filtering technology to try and find the same or similar allegedly infringing content. Even the earliest-stage service providers will quickly encounter too much user-generated content to manually review each post. Second, just because one user’s post is (potentially) infringing, that says nothing about whether subsequent posts are. Copyright infringement is highly context-specific, and staydown ignores its nuances, which would block permissible, non-infringing uses like fair use or licensed content.
Holding platforms liable for user infringement they have no knowledge of: Currently, the DMCA requires platforms to remove infringing content if they have actual or so-called “red flag” knowledge of infringement—i.e., if infringing activity is “apparent,” and it remains on the platform, the service provider can still be liable. Some policymakers, however, have considered lowering that bar and holding service providers liable for more (allegedly) infringing content the company does not know about.
So what? Right now, service providers do not have to seek out potential infringement, but they do have to remove the infringement they know about and respond to allegations. Changing the knowledge threshold would mean companies have to spend more time and money reviewing user posts to try to find possible infringement. Especially for startups and smaller sites—there is little infringement (if any), so that time and money would not catch much (if any) more—but the monitoring would be costly.[2] In addition, it would be up to courts to decide if a company was doing “enough” to try and find infringement, so there would be more legal fees necessary to prove safe harbor eligibility. Neither startups nor investors want to spend hundreds of thousands of dollars on litigation.
“Reasonable” monitoring: Similarly, some policymakers have proposed making service providers implement “reasonable” approaches to monitoring user-generated content.
So what? Reasonableness is fact-specific and has to be decided in court. And what is reasonable for one company one day would not be reasonable for all companies, or even for that same company a year later. With a shifting legal standard, it would mean companies may have to repeatedly re-establish safe harbor eligibility (where court proceedings in a different intermediary context can easily cost $500,000). In short, if a startup has to prove it is doing something reasonable, but it costs more than the company has to make the proof in court, the safe harbor has little practical value.
Allowing ambiguous or incomplete takedown notices: Under current law, a takedown notice is supposed to include at least a representative list of allegedly infringed works and enough information that a service provider can find and remove accused content. Some have proposed allowing notice senders to be less specific, e.g., only identify a few examples of infringement, while still expecting companies to remove the same broader scope of potential infringement.
So what? The current framework saves service providers from having to do too much guess work in removing allegedly infringing content. Rightsholders know what they own and what infringes their rights. Startup service providers, on the other hand, cannot possibly know all copyrighted works and are ill suited to identify what is and is not infringing—a question that often requires facts that only the rightsholder has. Allowing even less detailed notices would mean startups would have to learn the notice sender’s copyright catalog (which can be super complicated, in areas like music and video, where multiple owners overlap) and then find any potentially infringing content on the platform—facing costly and time consuming litigation if they fail.
What about Internet users and Internet-enabled creators? Service providers know that Internet users have an essential voice in any conversation over changes to these copyright laws. If startups cannot succeed and grow, because copyright law throws up substantial barriers to entry, users lose out on those new services. But startups, in particular, are close to their users and understand their individual needs. And they witness (well-documented) abuse of the current DMCA system. Congress created an incredibly strong incentive and deputized private companies to quickly enjoin alleged infringement (something courts usually do), based merely on an email from a purported copyright owner. And the current law makes it risky for service providers or users to fight back. If changes in the law increased the pressure to remove content, or forced increased reliance on error-prone technology, the inevitable result would be companies feeling like they have no choice but to remove more non-infringing, legitimate content from the web. (Because, by comparison, they would routinely end up in court defending costly cases and facing damages of up to $150,000/work infringed). There is room to adjust the current law, to better combat abuse. At the very least, though, policymakers should avoid changes that make abuse easier.
Disclaimer: This post provides general information related to the law. It does not, and is not intended to, provide legal advice and does not create an attorney-client relationship. If you need legal advice, please contact an attorney directly.
—
[1] Various policy proposals discussed in this post have been raised in the following, e.g., Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC, available at https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32019L0790&rid=1; 12/18 Discussion Draft for Stakeholder Comments Only (Dec. 22, 2020), available at https://www.tillis.senate.gov/services/files/97A73ED6-EBDF-4206-ADEB-6A745015C14B; Section 512 of Title 17: A Report of the Register of Copyrights, United States Copyright Office (May 2020), available at https://www.copyright.gov/policy/section512/section-512-full-report.pdf.
[2] See, e.g., Is the DMCA's Notice-and-Takedown System Working in the 21st Century?: Hearing Before the Subcomm. on Intellectual Property of the S. Comm. on the Judiciary, 116th Congress 8-9 (2020) (testimony of Abigail A. Rives), available at https://www.judiciary.senate.gov/imo/media/doc/Rives%20Testimony.pdf (listing OSPs that receive relatively small numbers of takedown notices relative to the amount of content they host).