The push for a new federal law regulating digital replicas and what it means for startups

The recent legislative and agency efforts directed at regulating digital replicas of individuals’ name, image, and likeness (NIL) have the potential to harm startups’ access to the artificial intelligence (AI) ecosystem and chill innovation. Startups are both the developers of tools that could be used to make digital replicas and operate the platforms where users might post content, including digital replicas, so any NIL framework will impact a wide range of businesses. Potential legislation should promote a balanced approach that protects individuals without burdening the startups.

What do we mean when we say digital replicas?

Digital replicas refer to videos, images, or audio recordings that have been digitally created or modified using AI to realistically depict an individual using their NIL and other indicators of identity. Concerns from policymakers over NIL content have been sparked by notable controversies this year such as the widely circulated explicit deepfakes of Taylor Swift on X (formerly known as Twitter). But digital replicas have beneficial uses, including as a tool for individuals with disabilities. And startups are innovating in this space, including facilitating customer service relationships and employee training by using generative AI to create personalized videos based on an individual’s NIL. As policymakers consider crafting a federal NIL framework, they must avoid chilling startup innovation in the AI ecosystem.

What are the current legal frameworks regulating digital replicas?

The most salient legal protection currently covering NIL content is the piecemeal framework of state statutory and common laws granting individuals the right of publicity. (At the federal level, there is no right of publicity.) The state protection stemming from privacy rights protects against the misappropriation of an individual’s name and image mainly in commercial contexts, allowing an individual to sue if their name or image is used without their consent in some circumstances. One of the things the courts consider when looking at the right of publicity lawsuit is the economic impact, making celebrities the most likely to bring these lawsuits when a product, service, or brand tries to benefit from associating with them without permission. Some unauthorized use of NIL, such as parody, is protected by the First Amendment, so courts have to weigh free expression rights against an individual’s interest in their identity when considering rights of publicity claims.

Each state approaches the right of publicity differently, with some expanding protections to individuals’ voices, some offering post-mortem rights, and some not recognizing the right at all. While some state publicity rights are treated within the context of intellectual property, federal courts have disagreed over whether Section 230 — a foundational Internet law that prevents Internet platforms from ruinous lawsuits over the content their users create, which has historically been understood to have an exemption for federal intellectual property rights — should bar lawsuits against Internet platforms that host user content that violates a person’s right of publicity.

Current proposals to regulate digital replicas:

Copyright Office recommendations and USPTO efforts

In a recent report, the Copyright Office recommended changes to federal law to provide individuals, regardless of fame, the right to authorize and license their NIL or digital replicas during their life or post-mortem. The report also, and most concerningly, recommends amending Section 230 and creating liability for Internet platforms that host user content. The report suggests that federal legislation include an “actual knowledge” requirement, which would mean an Internet platform could only be held liable for violations of the law if someone had reported the violative content and the platform failed to remove it. That’s likely to create a notice-and-takedown system similar to the one that stems from the safe harbor in Section 512 of the Digital Millennium Copyright Act, which has the potential for abuse with improper takedown notices often filed that silences valid, noninfringing expression.

Meanwhile, the United States Patent and Trademark Office (USPTO) is currently preparing recommendations for potential executive action on AI and intellectual property issues. As Engine explained at a recent public roundtable on AI and protections for use of an individual’s NIL, a federal NIL framework needs to appropriately balance individuals’ protections with concerns about legitimate expression and innovation.

Bills introduced in Congress

As lawmakers try to get ahead of concerns about digital replicas, several have introduced legislation aimed at the creation and distribution of digital replica content. The Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act would hold individuals or companies liable for creating or sharing unauthorized digital replicas with limited First Amendment protections, a concern for both startup developers and Internet platforms. Similar to the Copyright Office’s proposed federal law, the NO FAKES Act creates a federal intellectual property right to an individual’s NIL that can be licensed (even post-mortem) and installs an actual knowledge requirement that would create a notice-and-takedown framework. The bill creates liability for producing digital replicas and contains a Section 230 carveout so that Internet platforms hosting individual’s unauthorized digital replicas could be sued with damages as high as $25,000 per violation.

The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED) Act would have the National Institute of Standards and Technology create guidelines for synthetic content detection and content origins determination, which involves attaching provenance information and digital watermarks to digital representations of copyrighted works. The bill prohibits removing such watermarks or provenance information and from using the content in AI model training data, which would harm the startup ecosystem by limiting access to data inputs AI models rely on.

The Preventing Abuse of Digital Replicas Act would apply existing trademark law to digital replicas and require courts to presume that a digital replica is likely to cause confusion unless proved otherwise. This bill is a more narrow approach as it focuses on consumer confusion caused by digital replicas rather than creating new federal NIL property rights that mimic copyright law’s protection of original ideas. However, the bill expands the definition of digital replicas beyond individual’s NIL to any identifying characteristics, and it also labels digital replicas as intellectual property for the purpose of a Section 230 carveout.

And a key Senate committee recently passed the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act, a bill aimed at “revenge porn” that would criminalize the publication of non-consensual intimate imagery (NCII), included imagery generated by AI, and require that Internet platforms remove NCII within 48 hours of being notified of the content.

Takeaways for startups

Startups that are developers of AI tools that could be used to create digital replicas or operators of platforms that could host user content containing digital replicas should be paying attention to the ongoing conversations around digital replica policy. Provisions holding developers liable for the AI tools that could be used to create digital replicas would chill innovation. Startups would have no incentive to build innovative tools that benefit users if they could be held liable for a single bad actor’s misuse.

If any of the proposed frameworks that create a new intellectual property right are implemented, Internet platforms would have to aggressively filter user content and possibly face costly, time-consuming lawsuits. Startups are particularly vulnerable to such nuisance litigation where baseless claims are brought by bad actors, and with their limited resources, startups usually have to settle rather than defend a costly case, even if they’re likely to win. This would harm both speech and competition by making it risky for startups to offer their products and services that host content.

Also, provisions enabling individuals to transfer their NIL rights to third parties would further complicate the legal landscape and increase the risk of nuisance litigation against developers and platforms. Creating an NIL licensing regime, including for post-mortem rights, means that an NIL rights holder may not even be the individual depicted in digital replicas, yet they can seek redress for unauthorized NIL usage. Similar to other intellectual property licensing, this creates an avenue for bad actors possessing licenses to individuals’ NIL to file claims against Internet platforms and coerce settlements from startups unable to fight the legal battles.

As policymakers grapple with the potential problems caused by digital replicas, it’s critical they keep in mind the impact these proposals would have on startups and the benefits of digital replicas. The potential chilling of both innovation, competition, and expression underscores the need for a careful, balanced approach to formulating federal NIL legislation.

Disclaimer: This post provides general information related to the law. It does not, and is not intended to, provide legal advice and does not create an attorney-client relationship. If you need legal advice, please contact an attorney directly.