By Corynne McSherry / Electronic Frontier Foundation (EFF)
There is a lot of anxiety around the use of generative artificial intelligence, some of it justified. But it seems like Congress thinks the highest priority is to protect celebrities – living or dead. Never fear, ghosts of the famous and infamous, the U.S Senate is on it.
We’ve already explained the problems with the House’s approach, No AI FRAUD. The Senate’s version, the Nurture Originals, Foster Art and Keep Entertainment Safe, or NO FAKES Act, isn’t much better.
Under NO FAKES, any person has the right to sue anyone who has either made, or made available, their “digital replica.” A replica is broadly defined as “a newly-created, computer generated, electronic representation of the image, voice or visual likeness” of a person. The right applies to the person themselves; anyone who has a license to use their image, voice, or likeness; and their heirs for 70 years after the person dies. It’s retroactive, meaning the post-mortem right would apply immediately to the heirs of, say, Prince, Tom Petty, or Michael Jackson, not to mention your grandmother.
Boosters talk a good game about protecting performers and fans from AI scams, but NO FAKES seems more concerned about protecting their bottom line. It expressly describes the new right as a “property right,” which matters because federal intellectual property rights are excluded from Section 230 protections. If courts decide the replica right is a form of intellectual property, NO FAKES will give people the ability to threaten platforms and companies that host allegedly unlawful content, which tend to have deeper pockets than the actual users who create that content. This will incentivize platforms that host our expression to be proactive in removing anything that might be a “digital replica,” whether its use is legal expression or not. While the bill proposes a variety of exclusions for news, satire, biopics, criticism, etc. to limit the impact on free expression, interpreting and applying those exceptions is even more likely to make a lot of lawyers rich.
This “digital replica” right effectively federalizes—but does not preempt—state laws recognizing the right of publicity. Publicity rights are an offshoot of state privacy law that give a person the right to limit the public use of her name, likeness, or identity for commercial purposes, and a limited version of it makes sense. For example, if Frito-Lay uses AI to deliberately generate a voiceover for an advertisement that sounds like Taylor Swift, she should be able to challenge that use. The same should be true for you or me.
Trouble is, in several states the right of publicity has already expanded well beyond its original boundaries. It was once understood to be limited to a person’s name and likeness, but now it can mean just about anything that “evokes” a person’s identity, such as a phrase associated with a celebrity (like “Here’s Johnny,”) or even a cartoonish robot dressed like a celebrity. In some states, your heirs can invoke the right long after you are dead and, presumably, in no position to be embarrassed by any sordid commercial associations. Or for anyone to believe you have actually endorsed a product from beyond the grave.
In other words, it’s become a money-making machine that can be used to shut down all kinds of activities and expressive speech. Public figures have brought cases targeting songs, magazine features, and even computer games. As a result, the right of publicity reaches far beyond the realm of misleading advertisements and courts have struggled to develop appropriate limits.
NO FAKES leaves all of that in place and adds a new national layer on top, one that lasts for decades after the person replicated has died. It is entirely divorced from the incentive structure behind intellectual property rights like copyright and patents—presumably no one needs a replica right, much less a post-mortem one, to invest in their own image, voice, or likeness. Instead, it effectively creates a windfall for people with a commercially valuable recent ancestor, even if that value emerges long after they died.
What is worse, NO FAKES doesn’t offer much protection for those who need it most. People who don’t have much bargaining power may agree to broad licenses, not realizing the long-term risks. For example, as Jennifer Rothman has noted, NO FAKES could actually allow a music publisher who had licensed a performers “replica right” to sue that performer for using her own image. Savvy commercial players will build licenses into standard contracts, taking advantage of workers who lack bargaining power and leaving the right to linger as a trap only for unwary or small-time creators.
Although NO FAKES leaves the question of Section 230 protection open, it’s been expressly eliminated in the House version, and platforms for user-generated content are likely to over-censor any content that is, or might be, flagged as containing an unauthorized digital replica. At the very least, we expect to see the expansion of fundamentally flawed systems like Content ID that regularly flag lawful content as potentially illegal and chill new creativity that depends on major platforms to reach audiences. The various exceptions in the bill won’t mean much if you have to pay a lawyer to figure out if they apply to you, and then try to persuade a rightsholder to agree.
Performers and others are raising serious concerns. As policymakers look to address them, they must take care to be precise, careful, and practical. NO FAKES doesn’t reflect that care, and its sponsors should go back to the drawing board.
Please share this story and help us grow our network!
Editor’s Note: This month makes four years since ScheerPost was born, maturing into an award-winning top news source on the most important issues of our time at a moment when the once vaunted model of responsible journalism is overwhelmingly the play thing of self-serving billionaires and their corporate scribes. Alternatives of integrity are desperately needed, and we are one of them. Please support our independent journalism by contributing a “birthday gift” to help us reach our goal of $50,000 — we are 75% there. We can’t thank you enough, and promise to keep bringing you this kind of vital news.
You can also make a donation to our PayPal or subscribe to our Patreon.
Corynne McSherry
Corynne McSherry is the Legal Director at EFF, specializing in intellectual property, open access, and free speech issues. Her favorite cases involve defending online fair use, political expression, and the public domain. As a litigator, she has represented the Internet Archive, Professor Lawrence Lessig, Public.Resource.Org, the Yes Men, and a dancing baby, among others. She was named one of California’s Top Entertainment Lawyers and AmLaw’s “Litigator of the Week” for her work on Lenz v. Universal. Her policy work focuses on copyright, generative AI, and best practices for online expression. She has testified before Congress about the Digital Millennium Copyright Act and Section 230. Corynne comments regularly on digital rights issues and has been quoted in a variety of outlets, including NPR, CBS News, Fox News, the New York Times, Billboard, the Wall Street Journal, and Rolling Stone. Prior to joining EFF, Corynne was a litigator at the law firm of Bingham McCutchen, LLP. Corynne has a B.A. from the University of California at Santa Cruz, a Ph.D from the University of California at San Diego, and a J.D. from Stanford Law School. While in law school, Corynne published Who Owns Academic Work?: Battling for Control of Intellectual Property (Harvard University Press, 2001).
Originally Published: 2024-05-02 05:10:00
Source link