E (48:24)
Thank you so much. It is lovely to be here. I am Madhavi Sunder. I'm a professor of Intellectual Property Law at Georgetown University Law center in Washington, D.C. and just as Professor Kretchmer and Professors Kretschmer and Applin did a bit of a tag team on copyright and AI, I am going to be the follow up here with Professor McDonough on the right of publicity from the US perspective. So Professor McDonough introduced the popular controversy that you may have all heard of with respect to Scarlett Johansson's voice being appropriated to be the voice or one of the voices of OpenAI's Chat GPT. This controversy erupted last May. Well, as it so happens, I was in OpenAI's offices meeting with General Counsel and the head of IP on the very day, if not the very moment that the Johansson story broke. So our meeting was in fact interrupted by the Council's first phones buzzing non stop. And I learned in the cab ride on my way back to the hotel in San Francisco that day that Ms. Johansen had gone public with her dispute. So the first point though that I want to make about this dispute with respect to US law is kind of like the option zero that Professor Aplin talked about is that the contention is that we might not not really need that much new law in the United States with respect to this issue of digital replicas or deep fakes, specifically in the context of creative celebrities and producers. So one issue is that we actually already have pretty strong right of publicity law, particularly at least in the Hollywood Circuit of California. California that was strongly in Johansson's favor, arguably. So indeed we had a four decades old law case under California state right of publicity which was strikingly on point. So the right of publicity is a tort recognized in most US states, some under specific statutes and others under common law. Broadly speaking, this right in the states US recognizes the right of celebrities to control the commercial value in their identity. So on point to Johanson's case was a well known case about the grand dom of music and entertainment back in the 1980s, Bette Midler. So what happened was in 1985 Ford Motor Company advertised its Lincoln Mercury using different popular songs of the time. But Medler rejected doing any kind of commercial for Ford. So what did Ford do? Well, they hired a previous backup singer for Midler to study Midler's style and to sing the song, quote to sound as much as possible like Bette Midler, to really just imitate her. And in a decision that really expanded significantly right of publicity law in the United States, the Court of Appeals for the 9th Circuit significantly sought for the first time and announced protection of a celebrity's voice against sound alikes. So the court said a voice is as distinctive and personal as face and concluded that to impersonate Midler's voice was to pirate her identity now, the facts in Johansson's case were similar. So in 2013, Johansson voiced an AI girlfriend in the film Her. And a decade later, OpenAI CEO Sam Altman, a film fan of the film, asked Johansson to be the Voice of Chat GPT. She refused. A year later, when OpenAI launched an expressive voice for Chat GPT named Sky, which arguably sounded similar to Johansson's, Johansson said, cease and desist. Now, to be sure, the sky voice was just one of a few different voice options that Chat GPT had rolled out at the time, and a jury would have had to have deliberated on how similar or not the Voice really was to Johanson's. But California right of publicity law has been interpreted very broadly, covering not just exact imitations or sound alikes, but even anything that evokes the identity of a celebrity. So there was a famous earlier case that involved Vanna White, the glamorous co host of the game show Wheel of Fortune, who had won a right of publicity case against the use of a robot that was wearing a blonde wig turning letters. And though the defendant in that case never used White's name or her actual image, the U.S. court of Appeals said that it violated White's right of publicity just to evoke her image. So given this broad reading and some additional factors, OpenAI may have been implicated here. So first, two days before the rollout of the sky voice, Altman reached out again to Johansson and said, have you changed your mind? And she hadn't. Second, this is probably the most damning Altman tweeted about the launch of the new ChatGPT voices, with the simple tagline her plainly evoking Johansson's identity from her voice portrayal in the earlier film. So, as Professor McDonough said, the case ultimately was never litigated because Altman retired the sky voice soon after Johansson objected. But it's worth noting that Do Nothing option zero, would actually, at least in the Hollywood Circuit, probably go a pretty long way to protecting against voice appropriation like Scarlett Johansson's. Notably, there are many other gaps in US Law with respect to deepfakes. So US Law, US Copyright law, for example, would not protect against appropriation of the deep, perhaps sexy style of Johansen's voice work in her. Indeed, U.S. copyright law expressly says it protects neither voice nor style. And I'm going to come back to the style point later. So let me turn to the second point, and that is that. Still, many critics and observers are arguing today that despite this broad right in California, California, the Hollywood Circuit, we really need more and stronger legal protection against unauthorized digital replicas, which include not just commercial appropriation of voice and image for advertisement and monetary gain, but also deep fake pornography. So these arguments are gaining traction. And basically the argument is we can't do nothing. And I think it really echoes what Professor McDonough was also saying, that there's a lot of concern about other disruptions that may be on the periphery of intellectual property. But the call is to use intellectual property law to address these other disruptions, like deepfake pornography. Even Melania Trump has recently gotten in on the action, endorsing a new Take it Down act that would create liability for unauthorized deep fake pornography and require social media hosts to remove the imagery. So where are these renewed calls coming from for stronger and federal protection against unauthorized digital replicas? Well, there are several arguments for it. First and foremost, as I think Professor McDonough said at the end of his remarks, there is no federal right of publicity in the United States. And meanwhile, coverage and monetary liability vary greatly from state to state, leaving a celebrity like Taylor Swift, for example, who saw deepfake pornographic images of her reach tens of millions before social media hosts like X took them down without any clear and effective recourse across the country. Second, some states, like California, protect only the commercial value in one's likeness, including voice. But that right does not extend to ordinary people who typically don't trade in our image, likeness or voice. So thus many states leave ordinary individuals without a remedy against misinformation and pornographic deep fakes, which have become a large threat. Another variation is with respect to whether a state right of publicity law applies to celebrities even after they're dead. So until very recently, the right in California was not descendable, meaning that the right is extreme, extinguished upon the celebrity's death. So in addition, not all right of publicity laws state to state cover voice. Some do, like California, but others don't. And even where a statute might cover voice, like in California, it would again only apply to well known identifiable voices with commercial value. So it's this state of legal affairs that's led the US to Copyright Office to conclude in a report on digital replicas last summer that quote, new federal legislation is urgently needed given the speed, precision and scale of AI created digital replicas. So I'm going to highlight a couple of new pieces of legislation that we're seeing both at the state level and then two at the federal level, and then make some comments about these. So first, Tennessee was the first state to rush in. Not surprisingly, it's the home of a vibrant music Scene in Nashville. So Tennessee passed the Elvis act last March. And this stands for the Ensuring Likeness, Voice and Image Security Elvis Act. And this act is the first to address rights of musicians in the age of AI and the act makes voice appropriate creation a criminal misdemeanor. Now, this act is in response, no doubt, to the release of an AI generated song called Heart on My Sleeve, simulating Drake and the Weeknd in 2023. Known as the Fake Drake before Kendrick Lamar called out Drake himself as fake, the AI generated song drew some 15 million views before it was revealed as AI generated and unauthorized. Now, I note with some irony that the Elvis act is named after a musician who himself was one of the great appropriators of other musicians, particularly black musicians. Voice, style and movement. So California has also amended its even already broad right of publicity statute in light of the AI threat. And it extended the law in late 2023 now to allow for protection of a celebrity even after the celebrity's death. And already we see a lawsuit by the estate of the late comedian George Carlin under the new Act. George Carlin's estate is suing a podcast called Dudesy under the new California law for using AI to impersonate his voice and style in a YouTube comedy cheekily called I'm Glad I'm Dead. So the case of Carlin, I would suggest, raises issues not just of sound alikes, but also of metaphysical voice, right? More akin to style, the protection of which is much more controversial. So it's not clear to what extent Carlin's style is protected under right of publicity law, though again, California's right of publicity has been interpreted broadly to evoke. To include uses that just evoke the celebrity. So we don't know whether that would be included. For its part, though, the U.S. copyright Office, which has urged federal legislation and new legislation in this area, still does stop short of recommending protection for style outright for fear of stifling innovation. There's two notable pieces of federal right of publicity legislation also that are being considered considered by the US Congress, at least they were under the last administration, but I suspect they will continue to get close attention. So one is the no AI Fraud act that was introduced in early 2024. This would create a federal intellectual property right in voice and likeness and protect against the use of unauthorized digital voice replicas and digital depictions that readily identify an individual. So again, I think there are important questions to be raised here with respect to if the concern is other disruptions that have historically been outside of intellectual property, harms to personal dignity from unauthorized pornographic AI generated images, for example, is an intellectual property right. And now at the federal level, perhaps the right answer. This bill would allow these rights to be transferred or licensed during an individual's lifetime, and they would endure at least 10 years after death of an individual, even if that person hadn't used their identity commercially during their lifetime. There would also be punishing of trafficking in quote, personalized cloning service, unquote, designed to produce digital voice replicas. So the creators of this technology and there would of course be secondary liability for the social media hosts who materially contribute to or facilitate infringement when there's knowledge that the subjects of the replica have not consented to them. So the law would of course allow for some balancing of free speech considerations, though many critics are concerned concerned that social media hosts will be pressured to limit liability by taking down otherwise speech enhancing content. So there's another proposed no Fakes act which would similarly create a federal right to control image, voice, likeness. The key aspect that I want to highlight about the no Fakes act is this expressly would apply to all individuals, not just celebrities. Again, there's an inclusion of exceptions for freedom of expression with respect to commentary, criticism, scholarship, satire and parody, et cetera, and also provisions for online service providers that would be required to quickly remove all instances of infringing material once they obtain knowledge about it. So the US Copyright Office can has opined in favor of such laws, especially to the extent to which they extend federal right of publicity type protection to all individuals. So I find this very interesting that the Copyright Office seems quite accepting of extending federal copyright protections to traditionally non copyrightable subject matter like deepfake pornographies and privacy interests. And I think that we should be thinking about whether or not we're seeking to use intellectual property as a tool well beyond its traditional parameters. But I want to end with the final note, and that is that in the scramble of state and federal legislators to push for more legal protections, that we shouldn't forget that digital replicas are not all bad. And I think that Professor McDonough alluded to this. So a recent AI generated parody of Zelensky and Trump's fateful Oval Office meeting last week makes this point. So look up AI Zelensky and Trump Oval Office boxing match and you'll see a parody of a hostile showdown that ends in an all out brawl. Now while such portrayals may raise concerns about disinformation, if properly tagged AI generated content can provide poignant social and political commentary that should not face the immediate threat of being taken down to limit the liability of service providers. And I'm going to end just on a final optimistic note, which is that alongside the right to prevent AI generated likenesses lies the right to license such uses. And on that score, the future of digital avatars is all already here. So last month I had the privilege of seeing the ABBA Voyage concert here in London. And it was really a wonderful example of how AI concerts can revive musicians from the past and create beautiful new works full of emotion, nostalgia and innovation for new and old generations alike. So AI technology allowed for the creation of life sized CGI avatars, what they dub ABBA tars, which replicate the pop stars in their prime, but performing now new choreography and new routines. But to the old standbys, and in this case, all the members of ABBA are living and fully authorized use of their voice, likeness and image. But it wasn't just the machine doing the work. Four decades after their last public concert together, the real ABBA record the show in a studio in Sweden over five weeks using motion capture technology. The real live stars of ABBA in recent times now sang the songs, danced the dances and chatted between the songs. But they did it all In a, quote, NASA style studio with monitors and cameras everywhere and 100 people capturing all the data, unquote. So as we think about regulations, we we need to pay heed to the innovative and transformative possibility of AI generated replicas. As the best actor award this week to Adrian Brody perhaps helps begin to help us recognize. So before we're too quick to ban, let's appreciate that hybridity between man and machine is our future. AI can allow us to bring back music legends of the past, or perhaps give new physical life and animation to disabled performers and individuals. In short, we have the technology to bring back the dead. All you need is a license. AI and law together may yet give us a new lease on life. Thanks.