Logo text

Nothing had prepared William Passano for a new technology introduced in 1960 that threatened to upend his medical publishing business: the Xerox 914 photocopier. The machine offered the ability to produce cheap and quick duplicates that allowed the government’s National Library of Medicine to mass photocopy articles from his company’s copyrighted medical journals. Passano eventually sued in a case that went up to the Supreme Court, which ultimately took a cautious approach to regulating the emerging technology by ruling in 1975 that it was fair use for libraries to photocopy the originals. The justices left Congress to address the novel intellectual property issues raised in the case, leading to key revisions in copyright law a year later.

Fast forward half a century and the emergence of generative artificial intelligence has the similar potential to turn the entertainment industry on its head. While writers who spoke with The Hollywood Reporter weren’t initially spooked about the emergence of ChatGPT in January, with some stressing that the tool is incapable of writing funny jokes or producing usable results without substantial creative input, the tenor of the conversation has changed and it’s become a sticking point in negotiations in the writers strike. Creators and talent see the possibility for the technology to devalue their work or even outright replace them down the line in the absence of intervention from regulators or the courts: royalty-free music generators can compose a film score or beat, AI can write scripts and actors can be deepfaked into movie scenes.

But at the same time, Hollywood is embracing the technology, from using it to de-age actors to partnering with companies in the field to create AI-composed music. This is happening as the industry pushes for regulations. Michael Nash, chief digital officer for Universal Music Group, tells THR that AI programs training machine learning models by feeding them copyrighted works without permission from or payment to UMG’s artists “enables us to have a very important seat at the table around the evolution and use of these models, particularly with respect to developing new licensing opportunities.” He underscores the adoption of AI is to “put these tools in the hands of artists” to see “how far their vision can take this technology.”

In a hearing before the House Judiciary Subcommittee on Courts, Intellectual Property and the Internet examining the intersection of AI and copyright law, key players in Hollywood moved for guardrails to protect their work. “The rapid introduction of generative AI systems is seen as an existential threat to the livelihood and continuance of our creative professions unless immediate steps are taken on legal interpretive and economic fronts to address these emerging issues,” said Ashley Irwin, president of the Society of Composers and Lyricists (SCL), at the hearing on May 17. “It’s essential to prioritize policies and regulations to safeguard the intellectual property and copyright of creators and preserve the diverse and dynamic U.S cultural landscape.”

The SCL, which counts creators of scores and song for film, TV and theater as members, maintains that AI firms should have to secure consent by creators for the use of their works to train AI programs and compensate them at fair market rates for the subsequent creation of any new work that’s created on top of providing the proper credit, Irwin said. He stressed that any regulatory framework should not grant copyright protection to AI-generated works since doing so could flood the market with them, diluting the value of original pieces.

At the hearing, Dan Navarro, a singer-songwriter who’s also on the national board of SAG-AFTRA, compared the status quo to the “Wild West” and the earliest days of sampling, when proper attribution or payment wasn’t given for the reuse of a recording in another song. “Now, it’s routine,” he said, noting that he opposes compulsory licenses giving AI companies free rein to use his work. “‘Can’t Touch This’ by MC Hammer credits Rick James for ‘Super Freak,’ and his estate gets compensated.” He urged lawmakers to adopt a similar framework of consent, credit and compensation.

The widespread dissemination of services transforming an unknown performer’s voice into a worldwide pop star’s has made the music industry one of the front lines in the battle against rampant and likely illegal use of generative AI. In April, “Heart on my Sleeve,” a track that used passable AI versions of the voices of Drake and the Weeknd, took the internet by storm. It was removed from streaming services after UMG, which represents the artists, stepped in by asserting its intellectual property rights. The record label has been directing streamers to cut off access to its music catalogue for developers using their songs to train AI programs, according to Nash.

UMG has been sending requests to take down AI-generated songs, but it’s fighting an entire online community dedicated to making, sharing and teaching others how to create AI music. On Discord, members of a server called AI Hub in April released an album called UTOP-AI — a play on an upcoming project from Travis Scott — featuring the AI-generated voices of the rapper along with Drake, Playboy Carti and Baby Keem. It got nearly 200,000 views on YouTube and Soundcloud in just three hours before it was flagged for copyright infringement by Warner Music Group.

For the most part, artists are looking for a regulatory framework to guide the creation of AI-generated content. They don’t want their work to build the foundation of the tools that threaten to replace them, at least not without consent or payment. The legality of using copyrighted works to train AI programs is up in the air. There are several lawsuits considering whether the method falls under the fair use defense, which allows for the use of protected works without permission as long as they are transformative, working their way through the courts.

In November, a proposed class action was filed against Microsoft, Github and OpenAI claiming the billions of lines of computer code that their AI technology analyzes to generate its own code constitutes piracy. Getty Images is similarly suing Stability AI for copyright infringement, accusing the company of illegally processing millions of its images to train Stable Diffusion. The company has stressed that Stability AI refuses to pay for any of its photos, unlike other tech firms that licenses its digital assets for AI-related purposes. And, in late June, a proposed class action filed in San Francisco federal court by authors alleged that OpenAI “relied on harvesting mass quantities” of copyright-protected works “without consent, without credit, and without compensation.”

A finding of copyright infringement in a lawsuit would severely dampen the economic prospects of exploiting works generated solely by AI in Hollywood and put more control of the technology back into the hands of creators (the copyright office has stated that works generated solely by AI aren’t eligible for protection). While he’s confident in the courts backing the rights of artists in these cases, Nash says, “We’re obviously going to be very vocal in our desire to see that interpretation of copyright law in jurisdictions around the world” and will be “lobbying to address issues around proposed legal changes.”

Another weapon-of-choice in combatting AI content that pilfer a celebrity’s likeness is right of publicity and privacy laws. In April, a former cast member of the Big Brother reality show sued deepfake app developer NeoCortext in a proposed class action over the company’s app that allows users to paste their faces over photos and videos of celebrities. While the current copyright regime doesn’t address the legality of using copyrighted works to train AI programs, some states like California and New York have laws that bar the commercial exploitation of someone’s likeness without permission. “The core copyright concerns are our bedrock, but in addition there are other legal protections,” Nash says, referring to right of publicity laws. “Those are protections we’ve brought to the attention of platforms when we asked for content to be taken down, so there has been a recognition, but we think that there needs to be broader recognition. That’s where additional legislative support could be helpful.”

Still, artists want Congress to intervene despite the courts having settled similarly contentious copyright issues raised by the invention of the camera, photocopier and VCR in the past. The SCL has proposed adding a new section to the Copyright Act that would clarify that the fair use defense to infringement only applies to human-created works, minimizing the traditional avenues to exploit AI-generated works.

Amid this legal uncertainty, companies in the entertainment industry are looking to capitalize on AI tools before they miss out on opportunities to exploit the technology. In May, UMG announced a partnership with AI company Endel “to enable artists and labels to create soundscapes for daily activities like sleep, relaxation, and focus by harnessing the power of AI.” Several labels and music industry professionals have also reportedly reached out to a student hacker by the name of ak24, a member of the AI Hub Discord server who released UTOP-IA, for partnership opportunities over an AI music creation app he invented called Musicfy, which allows users to clone the voices of popular artists.

On the movie side, Miramax is collaborating with AI firm Metaphysic, known for popularizing a deepfake TikTok account spoofing Tom Cruise, to de-age actors in Robert Zemeckis’ Here. It’s part of a larger strategic partnership with CAA to develop generative AI tools and services for talent. Metaphysics CEO Tom Graham says the increasing investment in and adoption of services from “generative AI startups popping up across a broad spectrum of entertainment applications” is “only just the beginning” as the industry moves away from “repetitive and easily automatable job functions.”

As AI becomes increasingly adopts in all corners of Hollywood, artists and talent stress they aren’t unilaterally opposed to the technology. Most of them just want proper credit and compensation. “I don’t have a problem with the technology at all,” Irwin said. “I have a problem with the stealing of my material.”

Read More