Not everyone in Hollywood is happy with the film industry’s historic AI deal. A provision allowing for the creation of digital replicas and synthetic performers could, critics argue, decrease the number of jobs available to both performers and crew. This, in turn, could allow big-name stars—and their AI-generated clones—to feature in multiple projects at once, pushing out emerging actors as Hollywood becomes awash with synthetic performers.
Feelings are so strong that 14 percent of the national board of the Screen Actors Guild-American Federation of Television and Radio Artists, or SAG-AFTRA for short, actually voted against taking the deal to its general membership for ratification. Leaders of the Directors Guild of America and the Writers Guild of America, in contrast, overwhelmingly agreed to have their members accept the agreements they hammered out with the Alliance of Motion Picture and Television Producers (AMPTP).
With their deal with AMPTP, writers were trying to wrest control of a tool that could learn to draft original scripts or alter human-written scripts without permission. For actors, one of the key issues in the negotiations was different—AI could, they worried, steal their very likeness. Tight controls seem existentially necessary. “In this agreement, there are indeed a lot of imagined uses going forward, both for minor characters, for major characters, and background actors,” says Joshua Glick, visiting associate professor of film and electronic arts at Bard College. “That is part of why there’s maybe more anxiety surrounding where the actors stand with AI versus the gains made for the writers.”
One of the loudest critics of the deal has been Family Ties actress Justine Bateman, who serves as an AI adviser to the SAG-AFTRA negotiating committee. In the days after SAG reached its tentative deal with the AMPTP, she posted a widely shared thread on X that ended with, “Bottom line, we are in for a very unpleasant era for actors and crew.”
Bateman’s biggest worry is the language in the agreement concerning “synthetic performers”—or AIs that resemble humans. “This gives the studios/streamers a green-light to use human-looking AI objects instead of hiring a human actor,” she wrote on X. “It’s one thing to use [generative AI] to make a King Kong or a flying serpent (though this displaces many VFX/CGI artists), it is another thing to have an AI object play a human character instead of a real actor.” This, she argued, would be akin to Teamsters allowing their employer to use self-driving trucks instead of union drivers.
How you regulate the characteristics of these “synthetic performers” is another quandary. A summary of the new deal states that “if a producer plans to make a computer-generated character that has a main facial feature—like eyes, nose, mouth, or ears—that clearly looks like a real actor, and they use that actor’s name and face to prompt the AI system to do this, they must first get permission from that actor and agree on how this character will be used in the project.”
