GPT-4 Can’t Replace Striking TV Writers, But Studios Are Going to Try

The desire to replace writers with AI is a symptom of the larger problem that the guild is fighting for—which is that companies do not value writers and their work. 
ImaLEONARDO MUNOZ / Contributor via Getty Images

The Writers Guild of America is on strike, after six weeks of negotiating with a number of major entertainment companies, including Netflix, Amazon, Apple, and Disney, under the Alliance of Motion Picture and Television Producers (AMPTP). The walkout is the first Hollywood strike to occur in 15 years, and comes at an unprecedented moment—for the first time ever, writers are negotiating the studios’ use of generative AI tools like ChatGPT.  


The guild called the studios’ responses to their proposal “wholly insufficient given the existential crisis writers are facing,” as guild members walked off the job to join picket lines in Los Angeles and New York. In their strike announcement, the WGA included a list of their proposals and the AMPTP’s response. This list includes an unprecedented category: artificial intelligence. 

The WGA proposed to regulate the use of artificial intelligence on union projects, saying that AI can’t write or rewrite literary material, and can’t be used as source material or to train AI. The AMPTP rejected this proposal, telling the guild instead that there would be annual meetings to discuss “advancements in technology.” 

(The WGA represents thousands of members who write content for film, television, news, and online media, including VICE Union, which represents Motherboard staff.)

Since the strike began, a number of writers have been vocal about this specific proposition, saying that they do not want it to become an industry standard to rely on generative AI tools at the expense of writers. 


“Writers have always been valued for their ability to articulate uniquely the human experience. So if now you're saying that you got a machine that can do that better, not only is it ridiculous, but it's massively insulting,” Quinton Peeples, a screenwriter and producer, told Motherboard. To Peeples, the desire to use AI to do the job of screenwriters is a symptom of the larger problem that the guild is fighting for—which is that companies do not value writers and their work. 

“Initially, the WGA's AI proposals looked like outliers. Everything else on the list was talking about writer compensation, making sure writers were paid fairly to justify the immense value they were bringing to the studios. Over the negotiation, it became clear that the AI proposals are really part of a larger pattern. The studios would love to treat writers as gig workers. They want to be able to hire us for a day at a time, one draft at a time, and get rid of us as quickly as possible. I think they see AI as another way to do that,” John August, a screenwriter known for writing the films Charlie’s Angles and Charlie and the Chocolate Factory, told Motherboard.

“The idea that our concerns could be addressed by an annual meeting is absurd and honestly offensive. Everyone watching AI can tell you that these large language models are progressing at an incredible rate. AI-generated material isn't something that's going to become a factor in a few years. It's here now. It's lucky that we're negotiating our contract this year and not next year, before these systems become widely entrenched,” August said. 


August expanded on the guild’s two AI stipulations for Vox, saying, “First, the guild wants to make sure that ‘literary material—the MBA term for screenplays, teleplays, outlines, treatments, and other things that people write—can’t be generated by an AI. If a movie made by a studio that has an agreement with the WGA has a writing credit—and that’s over 350 of America’s major studios and production companies—then the writer needs to be a person.”

“Second, the WGA says it’s imperative that ‘source material’ can’t be something generated by an AI, either. This is especially important because studios frequently hire writers to adapt source material (like a novel, an article, or other IP) into new work to be produced as TV or films,” August added. “It’s very easy to imagine a situation in which a studio uses AI to generate ideas or drafts, claims those ideas are ‘source material,’ and hires a writer to polish it up for a lower rate.”

“The immediate fear of AI isn’t that us writers will have our work replaced by artificially generated content. It’s that we will be underpaid to rewrite that trash into something we could have done better from the start. This is what the WGA is opposing and the studios want,” C. Robert Cargill, a screenwriter best known for writing the films Sinister and Doctor Strange, tweeted. “The same IP laws that prevent you from stealing our writing protects us from a machine doing it as well. Because AI is just cut and paste.” 


The AMPTP’s position is yet another instance of an overblown perception of the capabilities of AI, and follows a number of corporate media shake-ups where executives decided to prioritize AI content over human-created content. Last week, BuzzFeed CEO Jonah Peretti shuttered BuzzFeed News, claiming in a letter that the digital media company would pivot to a new strategy that includes “AI enhancements.” 

The reality is AI is still filled with misinformation and bias. Recently, Microsoft researchers acknowledged in a paper that GPT-4 has trouble distinguishing between true facts and guesses and personalizing outputs to users, and also tends to make far-fetched conceptual leaps. They also found that GPT-4 makes up facts that aren’t in its training data, is very sensitive to framing and wording of prompts, and inherits the prejudices and biases from its training data—something AI ethics researchers have proven time and again about machine learning systems in general. 

Generative AI systems are already facing a number of copyright issues from writers and artists who claim that they were trained on their copyrighted data without permission. So far, Getty Images has filed a lawsuit against Stability AI, the company behind the text-to-image generator Stable Diffusion, for using a dataset that contains over 12 million photographs from Getty to train its AI model. Karla Ortiz, an artist and board member of an advocacy organization for artists called the Concept Art Association, is leading a fundraising effort to hire a lobbyist in Washington D.C. that can update IP laws and enforce more regulations for AI companies. There has not yet been a lawsuit filed by writers, but the desire to train AI using writers’ screenwriting is implausible without their explicit permission, which evidently and understandably, is not something that will be freely given. 


Another issue that union writers are concerned about is the hardship faced by underpaid, often-foreign workers who are tasked to train, moderate, and upkeep many of the world’s largest AI models. AI experts have frequently pointed out that these systems are a lot less automated than they are often portrayed—so to cite efficiency in being able to reduce staff and rely on the AI tool is an oversimplification that further perpetuates a power imbalance and an exploitation of workers from countries with fewer workplace regulations. 

“One of the things that I think that we're starting to rub up against is a laser focus on efficiency: what's the fastest way to get this done? How can we get this done cheaper? Well, that doesn't necessarily make life better,” Peeples said. “So we’re saying, before we render a judgment on whether AI is good or not, let’s talk about these things with a different lens on it—that efficiency and speed production is not of the highest value.” 

“I'm not around people who are saying, ‘oh, we're drawing a line in the sand and no AI and we don't want to even talk about that.’ What we are arguing is, can we slow this down a little bit and have a real thoughtful conversation about the realities and what its consequences might be?” Peeples added. 

Many authors and writers outside of Hollywood, who are not protected by the union, are rallying behind their striking colleagues. “Hollywood writers just went to bat for us authors. We should ALL be putting a clause in our contracts to prohibit the use of our writing to train AI! (To essentially replace us),” Kelly Yang, a writer of adult and children’s literature, tweeted. “But this is not just about money. I am deeply worried about where we are gonna go. Did u know that the WGA is fighting to make studios not buy AI source material? That may be our biggest protection if they win that, bc we authors don’t have ANYTHING protecting us from AI…” 

Hari Kunzru, a novelist who has written six novels including White Tears and Red Pill, tweeted that writers in other fields should be paying attention to the WGA negotiation surrounding AI because many questions have now been raised about the future of their work. 

“The question for writers is transparency about how work is generated and proper compensation for something is being used for training as opposed to just, you know, buying a draft of a script. I don't want a model out there that has been trained on my work unless I'm in control of that model,” Kunzru told Motherboard, adding that he thinks it’s important that if writing was used beyond a corporation’s scope of rights, writers should have recourse to address that.

Kunzru hopes that the industry can first fight for a good deal for the workers making a living on their art. “In the short term, we really need to maintain the already quite fragile ecology of the creative industries,” he said.