FYI.

This story is over 5 years old.

Entertainment

Meet the Guy Using Robots to Write Award-Winning Films

How long will it take for his creepy bot to win an Oscar?

Over the Christmas break, every single thing Ross Goodwin said was recorded. For two months, he walked around with a lavalier mic hooked to his shirt, keeping tabs of every conversation, every private utterance. Ross wasn't on a reality show. The NYU-trained technologist was recording himself for a very specific purpose—all of those hours of recordings are going to be transcribed in text, and then he's going to feed them into a "neural net," basically a computer imitation of the human brain.

Advertisement

"I'm training a neural net on that transcript and making a bot that talks like me forever," Ross explains, matter-of-factly. "It's literally going to be a mannequin wearing my clothes that whispers in my voice, forever. And then every two to nine years I add a new mannequin… So that's one project I'm working on." There's a silence over the phone, as though he's expecting me to ask about the specifics of the arrangement. All I can think is, "Why, dear god, does it have to whisper?"

But there's no time to indulge my fear of impending robot domination. We're here to talk about another of Ross' projects—a robot that uses similar technology to absorb millions of lines movie scripts, learn their structure and quirks, and spit out an original film. Two of these robo-written films, Sunspring and It's No Game, have already been made, directed by Ross' collaborator and college buddy, Oscar Sharp. The two met in a cross-discipline class at NYU, and the idea of designing a robot that could write screenplays actually came from Oscar, a film major.

"We started working on it… and didn't really get very far," Ross admits. "I was not using machine learning at that time, I was just using support algorithms like Markov chains and context-free grammars and template systems."

Googling "context-free grammars" lands you on an extremely complicated explainer that must require its own dictionary to read. And it's so complex because, basically, in order for a robot to be able to understand what it's being asked to write, the entirety of the English language needs to be broken down into a set of rules that a computer can decipher. Hundreds of thousands of words, grammar, tone, structure—all distilled into 1s and 0s.

Advertisement

Ross' first breakthrough was a project called Word Camera. "I was thinking about how to generate action descriptions that would describe a scene without going into any character's head—because in a screenplay you can only describe what can be seen by a viewer," he explains. "So I thought, Why not start with a photograph?" On the surface, it seems simple enough. But when you break it down, it's an insanely complicated ask. Take this gorgeous Autumnal still life:

A human can look at this picture and immediately say—there are three apples on a wooden tray, a pink-and-white rose, a lit yellow candle on a candlestick, some silky fabric, and a glass thing. But how could a robot know that? In milliseconds it would be able to tell you the exact colour of each and every pixel in the image. How would it know though how those pixels stack up into forms—where one B33029 red apple ends and a piece of 42493F grey silk begins?

This is where the neural net comes in. The technology can be taught to recognise what certain objects are by "reading" millions of images. And, depending on who you ask, the results are either nightmarish or psychedelic dreamscapes. There's perhaps no better example than the artificial intelligence that enables Google's reverse image search, and sees a world of eyeballs and dogs.

What fresh hell is this? Image by Google Deep Dream

For Oscar though, the technology always seemed to come back to film. So he asked Ross to use his new skill with neural nets to generate a film script. "I said, 'Yeah, I don't see why not.' And a few months later, we made Sunspring using that technology," Ross explains. Fronted by Silicon Valley's Thomas Middleditch, the strange sci-fi short traces a bleak future where mass unemployment forces young people to sell blood.

Advertisement

And the film's Beckettian dialogue reads like something out of Waiting For Godot. "When [these machines] start training they are literally spitting out just random sequences of letters, trying to make a word," Ross explains. "Then, over time—over the course of about a week—they learn what valid words are, how to string words together into sentences, and then how to string sentences together into coherent dialogue that makes sense for short periods." Take this exchange from the script:

H2
You should see the boys and shut up. I was the one who was going to be a hundred years old.

H
I saw him again. The way you were sent to me… that was a big honest idea. I am not a bright light.

C
Well, I have to go to the skull. I don't know.

"Sunspring is not total nonsense, actors were able to memorise those lines," Ross says of this first attempt. "It bears some resemblance at least to what we see in films generally. It's just a little weirder, I guess."

It's No Game, Ross and Oscar's second attempt, is a noticeable step up. It just won third prize in the Sci-Fi-London 48-Hour Film Challenge—facing off against 300 other teams, and more than 2,000 writers. And there are so many things to say about this short, and what it means for filmmaking as an art, but then there's also the fact it stars David Hasselhoff in the role he was born for: the Hoffbot.

The Hoffbot was trained on subtitles from Knight Rider and Baywatch. "To train a really solid long short-term memory (LSTM) neural network, I would say you need about 20 megabytes of text or more, which is about five million words," Ross says. "That's more than what most people have written in their lifetimes." If you wanted to train a machine to generate a sci-fi script, you'd feed in everything from 2001: A Space Odyssey to Ex Machina. Want to write a Aaron Sorkin teleplay? The scripts from a couple episodes of The West Wing should give you the requisite 20 million words to build a Sorkinator.

Advertisement

Then, over the course of a week or two, the machine will go over these lines again and again, slowly building up a statistical model of what's most likely to be said next, given what's just been said. It's all just patterns, which is why the machine's scripts tend to be—by Ross' own admission—"riddled with cliches."

"It's not that we're saying these film scripts are perfect, we're trying to ask the question, What is a perfect film script?" he says. It's really not a world away from how human screenwriters learn, watching hundreds if not thousands of films, studying screenplays line-by-line to figure out structure. Benjamin is just more formal.

Oh that's right, the robot's name is Benjamin. Or at least that's what it told a crowd—totally unprompted—last year at the Crunchies Awards. "What's next for you?" the interviewer asked during a talk about Sunspring. "My name is Benjamin," came the robot's reply.

"It's said lots of creepy things," Ross says. "There was one time I asked it—because when you train it, you can actually ask it questions and it'll respond—so I asked it, "Where did you grow up?" And it said, 'I didn't have to do that.' All the time I'm surprised and shocked."

So I ask Ross the question that Benjamin raises for any writer who constantly questions their ability and worth: How long is it going to be until this creepy robot takes my job and wins an Oscar for best original screenplay?

"I don't think anyone wants to see machines totally replace humans in creative pursuits, I think you want to see machines aid humans in creative pursuits," he says. "I can totally see a machine helping a human write best original screenplay. I mean, I think the fact that we've won an award this year with It's No Game is testament to that future being quite likely."

Ross Goodwin will be speaking in Sydney on June 2 at VIVID, and in Melbourne on June 5.

Follow Maddison on Twitter and Instagram