This story is over 5 years old.


Screenwriter's Block? This Artificially Intelligent Algorithm Could Help

Composer and filmmaker Dr. Alexis Kirke's 'Zenman' system allows for collaborative human/machine script writing.

Batwars: The Four Awaken. Image courtesy of Plymouth University

Everyone harbours amibitions to one day write that novel we all supposedly have in us, or maybe write a screenplay for that Clerks meets Mad Max indie you've been incubating over the years. But creative writing is tough, so wouldn't it be great if there was an AI that could help us?

Before you answer that question, take a look at what composer, filmmaker, and musical experimenter Dr. Alexis Kirke—who's also a senior research fellow at Plymouth University, where they run the annual science and music collaborative jamming session that is the Peninsula Arts Contemporary Music Festival—has been developing: it's a movie scriptwriting algorithm called Zenman, named after algorithmic composer Iannis Xenakis (it's pronounced Zen-a-kiss) and Oscar-winning screenwriter William Goldman, which Kirke has been using to aid with writing film scripts and also for musical performances.


"The computer works as a collaborator with the human writer," Kirke explains to The Creators Project. "A feature length script can be an overwhelming piece of material for a less experienced writer. So systems that can provide possible ideas to bounce off the writer could be helpful."

Sophisticated algorithms like Automated Insights' Wordsmith software are already being used in a journalistic capacity to help write articles—mainly financial, data-based stories—published on places like Associated Press and Yahoo. Other algorithms are auto-writing reference books. Sure these algorithms may struggle or fail at more complex, creative writing but it's early days yet. And that's the thinking behind Alexis' project—it has to start somewhere.

Alexis Kirke. Image courtesy Plymouth University via

Kirke notes that algorithmic writing is a couple of decades behind algorithmic music composition—commonly used in creative collaborations—and that's something he hopes to address. "This is because musical notes are much more freeform than words. You can play a few random white notes on a piano with a nice rhythm and then if you repeat it, you have a tune!” Kirke says on his website, “Written language is far more complex, put a few nice words together and it will often mean nothing.”

You don't just have words either but words that form sentences, sentences that form narratives, characters, and dialogue which adds further layers of complexity. “The truth is, we’re not as good at doing this as we are at using computers to write music. And this has put many researchers off. However, unless we begin to take steps in this direction, how will we ever make progress?"


Zenman works by feeding it different scripts which it analyzes and then it can create characters either based on those in the script or by merging elements of them to create new hybrid characters—from there it can generate dialogue. "I've incorporated emotions into the artificial characters," Kirke says. "Depending on what is said to them by other characters, it changes how they feel. And what they say next depends partly on how they feel. So if they're feeling happy they'll tend to say happier things. If they're feeling sad, it'll be sadder."

At this year's Peninsula Arts Contemporary Music Festival, which took place a few weeks ago, Kirke created a performance called Batwars: The Four Awaken where his moviebots took on the personalities of Luke Skywalker, Darth Vader, Superman, and Batman.

In the live performance, the bots' react to the music (and audience tweets linked to a unique hashtag) in turn generating dialogue. Their emotions are affected by the speed, loudness, and pitch of the specially composed music, which is played on a piano and violin. The bots also respond to Kirke's brainwaves linked to him via a EEG headset, feeding off how relaxed or anxious he felt.

Luke Skywalker. Screengrab via

To create, say, Luke, Kirke fed the system the three original Star Wars scripts from episodes IV, V, and VI along with some fan scripts. Zenman then creates the character from all the lines of dialogue spoken by Luke and all the actions referring to him. The system also stores the previous lines of dialogue that the character is referring to for context. "Also I could train the moviebots." Kirke explains. "This involved me kind of chatting to them and correcting them into being 'more like' their characters. Each moviebot probably had about five to seven training sessions each."


Outside of the performance Kirke interacts with the bots' and their generative behaviors using musical instruments in what sounds like an improvisational jazz setup. It's just he's jamming with an algorithm rather than other musicians and it isn't to create music but film scripts—it all makes for a curious and fascinating way to generate movie ideas.

"I’d rate it as being like the early algorithmic composition systems," Kirke notes “not great, but it can give me, as a screenwriter, ideas. I can feed in my own partially-completed scripts and mix them up with ones I like, to help in completing them.”

The next stage of the project is to create an algorithmic film and to develop Kirke's programming language FILM "which allows emotional agents to be specified by code in a similar format to a film script."

So where is all this heading, a future where bots are autonomously writing lines of dialogue and formulating stories? Or a future where human/machine creative writing collaborations are far more common? Both, it seems. Kirke thinks in the distant future basic stories and dialogue will be created entirely by computer algorithms, but in the nearer future algorithms will help with what he refers to as "Computer-aided Script Writing (CASW)." Collaborations where the writer takes the lead, importing theirs and other scripts into a system like Zenman, and using it to generate ideas for characters.


#SpotlightOfTheWeek - Once home to a secret research laboratory, Jakku was the last rallying point of the Imperial fleet. Learn more at

A photo posted by Star Wars (@starwars) on Feb 14, 2016 at 1:02pm PST

"The truth is that in the next 20 years we'll be using a lot of machine created music, and in the next 35 years a lot of machine created stories. Who can deny this?" notes Kirke. "Language, character, and story are harder to deal with than musical form and harmony, but they are not impossible tasks. We are already seeing storytelling tasks the likes of which no one has had to accomplish before: take the writing of Star Wars VII—how many constraints and pre-elements had to be taken into account into generating a new part of a seven-part movie arc embedded so deeply in our modern culture? Computer-aided storytelling may help to enable such complex long-term storytelling projects to be more easily done. They may enable us to generate multiple versions of a story which can then be embedded into an adaptive film that adjusts during its screening. And, given that Toy Story could now probably be animated in real-time, the combining of computer-script-writing with computer direction and cinematography (already active areas of research in animation) could provide openings for unimaginably dynamic storytelling experiences. And when does a highly dynamic story become a game, become an environment, become a virtual reality? This of course is the dream of the director—to immerse the viewer so much in the reality of the protagonist that they become one with the director's intentions. Technology is driving towards a merging of story and virtual environment, until perhaps our whole life can become an augmented reality narrative embedded within everyone else's, and then we will discover just how real the shadows are on the wall of Plato's cave."


Click here to learn more about Dr. Alexis Kirke.


A Look Inside the Academy Award-Winning 'Ex Machina'

Google Makes Learning Neural Networks Free

Miyazaki-Inspired Short Follows an AI's Coming of Age