This story is over 5 years old.


Emergence is the Perfect AI Dance Partner

John McCormick creates dances between performers and what McCormick calls an "artificially intelligent performing agent."
November 13, 2014, 6:00pm
A still from Recognition

An interactive, virtual dance partner—it’s what we’ve always dreamed of. And while it may not take the form of our fleeting fantasies, there’s something decidedly alluring about the crystalline contortions of Grevillea Crystalis Incarnadine. It’s bright, it changes color—it even has its own unique sense of style. And what’s more? It’s actually intelligent.

What you’re watching is Emergence, a dance duet between a dancer and what artist, choreographer, and researcher John McCormick refers to as an "artificially intelligent performing agent." McCormick, the developer of the work and a specialist in dance technology, has his fingers in many pies, having already carved an extensive career across platforms including virtual reality, new media art, dance and choreography.

Emergence—one of his latest projects in collaboration with dancer, Steph Hutchinson—is interactive and semi-improvised. The "digital agent" as McCormick calls it, forms one half of the performing duo, employing neural network technology to electronically mimic human brain cells. The performance also hinges on real-time motion capture, another one of McCormick’s specialties. Sensors capture the dancer’s movement which is fed to the agent, who interprets these movements as a vibrant and kinetic visual projection. The human dancer, in turn, then uses these visualisations to improvise a new and original dance piece.

In developing the work McCormick turned to real-life techniques that human dancers might use to create choreography and develop work together. Following, the act of having one dancer copy another dancer to learn new movements, and creating variations of another dancer’s movement phrases, were two such tasks the AI agent was given.

The Creators Project sat down with John McCormick to learn more about artificial intelligence, interfacing with our environment, and how to teach a robot how to dance.

A still from Instrumental

The Creators Project: Are you more an artist working with technology or a scientist working with art?

John McCormick: Definitely an artist working with technology. My background is mainly in dance and visual media although also in science. All the things I do are with an artistic intention. The challenge is the artwork—how to design things to fulfil that aim.

Let’s talk about Emergence—what kinds of themes does it explore?

Mainly themes of how we are becoming embedded within the environment that we operate in. Within the dance performance we’re looking at the electronic and visual environment, but that can easily be translated to everyday life. The way that we are now, technology is all around us and we are constantly interfacing with lots of different systems all the time. I was interested in those systems not simply being passive but in having them try to  understand what the person—the dancer—was doing, and almost then to become a performing partner.

Is this the quality of "emergence" that the title refers to?

Emergent behaviour is almost like a byproduct—something that comes about from a system and the interactions of the elements within that system but isn’t programmed into it. Emergence is about the kind of behaviour that an artificially intelligent agent might come up with just through the process of learning and performing with the dancer and trying to be creative in its own right. Rather than programming the agent to react in a certain way it had to learn how to move for itself by rehearsing, in the same way that a less experienced dancer might learn from a more experienced one.

Could you briefly take me through the process of developing the work?

I’d been working with dancers, projected 3D environments and movement tracking for quite a number of years.  But tracking the dancers’ movements and then generating sound and visuals based on that movement tends to be a one way thing— basically a cause and effect type activity. I was interested to see if we could develop it a bit further so the environment could make its own decisions—rather than being told what to do it could respond a bit more like an actual performing partner, more improvised.

From McCormick's Verbose Mode

Your installation Recognition @ Cube 37 last year involved an 3D projection of a large morphing eye that responded to the movements of passer-byers. How does Emergence build upon that work?

Recognition is one of the dance pieces in Emergence. Originally that was actually a dance performance as well—the eye would change shape according to the movements of the dancer or if she wasn’t there it would use movements it had learnt from her earlier. We wanted to take the work out into a public setting and see how it would react and how people would interact with and respond to it.

And was your 2013 work, Reproduction—very comprehensively described by you as an ‘8 screen, immersive, motion-sensitive, artificial life-based, mixed reality artwork’—a precursor, too?

That was some of my earlier work into artificially intelligent performing agents. Reproduction involved smaller entities created out of code – little audio visual creatures that could sing, change color. When a person entered the installation they were assigned a species (there were nine) and the entities of that species would swim out to them, cluster and sing, depending on what that person was doing. In that work the entities were smaller and more specific, while in Emergence we have started to look at human scale and humanoid performances.

To call it an "artificially intelligent performing agent" seems kind of impersonal. Did you ever give it a name?

No we didn’t! I’m not sure why, we just called it "him" and "it." Or "her" sometimes—it changed gender all the time. I think we didn’t want to go too far in making it into a person. It’s still a beginning, a half step between a real performer and something that is still emerging that could go a lot further. Maybe if it starts becoming it a lot smarter then we might give it a nickname.

Steph Hutchison dances in Instrumental

Is Emergence an ongoing work? Will be seeing more parts?

Yeah, it is, we’ll definitely keep working on it. I think that’s the way with artwork in general. A lot of artists have a theme that they keep working on and developing, even if it does change dramatically. Emergence is the same—it’s opened up some possibilities that we’d like to follow.

On that note, what’s the next project for you? 

Right now, at this very moment, I’m working on taking that same process and applying it to robots. So instead of programming a robot how to move you try and get it to learn that movement from a person. It’s using similar techniques to that in Emergence but applying those to a physical object, something real that you can touch and move around in physical, 3D space.

Why the move from virtual to physical?

We’re currently working on how to interpret dance works for vision-impaired people. One way we’re looking at doing that is by using a robot that a person could put their hands on that would describe the shapes that the dancer is making and the dynamics of that movement. That way, the person could get a sense of what the dancer is doing. So a system to interpret a dance performance to visually-mpraired people through touch rather than through vision. It’s going to be an ongoing research project for us.


Baby X, The Intelligent Toddler Simulation, Is Getting Smarter Every Day

A Glowing Orb Simulates Interactions Between Humans And Artificial Intelligence

New Artificially Intelligent Robot Can Steal Your Face And Mimic Your Voice