A bedroom, two dancers, and an army of drones closing in: drone choreography master Daito Manabe's new music video for longtime collaborator Nosaj Thing's "Cold Stares," featuring Chance The Rapper + The O'My's, sounds like a dystopian nightmare, reads like Noh theater, and watches like a robot's perspective on a David Mamet play. It's at once intimate, alienating, and absolutely fascinating; set inside a sparsely decorated void, the Elevenplay dancers, with choreography from MIKIKO, assume a fever dream of transhumanist proportions. The first piece from Nosaj Thing and Manabe since 2012's cutting edge projection-mapped dance music video, "Eclipse/Blue," it's a meditation on how we see each other—and how machines see us—that looks to the past for its emotional depth, and to the future for its place in human-drone history.
Curious as to how Nosaj Thing and Daito Manabe approached their collaboration this time around, The Creators Project reached out to both artists to talk production, improvisation, and how to get drones to dance not for you, but with you:
The Creators Project: The last time you collaborated, the result was definitely an outside-in sort of thing—a long shot of projections on dancers. This time, it feels wholly different. Can you tell me about how both of your ideas on technology evolved from "Eclipse/Blue" to "Cold Stares?"
Nosaj Thing (NT): It's moving so fast. I'm always changing the way I write beats now... Probably few times a year.
Daito Manabe (DM): Last time, we expressed "Eclipse," using a projector, a screen, and two dancers. Also, the reason there were two was because of its concept, "light and shadow." This time, we focused on the themes, "real and virtual worlds," and “interweaving of the sight," based on the song "Cold Stares" and its lyrics. We wanted to achieve switching POVs between drones with an intensity that takes the audience between real and virtual worlds. However, I guess the feeling that occurs in audiences, more than from the technical expressions, is through the choreography.
How much back-and-forth did you guys have between 2012 and 2015?
NT: Not too much actually... We see each other just a few times a year.
Can you tell me about how your artistic collaboration was mediated between the two videos? I.e. do you Skype, meet up in person to chat, visit each others' studios, or what?
NT: We usually communicate through email and talk ideas when we are in Tokyo or LA.
DM: When I went to LA during Perfume’s concert tour, we hung out together for dinner. Likewise, Jason stayed at my house when he visited in Japan. Both of us might think we're friends, rather than just business associates.
At some point, he brought a demo and I listened to “Cold Stares.” I got some inspirations and came up with ideas for the video, so, I asked if I could make it. After the initial stage, I created a direction for technicals. Our team, Rhizomatiks Research, had done well so far at that point.
Then, we moved into the actual production process. Rhizomatiks Research and Ray Inc. made a technical system and environments for production and post-production. Same as the last time, members “Takcom,” “Ray,” and “P.I.C.S.” created the VFX, MIKIKO choreographed, and the dancers were from Elevenplay. However, the process of finalizing a video took a few months in post-production with the captured data. Therefore, we sent many video links to [Nosaj Thing] which were, like, half-way processed, and we told him what we tried and wanted to do each time.
My biggest question regarding the video itself is—where does it take place? Is it in the real world, with drones swarming, in the drones' world, with humans as their subjects, or in a third place—somewhere in between the real and the virtual?
NT: It was filmed in a studio in Tokyo... I'm still processing it myself. Daito has the answers.
DM: It should be taken in the both worlds, “the real and virtual,” through techniques such as drone controlling, 3D scanning with 64 RGB cameras, motion capture, and AR techniques. This means we switch the worlds that take place. By using more complicated scanning systems than Kinect, we have succeeded in making smooth transitions between the real and the virtual.
This is a method we improved on since Perfume's SXSW performance.
There's a certain poetry to the idea of drones seeing humans as human, rather than as lines of code, or however computers "see" things. Was this an attempt to humanize drones, or further our own feelings of detachment?
DM: Once you watch the making-of videos, regarding this structure, you can figure it out, I think.
In some parts, the movement of drones is based on captured data by hand. It’s pre-recorded, so, maybe that’s why you feel it as “human.”
Other times, we animated CGI through the motion capture of an actor or dancer. There are still some steps and challenges to animating/moving an existing robot the exact same way as real dancer. On the other hand, putting a drone in motion by hand and recording it under this system is more handy. Thus, there are few limitations, but the drones could fly on exact same lines/paths.
How hands-on were the both of you during the actual making of the video?
NT: I wasn't there! I usually just send Daito a new track and he sends me test videos. I trip on all of them. In the video it feels like technology is trying to catch up with his ideas. It's raw while being highly technical.
DM: Before shooting, we'd already calibrated and created the path lines for the drones' motion. So, we didn’t focus on interaction so much. However, that doesn’t mean we don’t have interactivity. Lots of calibration cameras analyzed multiple drones’ positions in real-time and controlled them with feedback. In other words, there is no doubt some kind of interaction.
Of course the video is highly choreographed, but my last question has to deal with how much room you allow for both human and machine error. Is it in mistakes/glitches/improvisations that humans recognize our own machinery, and machines take on a more human nature?
NT: It's both. I get inspired by mistakes and improvisation all the time.
DM: In an actual space, dancers require 4 x 4 x 3 meters, and drones require 6 x 6 x 6 meters. Compared with robots that move more like machines, the drones try to balance in the air against aerodynamics and inertia. It might make drones as vivid as humans. I think this is an unique point for controlling actual objects in the real world beyond simulations, and we [thus] composite augmented reality on it.
"Cold Stares" ft. Chance The Rapper + The O'My's is off Nosaj Thing's third album, Fated, out now via Innovative Leisure / Timetable. Click here to check out the album on iTunes.