FYI.

This story is over 5 years old.

Tech

Humanoid Robots Are Getting Really Good at Making Art

Social 'bots can dance, draw, act, and make music with humans. In other words, we’re getting really good at engineering art.
"Paul" the drawing robot sketches a portrait, via YouTube

Robots aren’t just taking over our mundane, tedious, menial labor jobs anymore. Increasingly, they are learning social intelligence, interacting realistically with humans, and adopting creepily human-like qualities. Today's machines are expressive, emotive, and even creative. In other words, we’re getting really good at engineering art.

Some of the most impressive life-like technologies were showcased yesterday at the Living Machines exhibition, part of a week-long conference on biological machines at London’s Natural History Museum and Science Museum. The conference looked at robots that mimic the natural world, from animals and plants to human psychology. Nowadays, machines can mimic what’s in our head almost as well as how our bodies work, and one result of that is that robots are becoming seriously impressive artists.

Advertisement

You’ve probably heard of the iCub, one of the most advanced humanoids, known for being able to interact with the world and learn from it just like a toddler does. The iCub was on display yesterday, showing off its latest skills: dancing and making music.

The robot is open source, so anyone can "teach" it new skills. One project is a collaborative DJ game where man and machine create music together. The iCub jams with humans on new turntable-style instrument called the Reactable.

Other researchers are using using the iCub’s learning abiliy to show it to dance to music. Instead of acting out a pre-programmed series of motions, as dancing robots have done in the past, the machine can respond in real-time. It can also recognize and understand the movements of a human dance partner, and appear to show emotions by expressing it’s having a good time.

With the iCub Simulator, the robot mimics the movements of anyone standing in front of its sensors, using face-tracking technology. This way, it can learn fluid movements instead of typically rigid ones (goodbye robot dance). To show off these new moves, researchers made a video of the iCub raving in real-time.

Also on display at the conference was “Paul,” a robot designed by French engineer and artist Patrick Tresset, that uses the latest technology in artificial intelligence and cognitive computing, to sketch a portrait almost perfectly, either by looking at a person posing in front of it, or by memory.

Advertisement

As is the case with dancing, robots that can paint and draw are nothing new, but an element of autonomous creativity is being introduced. Art 'bots are moving beyond what is essentially a photocopy—recreating a digital image that’s stored after photograph is taken—and are now able to make creative decisions on their own.

The e-David made headlines earlier this month for being the first robotic painter that can adapt its strokes as it paints. The software gets a digital snapshot and then continually analyzes it, based on lighting and shading, to decide where the next stroke should be. It joins a growing crop of autonomous artistic bots making beautiful work, like the BNJMN and the Painting Fool.

So, machines can dance, paint, draw and jam. Then there’s the dramatic arts—arguably the purest form of human expression, emotion, and psychology.

Robots have been appearing on stage alongside humans for a couple years now, but the RoboThespian, which made an appearance at the conference yesterday, is so advanced in this field it’s being used in academia to study human-robot interaction.

The life-sized humanoid actor has been in show biz for a couple years now. It speaks expressively and moves realistically. It's voice isn’t choppy and robotic sounding, but smooth and animated, and it’s LCD eyes are designed to convey emotions that match the words it’s “speaking.”

The RoboThesbian sings (not half bad), cries, and cracks jokes. It interacts with the audience; it can see 15 or so people and will move its gaze between them, and recognize gestures like waving goodbye, and wave back.

Advertisement

The software that powers these life-like movements is called SHORE. Basically, it stores a huge amount of data on human faces, which the RoboThesbian can pull up instantly when it looks at someone, and thus recognize if a person is a man or woman, happy or sad, old or young.

The machine is able to elicit such an emotional response from humans that a man once tried punching it, creator Will Jackson said in an interview with Humans Invent. “A guy took offense because he thought the robot had looked at his girlfriend’s tits, and basically tried to attack it."

The way Jackson sees it, engineers should be focusing on building machines that can crack a simple but highly valuable quality—talent. “Why on earth would you make a really expensive robot to do low paid work? It doesn’t add up financially," he said. "We think it’s better to make robots to imitate very highly paid and easy to do jobs, rather than very low paid and difficult to do jobs."

Watch out Brad Pitt and Katy Perry, they’re coming for you next.