That's not a phrase we're likely to be using soon with any regularity, even as robots continue to take jobs once filled by humans. While a robotic arm's screwdriver capabilities might be perfectly efficient, the act of grasping and manipulating an unfamiliar object at the whim of unpredictable humans remains a great challenge.
An EU project called PaCMan (Probabilistic and Compositional Representations for Object Manipulation) is trying to teach robots to grasp things they haven't grasped before, in a very literal sense. Boris the Robot recently made headlines for its great talent at stacking a dishwasher.
"The idea is that if you want in future to have robots that can manipulate objects in the regular world, they won't necessarily have models of those objects—they won't know what they are," Jeremy Wyatt, the project's coordinator based at the University of Birmingham, told me.
While humans find can see something and pick it up without thinking, no matter where it is or whether they've seen it before, robots find it incredibly difficult to deal with the variability of the real world. That's not a problem on a robot-only production line, where everything is in its expected place and there aren't any humans around to mess up the order, but if we're going to work together robots are going to have to adapt to humans' less structured quirks.
Boris has been taught a few different grasping techniques—like curving a hand around a cup-shaped object—and uses those as a base when it approaches unknown objects.
Wyatt explained that a computer model essentially remembers how each motorised finger segment on the robotic hand was placed in the grasp, while another model takes into account the shape of the whole hand. "The robot's able to transfer that grasping to lots of different objects," he said.
The machine learning method makes sense of the new objects by recognising the parts they're made of. For example, it could tell that a jug is different from a cup but nevertheless has a similar curvature and handle.
That means it doesn't need a model of each individual object; it uses depth sensors similar to a Kinect to get an idea of the shape and position of whatever it's reaching for.
And the €3.4 million ($4.4 million) project doesn't just have mundane domestic chores like robotic dishwasher-loading in its sights. Wyatt said that the dishwasher example works well because it demonstrates a large variability of objects, though at the moment Boris isn't capable of manipulating things in its hand after picking them up. It can't turn a cup upside down to stack it properly, for instance.
As well as helping humans and robots work together on assembly lines, the grasping method could be useful in disaster efforts. Wyatt used the Deepwater Horizon oil spill and the Fukushima Daiichi nuclear disaster as examples of where a robot that could handle different objects without pre-programming could be really useful.
At Fukushima, for instance, a bot with this versatility could help turn different-shaped valves. "There were hundreds of valves of different sizes that need to be closed in the plant and they couldn't send a person in to do it," said Wyatt.
The robot doesn't have any kind of fingertips—it can't "feel" the force it's applying—and it can only pick up a maximum weight of a few hundred grams. Wyatt said the hands, which cost "the price of a really nice BMW," are pretty fragile, and the first one they used broke its pinky. But the methods they're using are highly adaptable, so the robot can continue to use the same algorithms even when it's lost a finger.
For now, Wyatt and his team have just demonstrated that Boris can choose the type of grasp it should use on an object. They're working on getting it to pass things from one hand to another and expect to have it loading a dishwasher by next Spring.
It's a step toward a robot that can work in human environments and with human colleagues. Though more likely than not, it will be giving the orders to you.