You probably heard the news over the weekend: Hitchbot, the hitchhiking robot, is dead. The Canadian experiment aimed at testing basic human decency, or at least willingness to play a quirky game of give-the-robot-a-ride, ended abruptly in the wee hours of Saturday morning when someone in Philadelphia beheaded and dismembered Hitchbot, and left it on the side of the road. The anonymous vandal apparently distributed a crime scene photo on social media.
The photo didn't show up in the official Hitchbot press release about the incident. "We decided not to publish it," Hitchbot creator Frauke Zeller told Buzzfeed. "It's upsetting—you can see how it has been taken apart and left in the street." She's right: Even though no media standards-and-practices rule sheet in the world would demand that journalists censor an image of plastic and rubber in a pile, it does seem like the destruction of Hitchbot was an act of cruelty.
Whether Hitchbot's family admits it or not, this outcome was what the experiment was all about. Leave a big hunk of plastic designed to ask people for a ride out by the side of the road, and you're not just asking whether people will go along with it, but whether they'll go out of their way to break it. That's because over the years, we humans have demonstrated hostility toward automatons, seemingly whenever we can find a reason, and sometimes when we can't.
But weirdly, we're starting to give a shit.
The early robot victims didn't have arms and legs.
Though the earliest example of humans killing machines that most people can think of is the Luddites, people had been smashing machines for years before that group came along. While there's no recorded account of any angry, unemployed, Bible-scribbling monks trashing a Gutenberg Press, the machines that came along after it weren't so lucky.
The Spinning Jenny, an eight spool yarn spinning device invented by James Hargreaves, and named after his daughter, Jenny, that essentially replaced seven people, got spinners plenty pissed. In 1768, only a year after the product launch, a gang of spinners broke into Hargreaves' house and killed all of his Jennies—except the daughter. Hargreaves left town and found a partner, and together they helped launch the Industrial Revolution.
It was 48 years later when the Luddites started their campaign of violence against machines (to be fair, they'd already killed about 50 humans). According to legend, Ned Ludd, their Robin Hood-style folk hero, had smashed a knitting machine in 1779, but it wasn't until 1816 that a historical record of Luddites committing machine murder shows up. On May 18, and June 18 of that year, lace makers, furious about a new knitting machine that could do precision work previously thought to be unique to humans, broke into two factories and demolished their lace-making machines. The guys who were arrested had covered their tracks by plotting out alibis, and they got away with it.
In 1830, with the Industrial Revolution in full swing, tailors in Paris broke into a factory that made military uniforms and smashed 80 sewing machines.
After the invention of clockwork, people started engineering machines designed to look like humans or animals—the kind you might have seen in the movie Hugo, that can be wound-up, and then move around like herky-jerky zombies. Some could sit across from you and play chess, kind of. Some could digest food, kind of. They often got destroyed in fires, and would be written about like human victims. In a supposed eye-witness account of the fire that claimed the mechanical chess player, the anonymous author claimed:
We listened with painful anxiety. It might have been a sound from the crackling wood-work, or the breaking window panes, but, certain it is, that we thought we heard through the struggling flames, and above the din of outside thousands, the last words of our departed friend, the sternly whispered, oft repeated syllables, "echec! echec!!"
In 1920, the word "robot" came along in a play called "R.U.R." aka: "Rossum's Universal Robots." With it came the idea of a human-shaped automaton who could be used by industry as a cheap and efficient alternative to people. The play itself was also an important moment in the history of cruelty to robots, not just because it aroused our anger by featuring the first fictional robot overthrow of humanity, but also because it ended with a guy refusing to cut open a robot woman in an experiment. He decided she had feelings.
Office Spacevia 21st Century Fox
The concept stuck. For almost 100 years, we've been calling the machines designed to replace us "robots." And just like the American working class objected to the mythical steam-powered hammer that threatened to replace folk hero John Henry, we haven't been huge fans of our coming obsolescence.
Vandalizing company equipment is a classic way for workers to express frustration. By the 1970s, auto workers in the US had a "folk history of shopfloor sabotage." When used a bargaining strategy, sabotaging the heavily automated process of making cars could help them get what they wanted from their bosses. But they don't do much machine-smashing today. It's been replaced by subtler forms of sabotage that largely spare the employment. Today, the killing of machines at work can involve death by a thousand cuts: Egyptian textile workers torture their work machines to death slowly, smashing the parts that fall off, and then not telling anyone, until one day the thing stops working.
But usually these days, when we take our raw anger out on machines, it's less an act of revenge against our employers than a hot-blooded, random act of violence. In 2013, an Amazon employee trashed 15 or 20 hand-held scanners, citing, "frustration with the job and other associates." Outside of work, soda machines malfunction, and sometimes we go medieval on them. Same goes for stamp machines and parking machines.
But at the same time, we're starting to like our robots. We own them, and give them names, and we're starting to get genuinely queasy about our own violence toward them. This can be harmless, as in the 2001 experiment in which children had a hard time holding a Furby upside down because the robotic, talking toy will freak out and go "me scared." But this unlikely friendship can also interfere with how we use our machines.
.gif via Boston Dynamics
According to a 2013 survey at the University of Washington, we're really starting to view robots the way fans viewed Hitchbot—as our pets, or even our friends. After interviews with 23 bomb disposal technicians who used robots in their work, author Julie Carpenter described a complicated relationship. Technicians "would say they were angry when a robot became disabled because it is an important tool," which is fine, but she added, "then they would add 'poor little guy,' or they'd say they had a funeral for it." Carpenter worried that this might make workers hesitate to send a robot on a suicide mission.
In 2006, an army colonel working at test range in Arizona had to supervise a robot designed to step on mines. With each mine the robot stepped on, a limb would be savagely blown off. The roboticist observing the test felt that it was going well, but according to The Washington Post, the colonel who supervised the operation called it off, because he couldn't stand watching the increasingly disabled robot slowly drag itself to the next mine.
The experiment was, the colonel decided, inhumane.
Philosophers like Peter Singer and Agata Sagan are increasingly willing to entertain the idea that robots might someday be able to feel emotions. And last week, thousands of high-profile scientists and technologists signed an open letter calling for a ban on killer robots controlled by artificial intelligence—essentially, drones that can decide who they want to kill.
It might be good that we're starting to feel empathy for robots when we torture them. After all, when Skynet goes online, it's going to be bad enough. The last thing we need is for the robots' first emotion to be a thirst for vengeance.
Follow Mike Pearl on Twitter.