Tech by VICE

A 60s Japanese Cartoon Predicted Our Autonomous Car Dilemma

Automating cars means automating death.

by Jordan Pearson
Sep 30 2016, 1:00pm

Image: Flickr/

On New Year's Day in 1963, a Japanese cartoon's debut episode illustrated a technological problem that we're only beginning to deal with now in 2016: what happens when a self-driving car kills somebody?

That cartoon was Astro Boy, whose eponymous protagonist's spiked black hair and friendly demeanor you may recognize as a cornerstone of Japanese and world pop culture. And it has a lot to teach us about the real costs of automated roadways.

Astro Boy's origin story begins and ends in a self-driving car accident. Astor, the son of the brilliant head of the Ministry of Science, is driving down the "world's safest road, Electronic Highway 66" in the year 2000. There was "no need for Astor to steer, speed up, or slow down—the highway controlled his car automatically." All the driver has to do is press a button, the show's narrator says, and because of this the highway is "as safe as man could make it."

But, suddenly, Astor's self-driving car crashes head-on into another vehicle, and he dies. The crowd gathers around the wreckage is visibly in shock, and his father is inconsolable (he later builds the robot Astro Boy to look like Astor). Their shock might arise from the fact that, in a world of autonomous vehicles, the activity of driving is supposed to be safe. People aren't supposed to die. But they still do.

Read More: When AI Goes Wrong, We Won't Be Able to Ask It Why

Who or what is to blame for his death? The car's CPU? Its GPS system or its cameras? The manufacturer? Perhaps Astor himself, since we see that the car is going quite fast and making sharp turns?

The show never gives us an answer, and here's why: in Astro Boy, just like in real life, the self-driving car constitutes a social contract that says death by machine is the unavoidable and ultimately acceptable cost of a world that is statistically safer, a frequently touted promise of self-driving cars—even if we're still shocked by any eruption of violence in a world where violence itself becomes rare, due to advanced technology. In a way, we're all like the bystanders gawking at Astor's corpse.

To be sure, somebody will eventually take the blame and assume the responsibility of paying out financial dividends to distraught families. But this is only a byproduct of our current financial and legal system. If Uber, or Tesla, or your insurance company is making money on your being shuttled around by robots, then they have to be ready to give some of it back when something goes wrong.

But these entities are newcomers to the social enterprise of developing a technology that will hopefully reduce human suffering overall. All of society has collaborated on their introduction. Self-driving cars, just like the internet, have been developed over the course of many decades in publicly funded universities and research institutions. And just like the internet, self-driving cars will require massive infrastructure investment on the part of governments: for example, to design highways made to guide autonomous cars, just like the one that Astor ended up splattered all over.

We've collectively agreed to fund self-driving cars by paying taxes, and we've agreed that they will make things safer. But there is a flipside.

By collaborating to put these things on the road in the hopes of making a statistical dent in the number of road deaths every year, we've also agreed that the people who do die by the grille, the tire, or the windshield are acceptable casualties on the way to progress.

This is true for any technology, and the added factor of death by autonomous robots— making deaths the result of computations gone bad instead of human error—only makes the morbid social contract embedded in all inventions explicit.

Take, for example, regular 'ole human-piloted vehicles. Technology theorist Paul Virilio wrote that "to invent the family automobile is to produce the pile-up on the highway" in order to illustrate the idea that although technological disasters—from Chernobyl to a plane crash—seem to erupt out of the blue, we really already agreed to them implicitly.

And here is what self-driving cars will give us: automated death with no explanation and nothing to blame except inanimate machinery.

We will mourn.

This is progress.

Get six of our favorite Motherboard stories every day by signing up for our newsletter.

Self-Driving Cars
motherboard show
astro boy
Paul Virilio
automated cars
social contract