A mock "killer robot" is pictured in central London on April 23, 2013 during the launching of the Campaign to Stop "Killer Robots". Photo courtesy of CARL COURT / AFP
When we think of “killer robots”, we might be tempted to think of humanoids from sci-fi hits, like the Decepticons from Transformers, the T-800 Terminator, or the Cylons from Battlestar Galactica. But chances are, the autonomous weapons of the near future—or the so-called Lethal Autonomous Weapons Systems (LAWS) which would be beings with artificial intelligence and a complete ability to kill—might not look like anything we’re imagining. And if humans do not have explicit control over them, they might do things far worse than we can imagine too.
At least that’s the idea behind the New York-based organisation Human Rights Watch’s (HRW) Campaign to Stop Killer Robots, which has been on since 2012. Now, eight years into their push for new international legislation regarding LAWS, HRW has announced in a new report that 30 countries—including Argentina, Iraq, Pakistan, Palestine, Zimbabwe, Colombia—are now finally explicitly seeking a ban on the use of these weapons. The report, which is a compilation of 97 countries’ position on fully automated weapons, says most of them want to “retain human control over the use of force". Additionally, a growing number of policymakers, artificial intelligence experts, private companies, international and domestic organisations, and ordinary individuals have also endorsed the call to ban fully autonomous weapons. The authors explain that autonomous weapons “would decide who lives and dies, without … inherently human characteristics such as compassion that are necessary to make complex ethical choices.”The good news here is that killer robots of this form most likely do not exist yet. The bad, however, is that there are precursor weapons systems—such as semi-automated drones—that have already been developed and deployed by several nations, and clearly show the trend of increasing autonomy. Unsurprisingly, a small number of military powers—most notably Russia and the United States—have blocked progress towards regulation, while investing heavily in the military applications of artificial intelligence. But Steve Goose, director of HRW's arms division, told reporters at the United Nations in Geneva that it was not a question of if there would be regulation—but when and how comprehensive it would be. Mary Wareham, the coordinator of the Campaign to Stop Killer Robots, called the removal of human control in weaponry a grave threat to humanity. She compared it to climate change, saying it deserves urgent multilateral action. "While the pandemic has delayed diplomacy, it shows the importance of being prepared and responding with urgency to existential threats to humanity, such as killer robots," she said.Two years ago, Wareham had told VICE "To avoid a future where killer robots, not humans, call the shots, governments need to act now.” Today, this only gets more urgent.Follow Satviki on Instagram.