FYI.

This story is over 5 years old.

News

A South Korean university is building killer robots — and AI experts are not happy

"It is regrettable that a prestigious institution like KAIST looks to accelerate the arms race to develop such weapons.”
Getty Images

Some 50 artificial intelligence experts signed a letter Wednesday criticizing a South Korean university for collaborating with a weapons manufacturer to build killer robots.

Defense company Hanwha Systems, which already deploys autonomous weapons on the North Korean border, recently teamed up with the state-run Korea Advanced Institute of Science and Technology to investigate further deployment of AI technology on the battlefield.

Advertisement

Written by Professor Toby Walsh of the University of New South Wales and signed by robotics and AI experts from around the globe, the letter points out the dangers of developing weapons that can act without human intervention.

“At a time when the United Nations is discussing how to contain the threat posed to international security by autonomous weapons, it is regrettable that a prestigious institution like KAIST looks to accelerate the arms race to develop such weapons,” Walsh wrote.

The letter states signatories will “boycott all collaborations with any part of KAIST” until it gets the necessary assurances that it will not develop autonomous weapons lacking meaningful human control.

KAIST’s President Sung-Chul Shin said in a later statement: “I would like to reaffirm that KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots.”

This however contradicted the university’s stance in a now-deleted announcement that described what the program would focus on: “AI-based command and decision systems, composite navigation algorithms for mega-scale unmanned undersea vehicles, AI-based smart aircraft training systems, and AI-based smart object tracking and recognition technology.”

Hanwha Systems has previously worked with Korea University to develop the autonomous SGR-A1 sentry gun, which has been in operation on the 38th Parallel dividing the Korean Peninsula.

Advertisement

READ: Russian weapons maker Kalashnikov developing killer AI robots

The application of artificial intelligence to weapons has long been a concern for academics. Last August, more than 100 experts — including Tesla founder Elon Musk and physicist Stephen Hawking — signed a letter calling on the U.N. to ban the development and use of artificially intelligent weaponry.

And the opposition has not been restricted to academia.

It was revealed Wednesday that more than 3,000 Google employees signed a petition asking the company to stop supplying the Pentagon with AI technology used to interpret video feeds, that could also be used to improve drone targeting systems.

“Amid growing fears of biased and weaponized AI, Google is already struggling to keep the public's trust," the petition read, according to the New York Times. "The argument that other firms, like Microsoft and Amazon, are also participating doesn't make this any less risky for Google. Google's unique history, its motto Don't Be Evil, and its direct reach into the lives of billions of users set it apart."

Next week the U.N. will host a meeting on autonomous weapons, with more than 20 countries already calling for a total ban. However, with China, Russia, the U.K. and the U.S. all actively developing autonomous weapons, it is unlikely a unilateral ban will come into force any time soon.

Cover Image: The Terminator robot is seen in the paddock following qualifying for the Spanish Formula One Grand Prix at the Circuit de Catalunya on May 9, 2009 in Barcelona, Spain. (Paul Gilham/Getty Images)