FYI.

This story is over 5 years old.

Tech

Meet the Man Building Autonomous Kamikaze Swarm Drones for the US Military

The LOCUST drones are launched out of a cannon, come together autonomously in a "swarm," and detonate on impact.

A big explosion, which is what it might look like if weaponized UAVs from the LOCUST project came to fruition and crashed into a target. Photo by Wikipedia user Staff Sgt. Eric Harris

A few days ago the US Navy released a video unveiling a new weapons project called LOCUST. In the space of a minute the system fires up to 30 drones out of a cannon. Once they've left the launching tube they come together autonomously in a "swarm" designed to attack and overwhelm their target through sheer numbers.

The video shows the drones flying in formation before cutting to a CGI animation of them destroying a small settlement that looks like it could be somewhere in the Middle East or Central Asia. From the ground you might expect an attack to look a bit like a colony of seagulls descending on your bag of chips, if all the birds were packed with explosives and detonated on impact.

Advertisement

The LOCUST project—which stands for Low Cost UAV (unmanned aerial vehicle) Swarm Technology—is still in development, but the idea of armed drones with a mind of their own thundering directly into someone's home sounds pretty terrifying. Last week the UN held a conference on how to regulate "killer robots" (autonomous weapons) in Geneva, but it's yet to set up any sort of legal framework to control them.

The promo video for LOCUST.

I wanted to find out whether we need be worried about projects, like LOCUST, that seem to allow armed robots to operate with less and less human input. Is it a dangerous escalation of Obama's extremely controversial and allegedly illegal drone war, or a PR stunt designed to intimidate US adversaries and help justify huge defense budgets?

I got in touch with Lee Mastroianni at the Office of Naval Research in Virginia, who acts as Technical Manager on the LOCUST project.

VICE: What are the program objectives?
Lee Mastroianni: A lot of people talk about swarming UAVs—we wanted to show we are actually doing this. We are able to create the swarm, which involves the rapid launch of large numbers of UAVs, and then control the swarm in a way that is useful for military operations. You can spit 'em out very, very rapidly.

Are the drones designed to carry weapons or for reconnaissance?
They could be for reconnaissance; they could be weaponized. If you look at the LOCUST video we put online, I used a sample mission… you have a simultaneous strike where weaponized UAVs take [all their targets] out at the same time.

Advertisement

And do they operate in a kamikaze sort of way? They fly into the target and explode?
The UAVs would be the weapon as opposed to a Predator [UAV], which launches other weapons. These are one-way missions.

Once they're in the air, how are they controlled?
That's the second big piece of the demonstration—autonomous control. Once launched, I don't need to talk to the UAVs. They understand what the mission is. They're talking to one another. You want to know what's it up to. You want to control it. You need to. But it isn't a UAV pilot flying it like a remote control aircraft.

A CGI image of LOCUST drones attacking a settlement.

I spoke to an individual from a military think tank after watching your video. She was skeptical that LOCUST would make it past the necessary trials to be used in the field, especially because of its autonomy.
I'm very confident that I'm addressing the risk adequately so that it will have success. In terms of the next steps to field such a capability there's quite a bit more work that needs to be done.

When I read the press release, the word that really jumped out was autonomous. The idea of autonomous, weaponized UAVs does seem like a departure from the UAVs we see at the moment.
I'm not sure I see as broad a distinction as you do. Safety, in terms of our sailors, our Marines, [is] paramount. This is a major first demonstration in this regard to take it from the idea of just cartoon sketches and people talking about it to, hey, there's a reality associated with this. If it drives the discussion on exactly the kind of things you're bringing up [around autonomy], then that's a good thing.

Advertisement

Is the kind of research going on at the ONR at the moment moving in the direction of more autonomy for UAVs?
Well, the Office of Naval Research is a leader in autonomy science technology and development. The future is manned platforms working with unmanned platforms on the battlefield.


As important as Lee's insight into the project was, I figured it would be worth redressing the balance a little by speaking to someone whose job isn't to design and promote the exact technology we were talking about.

Stephan Sonnenberg is a Clinical Supervising Attorney and Lecturer in Law at Stanford University. He helped to coordinate and co-authored a report on the effects of the drone war on civilians, which can be found on the anti-drone warfare website livingunderdrones.org.

For more on war, watch our doc "It's Like Vietnam All Over Again":

VICE: Have you seen the LOCUST promotional video and, if so, what sort of angle are you coming at it from?
Stephan Sonnenberg: I'm concerned about how all this is going to be impacting civilians. You're expanding the capability—the range—of very lethal weapons systems into situations you wouldn't currently use that kind of lethal force. It's amazing for a promotional video that the target for this is indiscriminate shelling of a village.

Yeah, putting a Middle Eastern–looking settlement in the video struck me as odd, from a PR point of view. Legally, is this idea of autonomy more cause for concern than the drone technology we see at the moment?
Human Rights Watch have taken the position of many others who think that the line should be drawn with autonomous weapons. You're abdicating ethical responsibility to some kind of a programmer to write code that's going to be consistent with humanitarian norms. I think there's a lot to be worried about.

Is there any sort of legal framework in place to differentiate between manned and unmanned flights?
The US will put forward its own justifications, many of which are classified, but if you really look at it it's very scary. For example, kids that are 12-years-old, or whatever, are going to be assumed to be targets unless posthumously proven otherwise, which is obviously outrageous.

Is there any legal framework in place to stop the US developing a fully autonomous drone?
No, I don't think there is. If I were having to argue that there was, I would come up short.

Follow Henry on Twitter.