Security camera footage of three people holding guns on a sidewalk outside a building
Omnilert
Tech

Gun Detection AI is Being Trained With Homemade ‘Active Shooter’ Videos

Companies are using bizarre methods to create algorithms that automatically detect weapons. AI ethicists worry they will lead to more police violence.

In Huntsville, Alabama, there is a room with green walls and a green ceiling. Dangling down the center is a fishing line attached to a motor mounted to the ceiling, which moves a procession of guns tied to the translucent line.

The staff at Arcarithm bought each of the 10 best-selling firearm models in the U.S.: Rugers, Glocks, Sig Sauers. Pistols and long guns are dangled from the line. The motor rotates them around the room, helping a camera mounted to a mobile platform photograph them from multiple angles. “It’s just like a movie set,” said Arcarithm president and CEO Randy E. Riley.

Advertisement

This process creates about 5,000 images of each gun floating ethereally. Arcarithm’s computer programmers then replace the green backdrop with different environments, like fields, forests, and city streets. They add rain or snow or fog or sun. A program then randomly distorts the images. The result is 30,000 to 50,000 images of the same gun, from multiple angles, in different synthetic settings and of varying degrees of visibility.

The point of creating this vast portfolio of digital gun art is to feed an algorithm made to detect a firearm as soon as a security camera catches it being drawn by synthetically creating tens of thousands of ways each gun may appear. Arcarithm is one of several companies developing automated active shooter detection technology in the hopes of selling it to schools, hotels, entertainment venues and the owners of any location that could be the site of one of America’s 15,000 annual gun murders and 29,000 gun injuries.

Among the other sellers are Omnilert, a longtime vendor of safety notification software, and newcomers ZeroEyes, Defendry, and Athena Securities. Some cities employ a surveillance system of acoustic sensors to instantly detect gunshots. These companies promise to do one better and save precious minutes by alerting police or security personnel before the first shot is fired.

Advertisement

They are all maneuvering around a problem: Algorithms, at their most basic level, collect data that is categorized, so they can independently determine if something new is of that category. In the tech industry, it’s generally believed that more data means a sharper algorithm. For companies that want to detect gunmen, therein lies one dilemma.

Omnilert Gun Detect

Screenshot from a promotional video for Omnilert's Gun Detect software

Visual detection machine learning has been developed for a wide range of uses, including diagnosing medical conditions and identifying pedestrians in a roadway. Researchers behind those efforts have access to nearly limitless pictures of tumors and inflammation and videos of joggers or dog walkers.

However, due to sensitivity, little footage from the start of shootings is readily available, certainly not enough to program a system that is supposed to differentiate a gun from a cell phone or a hairbrush reliably hundreds or thousands of times a day. Such footage is scrubbed from all but the darkest corners of the internet. There’s no inventory of it on Roboflow, Amazon Mechanical Turk, and other libraries of images for machine learning (though Roboflow does have a supply of still photos of guns).

The reliability of gun detection systems is of serious consequence to the people they monitor. This year, the Lockport City School District, in Upstate New York, implemented an algorithmic system to recognize faces and detect weapons. The technology misidentified black children at a higher rate, and emails between employees of its creator, ST Technologies, show the Canadian company was struggling to stop the system from mistaking broom handles for guns after it was implemented.

Advertisement

“I have concerns about the reliability of the object detection system and that system misidentifying a student holding a baseball bat and [police] will go and harass that student with a baseball bat,” said Daniel Lawrence, a researcher at the Urban Institute’s Justice Policy Center, who has studied technology in crime detection.

Alternatively, Lawrence said, police tend to take these alerts less seriously if they are always detecting low-priority activity or making false positives. “Everything depends on accuracy,” he said.

To train a computer program to recognize a gun as soon as it’s drawn—and then to test that program—companies have to get creative. And a bit weird.

Arcarithm, founded as a military and security contractor by three former Lockheed Martin employees, started by programming cameras to detect drones overhead. A client challenged them to come up with a system to detect guns. “If we can do drones, we can do anything,” said Riley, “so we spent the next ten years trying to tell if a guy has a gun or a broom and it turns out we can.”

Theoretically, the vast array of distortions and alterations in images feeding Arcarithm’s algorithm would account for ways a gun is obscured in real footage—by hands, by climate, or by distance. Through seeing so many common guns so many ways, the algorithm would supposedly become so familiar with guns, it could spot one instantly.

To test if their algorithm responds to the intended stimuli, Arcarithm staffers have staged armed invasions of their own headquarters using airsoft guns, which use condensed gas to shoot tiny, non-lethal plastic pellets. They’ve also taken to a nearby field to record themselves. It is programmers and desk employees cosplaying as criminals or militiamen. “All the guys are doing it,” said Riley. “They usually work on the development end. He adds that they warn the sheriff’s department, which usually sends an observer.

Advertisement

Arcarithm has not found any buyers outside the U.S. military, which seeks an alert system for armed people coming towards a base. Riley said he has approached the operator of a theme park and a school system near Huntsville.

Of the other U.S. companies selling gun detection technology, Athena did not respond to an interview request from Motherboard, and seems to have pivoted to making a dubiously marketed technology that monitors people’s temperature amidst the COVID-19 pandemic. A spokesperson for ZeroEyes said its technology is proprietary so they would not discuss it. A representative for Defendry said the company declined because it did not want its name in an article published by VICE Media.

Omnilert has supplied notification systems, mainly to colleges and universities, since 2003, and unveiled its Gun Detect software in October.

CEO Dave Fraser describes a kitchen sink approach to the data-to-feed-the-algorithm problem. His company has used technology made to produce video games to create CGI simulations of the first moments of hold-ups and shooting sprees. They’ve trained the algorithm on Hollywood movies (he named John Wick). And there is also what Fraser dubbed “pajama videos,” homemade clips of employees walking around with guns (real and toy) recorded in their homes during the COVID-19 remote work months. He’s also outsourced the task to a few video content creators.

Advertisement

“We’ve built up an internet database of ourselves and our contractors brandishing guns,” said Fraser. “We have thousands of hours of data we created and we own.” Homemade videos are used to both feed and test the algorithm.

The videos fill up the company Slack channel, he said. And programmers and other desk employees are tasked with creating them.

Even their public director of marketing, Elizabeth Venafro, has contributed self-filmed clips of herself marching through her home toting a toy rifle, which “felt very weird, as a non-gun-owner,” she said.

Screen Shot 2020-12-15 at 12.19.16 PM.png

A graphic demonstrating Arcarithm's Exigent-GR gun detection system

Experts in academia say that machine learning can now identify objects, even from a distance, but the process hinges on sufficient data.

“Today, we are much better than we were five years ago,” Ali Farhadi, an associate professor at the University of Washington working in computer vision and machine learning, told Motherboard. “We can detect objects fairly reliably.” Each year, smaller and more specialized objects are detectable and computer scientists can program algorithms to identify the body motions and context around them. “Not only can we see scissors but we know how people act when they are cutting things,” he said.

Visual identification requires a vast amount of varied data. Even differences in the sun path between the northern and southern hemispheres and subtle differences in background scenery can cause the program to be less effective, he said. “You want something that works as well in American cities as Indian cities,” said Farhadi. It’s even best to get footage from the types of cameras one expects to be in the field obtaining the feed, he said.

Advertisement

Karthik Ramani, a professor in mechanical engineering at Purdue University, completed a project that trained computer learning to identify mechanical objects so as to help engineers find exact matches and replacements. Machine learning is capable of identifying detailed objects, he said, but synthetic data is no replacement for the thing.

In CGI-created images, “I was seeing a loss of energy,” said Ramani. “You don’t get the real-world noise and reflections and metals are shiny and things can get confused. As humans, we see this and we get used to it. The machine doesn’t know these things yet.”

Some false positives are inevitable, Fraser and Riley both conceded. But both claim the technology can give first responders a few precious minutes, or seconds, to save lives.

Lawrence, of the Urban Institute, said once any surveillance or analytic technology comes into the hands of police departments, it's inevitably used to target poor, minority areas. “It is over-applied in communities with persons of color,” he said. Such neighborhoods are disproportionately policed, and the use of technology like predictive policing is a major driver of those statistics, creating a feedback loop.

“This technology is very expensive and it makes no sense to have it applied to the entire city,” he said.

Advertisement

However, Lawrence does not think cities will buy gun detection software in the near future. The summer racial justice protests and the “defund the police” movement have caused cities to shrink from buying expensive, futuristic equipment for police purposes. “I think as a society, we are redefining what policing is and how much money should be allotted to what and how much money should go to the police,” he said. “I think we are on the precipice of using money to combat crime and the causes of crime in a different way.”

He thinks the buyers of the next generation of gun detection software will be private companies, but once a gun is thought to be detected, “the call will go to the police.”

It is widely acknowledged that the ubiquity of guns in the United States is one reason the number of police killings in the U.S. dwarf those of other countries. Police shootings of Black people sometimes begin with the excuse that the officer thought the person had a gun, including the deaths of Casey Goodman, Stephon Clark, Tamir Rice, and Amadou Diallo. During a traffic stop, Philando Castile informed an officer he possessed a legal gun and the cop immediately  opened fire.

Like many companies who make automated systems, Omnilert defends its gun-detection technology by noting that the final decision is made by a human being. “It could automatically lock the door on a suspect,” said Riley. “Now it’s up to the police to show up and see what this person does.”

As for a police overreaction, Fraser said, “It’s a possibility. We tend to look at this as ‘no technology is perfect.’ We tend to think it’s a positive to put this technology in our customer’s hands rather than have them rely on hearsay or gunshots when it’s too late.”

The possibility is enough for Meredith Whittaker, faculty director of the AI Now Institute at New York University, to reject the use of the technology outright. Whittaker and other AI ethicists and scholars have noted that all algorithmic systems contain bias, and this fundamental flaw can't simply be fixed with more data or a software update.

“They shouldn’t purchase anything like this,” she said of those who would buy gun-detection technology. “There is no dataset that would make this work. They are flawed, they are racist and they are being put into schools.”