We Asked 43 Facial Recognition Companies if They'll Refuse to Work With Cops

After IBM and Amazon pulled access to its facial recognition software from law enforcement, we asked other companies that advertise the technology if they'll follow suit.
June 11, 2020, 3:43pm
flag-usa-controls-security-camera-97509
Image: Francesco Ungaro/Pexels

In a letter to Congress Monday, IBM CEO Arvind Krishna said the company will no longer offer general purpose facial recognition technology, and that the company would oppose its use—or the use of any technology—for “mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values.”

In the letter, spurred by the Black Lives Matter protests and the police killing of George Floyd, Krishna said now is the time to begin a national dialogue on whether and in what capacity domestic law enforcement agencies should use facial recognition technology.

And Wednesday evening, Amazon announced a one-year moratorium on police use of its Rekognition face recognition technology.

Bias in artificial intelligence is well documented; predictive policing algorithms disproportionately target majority Black neighborhoods, facial recognition systems often can’t recognize Black people, and the Black community is surveilled at a disproportionate rate, as well.

“Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe,” Krishna said in the letter. “But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported.”

To see if any other facial recognition companies agreed with IBM’s suggestion that there’s a “shared responsibility” to use AI responsibly, Motherboard reached out to 43 companies developing facial recognition technology, as compiled by Medium’s OneZero and supplemented with a few additional companies and federal agencies that have a focus on surveillance.

We asked if their facial recognition technology was used by the police, whether they supported Black Lives Matter protests, and if they would make a commitment to stop developing facial recognition technology and commit to not work with the police.

Here are their answers, listed alphabetically:

3DiVi

Did not respond.

Adera Global PTE Ltd.

Did not respond.

AiUnion

Did not respond.

Alchera

Did not respond.

AllGoVision

Did not respond.

Amazon

Did not respond, but issued a statement about Rekognition Wednesday.

AnyVision

Adam Devine, chief marketing officer for AnyVision, offered the following response:

1. Is our facial recognition technology used by the police?

AnyVision's facial recognition technology is not being used by the police, but we believe that it can and should be used by law enforcement to eliminate the bias that is clearly inherent in many organizations.

2. Do we support Black Lives Matter protests?

It would be easy to respond to this question with a careful, generic, PR-approved statement void of any real opinion or meaning. F#ck that. We're a startup with roots in Israel, a country that had to fight its way into existence, faces constant threat, and is populated by a people who have been persecuted for thousands of years. We support every effort—large and small, quiet and loud, peaceful and violent—that demands and earns equality, respect and safety for every race, sexual orientation, gender and religion.

We support the mission of BLM, and we applaud the protests and the significant achievements that have already come from the time, risk and perseverance of every individual who has taken to the streets of cities around the world.

3. Have we seen IBM's decision and will we stop developing facial recognition technology?

We have seen IBM's decision, and we disagree with it.

We believe IBM would have made a bigger, more powerful and frankly more effective statement if it had ceased doing business with law enforcement organizations and dedicated time and budget to legislative efforts to establish accuracy criteria for facial recognition algorithms and set forth guidelines for its ethical use rather than ceasing whatever development it was pursuing.

Their decision fails to recognize a critical point: just as bad people within bad organizations use force unjustly, bad people use technology unjustly. [...]

We stand by our technology and the good that can come of its ethical use. We believe that facial recognition with sufficient training is in fact a safer and more effective tool than human perception, because it's easier to train the bias out of a line of code than it is a line of people.

Kevin, it would be easy to say that all facial recognition is bad because inferior versions of the technology are used in bad ways by bad people. But that's a short-sighted, knee-jerk reaction.

Nuclear power can illuminate a city or flatten it. Facial recognition can be biased or it can be used to protect people from threats.

Awidit Systems

Did not respond.

ClearLink

Did not respond.

Cyberextruder

Did not respond.

Cyberlink Corp

As a matter of policy, CyberLink generally does not comment on client relationships, and as a Taiwanese technology company, we also generally refrain from commenting on the local politics of other countries. Facial recognition is capable of providing solutions that go far beyond surveillance purposes, such as highly secure authentication for contactless payments, access control, personnel authentication, and device or account login. FaceMe was designed to deliver solutions such as these, and with privacy protection at the forefront. We feel this discussion is an important exemplification of why regulation is deeply needed around the use of facial recognition technology, specifically when it comes to fulfilling a public safety role, to ensure this technology is employed ethically, is free of algorithmic bias and does not violate any individual’s privacy or personal freedom.

Dahua Technology

Did not respond.

Dynamic Imaging Systems

Did not respond.

FaceFirst

Did not respond.

Federal Bureau of Investigations

Did not respond.

Idemia

Did not respond.

Imagus Technology

Did not respond.

Innovatrics

1. We don't have any projects with any of the US police departments and neither are we planning to.

2. Many of our employees have joined Black Lives Matter support demonstrations here in Bratislava, Slovakia, and we fully endorse that.

3. It's difficult to comment on the decision of IBM, since we don't know anything about the quality of their algorithms. Unlike most face recognition developers, IBM has not been participating in the independent FRVT (Face Recognition Vendor Tests) done by the US NIST, which are considered industry standards.

The reason is that we provide face recognition services in many countries in Africa, Asia and South America (most recently for electoral register in Guinea or provision of consumer loans in the Philippines and Vietnam). We need those algorithms to be accurate and reliable and therefore we use a very comprehensive training dataset that doesn't prefer one region or skin complexion over others.

We don't plan to stop developing face recognition software, since we see and endorse its use for empowering people and making their life more comfortable. There are many positive ways how to use it while upholding basic human rights. In many African countries, for example, biometrics and face recognition enabled many people to actually have a vote in elections.

Intel

Did not respond.

Intellivision

Did not respond.

KanKan Ai

Did not respond.

Luxand Inc.

Did not respond.

MicroFocus

Did not respond.

NEC Global

Did not respond.

Neurotechnology

Did not respond.

Nodeflux

Did not respond.

Palantir

We do not build facial recognition algorithms nor do we work with US law enforcement agencies on facial recognition applications.

Realnetworks Inc.

Did not respond.

Rokid Corporation Ltd.

Did not respond.

Smilart

Did not respond.

Tech5 SA

Did not respond.

Tevian

Did not respond.

Toshiba

Did not respond.

Trueface

1. Is your facial recognition technology used by the police? No, it is not.

2. Do you support Black Lives Matter protests? Yes, we support the First Amendment and the right for citizens to express views through protest.

3. We do not work with any police forces but it is critically important that technology like face recognition works across genders and ethnicities and that companies like Trueface strive to continue to raise the bar. We have made our gender and ethnicity information publicly available and we continue to drive accountability in the space by encouraging the industry to do the same.

U.S. Customs and Border Protection

Did not respond.

U.S. Department of Homeland Security (DHS)

Did not respond.

U.S. Immigration and Customs Enforcement (ICE)

U.S. Immigration and Customs Enforcement’s (ICE) use of facial recognition technology is primarily by Homeland Security Investigations (HSI) special agents investigating child exploitation, human trafficking and other types of criminal investigations. HSI’s work to combat online child sexual exploitation and human trafficking has been widely recognized by law enforcement agencies around the world, and facial recognition technology is critical to identifying the perpetrators of these crimes. ICE does not routinely use facial recognition technology for civil immigration enforcement.

Via Technologies Inc.

Did not respond.

Videonetics Technology

Did not respond.

Vigilant Solution

Did not respond.

Vision-Box

Did not respond.

VisionLabs

Did not respond.

Yitu Technology

Did not respond.

This article originally appeared on VICE US.