Tech

Amnesty Uses Warped, AI-Generated Images to Portray Police Brutality in Colombia

The images got Colombia's flag wrong and invented fake humans in place of real protesters, adding a level of abstraction to real-world horrors.
Screenshot 2023-05-01 at 11

Last weekend, Amnesty Norway—an arm of human rights NGO Amnesty International—tweeted a series of AI-generated images to draw attention to the two-year anniversary of Colombia’s National Strike

“Police officials raped and insulted women and LGBTI people taking part in protests,” text superimposed over an AI-generated woman standing in front of AI-generated flames said. Another image shows AI-generated cops pulling an AI-generated woman away from a protest. The woman is draped in a flag that shows Colombia’s red, gold, and blue stripes out of order.

Advertisement

The events described by Amnesty did happen. On April 28, 2021, tens of thousands of Colombians protested a proposed tax increase, corruption, police brutality, in the streets of Bogota, Medellin, Cali, and other Colombian cities and towns. Dozens of people died and hundreds of people were injured and arrested during the protests, which lasted for months. There are thousands of photos of real people actually protesting. And yet Amnesty chose to remember these protests with AI-generated images that don’t even manage to get Colombia’s flag correct, creating a public narrative that turns reality into a dystopian, AI-generated fiction. 

The images contain hallmarks of AI generation. Besides the Colombian flag’s colors being in an incorrect order, in the same image there are plenty of unrecognizably warped human faces and features like hair, officers’ names are gibberish, and hands in the image are distorted. In another image shared by Amnesty Norway, depicting an AI-generated Colombian police officer, the officer’s helmet is a nonsensical and mangled design in addition to strange anatomical features. 

Amnesty has access to real photos of the depicted events; interspersed among the AI-generated images is an interview with Leidy Cadena Torres, a Colombian political scientist who lost an eye during the protests and ultimately had to seek asylum in Norway. Amnesty International has also put out various reports about police brutality during the Colombian protests, and has worked with specific photojournalists and videographers to capture on-the-ground media depicting the real horrors of police brutality in the country.

Considering this, using AI-generated images that get basic things like Colombia’s flag wrong and invent fake human beings in place of the real protesters undermines Amnesty's mission and adds a level of abstraction from the real-world atrocities that happened in Colombia. It also undermines Amnesty’s position as an organization that can be trusted to provide on-the-ground details about various human rights violations around the world.

“As part of its campaign for police reform in the Colombia, Amnesty International decided to use artificial intelligence images as a means of illustrating the grave human rights violations committed during the 2021 National Strike without endangering anyone who was present,” an Amnesty International spokesperson told Motherboard in a statement. “Many people who participated in the National Strike covered their faces because they were afraid of being subjected to repression and stigmatization by state security forces. Those who did show their faces are still at risk and some are being criminalized by the Colombian authorities.”

“Amnesty International works frequently with photojournalists, photographers and visual artists to develop campaigns materials and often uses real-life photography of the human rights violations that it is denouncing. The organization has done this in the past in Colombia, including in materials related to the repression of protests in the context of the 2021 National Strike. On this latest occasion, Amnesty International decided to use AI-generated images as a means of depicting victims while still protecting their identities. The intention was for this artwork to serve as a symbolic example that highlights the fear that many of the victims still feel about publicly denouncing the repression they suffered,” they continued. “All of the AI-generated images are clearly marked ‘illustrations produced by artificial intelligence.’ The organization used this disclaimer precisely to avoid misleading anyone. The images were also edited in a way so that they are clearly distinguishable from real-life photography, including the use of vivid colours and a more artistic style to honor the victims. Amnesty International’s intention was never to create photorealistic images that could be mistaken for real life. Leaving imperfections in the AI-generated images was another way to distinguish these images from genuine photographs.”