Adobe is selling AI-generated images showing fake scenes depicting bombardment of cities in both Gaza and Israel. Some are photorealistic, others are obviously computer-made, and at least one has already begun circulating online, passed off as a real image.
As first reported by Australian news outlet Crikey, the photo is labeled “conflict between Israel and palestine generative ai” and shows a cloud of dust swirling from the tops of a cityscape. It’s remarkably similar to actual photographs of Israeli airstrikes in Gaza, but it isn’t real. Despite being an AI-generated image, it ended up on a few small blogs and websites without being clearly labeled as AI.
Videos by VICE
Motherboard found multiple AI-generated images currently for sale on Adobe Stock that claim to depict the bloody conflict, which has seen the Israeli Defense Force bombard the Palestinian territory and kill thousands of civilians in the weeks since an attack by Hamas killed 1,400 Israelis. One AI-generated image is titled “Destroyed buildings as a result of a rocket fired by militants from the Gaza Strip” and shows devastated streets; another is called “Destroyed buildings in Gaza town of Gaza strip in Israel, Affected by war.” Other AI-generated images for sale on Adobe Stock show fake explosions inside Israel.
The images for sale were submitted, and presumably generated, by users. Some of them say they are AI-generated in their title, while others do not. The images are marked as being generated by AI in the site’s interface. However, proliferating—and profiting from—fake imagery of Israel-Palestine is arguably a dubious practice even if it’s labeled at the point of sale.
As numerous experts have pointed out, the collapse of social media and the proliferation of propaganda has made it hard to tell what’s actually going on in conflict zones. AI-generated images have only muddied the waters, including over the last several weeks, as both sides have used AI-generated imagery for propaganda purposes. Further compounding the issue is that many publicly-available AI generators are launched with few guardrails, and the companies that build them don’t seem to care.
Adobe Stock is a contributor network. Individual artists submit works to the website which then sells it to users for a fee. Last year, Adobe began accepting AI-generated image submissions. It’s probable that the Israel-Palestine images on Adobe were generated using an AI trained on photographs of actual war zones, but it’s impossible to know. In its content guidelines, Adobe tells users not to submit AI-generated images with iffy rights. “Don’t submit any work created with prompts referring to or based on people, places, or property unless you have the legal right to do so,” it says. “If you are unsure if you have the necessary rights, do not submit content made using that prompt.” The guidelines make no mention of whether users should upload images depicting ongoing violent conflicts.
“Adobe Stock is a marketplace that requires all generative AI content to be labeled as such when submitted for licensing,” an Adobe spokesperson told Motherboard in an email. “These specific images were labeled as generative AI when they were both submitted and made available for license in line with these requirements. We believe it’s important for customers to know what Adobe Stock images were created using generative AI tools.”
“Adobe is committed to fighting misinformation, and via the Content Authenticity Initiative, we are working with publishers, camera manufacturers and other stakeholders to advance the adoption of Content Credentials, including in our own products,” the statement continued. “Content Credentials allows people to see vital context about how a piece of digital content was captured, created or edited including whether AI tools were used in the creation or editing of the digital content.”