Artists Are Suing Over Stable Diffusion Stealing Their Work for AI Art

"They will replace the very artists whose stolen works power these AI products with whom they are competing."
Artists Are Suing Over Stable Diffusion Stealing Their Work for AI Art
Images generated by Midjourney

Three artists have teamed up in a class action lawsuit against Stability AI, DeviantArt, and Midjourney, alleging that the text-to-image AI tools have infringed the rights of thousands of artists and other creatives “under the guise of ‘artificial intelligence.’” 


The lawsuit, announced on Saturday, claims that the Stable Diffusion tool used by Stability AI, DeviantArt, and Midjourney was trained on billions of copyrighted images scraped from the internet and contained within the LAION-5B dataset, which were downloaded and used by the companies “without compensation or consent from the artists.” 

Artists Sarah Andersen, Kelly McKernan, and Karla Ortiz have teamed up with The Joseph Saveri Law Firm and Matthew Butterick, the same team that filed a class-action lawsuit on behalf of Github programmers against GitHub, Microsoft, and OpenAI for using their code without permission to train Microsoft’s latest AI tool, GitHub Copilot. 

“If Stable Diffusion and similar products are allowed to continue to operate as they do now, the foreseeable result is they will replace the very artists whose stolen works power these AI products with whom they are competing,” the law firm wrote in a press release. “AI image products are not just an infringement of artists' rights; whether they aim to or not, these products will eliminate ‘artist’ as a viable career path. In addition to obtaining redress for the wrongful conduct, this lawsuit seeks to prevent that outcome and ensure these products follow the same rules as any other new technology that involves the use of massive amounts of intellectual property. If streaming music can be accomplished within the law, so can AI products.” 


Since text-to-image generators have increased in popularity in the past year—with Stability AI’s Stable Diffusion, DeviantArt’s DreamUp, and Midjourney being among the most popular—many artists have been vocalizing their opposition to AI art after seeing their own mangled signatures appear in AI art and finding out that their work was used to train the tools without their permission

Not only are many artists furious at the possible copyright violations of AI art, but they have also been frustrated by how AI tools are able to sidestep the labor and art processes they are required to go through. As a result, many artists have been trying to ban AI art from art-sharing platforms, including ArtStation, an art portfolio website that saw a protest against AI art from users that were unhappy with the platform’s promotion of AI art next to their human-created work. 

Artists aren't the only ones suing over Stable Diffusion. According to The Verge, Getty Images announced on Tuesday that it is also suing Stability AI over copyright violation. In a press release, Getty claimed that Stability AI did not seek any licenses from Getty Images and “copied and processed millions of images protected by copyright and the associated metadata owned or represented by Getty Images.” 

As AI innovation continues at a rapid speed, it seems that artists and lawmakers are trying to put a break on further developments before it's too late.

Many tech enthusiasts are still rallying around the promise of AI tools to create detailed images cheaply and quickly—anonymous tech supporters have already published a response to Butterick and Joseph Saveri’s class-action lawsuit on a site called, using the “fair use” clause as the basis for why copyrighted images could be used by the tool.

Fair use, a legal doctrine that allows for a limited usage of copyrighted materials without permission by the original creators for purposes such as teaching and news sharing, is what a lot of generative model creators claim that their training data is covered by. However, since fair use applies differently to each case, it’s hard to make a blanket statement about all work in a dataset as being applicable.

“Please note that we take these matters seriously. Anyone that believes that this isn’t fair use does not understand the technology and misunderstands the law,” a spokesperson from Stability AI told Motherboard.

Midjourney and DeviantArt did not immediately respond to Motherboard’s request for comment. The Joseph Saveri Law Firm also did not immediately respond.

Update: This article was updated with comment from Stability AI.