During the last few months we have seen several platforms that are capable of creating images from text. Dall-E is the most impressive, but it cannot be tested publicly yet. We have Dall-E 2, yes, or Stable Difusion v1.5, which is no less surprising.
A few weeks ago I wrote Stable Diffusion Vs DALL·E mini, testing two generators of images from text, so you can see the potential they have.
The fact is that there are many people who manage to do wonders with these AI engines, and then use the generated images to put them in banks such as GETTY Images, among others.
That’s not a good idea.
Getty Images has banned the uploading and sale of illustrations generated with AI art tools like DALL-E, Midjourney, and Stable Diffusion, something sites like Newgrounds, PurplePort, and FurAffinity have done before. The reason is obvious, since many of the new images are generated using existing components in already published and protected images, so copyright may be violated.
There is concern about legality of AI-generated contentunaddressed rights issues regarding images, image metadata, and the people contained in images, so a user who uploads an AI-generated image may end up being served with a lawyer’s notice to pay damages.
Stable Diffusion, for one, creates images using content pulled from the web, including personal art blogs, news sites, and stock photo sites like Getty Images. There is a “fair use” law that provides some protection, but does not include the sale of images.
At the moment it is not known if there has already been any legal process with images created by AI, but Getty Images prefers to block before a problem arrives in the future.
Shutterstockon the other hand, has not published specific policies prohibiting the material, but they have limited search results.
Creating content using what hundreds of artists have done before is not a good idea if you have sales as your goal. Artificial Intelligence does not create from scratch, it creates from something it has learned, and that something can be protected by copyright.
The problem is that neither platform will know if an image is created by AI or not, so they will have to rely on reports made by other users to report such images. They are now working with C2PA (the Coalition for Content Provenance and Authenticity) to create filters, but it seems that it will take time to be really effective.