Glaze: the technology that confounds artificial intelligence and protects art

0
44
glaze.jpg
glaze.jpg

Since the advent of “AI art” platforms almost a year ago, artists have watched helplessly as their creative creations are sucked into some machine learning systems and used to create new images without credit or compensation.

A team of researchers at the University of Chicago, working with artists, has developed a tool called Glaze, which is expected to enable artists to actively protect their work.

How does Glaze work?

Glaze adds a second layer on top of an artwork, almost invisible, but containing artwork with a similar composition in a totally different style. This layer is not made up of noise or random shapes, but a different piece of art. You can’t detect it with the naked eye, but any machine learning platform trying to analyze the artwork will be confused as the Glaze layer will deflect its trace.

Rather than allow these machine learning platforms to render images that are based on the style of a human artist, as is currently the case, Glaze protects the artist’s style. For example, if a user wants an illustration in the style of Karla Ortiz, instead of the platform lifting their work to mimic it, using Glaze diverts the platform’s attention to another piece of art with a similar composition, but in a different style.

short term protection

This tool does not protect images already uploaded by these platforms, but it does give artists the opportunity to actively protect any new work they post online. The Glaze team admits that it is not a permanent solution against AI mimicry, as AI evolves rapidly and systems like Glaze face the inherent challenge of being future-proof.

The artist Karla Ortiz, who participated in the creation of Glaze, assured that the tool can protect your style from being imitated without your consent. When Ortiz’s original painting was analyzed by a machine learning platform, the image of her work was erased, proving the effectiveness of Glaze.

“Stable Diffusion today you can learn how to create images in Karla’s style after viewing a few pieces of Karla’s original artwork (taken from Karla’s online portfolio). However, if Karla uses our tool to cloak her artwork, adding small changes before posting it to her online portfolio, Stable Diffusion will not learn Karla’s art style. Instead, the model will interpret her art as a different style (for example, that of Vincent van Gogh). Someone who goaded Stable Diffusion into producing “Karla Ortiz-esque artwork” would get Van Gogh-esque (or some hybrid) imagery. This protects Karla’s style from being reproduced without her consent »is explained in the portal of this tool, where its download is also available.

While what Glaze can do is not a permanent solution, it is a promising tool for artists who want to protect their creative work from AI imitation, by providing a short-term measure of protection for any new artwork that is released. online. As AI continues to evolve, it will be interesting to see how protection tools adapt to keep up with advances in this ever-changing technology.