This is not about any specific case. It’s just a theoretical scenario that popped into my mind.

For context, in many places is required to label AI generated content as such, in other places is not required but it is considered good etiquette.

But imagine the following, an artist is going to make an image. Normal first step is search for references online, and then do the drawing taking reference from those. But this artists cannot found proper references online or maybe the artist want to experiment, and the artist decide to use a diffusion model to generate a bunch of AI images for reference. Then the artist procedes to draw the image taking the AI images as references.

The picture is 100% handmade, each line was manually drawn. But AI was used in the process of making this image. Should it have some kind of “AI warning label”?

What do you think?

  • Andrzej3K [none/use name]@hexbear.net
    link
    fedilink
    arrow-up
    2
    ·
    16 hours ago

    This is like in the 00s when every comic book artist was obviously using 3D modeling software but none of them would admit to it. You can’t expect artists to be honest about this sort of stuff because they’d just be putting themselves at a disadvantage to the ones who lie about it. Furthermore, it doesn’t really matter, does it