I love Generative Fill. There, I said it. When Adobe dropped it into Photoshop a couple years back, I immediately used it to remove a coffee cup from a client shot, fix some awkward shadows, and extend a landscape that was framed just slightly wrong. It took seconds. It worked beautifully. It made me more efficient, and my client was thrilled.

But here’s the thing that’s been nagging at me ever since: I have no idea whose work trained that tool to do those things.

The Convenience vs. The Cost

Let’s be honest about what Generative Fill actually does. You draw a selection, hit fill, and Adobe’s AI—powered by Firefly and trained on billions of images—generates pixels that match your scene. It’s not magic. It’s pattern recognition trained on a dataset so massive that Adobe can’t (or won’t) tell us exactly what’s in it.

The problem isn’t that AI exists. The problem is that these models were trained on imagery harvested from the open internet without explicit consent from the artists who created it. A landscape photographer’s portfolio. A concept artist’s Artstation. Stock photos. All of it potentially fed into the machine that now does work that used to require hours of manual skill.

Adobe says Firefly was trained on licensed content and public domain work. Great. But here’s what we know: other AI models—DALL-E, Stable Diffusion, Midjourney—were trained on scraped internet imagery. Some artists found their work used to train models they didn’t authorize and couldn’t opt out of. Several lawsuits are ongoing. The legal situation is genuinely messy.

What This Actually Means for Working Artists

I’ve got a friend—solid digital painter, years of experience—who recently lost a gig to a generative workflow. The client said the turnaround was faster and the cost was lower. My friend’s rate didn’t change. The market just shifted beneath her feet while she was still mastering her craft.

That’s not hypothetical. It’s happening right now. Junior designers and concept artists are feeling it hardest. The stuff that used to be an entry point—background work, asset generation, compositional adjustment—is increasingly being handled by tools that don’t need to learn, don’t need to be paid, and don’t need to build a portfolio.

And here’s the bitter irony: those tools got good partly by learning from portfolios like the ones those artists are trying to build.

The Skill Question Nobody Wants to Admit

I’ll say it plainly: using Generative Fill doesn’t make you a Photoshop artist. It makes you someone who knows how to use Generative Fill. There’s a meaningful difference.

Learning to hand-paint a sky teaches you how light works, how colors shift, how to blend. You develop an eye. You understand form and atmosphere in a way that clicking and waiting doesn’t replicate. When you use Generative Expand to extend a canvas, you’re not learning composition—you’re betting that an algorithm trained on composition will guess what you want.

This matters because skills compound. A designer who understands perspective, color theory, and form can solve problems in ways that someone who’s only learned to prompt an AI cannot. But why spend two years learning that when you can produce something acceptable in minutes?

The market doesn’t always reward the harder path. That’s real, and that’s the actual threat.

So What Now?

I’m not going to tell you not to use these tools. They’re built into Photoshop. They’re convenient. They’re useful. I still use them for legitimate production work—removing power lines, extending skies, fixing problems that would genuinely waste time.

But I also think we should be honest about what we’re accepting when we use them. We’re accepting that the training happened without everyone’s permission. We’re accepting that this might compress opportunities for working artists. We’re accepting that the definition of “skill” in our industry is shifting in real-time.

The better approach? Use these tools where they make sense—production work, problem-solving, efficiency. But don’t let them be the only thing you know. Learn the fundamentals. Understand why something works, not just how to make it work. Build skills that can’t be easily replaced by the next software update.

And maybe—just maybe—when you’re using an AI tool, think about the artists whose work trained it. They didn’t get asked, and they didn’t get paid.

That’s not preachy. It’s just how it is.