AI Art: Beyond the Legalese and Into the Soul
Every artist I know is seeing red. Illustrators, novelists, poets – the creative wellspring is overflowing with righteous fury. Why? Because their life’s work, the very essence of their artistic soul, is being slurped up by AI companies without so much as a ‘by your leave,’ let alone a royalty check.
Remember when OpenAI showed off how ChatGPT could mimic Studio Ghibli’s animation style? Cue the internet explosion of Ghibli-fied selfies. Then there’s Meta, apparently hoovering up millions of books to feed its AI beast. No wonder artists are feeling a tad…violated.
The typical artist outcry usually revolves around “permission” and “payment,” or perhaps a deeper philosophical lament about the “erosion of human creativity.” Perfectly valid points, of course. But these are also precisely the arguments AI evangelists love to swat down like pesky flies.
The defense usually goes something like this: ‘Training AI on copyrighted material falls under ‘fair use,’ so chill out, creatives.’ (OpenAI, naturally, claims this applies to their training, allowing stylistic imitation but not of individual living artists. Lawyers are currently having a field day interpreting that murky water.) And even if it isn’t fair use, IP rights shouldn’t get in the way of progress, dammit! Innovation for the good of humanity!
But what if this ‘innovation’ actually…harms humanity? What if artists are being forced into a role they fundamentally oppose, causing something far deeper than mere financial loss? We’re talking about moral injury.
Moral Injury: A Wound Deeper Than Copyright Infringement
Moral injury. It’s what happens when you’re forced to betray your own values. Psychiatrists stumbled upon it with Vietnam vets, forced to carry out orders that clashed violently with their conscience. Think doctors forced to ration care, or teachers implementing draconian discipline. It leaves a mark – a lingering sense of shame that can lead to severe mental distress.
AI-generated art and moral injury? Seems a stretch, right? How can your drawings being used to train an algorithm cause that? The crux of the matter is being forced to contribute, however passively, to a project artists fundamentally reject. It’s not about lost revenue; it’s about being complicit in something they believe is actively harmful, even if they can’t quite articulate why. Yet.
Framing the argument as moral injury hits harder. It’s a direct challenge to the ‘AI is progress, therefore shut up and be grateful’ narrative.
The Luddite Fallacy and the Pornography Analogy
Ah, the Luddites. Always trotted out to shame anyone questioning technological advancement. But let’s get real: the Luddites weren’t anti-technology; they were anti-exploitation. They opposed using machines to replace skilled workers with children earning pennies. The tech wasn’t the problem; the application of the tech was.
AI, narrowly tailored for specific purposes (drug discovery, for example), holds immense potential for good. But Artificial General Intelligence (AGI)? The hypothetical system that surpasses human intelligence? The very people building it are openly worried it might collapse the economy or even wipe us out. So, spare us the ‘progress’ argument.
And what about fair use? The principle hinges on the purpose of the use. Is it commercial? Will it harm the original creator? This isn’t just remixing music or sharing files. This is closer to pornography.
Wait, what? Hear me out. A court might allow someone to take your photograph in public. But they sure as hell won’t let them turn it into non-consensual porn. AI art, for many artists, is a form of digital defilement. It’s transforming their work into something they find morally repugnant because they believe it contributes to the “enshittification” of the world (even if it doesn’t end it).
That’s where we find ourselves: staring down the barrel of moral injury.
Artists are reaching for the familiar language of copyright and intellectual property. But what they’re really talking about is human agency. Creativity and originality matter because they are essential to who we are as people. And AI, in many artists’ views, is eroding that agency. It homogenizes taste, addicts us to AI companions, and tricks us into surrendering our capacity for ethical thought.
Forcing artists to be complicit in a project that undermines their very humanity? That’s not just unfair; it’s a profound moral violation. And that’s the argument that needs to be screamed from the rooftops.
Leave a Reply