The art world, bless its bewildering heart, is at it again. This time, the culprit isn’t a urinal signed by a pseudonym, but algorithms churning out ‘art’ at a rate that would make Warhol blush. And, predictably, it’s both generating eye-watering sums of money and triggering existential dread among actual artists.
Christie’s, the auction house known for flogging dead masters and dresses worn by dead princesses, recently held its first ever AI art sale. The loot? Over a million US dollars. The backlash? A petition signed by thousands of artists demanding the whole thing be cancelled. Seems fair.
The crux of the issue? Copyright, originality, and the looming specter of Skynet replacing Bob Ross. Copyright law, in its infinite wisdom, doesn’t recognize AI as an author. So, who owns the ‘art’? Nobody? Everyone? Prepare for legal battles as exciting as watching paint dry. (Except, you know, the paint was applied by a robot.)
One side argues that AI art is just another tool, like photography before it, destined for eventual acceptance. After all, photography was once considered cheating. Now, we have Instagram influencers and family portraits. Progress, right?
Sue Beyer, a multi-disciplinary artist, sums it up nicely. She uses AI like an “oracle,” feeding it prompts and treating its outputs as a collaborative process. Asking an AI ‘Who is Sue Beyer?’ is peak 21st-century navel-gazing, and we’re here for it.
But not everyone’s thrilled with the digital brushstroke. Birrunga Wiradyuri, a Wiradjuri man and founder of Birrunga Gallery, boycotted the Brisbane Portrait Prize over its AI policy. His concern? Cultural appropriation on steroids. AI scraping Indigenous art, reproducing stereotypes, and generally colonizing culture at warp speed. “Another level of invisible-ing us,” he says. Ouch. He’s got a point. Imagine an algorithm trained on Van Gogh paintings suddenly churning out “Sunflowers, but make it beige.” That’s the risk, amplified a thousandfold.
Dr. Louise Buckingham from the Arts Law Centre Australia highlights the existing problem of fake Indigenous art. AI, she argues, will only exacerbate cultural flattening and theft. Suddenly, those dodgy dot paintings sold to tourists become a full-blown existential threat.
Adobe Stock’s depictions of “Indigenous Australians” don’t help, either. Feather headdresses and generic body markings? It’s a cultural mishmash that’s offensive and lazy. And they’re charging $90 a pop for the privilege of perpetuating stereotypes. Thanks, Adobe.
The University of Chicago is fighting back with tools like Nightshade and Glaze. These programs “poison” images, turning them into digital landmines for AI training models. Ask for a cow in space, get a handbag. It’s digital sabotage, and frankly, hilarious.
Ultimately, the AI art debate boils down to control. Who controls the technology? Who benefits from it? And who gets erased in the process? As Dr. Pfefferkorn notes, artists have been grappling with the ethical implications of technology for centuries. Maybe, just maybe, we can navigate this digital minefield without accidentally blowing up creativity itself.
Leave a Reply