Remember that time you accidentally tagged your mom in a meme? Now imagine that meme is a hyperrealistic, AI-generated nude of you circulating the school. Welcome to the new reality for some teenagers.
Elliston Berry, a 14-year-old from North Texas, found herself in this digital nightmare when a classmate used AI to create deepfake images of her, photoshopping her face onto a nude body. The pictures spread like wildfire, turning her life upside down. “It was like my whole life turned upside down,” she says. Apparently, that’s just Tuesday for Gen Z now.
It’s Not Just Elliston: The Rise of AI-Fueled Harassment
According to a recent report from Thorn, a nonprofit focused on child safety online, one in eight teens knows someone victimized by deepfake nudes. Let that sink in. Deepfake technology is becoming increasingly accessible, and the results are often indistinguishable from reality. It’s not just a theoretical problem; it’s a weaponized tool for abuse and harassment, disproportionately targeting teenage girls and women. Because of course it is.
Bullying 2.0: Now With Extra Pixels
The images of Elliston and eight other girls were spread through an anonymous Snapchat account. The perpetrator? A male student, naturally. The lack of a clear playbook for dealing with such incidents left the girls and their families in limbo for months. Some even transferred schools. Nothing says ‘justice’ like uprooting the victims’ lives while the perpetrator probably still thinks it was a hilarious prank.
The mental health toll is staggering. Shame, guilt, fear – all the hallmarks of traditional sexual assault cases, amplified by the viral nature of the internet. Each time the image resurfaces, it’s a fresh wound. As RAINN’s Stefan Turkheimer puts it, “Each time the image comes up and a friend calls and says, ‘I’ve seen you online,’ it’s a new transgression, it’s a new piece of shame.”
The Numbers Don’t Lie (But Sometimes They’re Deepfakes)
Thorn’s survey reveals a disturbing trend: a significant number of teens are directly or indirectly affected by this issue. And many suffer in silence, hesitant to seek help. The shame and guilt are real, even if the image isn’t.
The National Center for Missing and Exploited Children (NCMEC) has seen a massive surge in reports of generative AI being used to create abusive sexual content. Reports skyrocketed from 4,700 in 2023 to 67,000 in 2024. We’re officially living in the future, and it’s a dystopian hellscape of manipulated images and shattered lives.
Teenage Wasteland: Body Image Edition
Adolescence is a crucial period for developing self-esteem and body image. Throwing deepfake nudes into the mix is like tossing a grenade into an already fragile ecosystem. The trauma can disrupt a young person’s sense of self and their ability to build healthy relationships. Anxiety, depression, PTSD – it’s a buffet of mental health issues waiting to happen.
Elliston’s story is a stark reminder of the real-world consequences of this technology. She worried about the images resurfacing, about how they might affect her future. She even started wearing sweatshirts and sweatpants to hide her body. The perpetrator didn’t just create fake images; he stole her sense of security and self-worth.
The Legal Limbo: Where’s the Justice?
The parents of the victimized girls filed a Title IX investigation, and the perpetrator was charged with a misdemeanor. But there’s no federal law requiring online platforms to remove non-consensual intimate images. Getting Snapchat to take down the images required intervention from a U.S. Senator. It’s clear the legal system is playing catch-up.
Elliston is now advocating for the Take it Down Act, which would criminalize the publication of non-consensual, sexually exploitative images, including AI-generated deepfakes. Even Melania Trump is on board. If that’s not a sign of the apocalypse, I don’t know what is.
What Can Be Done?
If you’re a victim of deepfake nudes, report it to the authorities via NCMEC’s CyberTipline. Don’t delete anything; it could be crucial evidence.
More importantly, talk to your kids. Have open conversations about deepfakes, consent, and the potential for harm. Make sure they know they can come to you if something happens. Being “behind the ball,” like Elliston’s mom said, is no longer an option. We are all behind the ball. Get ahead of it. Or at least, try to duck.
Because in the digital age, ignorance isn’t bliss; it’s a vulnerability waiting to be exploited.
Leave a Reply