Up to 50 female students from Victorian co-ed private school Bacchus Marsh Grammar have had their likenesses used in a series of AI-generated nude images, leading to the arrest of a male teenager.
The images were reportedly created using photographs from the girls’ social media – which were downloaded and manipulated through artificial intelligence (AI) to create explicit content.
These ‘deepfaked’ images were then circulated online, effectively victimising around 50 students in years 9 through 12.
As reported by the ABC, the students are receiving support while Bacchus Marsh Grammar works with police to get the images removed and determine the person(s) responsible for producing the illicit content.
After learning her daughter’s classmates’ social media photos were “mutilated” into fake nudes, a mother of a 16-year-old girl revealed the incident has been “very traumatising”.
"I went and picked my daughter up from a sleepover and she was very upset, and she was throwing up and it was incredibly graphic," she told ABC Radio Melbourne.
The woman explained she’d seen the images over the weekend, and lamented the technology which produced them.
"I almost threw up when I saw it. It was really, really awful," she said.
"It's just too much. It's too much intensity with all the images and the AI is getting out of hand and we're obviously seeing the implications.”
While a teenage male was arrested following the scandal, comments from the school’s principal Andrew Neal suggest the culprit has not been confirmed at this time.
“Logic would suggest that the most likely individual is someone at the school … or a group of people from the school, however, the police and the school are not ruling out any other possibility as well," Neal told the ABC.
Meanwhile a Victoria Police spokesperson has confirmed an investigation is underway.
AI empowering non-consensual porn
Although ‘deepfaking’ is a term which first surfaced as early as 2017, the ability to actually create deepfake images has historically required some technical know-how, particularly surrounding machine learning and AI tools.
Over the past few years, however, more user-accessible tools have gradually begun to appear and proliferate online, to the extent where many teenagers can now utilise deepfake technology and apps at ease.
In August 2023, the eSafety Commissioner received first reports of sexually explicit content generated by students to “bully other students”, leading the commissioner to call for immediate protections.
Since then, non-consensual ‘nudify’ apps have proliferated like wildfire – enabling lay users to effectively turn any photograph into an AI-generated explicit image for a small fee – while other apps enable full, custom AI models to be created to a person’s likeness.
In June, the federal government introduced strict legislation threatening jail sentences of six years against those who share non-consensual, AI-generated pornography, as well as aggravated offences for those who create such content.
The legislation was introduced both as a stop-measure to the uptake of deepfake tools in Australia, but furthermore, to help address the national crisis of gender-based violence.
“This insidious behaviour can be a method of degrading, humiliating and dehumanising victims,” said Attorney-General Mark Dreyfus.
“Such acts are overwhelmingly targeted towards women and girls, perpetuating harmful gender stereotypes and contributing to gender-based violence.”
In a statement, Victorian premier Jacinta Allan said there is “no place for this disgraceful and misogynistic conduct in Victoria”.
“My thoughts are with the young women of Bacchus Marsh Grammar and their families,” she said.
“Women and girls deserve respect in class, online and everywhere else in our community, which is why we have made laws against this behaviour and we are teaching respectful relationships in schools to stop violence before it starts.”