An individual who shares non-consensual artificial intelligence-generated pornography will now face a potential jail sentence of six years under federal government legislation introduced to Parliament on Wednesday.

Attorney-General Mark Dreyfus introduced the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 to the House of Representatives on Wednesday morning.

The bill introduces new criminal penalties for the sharing of non-consensual deepfake sexually explicit material, and an aggravated offence for those who also created the content.

“Digitally created and altered sexually explicit material that is shared without consent is a damaging and deeply distressing form of abuse,” Dreyfus said.

“This insidious behaviour can be a method of degrading, humiliating and dehumanising victims.

“Such acts are overwhelmingly targeted towards women and girls, perpetuating harmful gender stereotypes and contributing to gender-based violence.”

The legislation was flagged last month following a National Cabinet meeting to address the national crisis of gender-based violence.

The bill serves to replace the existing aggravating offence of using a carriage service to menace, harass or cause offence involving private sexual material, with a new offence involving the non-consensual sharing of sexually explicit material online.

It also clarifies that this applies to real images and digitally created ones that resemble individuals, such as deepfakes.

“The issue that arises when dealing with such material under the current framework is that because the victim is not involved in the creation of the fictional ‘deepfake’ version of themselves, an expectation of privacy may not attach to the depiction of the victim,” the bill’s explanatory memorandum said.

“This issue does not arise with the new offences, which do not rely on this definition and instead turn on whether the person depicted in the material consents to its transmission.”

The legislation defines deepfakes as “images, videos or sound files, generated using AI, of a real person that has been edited to create an extremely realistic but false depiction of them doing or saying something that they did not actually do or say”.

It also clarifies that the deepfake material must be a “depiction reasonably or closely resemble an individual to the point that it could be mistaken for them”.

The sharing of such deepfake images may now result in criminal penalties of up to six years imprisonment, and if the person also created the content, they could receive a seven-year sentence.

Difficult to prosecute

Dreyfus acknowledged that it may be difficult to identify those who share deepfakes online.

“People do hide behind fake identities, people do hide behind their accounts, but that does not mean that there are not technological means of tracing,” Dreyfus said in an interview on The Briefing.

“The fact that it’s difficult to find someone is not a reason for not legislating.”

The legislating of a criminal offence for the sharing of this sort of content will also send a message, Dreyfus said.

“There’s also a value in the Parliament of Australia sending a message to the community that these activities are criminalised,” he said.

“Criminal law always serves that function of sending a message to the community, drawing lines to say this behaviour is unacceptable and it’s so unacceptable that we are going to criminalise it and provide serious penalties for it.

“We hope that that message will also have a beneficial effect.”

The use of AI technology to generate fake sexually-explicit content hit the mainstream earlier this year when deepfake pornographic images of pop star Taylor Swift went viral on social media.

The deepfakes circulated widely on platforms such as X, with one such image viewed 47 million times before it was taken down.

Australia’s eSafety Commissioner warned last year that AI-generated child sexual abuse materials and deepfakes were on the rise, with the agency receiving reports of “sexually explicit content generated by students using this technology to bully other students” and a “growing number of distressing and increasingly realistic deepfake porn reports”.