A Victorian MP is pushing for improved laws to protect women from the weaponisation of AI tools after Nine News blamed Photoshop for digitally sexualising an image of her for a TV bulletin.

On Monday Nine News used an edited photo of Victorian Animal Justice Party MP Georgie Purcell as part of a bulletin on duck hunting in the state. The image appeared to have made Purcell’s breasts larger and exposed her midriff.

“Having my body and outfit photoshopped by a media outlet was not on my bingo card,” Purcell posted on X.

“Note the enlarged boobs and outfit to be made more revealing. Can’t imagine this happening to a male MP.”

In a follow-up statement, Purcell called on the sexist nature of the edits.

“Unfortunately, the difference for women is that they also have to deal with the constant sexualisation and objectification that comes with having images leaked, distorted and AI generated,” she said.

“The message this sends to young women and girls across Victoria is that even at the top of your field, your body is always up for grabs.”

Nine News director Hugh Nailon apologised to Purcell in a statement addressing the photo, blaming the error on an automation tool in Photoshop.

“Our graphics department sourced an online image of Georgie to use in our story on duck hunting,” Nailon said.

“As is common practice, the image was resized to fit our specs. During that process, the automation by Photoshop created an image that was not consistent with the original. This did not meet the high editorial standards we have and for that we apologise to Ms Purcell unreservedly.”

A spokesperson for Adobe, the company behind Photoshop, said its generative artificial intelligence features require “human intervention”.

“Any changes to this image would have required human intervention and approval,” an Adobe spokesperson told The Guardian.

Speaking on ABC RN Breakfast on Wednesday, Purcell said stronger protections are needed against the use of AI tools to target women.

“This has happened to me this week, it also actually happened to Taylor Swift this week with deepfake [images] using AI, happening to her all over Twitter on a much larger scale,” she said.

“I think we need to seriously consider that our laws are probably not keeping pace with emerging technologies like AI and the risks they pose not just to women in public life, but everyday women as well.

“I’ve heard stories from young women who have had just enough photos on their Instagram profiles for them to be taken and altered to appear naked without their consent, and it’s deeply, deeply concerning.”

Victorian Premier Jacinta Allan said the edited photo was concerning.

“That’s no way to represent any woman, let alone a woman who holds a position in public office, represents a community and is in the public discourse every single day,” Allan told the media on Tuesday.

Earlier this week X blocked all searches for Taylor Swift after AI-generated explicit images of the pop star spread across the platform.

One of the pornographic fake images of Swift had 47 million views before it was taken down by X.

US lawmakers have proposed legislation requiring creators to include a digital watermark on any content produced using AI, along with criminalising the act of sharing deepfake pornography online.

Australia’s eSafety Commissioner has warned that AI tools are being used by children to bully other students, and that AI-generated child sexual abuse material was on the rise last year.

eSafety Commissioner Julie Inman Grant said there had been a “growing number of distressing and increasingly realistic deepfake porn reports” and her agency had received reports of “sexually explicit content generated by students using this technology to bully other students”.