Apple has announced a new feature that will detect and report known child sexual abuse material to law enforcement agencies.

The new feature in the iOS 14.5 update will scan photo libraries stored on iPhones, at this stage, in the US.

Known as NeuralMatch, the tool will detect known images of child sexual abuse without decrypting people’s messages.

If Apple finds a match, the image will be reviewed by a staff member, who will notify law enforcement if necessary.

The tech juggernaut will also warn children and parents if the child sends or receives sexually explicit photos through the Messages app, and block potentially sexually explicit photos sent and received through a child’s iMessage account.

Apple says in its announcement that the new feature is secure and is designed to preserve user privacy.

The company also says that the risk of the system incorrectly flagging an account is extremely low.

The new update has formed the basis of a significant new ad campaign and has drawn praise from child protection groups.

However, the move has raised the ire of Australian privacy and digital experts.

Apple was one of the first major companies to embrace end-to-end encryption, in which messages are scrambled so that only senders and recipients can read them.

Privacy advocates suggest the update could have unforeseen ramifications.

Katrina Michael is a Professor in the School of Information Systems and Technology at the University of Wollongong.

She says that while Apple should be patted on the back for helping law enforcement tackle a heinous crime, the potential for privacy breaches are of concern.

For example, private messages about a child’s christening could trigger an alert when they were not remotely illegal, the past board member of the Australian Privacy Foundation told Information Age.

“It all depends how the data is being treated, pre-processed, and stored. Where iPhone’s app is sitting to screen is also important – on the device, or elsewhere. I’m worried about red flags when there are innocent photos in context,” Michael says.

An Australian digital rights advocate also raised concerns about a lack of detail about the update, which has created confusion in the market.

“The announcement has been a bit clumsy, and Apple hasn’t done a good job of explaining themselves here. We deserve more clarification on this.”

The comments come from Justin Warren, who is a board member of the Electronic Frontiers Australia (EFA) which has been promoting and protecting digital rights since 1994.

He’s concerned that the update could be used to spy on innocent people, and could perhaps be utilised by government agencies without consumer knowledge.

Apple is sourcing the child sexual abuse material from a secret database provided by a private company, he says.

“It opens a can of worms,” he says.

“Ultimately, the question is, is the mobile phone a private device, or is it not?

“What if you’re in the pub and you leave your phone unlocked, and someone takes a dick pic on your phone? What happens then?” he adds.

The new update looks like a backflip given that Apple has been resolute about the importance of user privacy in recent years, even refusing requests by the Federal Bureau of Investigation to unlock the iPhone of a suspected terrorist, he says.

For decades, there have been all sorts of well-meaning but harmful proposals and legislation in order to protect children.

But the problem is child sexual abuse materials cannot be removed by waving a magic wand, Warren says.

He has concerns that a global private corporation like Apple is making censorship decisions that affect millions of people, seemingly without any intervention from authorities, calling for greater clarity on the tool.

“These are complicated societal issues that require careful handling. This needs much more careful thought before it’s rolled out.”

Apple has released a FAQ about its new photo scanning technology.