Apple is delaying a controversial program to automatically scan iOS and iPadOS devices for child sexual abuse material (CSAM) after privacy concerns and potential problems with the technology.

In an update to its original statement about the child safety features, Apple said it was going to hold off its device scanning software for a few months.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said.

Apple is planning to check every photo uploaded to iCloud for CSAM. It would do this by hashing the images and checking them against a database of confirmed CSAM image hashes that would be stored locally on the device. Once the system detects a certain threshold of CSAM, it would disable the Apple account and refer the images to a human for verification.

The move to start scanning every image on a user’s iCloud seemed a radical reversal from Apple’s typical pro-privacy position which has seen the company regularly collide with law enforcement over its refusal to unlock iPhones and hand over encrypted information.

Apple has also used device and data privacy as a point of differentiation from its Silicon Valley rivals, sparking a battle with Mark Zuckerberg over privacy features.

While there is a very real and clear need to stop the abhorrent spread of child abuse material online, digital rights advocates were concerned Apple was heading down a slippery toward enabling further technological surveillance of free citizens.

Slippery slope

The Electronic Frontier Foundation (EFF), which vehemently opposed Apple’s phone scanning features from the outset, welcomed Apple’s delay but said it wanted to see the phone scanning plans scrapped entirely.

“The features Apple announced a month ago, intending to help protect children, would create an infrastructure that is all too easy to redirect to greater surveillance and censorship,” Cindy Cohn, executive director of EFF said in a blog post.

“These features would create an enormous danger to iPhone users’ privacy and security, offering authoritarian governments a new mass surveillance system to spy on citizens.”

Computer scientists offered similar warnings about the potential misuse of this technology.

Jonathan Mayer and Anunay Kulshrestha, researchers from Princeston University, independently explored a “privacy-preserving perceptual hash matching” system similar to the one Apple was considering.

Like Apple’s, Mayer and Kulshrestha’s system would check images against database of hashed material and flag any inappropriate content as a way for law enforcement to monitor end-to-end encryption for illicit material.

But they quickly realised the potential ramifications for user privacy and the right to free speech.

“Our system could be easily repurposed for surveillance and censorship,” the pair wrote in a Washington Post op-ed last month.

“The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

“We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.”

Hash collisions

Since Apple announced its original plans, a community of researchers and computer scientists began hunting for potential flaws system.

Almost immediately, people reverse engineered the NeuralHash algorithm Apple was planning to use for its on-device detection.

This led to further exploration as people hunted for innocuous images that would be recognised by the NeuralHash model as child exploitation.

One such hash collision was found early on, suggesting the model could be fooled into thinking images that looked like noise to a human were CSAM. This led to the suggestion that similar images could easily be disseminated and used to overwhelm Apple’s human verification system.