Google banned a man’s account and referred to him to the police after its automated systems flagged potential child sexual abuse material (CSAM) on his phone which turned out to be photos he was sending to a doctor.

A US-based software engineer named Mark recently spoke to the New York Times about the bizarre incident that took place early last year and saw him lose access to his email account, phone number, personal photos and files, and the ability to authenticate access to other online services.

It all started when Mark noticed his toddler’s penis had become swollen and was causing pain.

He took photos on his Android device to keep track of the problem and was advised during a telehealth triage session – this was at the height of the COVID pandemic – to send the pictures to a doctor for review.

The doctor soon diagnosed Mark’s son with an infection, prescribed antibiotics, and the toddler was on his way to being healthy again.

But a couple of days later, Mark got a notification that his Google account had been disabled due to a “severe violation” of the tech giant’s policies.

“Oh, God,” Mark thought, according to the New York Times. “Google probably thinks that was child porn.”

He appealed the ban and explained the situation but to no avail: Google kept Mark’s account locked permanently and sequestered him from the online life he made using the company’s services.

“The more eggs you have in one basket, the more likely the basket is to break,” he reflected.

A few months later, still without access to his account, Mark received an envelope from the San Francisco Police Department explaining that Google had referred him to the police which conducted an investigation.

Google handed every digital detail of Mark’s life over to the police which, in turn, found no wrongdoing.

“I determined that the incident did not meet the elements of a crime and that no crime occurred,” one of the police’s investigators wrote.

Even with the police clearing Mark of possessing child exploitation, Google refused to turn his account back on and is standing by its noble statement that it is “committed to preventing the spread of [child sexual abuse material] on our platforms”.

Google uses technology to automatically scan photos when they’re uploaded to Google Photos – which Android devices do by default – and see if they have CSAM-like characteristics.

Apple had announced a similar program for scanning photos stored on its mobile devices but put it on pause after community backlash.

Writer and digital rights activist Cory Doctorow said the incident with Mark’s Google account is more evidence that the power and influence of big tech needs to be curtailed.

“The tech giants set out to become utilities, as important to your life as your electricity and water – and they succeeded,” he said in a blog post.

“However, they continue to behave as though they are simply another business, whose commercial imperatives – including the arbitrary cancellation of your services without appeal – are private matters.”

Mark’s account remains locked out after Google said it spotted more problematic material – specifically a video in which a child was lying in bed with a naked woman.

He told the New York Times he doesn’t remember this specific video, which of course he can’t review because his account was locked, but reflected on the video likely being him recording an intimate, personal moment between his wife and son.

“If only we slept with pyjamas on, this all could have been avoided.”