Microsoft has developed an automated tool that relies on artificial intelligence techniques to identify online conversations in which child grooming by sexual predators may be taking place.

Code-named “Artemis”, the project draws on the Greek word “artemes”, meaning “safe” (not to be confused with the NASA-crewed spaceflight program of the same name).

It relies on monitoring text-based exchanges, looking for words and sentence structures that may indicate child grooming.

The exact nature of the algorithms that sound the alarm have not been revealed but the system computes a score that indicates the likelihood of grooming taking place and, at a predetermined threshold level, a human moderator is notified.

If the moderator determines that grooming is occurring they can contact the relevant authorities and hopefully prevent escalation.

Free for all

Microsoft is making this tool available, at no cost, to companies operating online chat technologies as ongoing testing and refinement of Artemis takes place on the Xbox and within Skype.

The distribution of the software will be managed by Thorn, the US-based non-profit anti-human trafficking organisation founded by Ashton Kutcher and Demi Moore, that works to address the sexual exploitation of children.

Development of Artemis began at a Microsoft 360 Cross-Industry Hackathon in November 2018, where teams led by leading Dartmouth College academic, Professor Hany Farid, analysed the text message exchanges of tens of thousands of conversations to gain insight into the linguistic patterns used by online predators.

The teams then built prototypes that could accurately detect the patterns.

Professor Farid has a strong connection with preventing child exploitation and was a key developer of Microsoft’s PhotoDNA, which facilitates the detection of online child exploitation material through the computation of hash codes that can uniquely identify images and videos.

For Project Artemis, issues that still need to be addressed concern detection accuracy, communication privacy, where the moderators will come from, and how they will be supported – but this initiative may help to reduce the rise of child exploitation that has been an alarming by-product of online chat capabilities.

In a recent blog post, Microsoft’s Chief Digital Safety Officer Courtney Gregoire said, “Project Artemis is a significant step forward, but it is by no means a panacea.”

Every parent’s nightmare

With private one-on-one online chat capability being a feature of many apps and video games, this analysis tool is more important than ever.

As parents juggle work and family commitments, how many know exactly what their children are doing during every moment of online interaction, whether it is through an app or even a chat window on their Xbox?

And is there enough awareness of what “grooming” is?

Although the term is usually used to describe an adult forming a relationship with a child with the intention of meeting them for sex, it can be much more subtle than that.

Online grooming often involves manipulating a child into sending sexualised images or videos of themselves to an online predator via chat channels.

These predators often pose as children themselves and will try to befriend the child, gaining their trust, playing on the child’s insecurities, and making them feel special.

Alternatively, they may emotionally blackmail the child into compliance, with typical methods including threats, such as “I will strangle your cat” or “I will send the naked pictures of you to your friends”.

Given the ease with which predators can “virtually” enter a home using technology that seems innocent, online chat grooming can be so subtle that it may not even be noticed by adults in the same room.

Potential victims are manipulated into secrecy and there have been reports of sexual predators taking advantage of Australian children as young as four, with the trend reportedly being on the increase.