Facebook is making it harder for advertisers and strangers to reach young people on its Instagram platform in response to increasing pressure over how children interact with it.

Instagram users under the age of 16 will be defaulted into private accounts, giving them more control over who sees content they post such as personal photographs.

Along with defaulting to private accounts, Instagram is also making it harder for adults flagged with “potentially suspicious behaviour” – like those who have been recently reported by young users – to find and contact young users.

Finally, the platform will stop allowing advertisers to target users under the age of 18 based on anything other than their gender, age, and location.

“We already give people ways to tell us that they would rather not see ads based on their interests or on their activities on other websites and apps, such as through controls within our ad settings,” Instagram said.

“But we've heard from youth advocates that young people may not be well equipped to make these decisions.

“We agree with them, which is why we’re taking a more precautionary approach in how advertisers can reach young people with ads.”

Facebook came under fire earlier this year when digital rights activists Reset Australia published a report showing how advertisers could create advertisements targeting young based on their profiled interest in age-inappropriate topics such as gambling and alcohol.

Executive director of Reset Australia, Chris Cooper, maintained his skepticism about Facebook’s changes to Instagram.

“Facebook isn’t saying it will stop profiling kids based on dubious interests, just that it will not let advertisers target them based on them,” he said.

“There is no commitment Facebook itself won’t keep using this profiling for its own purposes.

“This just underscores the need for meaningful public oversight about how these platforms collect and use young people’s data.”

Cooper noted Facebook’s actions coincide with the upcoming enforcement of the UK’s Age Appropriate Design Code beginning in September.

The UK code will require digital platforms to build stricter controls into their services to ensure the data of their child users is appropriately protected and covers topics like profiling, default settings, and so-called ‘nudge techniques’ designed to manipulate behaviour.

Cooper wants to see a similar regulatory effort in Australia, saying Facebook and other digital platforms shouldn’t be trusted to decide how they operate when it comes to the digital rights of children.

“We shouldn't rely on Facebook to self-regulate or other countries to dictate standards – Australia needs a regulatory code governing how children and young people’s data is collected and used,” he said.