Unscrupulous mobile app developers are collecting personal data about Australian children younger than five, a privacy watchdog group has warned, after a nine-month analysis of 186 popular game apps found that 59 per cent conduct “problematic data collection behaviour”.
Part of a campaign called Apps Can Track, the technical analysis – commissioned by Children and Media Australia (CMA) with the assistance of Dr Serge Engleman, research director of the Usable Security and Privacy Group within University of California Berkeley’s International Computer Science Institute (ICSI) – used the ICSI-backed AppCensus tool to examine the data collection practices of 208 Android apps.
Of those, 22 were specifically designed to facilitate school-home or parent-child communication and the remaining 186 were popular games – 54 of which were classified as engaging in “risky” privacy practices.
This meant that despite being targeted at very young children – think Dr Panda’s Swimming Pool – the apps collect and send unique Android ID (AID) and Android Advertising Identifier (AAID) codes to up to six ad-linked companies.
A further 40 apps, including Icecream Cone Cupcake Baking Maker, were flagged as needing caution because they transmitted the AAID to at least one ad-linked company.
And seven apps – including Starwars Pinball 7 – were classified as “very risky” because they transmit personal data to ad-linked companies “in an insecure manner”.
AID and AAID codes allow advertisers to identify individual mobile handsets and track users’ behaviour between apps – behaviour that has been tightly regulated for children under 13 since the United States passed the Children’s Online Privacy Protection Act (COPPA) in 1998.
The absence of legislation to protect Australian children motivated CMA, which has separately reviewed the content of over 1,500 movies and 990 apps since 2002, to expand its remit to include privacy protections.
The results of the project are being incorporated into CMA’s growing database of mobile app privacy reviews, which the organisation has been building since June 2021 and now includes 208 app reviews.
CMA’s goal is “to support parents who may not be aware of the risks in apps their children play, nor be able to detect or prevent them,” the organisation said, warning that commonly used privacy practices create “an extensive digital footprint that can be used by predators and marketers, and may jeopardise prospects later in life such as jobs, university places, health insurance and more.”
“Big Tech’s business model is to gather users’ data to sell them things, and will not be easily budged from this stance.”
The clock is TikTok-ing
Detailed policies from Apple and Google set clear expectations around what data apps will and won’t collect, but developers have been repeatedly caught out ignoring those restrictions and collecting far more data about users than they are supposed to – or need to.
Earlier this year, for example, Google banned more than a dozen apps after AppCensus analysis identified secret data-harvesting practices.
In March, the US Federal Trade Commission fined weight loss firm WW International (formerly Weight Watchers) and subsidiary Kurbo, Inc. $2.2 million ($US1.5 million) for illegally collecting data about children as young as eight without their parents’ consent.
Ubiquitous pandemic-era remote learning upped the ante, with Human Rights Watch (HRW) last year reporting that 89 per cent of 163 analysed EdTech products “put at risk or directly violated children’s privacy and other children’s rights, for purposes unrelated to their education.”
The products “monitored or had the capacity to monitor children, in most cases secretly and without the consent of children or their parents,” HRW found – with most online learning platforms using tracking technologies that “trailed children outside of their virtual classrooms and across the internet, over time”.
Nearly all of the apps share children’s data with third parties, HRW found, warning that “in doing so, they appear to have permitted the sophisticated algorithms of AdTech companies the opportunity to stitch together and analyse this data to guess at a children’s personal characteristics and interests, and to predict what a child might do next and how they might be influenced.”
The yawning gulf between the theoretical protection of children and the practice of app developers has raised alarm bells at the highest levels.
Earlier this month, US Federal Communications Commission (FCC) commissioner Brendan Carr asked Apple and Google to remove the ubiquitous TikTok app from their app stores after investigations suggested it is “a sophisticated surveillance tool that harvests extensive amounts of personal and sensitive data.”
For now, CMA is focused on advocating for “the urgent need for effective regulatory protections”, pushing for developers to use “child-centred safety by design”, and on empowering parents – including securing additional funding to make the AppCensus findings even more accessible to non-technical parents.