It may have raised privacy concerns around the world, but increasingly desperate Western governments are taking lessons from China’s Close Contact Detector (CCD) app as they fight to contain the coronavirus.

The CCD app, which draws on a range of data sources to track users’ movements, warns users if they have been in close proximity to someone who later tested positive to the virus.

In China where mass surveillance is a fact of everyday life, such an app hardly raised an eyebrow – and fostered a similar approach in South Korea, where legislation allowed government authorities to harvest location data, credit card and other transactional data to track citizens’ movements.

Given the relatively positive containment results in South Korea – which has just 8,413 cases and 84 deaths in a population of 52m – other countries are examining its response carefully.

Emergency legislation in Israel, for example, has given the country’s Shin Bet internal security service authority to use mobile-phone data to track individuals.

Application of spying tools to domestic citizens, which was unanimously approved by the government at the request of the country’s Health Ministry, has triggered High Court challenges and raised concerns amongst privacy advocates calling it a “slippery slope”.

But could it become the new normal?

Automating contact tracing

With infection and exposure numbers skyrocketing, traditional manual methods of contact tracing to identify at-risk individuals have become untenable.

Automated location tracking and notification have become the only way of keeping up, Mike Parker, a bioethicist who is director of the Wellcome Centre for Ethics and Humanities at the University of Oxford, told Information Age.

“Because at least half the infections come in people who are symptomatic, by the time you’ve gotten to them and started your traditional public health tracking, they have already done their transmission of the disease,” he explained.

“Normal methods of contact tracing aren’t going to be effective – and the lag time means you are never going to catch up.”

Parker’s team has been working with the UK government to explore the ethical constructs that would be required to make a CCD-like app – which cross-matches individuals’ movements, shares data between devices using promiscuous Bluetooth connections, and conveys infection risk using a stoplight mechanism – viable in countries with strong privacy frameworks.

It’s a field of research that has moved from the purely academic years ago to the urgent now – and researchers highlight three key considerations that must be addressed for such activities to be widely accepted.

The first, one team of researchers noted, is addressing interpretation of the data collected, and disciplinary bias; the second, the potential risks to data subjects in low and middle-income countries.

Finally, governments need to address the risk of ‘function creep’ – the possibility that intrusive surveillance implemented for pandemic monitoring would be expanded to uses such as tracking political rivals or members of minority groups.

A matter of trust

An earlier UK app called FluPhone, which spawned a multi-disciplinary research effort called the FluPhone Project, explored many of the issues around location-based disease tracking.

Escalating concern about COVID-19 means many users would happily opt-in to a location-sharing app, Parker says, but widespread adoption would require governments to build long-term trust by addressing issues such as the security of data and controls to ensure its eventual destruction.

“You would need a respected, credible group of people to oversee this,” Parker explains, “and there needs to be a very clear commitment that the uses of this data would be limited to addressing the epidemic.”

Although hundreds of millions of Americans regularly donate private information to social-media giants, privacy is paradoxically still a major concern in that country.

There, all options are on the table: location-based tools such as MIT Media Lab’s Private Kit. Safe Paths actively collects location data, while the Medical College of Georgia is developing an AI-based tool that determines risk exposure using in-home surveys of mobile users.

Despite growing concerns about COVID-19 in Australia, a similar app could face an uphill battle to get traction amidst ongoing discussions about data privacy and surveillance.

Some 58 per cent of respondents to a recent Privacy Australia survey said they were not confident that most companies take adequate steps to protect their data, while 29 per cent said they are more concerned about privacy than they were a year ago.

And 30 per cent said Australian government surveillance was the biggest online privacy threat – just ahead of the 27 per cent that named companies collecting or trafficking in data.

Taken together, the two groups were seen as more of a threat than the 39 per cent that nominated hackers and cybercriminals.

With 88 per cent of the Australian respondents saying they value privacy more than convenience, unilateral government surveillance efforts would likely be poorly received – but current exceptional circumstances might change that for many people.

“Everyone recognises that we have to work together to solve this problem,” Parker says, “and one way of working together is to stay in your house and not go out.”

“This sort of app offers the possibility of a more intelligent approach of social distancing, which enables us to be more socially responsible but also to participate in our lives more meaningfully.”