The government needs to make sure its reform of the Privacy Act treats personal and technical data in a way that protects individuals’ right to privacy, the Australian Computer Society (ACS) has told the Attorney General’s department.
The department is currently undertaking a significant review of the Privacy Act with a view to modernising it to better fit within the current technological landscape.
In its submission to the review's discussion paper, ACS said it was important to include technical information that could be used to identify an individual as though it were information ‘about an individual’ and should thus be protected by the Privacy Act.
“Electronic fingerprinting techniques are becoming more sophisticated and effective with very sparse information,” ACS said in its submission.
“This, coupled with individuals using or being monitored by electronic devices that are always connected to the internet, means that technical information is possibly more invasive than personal information consciously input by the individual.”
ACS also said that the act of inferring personal information was “an attempt to identify or re-identify an individual without their consent” and such actions should also fit within the threshold of information ‘about an individual’.
Digital Rights Watch also called for inferred data to be included in the definition of “personal information”, warning that unchecked data analysis can have real-world consequences.
“Inferences, predictions, or other assumptions made about people based on the data collected about them can have very real negative consequences, and should be considered as a form of privacy harm,” the organsiation said in its discussion paper submission.
“A failure to include generated or inferred information within the scope of the Privacy Act would represent a failure to acknowledge the modern technical realities of data processing.”
It’s an important distinction for the reformed Privacy Act to get right, and one that creates tension between those looking out for the rights of individuals and citizens, and businesses which profit from the gathering and analysis of user data.
Disagreement over inferred data
ACS’ submission is at odds with parts of the technology and business sectors who wish to continue using user data for their own benefit.
Microsoft said, in its submission to the Privacy Act issues paper last year, that it “does not support” including “inferences” as part of the Act’s definition of personal information, saying it “could have a chilling effect on the use of AI in Australia”.
“This could create confusion and result in significant implementation challenges regarding the collection and use of inferences and predictions generated through artificial intelligence or machine learning that are not tied to a natural person,” Microsoft said.
Facebook similarly diverges from ACS’ position that inferences about individuals’ data should be protected by the Act, saying that inferred information “is not usually generated on by or on behalf of an individual” but rather by organisations which, by extension, own the inferred data.
“It is entities who draw inferences about an individual using their proprietary data analysis tools,” Facebook said in its submission last year.
“Companies invest significant amounts of time, money and resources to make these tools incisive, to make their product better and suited to their users' unique needs.
“These datasets are a company’s intellectual property.”
Digital Rights Watch addressed Facebook’s argument, saying it “would only serve their business model and not the protection of individuals’ rights”.