Big data is watching: new power not without its dangers

The NSW government’s plan for a whole-of-government data analytics centre has highlighted the potential of “big data” to influence and inform government policy decisions.

Having spent the past three years integrating its data repositories on a state-of-the-art platform, the introduction of analytics is a logical next step in the government’s desire to leverage that data more effectively.

While recognising the benefits data analytics offer to improving services, it’s good to see that a steering committee - composed of NSW’s privacy commissioner, chief scientist, customer service commissioner and information commissioner - has been created to oversee the centre’s ­establishment.

The involvement of privacy and customer service advocates is absolutely crucial ­because harnessing big data does have its complications, especially when it comes to the rights and best interests of consumers.

There’s no denying the power of big data and as we move towards a hyper-connected world — with millions of sensors measuring everything from social media and buying habits to traffic flow — the sheer volume, variety and quality of the data collected will be astronomical.

While much of this data will be anonymous, data analytics increasingly gives organisations the ability to drill down to monitor and understand individual behaviour, often without the awareness of those being observed.

There was a huge kerfuffle in February when it was revealed Samsung smart TVs had the ­capacity to eavesdrop on private conversations, albeit only if the voice response function was activated by the user.

While much of the controversy was fuelled by poorly worded policy documents from the technology giant, the subsequent media attention did highlight the potential for data-sensing applications to infringe on personal privacy.

While data collection is one side of the coin, the other side is how the data is analysed and used to inform business and operational decisions.

Just last month, senator Arthur Sinodinos published an article entitled “Big data and the Australian government” in which he said: “The right to privacy is fundamental. Parliament must be stringent in overseeing the use of any personal data, ensuring that it is handled appropriately, stored securely and properly de-identified where necessary.”

I’m sure those 37 million subscribers on Canadian infidelity site Ashley Madison, whose personal details were hacked last month, are wishing the company had followed through on its promise to delete their data.

With the volume and variety of data being collected set to increase exponentially, organisations must also be held accountable for how they store and use data.

It’s essential anyone whose privacy is breached by an Australian company has the protection of an up-to-date and relevant legislative framework.

These days, we expect supermarkets and department stores to track our spending habits. We even sign up for loyalty cards to make it easy for them, on the promise of cheap flights or in-store discounts, but how much do consumers understand about what happens to that data and where the boundaries lie?

To what degree do retailers consider the ethical implications of how that data is used?

What are our rights as consumers and what are our responsibilities as ICT professionals and business people?

Back in 2002, when data-mining was first coming into vogue, Target in the US hit the news for sending catalogues promoting baby products to a 16-year-old girl.

The girl’s father was outraged at the marketing campaign, accusing the store of encouraging the high school student to get pregnant.

The retailer had employed a statistician to develop ­algorithms targeting newly pregnant women based on changes in their buying patterns so it could market its baby products.

As it turned out, the store had identified the girl was pregnant before she told her father.

Given that the girl was just 16 and not yet an adult, the ethics of such targeted marketing are open to debate.

Indeed, Target has reportedly since modified its approach but in the absence of clear policy guidelines or regulations around the ethical application of data-mining and analytics, people will revert to their personal values, which can vary significantly.

What one person thinks makes good business sense, another person might find inappropriate.

Some analysts recommend the “what would Mum say” test, as a way of checking whether a planned action would be considered acceptable by the community.

Unfortunately today there are too few women in positions of influence to be able to apply such a test as ubiquitously as it would be needed.

This may lead to a new job emerging from the disruption that technology promises to bring — I would argue one of the most important going forward — that of the ethicist.

This article was first published by The Australian.