Data algorithms help you to find the cheapest petrol, map the fastest route to work and get the best deal on flights.

But are they helping big companies rip you off?

That is what the ACCC is looking to find out, following the announcement of its Data Analytics Unit on Thursday.

ACCC Chair Rod Sims spoke about the changes that will protect against big data e-collusion at an event in Sydney.

“Concern has been raised by some that the way prices are determined, and potentially collusive outcomes are achieved, is changed by machine learning algorithms,” he said.

“It is said that a profit-maximising algorithm will work out the oligopolistic pricing game and, being logical and less prone to flights of fancy, stick to it.”

The Data Analytics Unit will work specifically to protect consumers from incidents of e-collusion.

The announcement comes after the Productivity Commission earlier this year released its Inquiry Report into Data Availability and Use, which highlighted the ways in which data is changing decision making and the need for stronger regulation regarding data.

Sims called on Australia to follow the advice of the report and protect customers where needed.

“The Productivity Commission states unequivocally that ‘marginal changes to existing structures and legislation will not suffice’ and consumers would need to embrace any such right,” he said.

“The ACCC considers that while industry and experts might be best placed to start the journey, it will need strong legislative backing and a regulator prepared to put competition and consumer issues to the forefront.”

NSW Chief Data Scientist, Ian Opperman, said that industrial regulation was needed in order to avoid any e-collusion.

"The world of data analytics is opening new frontiers of possibility. As smart algorithms find new insights in data, it becomes ever more important to ensure we maintain the quality of these algorithms," he said.

"Just as the advent of cars, planes and even the legal entity of businesses have changed our lives, an algorithm should be scrutinisable to ensure fairness and performance."

Creating liability

Mr Sims pointed out that through the continual development of deep learning and artificial intelligence systems, the idea of liability becomes murky.

“There is one observation I will make in relation to this discussion,” he said about the accountability of automated systems. “In Australia, we take the view that you cannot avoid liability by saying ‘my robot did it.’”

He cited the 2013 Google AdWords case as an example, where sponsored links for certain companies were displayed when a user searched for a rival company. For example, a search for ‘Harvey World Travel’ would produce a result for STA Travel.

While the high court ruled in favour of Google, on the basis that they did not create or produce any of the sponsored links, the companies that had misused these sponsored links were still found guilty, and Google were ultimately forced to make sure the practice no longer happened worldwide.

The Data Analytics Unit will aim to prevent such anti-competitive algorithms from arising.

Harper changes to protect

The 'legal hook' that would allow the ACCC to act if such collusion did occur in the future relates to the recent Harper changes.

In October this year, Parliament passed the Competition and Consumer Amendment (Competition Policy Review) bill, following recommendations from the 2015 Harper Competition Policy Review.

The changes now strengthen the prohibition on the misuse of market power by corporations through the introduction of a purpose or effects test, which prohibits corporations from “engaging in conduct with the ‘purpose, effect or likely effect’ of substantially lessening competition.”

Sims explained that under these new changes, it would be easier for the ACCC to prohibit robots from colluding.

“I expect that, no matter how anti-competitive conduct occurs, there will now be a legal hook allowing the ACCC to take appropriate enforcement action to address the resulting competitive detriment,” he said.