With artificial intelligence increasingly acting like a boss for some workers, efforts to explain the decisions made by algorithms could actually be making things worse for employees in the gig economy, according to a new Australian study.

The research by a team of academics from Sydney’s Macquarie University found efforts to make the use of AI in management more transparent and explainable could actually make workers more stressed if not done properly.

The large experimental study, which included a survey of more than 1,100 gig economy workers, found that while improving AI transparency was meant to help workers, doing so with too much information could be detrimental.

“We tend to think ‘the more information the better’ … but for a delivery rider weaving through traffic, that’s not true,” report co-author and Macquarie Business School associate professor Miles Yang told Information Age.

“Our research shows that if you give workers too many types of explanations at once — like dumping a detailed data log plus a complex hypothetical scenario on them — it backfires, it creates ‘cognitive overload’.

“Instead of being treated fairly, they just feel overwhelmed and confused.

“It’s a classic case of good intentions leading to a bad user experience.”

‘Less is more’

When it comes to the gig economy, AI systems are now in many cases used as management, and make important decisions relating to workers’ hours, deactivation, and complaints.

“We see gig workers everywhere – delivering food, driving cars – and they are all managed by algorithms, not humans,” Yang said.


Gig workers are increasingly required to track and manage their work using mobile apps with AI integrations. Image: Shutterstock

The Macquarie University study looked at the types of information provided to gig economy workers when a decision was made about them by AI tools.

This included information such as exactly how late a delivery was, while others outlined hypothetical alternatives such as showing how the rider could have taken a different route to avoid being late.

The research found while either of those offerings could be useful for workers, it may be problematic if they were both provided simultaneously.

“Less is more,” Yang said.

“Don’t just data-dump on workers in the name of transparency — platforms need to be smart about how they explain things.

“Our study suggests that counterfactual explanations are super effective, but they need to be delivered simply.

“Don’t clutter the screen with every single data point — give them the insight they need to improve and strip away the rest.”

Dealing with a ‘hostile AI gatekeeper’

Alexi Edwards, a gig economy worker and the national food delivery delegate at the Transport Workers Union (TWU), said there were “really big problems” when it came to AI management, and he often had to deal with a “hostile AI gatekeeper” in order to reach a human support officer.

“The support AI has been trained to minimise escalations to human support and will attempt to get the [DoorDash delivery worker] to accept a worry-free unassign without pay but also unaffected completion rate,” Edwards told Information Age.

“Attempts to escalate to a human support officer are met with such hostility from the AI as it insists that it will be able to resolve the issue.

“Upon being asked to escalate to a human the AI will try to pretend to be a human before you ask it to escalate again.

“This is a predatory practice that isn’t the AI’s fault but comes from the way that it has been programmed and trained.”

DoorDash did not respond by deadline to a request for comment.

There have been some landmark moments in recent years in the push for better rights for gig economy workers in Australia.

In November 2025, gig economy giants Uber Eats and DoorDash agreed to a deal with the TWU to provide minimum pay to its delivery drivers, cover insurance, and provide dispute resolution.

New protections introduced by the federal government in 2024 allowing “employee-like workers” such as Uber drivers to challenge unfair deactivations have also been used several times at the Fair Work Commission.

Last year Uber was ordered to repay the lost wages of a driver who the Commission ruled had been unfairly kicked off the platform.