The widespread and escalating use of high-tech workplace surveillance is harming employees, with laws around the country failing to keep pace, a new inquiry has found.
The Victorian Legislative Assembly Economy and Infrastructure Committee has handed down its report into workplace surveillance, detailing a range of tech-based ways Australian companies are monitoring their workers, and the impact of this.
These methods include requiring the use of an app that continuously hoovers up private data on workers, an AI model that analyses facial expressions to determine whether someone is concentrating and neurotechnology tools that claim to determine someone’s level of attention.
The government-led committee called for new workplace surveillance laws that require all monitoring to be reasonable, necessary and proportionate to achieve a legitimate aim, and for workers to be notified of, and consulted about, any workplace surveillance practices.
“In a short space of time, surveillance has advanced beyond camera footage and the recording of telephone calls to incorporate keylogging, wearable trackers, biometrics, neurotechnology and artificial intelligence,” Labor MLC and Committee chair Alisan Marchant said.
“Victoria should not wait to strengthen protections around workplace surveillance but instead lead the way with dedicated, principles-based legislation.”
The impact on workers
The inquiry found that the most common forms of workplace surveillance occur through the use of computers, webcams, mobile phones and handheld scanners to gather data on work activities, and determine a worker’s location and task speed.
More sophisticated tools are on offer too, such as the use of AI to process workplace data to reach conclusions about an individual’s performance.
The use of these tools can be harmful for employees and create an unsafe workplace, the inquiry found.
“Workplace surveillance that is excessive and lacks transparency has been shown to have a negative impact on employees’ morale, job satisfaction and commitment to their organisation,” Marchant said.
“It has also been shown that it can intensify work, adversely affect employees’ mental and physical health and exacerbate the power imbalance between employers and employees.”
High-tech tactics
The committee inquiry looked at the Commonwealth Bank’s internal staff app, Navigate, which is required to be installed on employees’ personal devices to access buildings, book workstations, register visitors and report faults.
This app was also found to continuously collect data on workers’ precise location, with some workers even asked to apply for leave when they are away from their desk or appear to be unproductive, the inquiry said.
In its response to questions from the committee, the Commonwealth Bank acknowledged that it uses the app to monitor staff locations, and that it also uses a range of other surveillance to monitor its buildings, web browsing, emails, and interactions with customers.
The inquiry also heard about an AI model created by Fujitsu that can detect small changes in a person’s facial expressions in order to work out if they are concentrating, and wearable devices that monitor conversations, including how enthusiastically someone is speaking.
Workplaces are also using neurotechnology to determine their employees’ cognitive state, such as level of attention and effort.
A submission to the inquiry also detailed the use of AI to analyse conversations employees were having.
One on occasion, when building rapport, a financial adviser said that “unfortunately it’s been really rainy lately”.
The AI tool marked this conversation with a “sad face”, leading to a disciplinary hearing because of the use of the word “unfortunately”.
Laws lagging
The Victorian committee found that the state’s workplace surveillance laws have not been changed since 2006 and have not kept pace with examples such as these.
It recommended the introduction of “principles-based workplace surveillance laws” that are technology-neutral and require employers to show that any workplace surveillance is “reasonable, necessary and proportionate to achieving a legitimate aim”.
These laws should also require companies to notify their workers of this surveillance and consult with them before any changes are made.
And when AI is used in this process, the laws should require that a human review any significant changes made by an automated system, the committee said.
In its report, the committee also criticised a lack of participation in the inquiry by large employers and companies, with only one firm making a submission and none accepting an invitation to appear at a public hearing.
The committee said that the likes of Amazon, Australia Post, Optus, Uber and the big four banks did not respond to a call for submissions and declined to appear at a hearing.
There have been some recent examples of workers fighting back over digital surveillance.
Late last year, Woolworths warehouse workers went on strike for more than two weeks in part over AI surveillance and automation, eventually claiming victory.
Also last year, an Apple worker filed a lawsuit accusing the tech giant of “systematic invasion of employee privacy” and the monetisation of the personal data of its workers.