Big tech companies’ search algorithms are closely guarded secrets, but Australian researchers hope to see inside the black box by recruiting thousands of ‘citizen scientists’ for a year-long research project searching for bias in search results.
The project – driven by Queensland University of Technology (QUT) and Monash University researchers within the ARC Centre of Excellence for Automated Decision-Making + Society, in conjunction with global non-profit research organisation AlgorithmWatch – emerged from concerns that social media and search giants were being manipulated by special interests to give different results to different users.
“Different people entering the same search terms on computers that had compiled a history about their activity would sometimes get different information,” Professor Mark Andrejevic of the Monash University School of Media, Film, and Journalism told Information Age.
“That was perceived to be a concern in the political realm: what if information was being customised in ways that might increase political polarisation or give people completely different views of the world?”
“It seems important to be able to understand whether people are getting the same information or different information.”
Some 490 participants have already donated over 16.2 million search results to The Australian Search Experience (TASE) project since it began beta-testing its browser plugin in March, and Andrejevic hopes thousands more will sign up over the next year.
That plugin – which only works on desktop Chrome and Firefox browsers at this point – does not collect any personal information and does not monitor users’ personal web activity.
Rather, it has been designed to test the hypothesis – that search algorithms bias results based on individual users’ profiles – by running predetermined web searches using participants’ browsers, collating the results and sending them back to researchers.
By running the same search from computers with thousands of different search histories and individual user profiles, the investigators hope to identify inconsistencies in search results that could elucidate whether and how search algorithms shape the results they deliver.
“Large search engines and social media platforms have quite clear visibility into how their systems are working,” Andrejevic said, “but we who use these providers to inform ourselves about the world have got very little idea what takes place behind the scenes when we enter a search term.”
Many versions of the truth?
Claims of search-engine bias have persisted for years: the Wall Street Journal, for one, reported in 2019 that Google “has increasingly re-engineered and interfered with search results” based on pressure from businesses, interest groups, and individual governments.
“Far from being autonomous computer programs oblivious to outside pressure,” the WSJ noted, “Google’s algorithms are subject to regular tinkering from executives and engineers who are trying to deliver relevant search results, while also pleasing a wide variety of powerful interests.”
Tweaks to Google’s algorithms would, for example, favour larger businesses over smaller ones and Google was spotted making changes that favour major web properties like eBay, Amazon, and Facebook.
Concerns over search manipulation led the Australian Competition & Competition Commission (ACCC) to propose a digital platforms regulator whose responsibilities would, chair Rod Sims said, include “serious scrutiny of the algorithms of Facebook and Google” and an inquiry into ad-technology systems “so that we can bring some clarity as to who is making money at what level”.
Google isn’t alone: Facebook, for one, tightened advertising restrictions following revelations by Reset Australia that the company was profiling young children – and allowing advertisers to target them – based on their interest in age-inappropriate activities such as smoking, gambling, extreme weight loss, and alcohol.
Allegations of search tampering took on a life of their own during the propaganda-soaked Trump presidency, as extremists from both sides took their ideological battles online with alacrity.
One psychologist, for example, claimed that Google results were favouring leftist agendas – an argument supported by an ex-Google whistleblower – while one academic study into search engine manipulation effect (SEME) concluded that “biased search rankings can shift the voting preferences of undecided voters by 20% or more”.
SEME’s potential role in biasing people’s understanding of key issues has been demonstrated many times – Andrejevic is hopeful that such bias will be visible in what he hopes will be tens of millions of data points collected during TASE’s year-long run.
“Concerns have been voiced that commercial platforms may prioritise stickiness and engagement over things like accuracy and commitment to democratic or civic values,” he said.
“By opening the black box to see behind the screen, the goal is to get some accountability for quite powerful companies that have such an important role in shaping our information worlds.”