YouTube’s ban on anti-vaccination content has been hailed by some as a victory for evidence-based policy, but a leading artificial intelligence (AI) researcher has warned that it will take time before the content is cleansed from the public discourse.
Responding to the ocean of COVID misinformation – including suggestions that vaccines cause cancer or diabetes, do not work or contain tracking devices – the streaming-video behemoth recently committed to a “significant” update that will see it taking down such content and terminating the accounts of anti-vaccine influencers.
Based on its work with experts to develop 10 COVID-19 related policies preventing misinformation through its site, YouTube said in a blog post that it has removed over 130,000 videos for violating its COVID-19 vaccine policies since last year.
The new guidance “maps to public vaccine resources provided by health authorities and backed by medical consensus”, YouTube said, noting exceptions for videos such as personal testimonials and discussion of “historical vaccine successes or failures”.
Just how the ban will work is not fully clear, since the billions of videos on YouTube are actively and regularly indexed and cross-referenced to fuel the company’s proprietary recommendation algorithms.
Those algorithms have been amplifying anti-vaccination efforts by creating an ‘echo chamber’ effect that, one recent UK study found, was actively increasing vaccine hesitancy among social media users.
That same effect – in which YouTube’s algorithms favour similar content in an iterative process that excludes balancing perspectives – means that simply banning videos won’t necessarily banish every possible recommendation overnight.
Retraining AI in the anti-vax era
YouTube can approach the challenge in two ways, said Professor Juerg von Kaenel, a 36-year IBM Research veteran who was recently appointed as director of RMIT University’s recently-established Centre for Industrial AI Research and Innovation (CIAIRI).
“Fundamentally, YouTube can do two things [to enforce its policy change],” von Kaenel said. “Changing the learning algorithms will take time, because relearning everything is a humongous effort – and that will happen slowly, and over time.”
A more efficient method would be to develop algorithms that proactively cleanse YouTube’s content index of offending videos, scrubbing the site clean in a process that would need to outpace efforts by anti-vaccine activists to keep uploading new content.
Targeting videos by keywords “gets you 80 per cent there”, he said, noting that many online posters use euphemisms, sarcasm, or parochial cultural references that are likely to slip past automated tools.
There may also be challenges for academics and researchers who use the examples to illustrate the dangers of misinformation, von Kaenel added.
“Some of it is still in there and you have to allow people to explain that it’s misinformation, and to allow samples” for analysis.
“It gets really complicated,” he said, “and that’s where you will probably never be able to do a 100 per cent perfect job.”
Whatever technological controls are put in place, the moves have been heralded as a step in the right direction even if they are likely to miss many channels in Australia, where vaccine hesitancy has gained particular traction as anti-vaxxers exploit a populace fatigued by ongoing lockdowns.
A May report by Reset Australia correlated a rise in vaccine hesitancy with a 280 per cent increase in membership of Facebook anti-vax groups, who have been observed actively targeting minority communities across Australia.
As vaccine-hesitant Australians get ill with COVID-19 and one COVID denier after another admits being brainwashed by online conspiracy theorists, there are signs that science is slowly making progress.
The latest Melbourne Institute Vaccine Hesitancy Tracker found that nationwide vaccine hesitancy fell from 16.7 per cent in early September to 15 per cent by the month’s end.
Just how much YouTube’s ban will change this figure remains to be seen.
“As with any significant update,” YouTube said, “it will take time for our systems to fully ramp up enforcement.”