Mandatory face masks have helped contain the COVID-19 surge in Victoria – but as masks increasingly become part of the ‘new normal’, businesses and authorities face new challenges as face-recognition vendors race to update algorithms that can’t cope with masks.
Masks were contentious long before COVID-19, when Hong Kong authorities banned their wearing by protestors who were donning masks to avoid the watchful eye of artificial intelligence (AI)-powered surveillance cameras.
The technologies have become endemic to public safety, with increasing use of AI-powered surveillance cameras helping police monitor crowds and rapidly identify potential offenders.
But ongoing testing by the US National Institute of Standards and Technology (NIST) has shown the efficacy of conventional face-recognition algorithms – which have become widely used in everything from smartphones to Customs systems and surveillance cameras – declines dramatically when simulated masks are digitally added to test photos of subjects.
False match rate (FMR) figures in the Ongoing Face Recognition Vendor Test (OFRVT) surged tenfold or more when 138 different face-recognition algorithms were tested against the NIST data set of more than 6.2 million photographs.
A range of digitally applied mask shapes, colours, and nose coverage was tested, and recently submitted algorithms showed wildly different results as authors tried – with varying degrees of success – to adapt their algorithms for increasingly common face-mask wearing.
FMR rates of around 0.3 per cent increased to an average of around 5 per cent when masks were applied, with some algorithms failing to authenticate up to half of the images they were presented with.
Interestingly, one analysis of the testing concluded, full-face-width masks, such as widely-used disposable surgical masks, blocked twice as many recognition attempts as rounder N95-type masks, which don’t obscure as much of the face.
Black masks were more likely to block face-recognition attempts than light blue masks, for reasons the researchers couldn’t identify but flagged as important for future research.
Anonymity in the COVID-19 era
The results of ongoing face-mask testing are more than academic: as societies fight to grapple with the ‘new normal’ and cities like Melbourne seem poised to require universal mask wearing for some time, businesses must revisit their security protocols around identification of individuals.
In the US, reports suggested security-conscious businesses, including banks and casinos, were requiring customers to momentarily raise their face masks for identification by security cameras – and calling police when customers failed to comply.
Such businesses have long banned motorcycle helmets and many other face coverings to ensure that patrons can be identified; however, given the strict public-health directives requiring masks be continuously worn while outside, they have gradually moved towards a compromise.
The Commonwealth Bank, for example, asks customers to “please wear a mask and maintain physical distancing” if they visit a branch while the NAB advises that customers are “welcome to come into our branches with a face mask, but we may ask you to remove it for identification purposes.”
Competitors like ANZ and Westpac offer no specific guidelines on mask wearing – but the issue is much larger than banks because similar issues can apply in any public place, as highlighted in a 2017 bill proposed by Senator Jacqui Lambie.
“Full face coverings conceal the identity of the wearer, disrupting the authorities’ ability to track down a perpetrator in the event of a crime,” notes the bill, which would have criminalised those who “intimidate or force an adult into wearing identity concealing garments”.
Fast-forward three years, and millions of Australians are now required to wear identity-concealing masks on a daily basis – and face-recognition vendors face a mammoth task in adapting to the post-COVID normal.
Research has already shown that the algorithms can be tricked in certain circumstances, but firms like Tryolabs are making progress by refining surveillance techniques to automatically monitor whether individuals are complying with public-health guidance or not.
Mask awareness is being built into everyday devices using face-recognition, with even Apple recently releasing a software update that can detect when a user is wearing a mask.
Greater use of the technology, Tryolabs’ founders believe, could make monitoring of healthcare compliance easier – helping single out individuals that are refusing to wear face coverings, or providing data to shape public-health authorities’ policymaking.
Yet as the surveillance state adapts to the wearing of masks, surging concern over widespread facial recognition may reignite a debate that has seen the likes of Facebook and Instagram sued, tech giants pausing their efforts, and police backing away from overuse of facial-recognition technology in public.