You may not have realised it, but if you participated in Sydney’s recent City2Surf running event, a biometric face-matching system has already picked you out from thousands of participants, waiting for you to request your photos – unless someone else beats you to it.

Organisers of this year’s race contracted Sportograf, a German sports photography company that has long managed photography for large public events and enables participants to find themselves in photos by entering their bib number.

To increase the likelihood of finding your photos, Sportograf recently added a face recognition feature that lets you take a selfie or upload a photo of yourself, then have it matched against the thousands of photos taken during the event.

Sportograf calls the face recognition technique “one of our most important company secrets” and only says that it “use[s] modern techniques and algorithms”, crediting the technique with increasing the number of matches for any search by up to 120 per cent.

For all its benefits in spotting photos, some security experts are worried that a lack of access controls could leave the capability open for abuse by people who upload other runners’ photos to track them and the people they run with, locate former spouses, or even use multiple source images to create deepfakes.

“From a user’s perspective it’s pretty cool because you get all the images of yourself as you run the course,” says Blair Crawford, founder and CEO of Sydney-based biometric identity company Daltrey.

Crawford was among the runners on the day and – despite the natural caution that comes from working as a privacy professional for over 12 years – had no knowledge that he had been “enrolled in some sort of biometric database that is located in Germany” until he received a follow-up email after the race.

Experiments with the service revealed a lack of controls over access to the images, meaning that someone could upload an image of anybody they were interested in tracking and the system would bring up all of the photos of them.

Sportograf – which has photographed over 10 million athletes and takes 30 million images per year – argues that participants in these events implicitly consent to the processing of their personal and biometric data, but Crawford believes organisers could be clearer in letting participants know what will happen to their photographs.

“From a privacy and security perspective,” he told Information Age, “you have to be able to not have consented to these things passively, and [know] the scope for which the data can be used, and by which other people.”

“If you don’t have the appropriate scope defined, then you don’t have the ability to control that scope and not consent to it. It’s a very slippery slope when you start to have a significantly large amount of data about persons without necessarily having the appropriate gates for access.”

They probably are watching you

Increasing use of face recognition has raised hackles amongst privacy groups that have repeatedly raised concerns about moves by the likes of Woolworths and Coles to increase their use of video to monitor customers while in-store.

Despite increasing their of use of cameras, Woolworths and Coles are not, a recent Choice survey found, using facial recognition technology to identify and track individual customers – a chorus echoed by 15 other major brands.

Those retailers that have dipped into indiscriminate face recognition have quickly retreated, as when Bunnings, Kmart, and whitegoods retailer The Good Guys were recently caught using facial recognition technology to identify their customers.

That investigation was welcomed by consumer group Choice, which said it is concerned that rampant use of facial recognition technology “pose[s] significant risks to individuals” including invasion of privacy, misidentification, discrimination, profiling, and exclusion.

Although Bunnings chief operating officer Simon McDowell initially said the firm was “disappointed by … inaccurate characterisation” of the facial recognition technology – and that the technology was important for staff and customer safety – the outlets suspended the programs after their activities triggered a formal investigation by privacy watchdog the Office of the Australian Information Commissioner (OAIC).

The technology “is used solely to keep team and customers safe and prevent unlawful activity,” McDowell said, calling it “an important tool in helping us to prevent repeat abuse and threatening behaviour towards our team and customers.”

Choice consumer data advocate Kate Bower was unconvinced, flagging retailers’ use of online privacy policies and small in-store signage as “insufficient and non-compliant”, arguing that the revelations “struck a nerve with the Australian community” and said the outcome “could set the standard for how businesses treat customers and their data.”

Widespread surveillance remains highly controversial, with privacy concerns raised about the use of automatic face matching for the recent City2Surf running event and mass facial-matching firm Clearview AI sanctioned by regulators after conducting widespread face recognition on large numbers of images.