AI facial recognition oversight lagging far behind technology, watchdogs warn
Exclusive: Biometrics commissioners say face-scanning not as effective as claimed and new laws needed to regulate use
www.silverguide.site –
Britain’s biometrics watchdogs have warned that national oversight of AI-powered face scanning to catch criminals is lagging far behind the technology’s rapid growth.
With the Metropolitan police almost doubling the number of faces they scan in London over the past 12 months and a rising use of the technology by retailers in the UK, Prof William Webster, the biometrics commissioner for England and Wales, said the “slow pace of legislation was trying to catch up with the real world” and “the horse had gone before the cart”.
Dr Brian Plastow, who holds the same role in Scotland, warned the technology was “nowhere near as effective as the police claim it is” and said there was a “patchwork legal framework” throughout the UK. He said in England and Wales, police were “really just marking their own homework”.
The watchdogs said new laws were needed to govern when and how police forces used live facial recognition technology, with a new regulator to clamp down on misuse.
Several bodies have oversight of the technology, including the Information Commissioner’s Office (ICO) and the Equality and Human Rights Commission.
The Home Office is considering a new legal framework for the technology as it also plans to introduce nationally what it calls “the biggest breakthrough for catching criminals since DNA matching”.
Members of the public wrongly labelled as suspected criminals by shops using AI cameras said there was no accountability or recourse to complain. They said the system had left them feeling “guilty until proven innocent”.
They described the ICO, which is responsible for monitoring facial recognition tech and the biometric data it uses, as “toothless” and unresponsive.
British police forces and high street retailers claim the technology makes streets safer, but others criticise it as Big Brother-style mass surveillance, with risks for civil liberties and data privacy.
So far this year the Met has scanned more than 1.7 million faces in London hunting for suspects on watchlists, up 87% on the same period in 2025.
It has also emerged:
An independent audit of the Met’s use of facial recognition technology (FRT) has been indefinitely postponed after the police requested delays.
Polling shows 57% of people believe the systems are “another step towards turning the UK into a surveillance society”.
A whistleblower claimed shop-based face-scanning systems had sometimes been misused by shop or security staff “maliciously” adding members of the public to watchlists.
Webster said: “We could be talking three years, at a minimum, before regulation is in place and active. And we already have a rollout of live face recognition in a dozen different police forces.
“The technology is becoming cheaper and cheaper, and in time we will see it everywhere, including in the static surveillance camera network.”
In February, the Guardian revealed how police arrested a man for a burglary in a city he had never visited after face-scanning software deployed across the UK confused him with another person of south Asian heritage.
Several other people have told the Guardian about the impact of being misidentified by face-scanning software increasingly used by retailers to fight shoplifting.
Further concern about limited scrutiny of the fast-developing technology has been caused by the postponement of the ICO’s planned audit of the Met’s use of AI-powered face scanning to find wanted criminals.
The ICO, which is the UK’s data regulator, had scheduled the investigation for October last year. But the Met asked for it to be pushed back and it is no longer certain it will go ahead, according to emails obtained by the Guardian under the Freedom of Information Act.
They show the Met cited as reasons for delay its need to handle a legal challenge to its face-scanning policy, about which a court ruled in its favour last week, officers taking Christmas leave and the burden of policing new year festivities.
The ICO accepted its request and the investigation is no longer certain to go ahead, prompting claims the regulator is being “insufficiently aggressive”.
David Davis MP, the former shadow home secretary and a civil liberties campaigner, said: “[FRT] is a massive development with all sorts of implications. The ICO should be the defender of the ordinary citizen and should be far more aggressive in what it does.”
The ICO and the Met said the timing of the judicial review meant it was appropriate to postpone the proposed audit.
The Met said: “We have always been transparent about our use of facial recognition technology and welcome independent scrutiny.” The ICO said it was reviewing whether the audit was rescheduled.
Polling of 2,000 adults last month by Opinium found that nearly a third opposed the use of facial recognition by retailers. In addition, 62% worried about the technology getting people into trouble for things they had not done, according to the poll, commissioned by Face Int, a biometric security company.
Face-scanning software is being increasingly used by retail chains to target shoplifters and antisocial and violent behaviour in stores. Sainsbury’s, Budgens and Sports Direct are among the chains using Facewatch in some shops.
The technology analyses CCTV footage and compares faces against a private database of known offenders, alerting staff when a match is made.
Big Brother Watch, a civil liberties campaign group, said it had been contacted by 21 people during the past year who believed they had been wrongly placed on watchlists or misidentified.
Ian Clayton, a retired health and safety professional from Chester, was asked to leave Home Bargains in February after being told he had been flagged on a facial recognition system as a thief. He later found out he had been wrongly associated with a shoplifter he had happened to stand next to on a previous visit.
“It feels very Orwellian,” he said. “We’re constantly being recorded and put on these systems but should we be there? It feels like spying without cause. It left me feeling vulnerable, exposed and a little bit helpless. I’m hyper-aware of cameras now.”
The same thing happened to Warren Rajah, a data strategist in south London, on a visit to Sainsbury’s. “This is a civil rights issue that we are slow-waltzing into,” he said. “We know cameras cannot pick up features of people that have darker features with as much accuracy.”
Meanwhile, a whistleblower has claimed the systems have sometimes been misused by shop or security staff “maliciously” adding members of the public to watchlists even though they have not been caught doing anything wrong.
Paul Fyfe, a former security guard who worked using Facewatch cameras in Stockton-on-Tees until last September, said in some cases staff had tagged members of the public on watchlists even when they had not been caught shoplifting or committing violence.
“If you’ve got someone there that you’re pissed off with, that you can’t catch or you’re getting chew off [being hassled] or they are threatening you, the easiest way to harm them is to upload them on the system,” he said. “[On] 10 to 15 occasions, I know people have been tagged for malicious reasons.”
The result was that security guards in other stores with the same software would be alerted whenever they entered.
Facewatch’s CEO, Nick Fisher, said: “We do not recognise the claims that the incident reporting system is being misused, including the serious allegation that individuals are being added maliciously.
“The system has been purposely designed not to allow misuse, and we have strict rules governing how the system can be used, with safeguards and controls built in. Retailers must meet clear evidential standards before submitting a record, and every submission is subject to human review before any individual is added to the database. If a submission does not meet the required standard, it is rejected and returned to the retailer.”

Comment