The London Metropolitan police’s use of facial recognition for what it calls “precision policing” based on “bespoke” watchlists has resulted in 17 arrests in London, according to the BBC. The arrests come on the heels of 42 face biometrics-assisted arrests in February – leading critics to level accusations of Orwellian surveillance tactics.
The Met police force, which has benefited from a government investment of £230 million (approximately US$295 million) in drones and facial recognition technology, has responded to those allegations in a mildly hurt, how-could-you tone.
“We do not keep your data. If there is no match, your data is immediately and automatically deleted in seconds,” says Lindsey Chiswick, director of intelligence for the Met and national lead for facial recognition. “An independent study has confirmed the algorithm we use is accurate. Through this testing we also understand how to use the technology in a fair way with no bias in relation to race or gender.”
“Ultimately live facial recognition technology is here to keep Londoners safe through accurately identifying people the police want to talk to.”
The question of who, exactly, police want to talk to is what privacy watchdogs are concerned about. One 23-year-old man flagged by the system for possession of points and blades was later found to have six rounds of ammunition, stolen mobile phones, a large quantity of cannabis and a stolen Oyster card linked to a robbery in 2022. Other arrests were variously for assault, burglary, theft, pickpocketing, breaching court-imposed conditions, fraud, threatening behavior and obstructing a constable.
Calling these arrests “precision policing” is dubious at best, says Madeleine Stone, senior advocacy officer for Big Brother Watch.
“Rather than actively pursuing people who pose a risk to the public, police officers are relying on chance and hoping that wanted people happen to walk in front of a police camera.”
A news release on the Met police’s website says the facial recognition system “identifies people who are on a bespoke watchlist which can include those who are wanted, have outstanding arrest warrants as issued by the court, or to ensure a person is complying with their conditions.”
But how bespoke is a watchlist that identifies pickpockets and those “obstructing a constable” – and what size is the database that the bespoke system draws upon for its biometric data?
Scottish commish says police are holding too many faces
Scottish Biometrics Commissioner Dr. Brian Plastow has raised concerns about how many images police are holding for law enforcement purposes. An article in Police Professional says Plastow explained his issue in an assurance review report presented to Scottish Parliament on March 25. The databases for facial recognition data and face biometrics, he says, are not as well defined as those for DNA or fingerprints – meaning no one is quite sure of the total volume of biometric facial images held by UK police.
The commissioner, however, has a guess. “A key finding from this review is that the volumes of images held are significantly higher than the total volumes of all other biometric data types combined. And while the total number is not possible to determine, I would estimate that there are at least three million images being held by Police Scotland.” Despite the large number of images held, Plastow says none of the agencies covered in the assurance review hold “meaningful management information on image volumes, including any metrics pointing to their effectiveness.”
Plastow intends to push the issue of retention in an upcoming Review of the Laws of Retention of Biometric Data in Scotland, which is being conducted in partnership with the Scottish Government. But it is not the only issue. Plastow also pointed to the elimination of the Biometrics and Surveillance Camera Commissioner for England and Wales as an example of a move that risks further eroding the public’s trust in how law enforcement uses biometric data.
The commissioner has been outspoken in his criticism of the use of facial recognition by law enforcement.
Ireland adds assaults on police to proposed facial recognition uses
In Ireland, politicians are making the case that the schedule of offences worthy of facial recognition investigation should include assaults on police, given the increased level of abuse that Garda officers are facing in the line of work.
The Irish Times reports that Minister for Justice Helen McEntee has enabled the use of facial recognition technology to investigate attacks against Garda members. An amendment to the Facial Recognition Technology Bill adds assault against a Garda or Defense Forces member to the list of crimes for which facial recognition can be employed. That list also includes child sexual abuse, child kidnapping or abduction, drug crime and human trafficking (still a far cry from pickpocketing).
“The time for facial recognition technology has come,” says McEntee. “We need to declare over the days of gardaí trawling through hours of footage, using up time and resources, delaying arrests and prosecutions.”
Critics, naturally, have raised concerns about false positives, algorithmic bias and mission creep.
Article: Facial recognition by U.K. law enforcement raising concerns for privacy watchdogs