Dan Kitwood/Getty Images
show image

The Met can’t say how many businesses received facial recognition images

London’s Metropolitan Police Service (MPS) has refused to rule out the possibility that more businesses have received copies of facial recognition images of suspected criminals.

The admission came after the force revealed on Wednesday (4 August) that it had shared images with the developers of a 67-acre site in King’s Cross, having initially denied doing so.

A Met spokesperson said that the images had been shared at a local level and had “only just come to light to the central team managing police imagery”, adding that “MPS has not shared any images with the Kings Cross Estate, or any related company, for facial recognition purposes since March 2018”.

But he did not directly respond to questions about the number of similar arrangements, revealing only that the force had contacted “all local chief superintendents … to reinforce that there should be no local agreements or use of live facial recognition”.

Earlier this week, Sadiq Khan, the Mayor of London, welcomed the Kings Cross Estate’s decision to scrap its facial recognition roll-out, which was first reported in August by the Financial Times.

“I’m pleased they responded quickly to my concerns about the use of facial recognition at the site,” Khan said in a tweet. “London’s public spaces should be open for all Londoners to access & enjoy without fear of separation or segregation.”

The mayor has also called for the Met to disclose if other businesses have received facial recognition images.

Cardiff High Court ruled on Wednesday that South Wales Police had not acted unlawfully when they tested live facial recognition software in the Welsh capital in 2017 and 2018.But during a speech in Sydney this week, Cressida Dick, the Met’s commissioner, warned that “we’re now tiptoeing into a world of robotics, AI and machine learning”.

“The next step might be predictive policing. People are starting to get worried about that … particularly because of the potential for bias in the data or the algorithm, [like] live facial recognition software,” she warned.

The Information Commissioner’s Office has launched a broad investigation into the use of the technology in the UK. In light of the judge’s ruling on Wednesday, the watchdog said: “This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police.”