show image

Oscar Williams

News editor

The ICO is threatening to take legal action over police use of facial recognition technology

The Information Commissioner has threatened to take legal action over the increasing use of facial recognition technology by police forces across the UK.

In a blogpost published on Monday evening, Elizabeth Denham challenged the legality and effectiveness of the technology, which was rolled out at last year’s Notting Hill Carnival and the Champions League final in Cardiff.

“For the use of FRT to be legal, the police forces must have clear evidence to demonstrate that the use of FRT in public spaces is effective in resolving the problem that it aims to address, and that no less intrusive technology or methods are available to address that problem.”

The data protection regulator has identified the issue as a priority area for her team and recently written to the Home Office and the National Police Chiefs Council setting out her concerns. “Should my concerns not be addressed I will consider what legal action is needed to ensure the right protections are in place for the public.”

Last year, the Biometrics Commissioner Paul Wiles challenged the legitimacy of a database of custody photographs that includes 19 million images and is used to inform facial recognition software. Hundreds of thousands of the images feature people who were later acquitted or never charged with a crime.

Denham said she would be “considering the transparency and proportionality of retaining [the custody] photographs as a separate issue, particularly for those arrested but not charged for certain offences.”

The effectiveness of facial recognition technology has recently been called into question by privacy groups in both the UK and US. A member of staff from Liberty who observed the Met Police’s operation at Notting Hill Carnival last year claimed the technology led to at least 35 false positives, five people being unduly stopped and one wrongful arrest. No legitimate arrests were made as a result of the deployment.

Wired revealed earlier this month that during the UEFA Champions League Final last June, there were 2,470 alerts of possible matches, 2,297 false positives and 173 accurate identifications.

“At another level, I have been deeply concerned about the absence of national level co-ordination in assessing the privacy risks and a comprehensive governance framework to oversee FRT deployment,” Denham said. “I therefore welcome Baroness Williams’ recent confirmation of the establishment of an oversight panel which I, alongside the Biometrics Commissioner and the Surveillance Camera Commissioner (SCC), will be a member of.”

The Home Office has previously faced a barrage of criticism for failing to explain why, more than four years after it was promised, a framework for the use of biometric data in policing has not been published.

Speaking to NS Tech last year, the chair of the science and tech select committee Norman Lamb warned that the government’s failure to publish the framework “massively increases the risk of abuse”. The MP said it was “extraordinary” that police forces are starting to apply facial recognition technology “in a policy-free vacuum”.