Steffi Loos/Getty Images
show image

Laurie Clarke

Reporter

Facial recognition use by South Wales Police ruled unlawful in landmark case

Activists have chalked up the world’s first successful legal challenge against police use of automatic facial recognition (AFR) technology, with the Court of Appeal ruling that South Wales Police’s (SWP) use of the technology is unlawful.

The legal challenge was brought by civil rights group Liberty on behalf of Ed Bridges, based in Cardiff. The court upheld three out of the five points raised in the appeal.

The use of facial recognition tech was ruled unlawful because although the legal framework comprised of the Data Protection Act (2018) and The Surveillance Camera Code of Practice, alongside SWP’s local policies, there was insufficient guidance on where AFR Locate could be used and who could be put on a watchlist. The court found that this was too broad to allow police to meet appropriate legal standards while deploying the tech.

Because it didn’t reflect this, the Data Protection Impact Assessment (DPIA) (a legally required bit of documentation for high-risk data-use scenarios) carried out by SWP was ruled deficient.

The court also found that SWP didn’t comply with Public Sector Equality Duty (PSED) under section 149 of the Equality Act 2010. The court found that the purpose of the PSED was to make sure public authorities assess whether a policy has discriminatory potential. It found that SWP didn’t take the requisite steps to find out whether the facial recognition technology in question exhibited a racist or sexist bias. However, the court said there was no clear evidence that the technology was in fact biased.

Less encouraging for anti-facial recognition activists was the court’s finding that despite being deployed unlawfully, the police’s use of the technology was proportionate because its potential benefits ostensibly outweighed the impact on Bridges.

The case was initially dismissed by London’s High Court in September, which found that the use of the technology was not unlawful. Bridges argued that the technology had breached his human rights given that his highly sensitive biometric data had been collected and analysed without his consent.

“I’m incredibly, ecstatically pleased by today’s judgement on the case I brought with [Liberty] against the use of automatic facial recognition technology by South Wales Police,” Ed Bridges wrote on Twitter. “The Court of Appeal agreed that this technology threatens our rights.”

He highlighted that the Equality and Human Rights Commission has previously called for a moratorium on the use of AFR by police forces in England and Wales before an independent assessment had been carried out. “Today’s ruling should prompt them to go further,” he said.

SWP said it wouldn’t appeal the outcome. The upshot of the ruling is that the police force must stop using the technology – at least until the legal questions are addressed.

“This is a major victory that will have ramifications for any force trying to use privacy-abusing, racist [facial recognition technology] in future,” Liberty said on Twitter. “But it’s not the end of the fight. It must be banned.”

Hannah Couchman, policy and campaigns officer at Liberty, previously told NS Tech: “Anyone can be included on a facial recognition watch list – you do not have to be suspected of any wrongdoing, but could be put on a list for the ludicrously broad purpose of police ‘intelligence interest’. And even if you’re not on a watch list, your personal data is still being captured and processed without your consent – and often without you knowing it’s happening.”

In response to the ruling, South Wales Police Chief Constable Matt Jukes told the BBC: “The test of our ground-breaking use of this technology by the courts has been a welcome and important step in its development. I am confident this is a judgment that we can work with.”

The facial recognition technology used by the Met police was found to only be reliable in 19 per cent of cases by an independent study carried out by surveillance expert Peter Fossey at Essex University, who scrutinised two years worth of trial data. In spite of this, police have tended to shrug off critiques of the invasive technology.

The Court of Appeal wrote in the judgement: “We would hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.”

This could potentially pave the way for more legal action against AFR in future – given that the technology widely exhibits a racial and gender-related bias.