show image

Oscar Williams

News editor

Some of the UK’s biggest police forces are using algorithms to predict crime

More than a dozen of the UK’s biggest police forces are at risk of reinforcing discrimination by using biased algorithms to predict crime, campaigners have warned.

Researchers at Liberty found that 14 forces across the country, including the Met, West Midlands and Greater Manchester Police, have used or are considering using predictive programs to identify crime hot spots.

The software has been deployed with the aim of helping stretched forces to more efficiently allocate resources, but Liberty warns it could be used to dispatch officers to areas which may already be over-policed.

Other applications include assessing whether individuals are likely to commit crime, as the Met Police’s controversial Gangs Matrix attempts to do, and using profiling to assess people’s risk of becoming victims.

The author of the report, Hannah Couchman, warned that “the algorithms are driven by data already imbued with bias, firmly embedding discriminatory approaches in the system while adding a ‘neutral’ technological veneer that affords false legitimacy”.

“Life-changing decisions are being made about us that are impossible to challenge,” Couchman added. “In a democracy which should value policing by consent, red lines must be drawn on how we want our communities to be policed.”

The Information Commissioner’s Office reported in November that the Met’s Gangs Matrix had led to “multiple and serious breaches of data protections laws”. The investigation revealed that the matrix failed to clearly distinguish between the approaches taken to victims and perpetrators of gang-related crime.

James Dipple-Johnstone, the deputy Information Commissioner, said at the time that “clear and rigorous oversight and governance is essential,” so “the community can have confidence that their information is being used in an appropriate way”.

A lack of transparency over how citizens are targeted was identified as one of the key shortcomings of the way the software is used. Liberty called for the risk assessments and predictive mapping to be scrapped, but that in any case it should be communicated to those it affects.