Cash-strapped police chiefs defend their use of AI as campaign group finds 14 forces across the UK are using algorithms to supplement human officers
Police have said they are seeking “balance” in the use of artificial intelligence to predict crimes, after freedom of information requests found that 14 UK police forces were deploying, testing or investigating predictive AI techniques.
The report by Liberty, “Policing by Machine”, warned that the tools risk entrenching existing biases and delivering inaccurate predictions.
The civil liberties group urged police to end the use of predictive AI, saying mapping techniques rely on “problematic” historical arrest data, while individual risk assessment programmes “encourage discriminatory profiling”.
The forces using or trialling predictive mapping programmes are Avon and Somerset Constabulary, Cheshire Constabulary, Dyfed-Powys Police, Greater Manchester Police, Kent Police, Lancashire Police, Merseyside Police, the Metropolitan Police Service, Norfolk Constabulary, Northamptonshire Police, Warwickshire Police and West Mercia Police, West Midlands Police and West Yorkshire Police, while a further three forces – Avon and Somerset, Durham and West Midlands – are using or trialling individual risk-assessment programmes.
Norfolk Police, for instance, is trialling a system for identifying whether burglaries should be investigated, while Durham Constabulary’s Harm Assessment Risk Tool (Hart) provides advice to custody officers on individuals’ risk of re-offending, and West Midlands Police uses hotspot mapping and a data-driven analysis project.
Liberty said that at a minimum, police should be more transparent about their use of algorithms in policing.
A West Midlands Police spokesperson said the force was determined to ensure that its data science work had “ethics at its heart”.
A Durham Constabulary spokesperson said the force was “proud” of Hart, which it said advised officers as part of an intervention programme to help offenders “turn their lives around”.
“All decisions are ultimately made by an experienced custody officer, but the Hart advisory tool gives them a clear indication as to who might be more at risk of re-offending – not so they are stereotyped, but so we can give them more support to turn away from crime,” the spokesperson said.
The National Police Chiefs’ Council said such programmes were part of ongoing “innovative” use of technology and were deployed with a background of “strong” ethical standards and legislation.
“Policing is underpinned in the UK by a strong set of values and ethical standards, as well as a significant amount of legislation,” said assistant chief constable Jon Drake, the group’s lead for intelligence.
“At all times we seek to balance keeping people safe with people’s rights. This includes the way in which we police hot-spots for crime.
“For many years police forces have looked to be innovative in their use of technology to protect the public and prevent harm and we continue to develop new approaches to achieve these aims.”
But Hannah Couchman, author of the Liberty report, said predictive policing adds to the notion of “pre-criminality” and puts a “glossy sheen” of technology on existing biases.
“It fails us because it focuses on technology and big data as the solution to policing problems which are deeper, systemic issues requiring a much more considered, radical and compassionate response,” Couchman said on Twitter.
Liberty also called on the London Metropolitan Police to carry out a review of its Gangs Matrix database, which the Information Commissioner’s Office recently called “unjustifiably excessive”.
The use of AI comes amidst a climate of ongoing budget cuts that have significantly reduced police staffing.
Durham police chief Michael Barton said he has had to cut staff every year since becoming chief officer.
“I have now got a third less money than I had in 2010,” he told the Financial Times. “We have had the fifth biggest cut, pro rata, of any police force in the UK.”
Barton said the force was testing the use of AI because it had the potential to be free of the “inherent biases” that affect the judgements of human custody sergeants.