The United Kingdom’s most senior police officer Cressida Dick has strongly defended the use of facial recognition by police forces
The use of facial recognition by British police forces is once again in the headlines after the UK’s most senior police officer defended its use against critics.
Indeed, Metropolitan Police Commissioner Cressida Dick labelled critics of the use of ‘Live Facial Recognition’ (LFR) as “highly inaccurate or highly ill informed.”
In September last year, the use of automated facial recognition (AFR) by South Wales Police was found to be lawful by the High Court in London. That ruling came after a legal challenge was issued against the police by a shopper from Cardiff, who claimed his human rights were breached when he was photographed while Christmas shopping.
Last July then Home Secretary Sajid Javid gave his backing to police forces using facial recognition systems. He said it was important that police made use of the latest tools to help them solve crimes.
That came after the Information Commissioner’s Office (ICO) warned that any organisation using facial recognition technology, and which then scans large databases of people to check for a match, is processing personal data, and that “the potential threat to privacy should concern us all.”
Despite that, a number of police forces are currently trailing the use of the technology, including the Met in London.
Live facial recognition (LFR) technology is also being used by South Wales Police, who in 2017 used facial recognition software at the Champions League Final in Cardiff to scan the face of every fan attending the game.
And the use of facial recognition systems by South Wales police in shopping centres is also under judicial review.
But matters were not helped when an academic study last year found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology are innocent, and that the overwhelming majority of people identified are not on police wanted lists.
But now the Metropolitan Police Commissioner has responded to a report from the Royal United Services Institute, which called for tighter rules on police use of technology.
The report looked at the use of data and algorithms by police in England and Wales. Among its recommendations were that police should issue new national guidelines in this area.
But Dame Cressida used her speech at the report’s launch to issue a strong defence of the use of facial recognition by the police.
She said AFR does not store biometric data, and that human police officers will always make the final decisions on whether or not to intervene, not the machine.
Dame Cressida also rejected AFR can be biased racially, saying that the tech the police are deploying is proven not to have an ethnic bias.
“This is about using a tool that can augment intelligence rather than replace it,” she said. “The Met’s trials of LFR resulted in the arrest of eight wanted individuals whom we would otherwise have been very unlikely to identify. Without LFR, those eight individuals who were wanted for having caused harm would probably not have been arrested.”
“So I and others have been making the case for the proportionate use of tech in policing but right now the loudest voices in the debate seem to be the critics,” she said. “Sometimes highly inaccurate or highly ill informed. I would say it is for critics to justify to the victims of those crimes why police should not be allowed to use tech lawfully and proportionally to catch criminals.”
“It is not for me and the police to decide where the boundary lies between security and privacy, it is right for the police to contribute to the debate,” she said. “But speaking as a member of public, I will be frank. In an age of Twitter and Instagram and Facebook, concern about my image and that of my fellow law-abiding citizens passing through LFR and not being stored, feels much, much smaller than my and the public’s vital expectation to be kept safe from a knife through the chest.”
“The only people who benefit from us not using lawfully and proportionately are the criminals, the rapists, the terrorists and all those who want to harm you, your family and friends,” she warned.
“We know there are some cheap technologies that do have bias, but as I have said, ours doesn’t,” Dame Cressida said. “Currently, the only bias in it, is that is shows it is slightly harder to identify a wanted women than a wanted man.”
Dame Cressida proposed that that the best way to ensure that the “police use new and emerging tech in a way that has the country’s support is for the Government to bring in an enabling legislative framework that is debated through Parliament, consulted on in public, and which will outline the boundaries for how police should or should not use tech.”
Despite her defence, it is fair to say there are doubts as to the effectiveness of facial recognition systems.
These systems has been previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.
Microsoft for example has refused to install facial recognition technology for a US police force, due to concerns about artificial intelligence (AI) bias. The Met use a facial recognition system from Japanese firm NEC.
And Redmond reportedly deleted a large facial recognition database, that was said to have contained 10 million images that were used to train facial recognition systems.
San Francisco meanwhile has banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.
And facial recognition can also be fooled as well. In 2017 Vietnamese cybersecurity firm said it had tricked the facial recognition feature on the iPhone X using a 3D-printed mask.
Can you protect your privacy online? Take our quiz!