Police Use Of Facial Recognition Breached Privacy, Court Rules

Metropolitan Police

Milestone ruling. The UK Court of Appeal rules use of automatic facial recognition (AFR) tech by South Wales Police is unlawful

The Court of the Appeal has issued a significant ruling on the use of automatic facial recognition (AFR) in the United Kingdom.

The court’s three judges unanimously ruled that the use of APR by South Wales Police had breached privacy rights, data protection laws and equality legislation.

Automatic facial recognition (AFR) technology was used by South Wales Police from 2017, when facial recognition software was deployed at the Champions League Final in Cardiff to scan the face of every fan attending the game.

Police AFR

South Wales Police also used facial recognition systems in shopping centres, but this was challenged by civil rights group Liberty and Ed Bridges, 37, from Cardiff.

Bridges had began his legal challenge after his face was scanned while he was out shopping.

AFR is used by police officers to compare the faces of pedestrians who are not suspected of any crime, against a database of persons of interest.

This Court of Appeal ruling now means the use of AFR by the police is effectively outlawed until it can be brought before parliament.

According to the court, there are “fundamental deficiencies” in the legal framework supporting the police’s use of facial recognition technology, which caused the breaches of these rights, Sky News reported.

“The fact remains, however, that [the police] have never sought to satisfy themselves, either directly or by way of independent verification, that the software programme in this case does not have an unacceptable bias on grounds of race or sex,” the ruling stated on Tuesday.

“I’m incredibly, ecstatically pleased by today’s judgement on the case I brought with Liberty against the use of automatic facial recognition technology by South Wales Police,” said Bridges. “Automatic facial recognition technology is an intrusive and discriminatory mass surveillance tool.”

“It has been used without the public’s consent and often without their knowledge,” he reportedly added. “We should all be able to use public spaces without being subjected to oppressive surveillance.”

South Wales Police reportedly said it is not going to appeal Tuesday’s ruling.

Controversial tech

The use of AFR has been controversial for a few years now.

Last year the Information Commissioner’s Office (ICO) warned that any organisation using facial recognition technology, and which then scans large databases of people to check for a match, is processing personal data, and that “the potential threat to privacy should concern us all.”

Indeed in 2019 an academic study found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology were innocent, and that the overwhelming majority of people identified are not on police wanted lists.

And in August 2019, the ACLU civil rights campaign group in the United States ran a demonstration to show how inaccurate facial recognition systems can be.

It ran a picture of every California state legislator through a facial-recognition program that matches facial pictures to a database of 25,000 criminal mugshots.

That test saw the facial recognition program falsely flag 26 legislators as criminals.

Despite that, in July 2019 then Home Secretary Sajid Javid gave his backing to police forces using facial recognition systems, despite growing concern about the technology.

Tech boycott

Facial recognition systems have been previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.

And tech firms have begun boycotting the supplying of the tech to police forces.

Microsoft first refused to install facial recognition technology for a US police force a year or so ago, due to concerns about artificial intelligence (AI) bias.

This boycott was subsequently been joined by Amazon and IBM, among others.

Microsoft has also deleted a large facial recognition database, that was said to have contained 10 million images that were used to train facial recognition systems.

San Francisco banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.

But the police remain in favour of its use.

In February this year, the UK’s most senior police officer, Metropolitan Police Commissioner Cressida Dick, said criticism of the tech was “highly inaccurate or highly ill informed.

She also said facial recognition was less concerning to many than a knife in the chest.

Can you protect your privacy online? Take our quiz!