The use of automated facial recognition (AFR) by South Wales Police was lawfully, the High Court has ruled.
It comes after a legal challenge was issued against the police by a shopper Ed Bridges, from Cardiff, who claimed his human rights were breached when he was photographed while Christmas shopping.
But the High Court has refused his request for a judicial review on all grounds, finding South Wales Police had followed the rules and their use of AFR was justified.
The High Court, according to the BBC, heard the case in May this year, and said this was the first time any court in the world had considered the use of the technology.
Civil rights group Liberty was quoted as saying that its client would appeal against the ruling. Apparently Liberty had argued it was akin to the unregulated taking of DNA or fingerprints without consent, and it is campaigning for an outright ban of the practice.
Among the High Court’s reported key findings are that South Wales Police’s use of AFR met the requirements of the Human Rights Act and its actions were subject to sufficient legal controls.
The Court also ruled that concerning the data protection argument, the force was processing personal data, but in a lawful manner that met the conditions set out in the legislation.
The Court also found that South Wales police had complied with equality laws; and that the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR.
“This disappointing judgement does not reflect the very serious threat that facial recognition poses to our rights and freedoms,” lawyer Megan Goulding was quoted by the BBC as saying.
“Facial recognition is a highly intrusive surveillance technology that allows the police to monitor and track us all,” she added.
“South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent,” said the plaintiff, Ed Bridges.
“This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance,” said Bridges.
A spokesperson for the Information Commissioner said it would be reviewing the judgement carefully.
In July the information commissioner’s office (ICO) said it was carrying out an investigation into trials of facial recognition by police forces, including those in London and South Wales.
Last December the Metropolitan Police carried out voluntary trials of the tech in central London, but said those who preferred not to be scanned would not necessarily be considered under suspicion.
In July the House of Commons Science and Technology committee argued police trials should be halted until the relevant regulations were in place.
And these systems has been previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.
Microsoft for example has refused to install facial recognition technology for a US police force, due to concerns about artificial intelligence (AI) bias.
It also reportedly deleted a large facial recognition database, that was said to have contained 10 million images that were used to train facial recognition systems.
San Francisco meanwhile has banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.
Can you protect your privacy online? Take our quiz!
Eyes wide shut. New flagship Google phone can be unlocked if a person's eyes are closed
Number of ransomware attacks on SMBs on the rise, and the cost of downtime has risen over 200 percent