Lawmakers in California have banned the use of facial-recognition software in police body cams
Concern at the use of facial recognition technology continues as California lawmakers ban its use for the body cameras used by state and local law enforcement officers.
It comes after civil rights campaign group in the US called ACLU ran a picture of every California state legislator through a facial-recognition program that matches facial pictures to a database of 25,000 criminal mugshots.
The test saw the facial recognition program falsely flag 26 legislators as criminals. And to make matters worse, more than half of the falsely matched lawmakers were people of colour, according to the ACLU.
Officials in San Francisco have already banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.
But now this ban looks set to be expanded to the whole of California, which is the most populous state in America.
The bill, AB 1215, which also referred to as the Body Camera Accountability Act, has been officially passed by the California Assembly and now it is up to Governor Gavin Newsom, who must decide whether or not to sign it into law by 13 October.
If he agrees, the bill will go into effect in January 2020, and it will last three years.
During this period, the bill prohibits the use of biometric surveillance technology, which includes facial-recognition software, in police body cameras. It also prohibits police from taking body-camera footage and running it through facial-recognition software at a later time, CNN reported.
But it does not prevent state and local police from using facial-recognition technology in other ways, such as in stationary cameras, and it does not apply to federal agencies such as the FBI.
The use of facial-recognition continues to cause privacy concerns around the world.
Earlier this month the High Court in the UK ruled the use of automated facial recognition (AFR) by South Wales Police was lawful.
Facial-recognition has been previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.
Microsoft for example has refused to install facial recognition technology for a US police force, due to concerns about artificial intelligence (AI) bias.
It also reportedly deleted a large facial recognition database, that was said to have contained 10 million images that were used to train facial recognition systems.
Can you protect your privacy online? Take our quiz!