Information Commissioner concerned that live facial recognition could be linked to social media to create ‘supercharged CCTV’
The UK Information Commissioner has said she is “deeply concerned” at the potential for misuse of real-time facial recognition in public places.
The technology has “benefits” such as allowing people to easily unlock their phones, set up a bank account online or go through passport control, Information Commissioner Elizabeth Denham said in a blog post.
But when used in real-time and in more public contexts the risk to privacy increases, she said.
She said live facial recognition (LFR) could be used “inappropriately, excessively or even recklessly”.
“When sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the impacts could be significant,” Denham wrote.
“We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights without having our biometric data collected and analysed with every step we take.”
LFR can be used to instantly profile a person to display personalised adverts, or match shoppers’ images against a database of known shoplifters.
It could be combined with social media or other data analytics systems to create “supercharged CCTV”.
Denham said there is also an “opportunity” to ensure the emerging technology doesn’t expand without due regard for data protection.
To this end she has released a Commissioner’s Opinion on the use of LFR in public places by private companies and public organisations.
Data protection and privacy must be “at the heart” of any decision to deploy LFR, she said, adding that the law sets a “high bar” to justify the use of LFR in places where people shop, socialise or gather.
She said that six ICO investigations into live LFR systems found that none were fully compliant with data protection law.
The organisations involved chose to stop their use of LFR.
“With any new technology, building public trust and confidence in the way people’s information is used is crucial so the benefits derived from the technology can be fully realised,” she wrote.
If used properly, the technology could have benefits, such as helping in an emergency search for a missing child, she said.
But organisations will need to demonstrate high standards of governance and accountability from the outset, including being able to justify that LFR usage is “fair, necessary and proportionate” in each context in which it is to be deployed.
“They need to demonstrate that less intrusive techniques won’t work,” Denham wrote.
“We will work with organisations to ensure that the use of LFR is lawful, and that a fair balance is struck between their own purposes and the interests and rights of the public,” she wrote.
“We will also engage with government, regulators and industry, as well as international colleagues to make sure data protection and innovation can continue to work hand in hand.”