ICO Investigates Kings Cross Facial Recognition Use

King’s Cross Central Limited Partnership

UK data protection watchdog to investigate why Kings Cross developer is using facial recognition

The Information Commissioners Office (ICO) is to investigate why a private company that manages the King’s Cross area of London is using facial recognition to track individuals.

The ICO said it remains “deeply concerned” at the use of the technology, and said that it will investigate “inspect the system and its operation on-site” to ensure it complies with data protection laws.

It comes after growing concern when it was revealed earlier the week that the 67-acre King’s Cross area, which is developed and managed by the King’s Cross Central Limited Partnership (KCCLP) and Argent, is using the tech for unknown purposes.

King’s Cross Central Limited Partnership
King’s Cross Central Limited Partnership

ICO investigation

The area is the site of Google’s UK headquarters and other businesses, Central Saint Martins college, schools and retailers.

The ICO said its investigation was down to concerns about the use of the tech, as reported in the media.

“We have launched an investigation following concerns reported in the media regarding the use of live facial recognition in the King’s Cross area of central London, which thousands of people pass through every day,” explained Elizabeth Denham, Information Commissioner.

“As well as requiring detailed information from the relevant organisations about how the technology is used, we will also inspect the system and its operation on-site to assess whether or not it complies with data protection law,” Denham said.

“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way,” said Denham. “They must have documented how and why they believe their use of the technology is legal, proportionate and justified.”

“We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights,” she added.

Earlier this week a facial recognition system in the United States wrongly identified 26 Californian lawmakers as criminals. Most of those misidentified were people of colour.

Facial concerns

In July the ICO warned the police about the use of the technology, after a study found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology are innocent, and that the overwhelming majority of people identified are not on police wanted lists.

The mayor of London Sadiq Khan has also announced this week he is questioning the developer about its use of facial-recognition systems.

In July the House of Commons Science and Technology committee argued police trials should be halted until the relevant regulations were in place.

The UK biometrics commissioner Professor Paul Wiles has also expressed concern at the use of the technology.

Can you protect your privacy online? Take our quiz!