Kings Cross Facial Recognition Cameras Shut Down

The private firm that developed and managed the 67-acre King’s Cross area, has confirmed that it is no longer using facial recognition cameras.

It comes after it was revealed last month that King’s Cross Central Limited Partnership (KCCLP) was using facial recognition cameras for unknown purposes.

The discovery prompted questions from London’s mayor Sadiq Khan, as well as an investigation by the UK’s data protection watchdog, the Information Commissioners Office (ICO), after it said it was “deeply concerned” about the use of the technology.

Facial recognition

After weeks of silence, KCCLP has now issued a statement on the matter.

“There has been considerable media interest recently regarding the use of facial recognition technology (FRT) at the King’s Cross Estate, located next to St Pancras International and King’s Cross stations,” it stated.

It said it was co-operating with the ICO investigation, and will not comment whilst the probe is ongoing, except to clear up a few matters for the public record.

It confirmed that the King’s Cross Estate does not currently use FRT. It said there was two FRT cameras, covering a single location at King’s Boulevard, that was operational between May 2016 and March 2018.

It said that during that period, all FRT data was “regularly deleted, with the final deletion taking place in March 2018.”

It said that the King’s Cross Estate team has since undertaken work on the potential introduction of new FRT, but this work has stopped.

“There has been no operational FRT system at the King’s Cross Estate since March 2018,” it insisted, and said the tech was never used for marketing or other commercial purposes.

“The system was used only to help the Metropolitan Police and British Transport Police prevent and detect crime in the neighbourhood and ultimately to help ensure public safety,” it stated. “In the meantime, KCCLP has no plans to reintroduce any form of FRT at the King’s Cross Estate.”

Facial concerns

The use of FRT remains hugely controversial due to concerns about its accuracy and racial bais.
Last month a facial recognition system in the United States wrongly identified 26 Californian lawmakers as criminals.

Most of those misidentified were people of colour.

In July the ICO warned the police about the use of the technology, after a study found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology are innocent, and that the overwhelming majority of people identified are not on police wanted lists.

That same month the House of Commons Science and Technology committee argued police trials should be halted until the relevant regulations were in place.

Can you protect your privacy online? Take our quiz!

Tom Jowitt @TJowitt

Tom Jowitt is a leading British tech freelance and long standing contributor to TechWeek Europe

Recent Posts

So, you want to be a CIO?

The role of the CIO is evolving with more of a focus on revenue and strategy, according to the 2019…

2 days ago

Twitter Demands AI Firm Cease Facial Image Collection

Privacy concern. Cease-and-desist letter from Twitter to AI firm Clearview demands it stop collecting photos from social media platforms

2 days ago

Sonos Boss Apologises For Update Controversy

Sonos CEO says sorry for anger caused by its update policy, and says it will support legacy products “for as…

2 days ago

Apple Cautions EU About Common Charger Push

Apple has cautioned against the renewed EU push for a common mobile charger, warning that losing its Lightning port will…

2 days ago

US Tells UK It Still Has ‘Significant Concerns’ Over Huawei

With a UK decision on Huawei expected by the end of the month, US officials maintain 'significant concerns' about the…

3 days ago

Apple Fixed Tracking Flaws In Safari, But Google Director Disagrees

Google identified multiple privacy flaws in Apple's Safari browser, which the iPad maker said it has fixed, but a Google…

3 days ago