Kings Cross Facial Recognition Cameras Shut Down

The private firm that developed and managed the 67-acre King’s Cross area, has confirmed that it is no longer using facial recognition cameras.

It comes after it was revealed last month that King’s Cross Central Limited Partnership (KCCLP) was using facial recognition cameras for unknown purposes.

The discovery prompted questions from London’s mayor Sadiq Khan, as well as an investigation by the UK’s data protection watchdog, the Information Commissioners Office (ICO), after it said it was “deeply concerned” about the use of the technology.

Facial recognition

After weeks of silence, KCCLP has now issued a statement on the matter.

“There has been considerable media interest recently regarding the use of facial recognition technology (FRT) at the King’s Cross Estate, located next to St Pancras International and King’s Cross stations,” it stated.

It said it was co-operating with the ICO investigation, and will not comment whilst the probe is ongoing, except to clear up a few matters for the public record.

It confirmed that the King’s Cross Estate does not currently use FRT. It said there was two FRT cameras, covering a single location at King’s Boulevard, that was operational between May 2016 and March 2018.

It said that during that period, all FRT data was “regularly deleted, with the final deletion taking place in March 2018.”

It said that the King’s Cross Estate team has since undertaken work on the potential introduction of new FRT, but this work has stopped.

“There has been no operational FRT system at the King’s Cross Estate since March 2018,” it insisted, and said the tech was never used for marketing or other commercial purposes.

“The system was used only to help the Metropolitan Police and British Transport Police prevent and detect crime in the neighbourhood and ultimately to help ensure public safety,” it stated. “In the meantime, KCCLP has no plans to reintroduce any form of FRT at the King’s Cross Estate.”

Facial concerns

The use of FRT remains hugely controversial due to concerns about its accuracy and racial bais.
Last month a facial recognition system in the United States wrongly identified 26 Californian lawmakers as criminals.

Most of those misidentified were people of colour.

In July the ICO warned the police about the use of the technology, after a study found that 81 percent of ‘suspects’ flagged by Met’s police facial recognition technology are innocent, and that the overwhelming majority of people identified are not on police wanted lists.

That same month the House of Commons Science and Technology committee argued police trials should be halted until the relevant regulations were in place.

Can you protect your privacy online? Take our quiz!

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

UK CMA Seeks Feedback On Microsoft, Amazon AI Partnerships

British regulator invites feedback on major partnerships Microsoft and Amazon have struck with smaller AI…

8 hours ago

Google Fires More Staff Over Israel Protest

Another 20 staff have been fired by Google over Israel protest and their “completely unacceptable…

9 hours ago

Australian PM Hits Out At Elon Musk Over Knife Attack Video

Censorship row brewing down under, after the Australian Prime Minister calls Elon Musk an 'arrogant…

10 hours ago

US SEC Seeks $5.3 Billion Fine From Terra’s Do Kwon

Financial regulator asks New York judge to impose $5.3 billion in fines against Terraform Labs…

11 hours ago

Microsoft Launches Smallest AI Model, Phi-3-mini

Lightweight artificial intelligence model launched this week by Microsoft, offering more cost-effective option for Azure…

15 hours ago

US Senate Passes TikTok Ban Or Divestment Bill

ByteDance protest falls on deaf ears, as Senate passes TikTok ban or divest bill, with…

16 hours ago