The private company that manages the King’s Cross area of London is using face recognition to track individuals and Canary Wharf is considering adding such capabilities, in spite of controversy around the technology.
The 67-acre King’s Cross area, which is developed and managed by the King’s Cross Central Limited Partnership, is the site of Google’s UK headquarters and other businesses, Central Saint Martins college, schools and retailers.
An unnamed spokesperson for the King’s Cross group told the Financial Times there are multiple cameras installed across the area to observe passers-by using technologies including face recognition.
“These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public,” the spokesperson said.
The person did not indicate how many cameras are in use or for how long face recognition had been utilised in the area.
The King’s Cross Central Limited Partnership did not immediately respond to a request for comment.
The Canary Wharf Group, which owns private offices and public spaces in the area, is also speaking to face recognition suppliers to pilot the technology in the area, the FT said, citing unnamed sources.
The technologies would not operate continuously to monitor pedestrians and officer workers, but would be limited to specific purposes or threats, the sources said.
The development currently operates at least 1,750 CCTV cameras as well as automated licence plate recognition systems, said Genetec, a Canadian company that provides security software to the district.
The paper noted that with Canary Wharf’s 97 acres, the surveillance plans would subject more than 160 acres of London to face recognition.
King’s Cross and Canary Wharf are unusual in London in that both form large, privately managed zones visited by large numbers of people daily.
Canary Wharf houses the offices of many financial services companies, including Barclays, Credit Suisse and HSBC, and is visited by some 140,000 people daily.
The operation of such private face recognition systems would be controversial since European data protection law requires organisations to obtain the explicit consent of individuals before gathering their personal data, including biometric characteristics such as fingerprints or face identification patterns.
The information commissioner’s office (ICO) is already carrying out an investigation into trials of face recognition by police forces, including those in London and South Wales.
Through a series of freedom of information requests, privacy advocates Big Brother Watch found that in London trials face recognition matches were up to 98 percent inaccurate, while those in South Wales were up to 91 percent inaccurate.
During a trial covering Cardiff Airport as well as the city’s train station and city centre during UEFA Champions League Final Week in 2017, for instance, the systems generated 2,554 false positives and 173 true-positives, the group’s investigation found.
Last December the Metropolitan Police recently carried out voluntary trials of the tech in central London, but said those who preferred not to be scanned would not necessarily be considered under suspicion.
In July the House of Commons Science and Technology committee argued police trials should be halted until the relevant regulations were in place.
Trolls beware. Twitter releases feature that will deliver a 'reconsider prompt' for users, if they…