Police acknowledge working with private King’s Cross developer on face recognition scheme, amidst ongoing concern about the use of the tracking tech
The Metropolitan Police has acknowledged sharing the images of seven people as part of a data-sharing agreement with a private firm in the King’s Cross area of London, but said it had no records of how the images were used.
The privately managed King’s Cross estate first admitted in August that it had deployed face recognition systems in two cameras on pedestrian boulevards to “help ensure public safety”.
The deployment of tracking technology was controversial because it was operated by a private firm and without the knowledge or consent of members of the public.
In September the estate, which is managed by a consortium led by property developer Argent, said it had cancelled plans to use the technology in the future.
The Met acknowledged having shared images of people with King’s Cross last month, after previously denying the arrangement.
In its new report the police agency said only seven images had been shared under the agreement and that it did not believe other similar arrangements were in place with private organisations.
The pictures provided by Camden Borough Police were of “persons who had been arrested and charged/cautioned/reprimanded or given a formal warning” and were supplied in order to prevent crime, the Met said.
It said it had no record of whether the estate’s system had come up with any matches to the images or whether police action had been taken as a result.
The Met confirmed that the estate used a system made by NEC, which also makes face recognition systems used in trials by the Met itself as well as South Wales Police.
The agreement between police and King’s Cross lasted from May 2016 to March 2018, with a new agreement put into place at the beginning of 2019. No images were shared under the new deal.
The Met has apologised for misinforming the London mayor and the London Assembly over the matter, saying the mistake was due to the deal having been made at a borough level.
London deputy mayor Sophie Linden said police had informed the city’s basic command units that “there should be no local level agreements on the use of live facial recognition”.
“The Mayor and I are committed to holding the Met to account on its use of facial recognition technology,” she said in a statement.
British Transport Police has also confirmed it shared images with King’s Cross for use with live face recognition.
‘Out of control’
The arrangement raised concerns in part because it was not reported to the surveillance camera commissioner.
“The fact that police initially denied involvement and have few records about it shows how out of control facial recognition use is in this country,” said Silkie Carlo, director of Big Brother Watch.
The group said it has obtained further details of Argent’s use of face recognition after submitting a data subject access request to the firm.
Argent said it retained data from face recognition checks for up to 30 days in the case of a match, instantly deleting other scan data.
At least two security officers, one of whom was often a police employee, were used to confirm the matches, and the Ministry of Justice’s Criminal Justice Secure email service was used to transfer data to the two police forces involved in the checks, Argent said.
Face recognition data, like other biometric data, is considered personal information and as such is covered by data protection rules such as the GDPR.