Facial Recognition Lawsuit After Man Is Falsely Jailed

Racial bias of facial recognition again in headlines, after another black plaintiff files lawsuit after being falsely jailed

Facial recognition and the issue of misidentifying people of colour, has prompted another lawsuit to be filed in the United States.

The Associated Press reported that Randal Quran Reid on 8 September filed a lawsuit in federal court in Atlanta against Jefferson Parish Sheriff Joseph Lopinto and detective Andrew Bartholomew.

The lawsuit accuses detective Bartholomew of false arrest, malicious prosecution and negligence. Sheriff Lopinto meanwhile is alleged to have failed to implement adequate policies around the use of facial recognition technology.


Facial recognition

The lawsuit seeks unspecified damages after Randal Quran Reid was misidentified by facial recognition technology and jailed for six days.

Randal Quran Reid had been arrested for allegedly being wanted for crimes in Louisiana – a US state that he had never visited.

“I was confused and I was angry because I didn’t know what was going on,” Quran told The Associated Press. “They couldn’t give me any information outside of, ‘You’ve got to wait for Louisiana to come take you,’ and there was no timeline on that.”

Reid, 29, had been arrested after detective Bartholomew, allegedly using surveillance video, relied on a match generated by facial recognition technology to seek an arrest warrant for Reid after a stolen credit card was used to buy two purses for more than $8,000 from a consignment store outside New Orleans in June 2022, the lawsuit said.

“Bartholomew did not conduct even a basic search into Mr. Reid, which would have revealed that Mr. Reid was in Georgia when the theft occurred,” the lawsuit reportedly states.

In an affidavit seeking the warrant, Detective Bartholomew allegedly cited still photographs from the surveillance footage, but did not mention the use of facial recognition technology, according to the lawsuit.

The detective said he was advised by a “credible source” that one of the suspects in the video was Reid. A Department of Motor Vehicles photograph of Reid appeared to match the description of the suspect from the surveillance video, Bartholomew reportedly said.

“The use of this technology by law enforcement, even if standards and protocols are in place, has grave civil liberty and privacy concerns,” said Sam Starks, a senior attorney with The Cochran Firm in Atlanta, which is representing Reid. “And that’s to say nothing about the reliability of the technology itself.”

The Associated Press reported that the family of Reid hired an attorney in Louisiana who presented photos and videos of Reid to the sheriff’s office. The person in the surveillance footage was considerably heavier and did not have a mole like Reid’s, according to his lawsuit.

The sheriff’s office then asked a judge to withdraw the warrant. Six days after his arrest, sheriff’s officials in Georgia’s DeKalb County released Quran.

Reached by phone, Bartholomew said he had no comment, the Associated Press reported.

A spokesman for the sheriff’s office, Capt. Jason Rivarde, said the office does not comment on pending litigation.

Racial bias

Reid is among at least five black plaintiffs who have filed lawsuits against US law enforcement in recent years, the Associated Press reported. Those lawsuits allege the plantiffs were misidentified by facial recognition technology and then wrongly arrested.

In December 2020, a New Jersey man, Nijeer Parks, filed a lawsuit against local police, the prosecutor and the City of Woodbridge in New Jersey, after he was arrested for a crime he did not commit based on a bad face recognition match.

Three of those five lawsuits, including one by a woman who was eight months pregnant and accused of a carjacking, are against Detroit police.

Facial recognition technology allows law enforcement agencies to feed images from video surveillance into software that can search government databases or social media for a possible match.

But critics say it results in a higher rate of misidentification of people of colour than of white people.

In August 2019, the ACLU civil rights campaign group in the United States ran a demonstration to show how inaccurate facial recognition systems can be.

It ran a picture of every California state legislator through a facial-recognition program that matches facial pictures to a database of 25,000 criminal mugshots.

That test saw the facial recognition program falsely flag 26 legislators as criminals.

Facial recognition systems were also previously criticised in the US after research by the Government Accountability Office found that FBI algorithms were inaccurate 14 percent of the time, as well as being more likely to misidentify black people.

And tech firms have boycotted the supplying of the tech to police forces.

Microsoft first refused to install facial recognition technology for a US police force, due to concerns about artificial intelligence (AI) bias.

This boycott was subsequently been joined by Amazon and IBM, among others.

Microsoft also deleted a large facial recognition database, that was said to have contained 10 million images used to train facial recognition systems.

San Francisco banned the use of facial recognition technology, meaning that local agencies, such as the local police force and other city agencies such as transportation would not be able to utilise the technology in any of their systems.

But the police remain in favour of its use.

In February 2020, the UK’s then most senior police officer, Metropolitan Police Commissioner Cressida Dick, said criticism of the tech was “highly inaccurate or highly ill informed.”

She also said facial recognition was less concerning to many than a knife in the chest.