Microsoft Deletes Facial Recognition Database Over Privacy Fears – Report

Microsoft continues to react to growing alarm at the use of facial recognition (FR) technology in everyday life, with a report that it has deleted a large FR database.

The database in question was said to have contained 10 million images that were used to train facial recognition systems, the Financial Times reported.

This is not the first time that Microsoft has expressed its unease about the growing use of facial recognition. In April Redmond refused to install facial recognition technology for a US police force, due to concerns about artificial intelligence (AI) bias.

Facial databases

At the time President Brad Smith explained how Microsoft had rejected a California law enforcement agency’s request to install facial recognition technology in officers’ cars and body cameras due to human rights concerns.

That move came as governments and firms around the world increasingly grapple with the ethical use of AI and facial recognition, amid reports that some researchers are scraping people’s images from social media and CCTV cameras.

The Microsoft database in question was used to train other facial recognition systems around the world, including military researchers and Chinese firms such as SenseTime and Megvii, the FT reported.

The database was reportedly called MS Celeb, was published in 2016 and described by the company as the largest publicly available facial recognition data set in the world.

The people whose photos were used were not asked for their consent, and it is widely believed to have been made up of the faces of public figures such as celebrities.

“The site was intended for academic purposes,” the FT quoted Microsoft as saying in a statement. “It was run by an employee that is no longer with Microsoft and has since been removed.”

The FT also reported that two other databases have been removed including the Duke MTMC surveillance data set built by Duke University researchers, and a Stanford University data set called Brainwash.

AI, FR ethics

This display of corporate ethics from Microsoft comes amid an intense debate about the use of facial recognition and artificial intelligence.

The British government for example has launched an inquiry into the use of AI and potential bias in legal matters.

Google meanwhile has tied itself in knots over the issue.

In March this year Google created the ‘Advanced Technology External Advisory Council (ATEAC)’, to offer guidance on the ethical use of AI.

But only a week later it disbanded the council over concern of a couple of its female members.

And Google also caused deep anger among its employees (some of whom resigned) over its involvement in a Pentagon project codenamed Project Maven. The Pentagon drone project had utilised Google’s AI technology.

Following that, Alphabet CEO Sundar Pichai last year created new principles for AI use at Google, and pledged not to use AI for technology that causes injury to people.

Put your knowledge of artificial intelligence (AI) to the test. Try our quiz!

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Creating Deepfake Porn Without Consent To Become A Crime

People who create sexually explicit ‘deepfakes’ of adults will face prosecution under a new law…

6 hours ago

Google Fires 28 Staff Over Israel Protest, Undertakes More Layoffs

Protest at cloud contract with Israel results in staff firings, in addition to layoffs of…

7 hours ago

Russia Already Meddling In US Election, Microsoft Warns

Microsoft warns of Russian influence campaigns have begun targetting upcoming US election, albeit at a…

8 hours ago

EU To Drop Microsoft’s OpenAI Investment Probe – Report

Microsoft to avoid an EU investigation into its $13 billion investment in OpenAI, after EC…

12 hours ago

US Provides Assurances For Julian Assange Extradition

As President Biden 'considers' request to drop Julian Assange extradition, US provides assurances to prevent…

14 hours ago