Apple has announced that in the US, it will scan an iPhone’s photo libraries for known images of child sexual abuse in an unprecedented move.

The move has been praised by child safety campaigners, but has been slammed by privacy campaigners worried that the technology could be used by authoritarian governments.

The iPhone maker is also accused of creating a backdoor for encryption by campaigners, by scanning encrypted messages for red flags.

The Apple developments were confirmed in a blog post entitled ‘Expanded Protections for Children’, and will see Apple scan photo libraries stored on iPhones in the US for known images of child sexual abuse (Child Sexual Abuse Material or CSAM), before they are uploaded to iCloud.

Photo scanning

Apple is using a tool called neuralMatch the Guardian reported, and prior to them being uploaded to iCloud, the images will be compared against a database of known child abuse imagery.

If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.

Apple says this will work on iOS and iPadOS operating systems, and since the tool only looks for images that are already in NCMEC’s database, parents taking photos of a child in the bath, for example, apparently need not worry.

“Apple’s method of detecting known CSAM is designed with user privacy in mind,” the firm stated. “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organisations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Despite the fine sounding rhetoric about privacy, it should be noted that tech companies including Microsoft, Google and Facebook have for years been sharing digital fingerprints of known child sexual abuse images.

Apple itself has used those fingerprints to scan user files stored in its iCloud service for child abuse images.

Unprecedented development

But this new direction is the first time that a tech company will be actively scanning images on-device, and is an unprecedented development.

And it has caused serious concerns among privacy campaigners.

John Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter Wednesday night.

“This sort of tool can be a boon for finding child pornography in people’s phones,” Green tweeted. “But imagine what it could do in the hands of an authoritarian government?”

Green pointed out that this type of system relies on a database of “problematic media hashes” that you, as a consumer, can’t review.

“Imagine someone sends you a perfectly harmless political media file that you share with a friend,” he tweeted. “But that file shares a hash with some known child porn file?”

Encrypted message scanning

And in another worrying development for privacy campaigners, Apple also plans to scan users’ encrypted messages as they are sent and received.

An AI-based tool will attempt to automatically identify sexually explicit images, enabling parents to turn on automatic filters for their children’s inboxes.

That system, which Apple says is purely aimed at providing tools to “warn children and their parents when receiving or sending sexually explicit photos”, will not result in sexually explicit images being sent to Apple or reported to the authorities.

But parents will be able to be notified if their child decides to send or receive sexually explicit photos.

Apple also plans updates to Siri and Search to provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

Encryption backdoor?

The Electronic Frontier Foundation (EFF) was quick to wade in and called Apple’s compromise on privacy protections “a shocking about-face for users who have relied on the company’s leadership in privacy and security”.

It said Apple’s move on encryption will also open a backdoor into people’s private life.

“Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage,” said the EFF. “If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.”

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” it added.

“To say that we are disappointed by Apple’s plans is an understatement,” said the EFF. “Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the US and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

OpenAI Tests Search Engine Prototype Called ‘SearchGPT’

Google's dominance of online search is being challenged, after OpenAI unveiled a search prototype tool…

20 hours ago

Elon Musk To Discuss $5 Billion xAI Investment With Tesla Board

Conflict of interest? Elon Musk to talk with Tesla board about making $5 billion Tesla…

24 hours ago

Amazon Developing Cheaper AI Chips – Report

Engineers at Amazon's chip lab in Austin, Texas, are racing ahead to develop cheaper AI…

2 days ago

Apple Smartphone Sales In China Drop 6.7 Percent, Canalys Finds

China woes. Apple's China smartphone shipments decline during the second quarter, dropping it down into…

2 days ago

Meta Ordered To Clean Up AI-Generated Porn By Oversight Board

Oversight Board orders Meta to clarify rules over sexually explicit AI-generated images, after two fake…

2 days ago