Apple has announced that in the US, it will scan an iPhone’s photo libraries for known images of child sexual abuse in an unprecedented move.

The move has been praised by child safety campaigners, but has been slammed by privacy campaigners worried that the technology could be used by authoritarian governments.

The iPhone maker is also accused of creating a backdoor for encryption by campaigners, by scanning encrypted messages for red flags.

The Apple developments were confirmed in a blog post entitled ‘Expanded Protections for Children’, and will see Apple scan photo libraries stored on iPhones in the US for known images of child sexual abuse (Child Sexual Abuse Material or CSAM), before they are uploaded to iCloud.

Photo scanning

Apple is using a tool called neuralMatch the Guardian reported, and prior to them being uploaded to iCloud, the images will be compared against a database of known child abuse imagery.

If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.

Apple says this will work on iOS and iPadOS operating systems, and since the tool only looks for images that are already in NCMEC’s database, parents taking photos of a child in the bath, for example, apparently need not worry.

“Apple’s method of detecting known CSAM is designed with user privacy in mind,” the firm stated. “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organisations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Despite the fine sounding rhetoric about privacy, it should be noted that tech companies including Microsoft, Google and Facebook have for years been sharing digital fingerprints of known child sexual abuse images.

Apple itself has used those fingerprints to scan user files stored in its iCloud service for child abuse images.

Unprecedented development

But this new direction is the first time that a tech company will be actively scanning images on-device, and is an unprecedented development.

And it has caused serious concerns among privacy campaigners.

John Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter Wednesday night.

“This sort of tool can be a boon for finding child pornography in people’s phones,” Green tweeted. “But imagine what it could do in the hands of an authoritarian government?”

Green pointed out that this type of system relies on a database of “problematic media hashes” that you, as a consumer, can’t review.

“Imagine someone sends you a perfectly harmless political media file that you share with a friend,” he tweeted. “But that file shares a hash with some known child porn file?”

Encrypted message scanning

And in another worrying development for privacy campaigners, Apple also plans to scan users’ encrypted messages as they are sent and received.

An AI-based tool will attempt to automatically identify sexually explicit images, enabling parents to turn on automatic filters for their children’s inboxes.

That system, which Apple says is purely aimed at providing tools to “warn children and their parents when receiving or sending sexually explicit photos”, will not result in sexually explicit images being sent to Apple or reported to the authorities.

But parents will be able to be notified if their child decides to send or receive sexually explicit photos.

Apple also plans updates to Siri and Search to provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

Encryption backdoor?

The Electronic Frontier Foundation (EFF) was quick to wade in and called Apple’s compromise on privacy protections “a shocking about-face for users who have relied on the company’s leadership in privacy and security”.

It said Apple’s move on encryption will also open a backdoor into people’s private life.

“Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage,” said the EFF. “If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.”

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” it added.

“To say that we are disappointed by Apple’s plans is an understatement,” said the EFF. “Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the US and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

UK CMA Seeks Feedback On Microsoft, Amazon AI Partnerships

British regulator invites feedback on major partnerships Microsoft and Amazon have struck with smaller AI…

8 hours ago

Google Fires More Staff Over Israel Protest

Another 20 staff have been fired by Google over Israel protest and their “completely unacceptable…

9 hours ago

Australian PM Hits Out At Elon Musk Over Knife Attack Video

Censorship row brewing down under, after the Australian Prime Minister calls Elon Musk an 'arrogant…

10 hours ago

US SEC Seeks $5.3 Billion Fine From Terra’s Do Kwon

Financial regulator asks New York judge to impose $5.3 billion in fines against Terraform Labs…

11 hours ago

Microsoft Launches Smallest AI Model, Phi-3-mini

Lightweight artificial intelligence model launched this week by Microsoft, offering more cost-effective option for Azure…

15 hours ago

US Senate Passes TikTok Ban Or Divestment Bill

ByteDance protest falls on deaf ears, as Senate passes TikTok ban or divest bill, with…

16 hours ago