Apple Defends Photo Checking On iPhones

Apple promises to reject any government demands to use, or alter, its controversial detection system of child abuse images

Apple has pledged not to allow repressive governments to abuse its controversial child sexual abuse image detection system.

Last week Apple had surprised many, when it suddenly announced that it will scan an iPhone’s photo libraries for known images of child sexual abuse, in an unprecedented move.

Apple on Sunday evening defended the move, but has not responded to specific concerns that it will actually carry out checks of images on people’s handsets, unlike its competitors who scan uploaded images.

CSAM scanning

Apple’s move last week was praised by child safety campaigners, but was slammed by privacy campaigners worried that the technology could be abused by authoritarian governments.

The iPhone maker is also accused of creating a backdoor for encryption by campaigners, by scanning encrypted messages for red flags.

Apple called its development ‘Expanded Protections for Children’, and it means that Apple will scan photo libraries stored on iPhones in the US for known images of child sexual abuse (Child Sexual Abuse Material or CSAM), before they are uploaded to iCloud.

Prior to the images being uploaded to iCloud, the images will be compared against a database of known child abuse imagery produced by child protection agencies.

If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.

Apple says this will work on iOS and iPadOS operating systems, and since the tool only looks for images that are already in NCMEC’s database, parents taking photos of a child in the bath, for example, apparently need not worry.

Apple defence

Apple said in a FAQ document posted to its website Sunday that governments cannot force it to add non-CSAM images to a hash list. This is the file of numbers corresponding to known child sexual abuse images, that Apple will distribute to iPhones to enable the system.

“Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app,” said Apple.

“It works only on images sent or received in the Messages app for child accounts set up in Family Sharing,” it added. “It analyses the images on-device, and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo.”

“As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it,” Apple said.

“The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images,” said Apple.

“This feature only impacts users who have chosen to use iCloud Photos to store their photos,” said Apple. “It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.”

iPhone scanning?

It should be noted that other tech companies including Microsoft, Google and Facebook have for years been sharing digital fingerprints of known child sexual abuse images.

Apple itself has used those fingerprints to scan user files stored in its iCloud service for child abuse images.

Apple however believes that its system is more private than those used by other companies, because its system uses both its servers and software that will be installed on people’s iPhones through an iOS update.

But this new direction is the first time that a tech company will be actively scanning images on-device, and is an unprecedented development.

And it has caused serious concerns among privacy campaigners.

Yet Apple insisted that it is not is going to scan all the photos stored on a person’s iPhone.

“By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM,” said Apple. “The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device.”

And Apple categorically denied that governments could force it to add non-CSAM images to the hash list.

The fear is that repressive government’s could add political images to hash list, so it could track or identify political opponents.

“Apple will refuse any such demands,” Apple said. “Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups.”

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands,” Apple said. “We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”

“Furthermore, Apple conducts human review before making a report to NCMEC,” it added. “In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.”