Apple Acknowledges ‘Confusion’ Over ‘iPhone Scanning’ System

Apple has acknowledged that its announcement of tools to scan for illegal images on iPhones and iPads was “jumbled pretty badly”.

Following criticism from privacy campaigners, the company has now given more details on the system, saying device-level scanning would allow independent experts to verify how Apple was using the system and what was being scanned for.

On 5 August Apple announced it would scan images uploaded from iPhones and iPads to its iCloud storage, looking for matches against a database of known child sex abuse material (CSAM) maintained by the US National Centre for Missing and Exploited Children (NCMEC).

Companies that operate cloud-based services, including Facebook, Google and Microsoft, commonly scan for CSAM, but do so remotely.

Upload scanning

Apple said it plans to add hashes for the CSAM database directly to iPhones and iPads in an operating system update later this year and that devices are to scan images before they reach iCloud.

An image is to be scanned only when a user uploads it to iCloud, and the system only detects exact matches against the database.

The system would not flag images of a person’s children in the bath, or search for pornography, Apple’s head of software, Craig Federighi, told The Wall Street Journal.

He said the announcement was “misunderstood” and that people had become concerned that Apple was scanning iPhones for images.

“That is not what is happening,” Federighi said.

“We feel very positively and strongly about what we’re doing and we can see that it’s been widely misunderstood.”

Account review

If the user tries to upload several CSAM images, the account will be flagged for review by Apple staff.

Federighi said this would only happen if the user tried to upload in the region of 30 matching images.

Apple said it plans to add the same database to all versions of iOS and iPadOS, but that it would only be used for scanning in the US initially, with rollouts in other countries to be considered on a case-by-case basis.

Apple said putting the database on the device would add accountability and that an independent auditor would be able to verify what was being scanned for.

‘Confusion’

The company is also rolling out a separate parental control that invovles image-scanning and Federighi said there had been “confusion” between the two.

If activated by a parent, the second feature scans messages sent or received by a child using the iMessage app. If nudity is detected the tool obscures the photo and warns the child.

Parents can also choose to receie an alert if the child chooses to view the photo.

Privacy groups said the tool could be expanded and used by authoritarian governments to spy on citizens.

Will Cathcard, head of WhatsApp, said Apple’s tools were “very concerning” and whistleblower Edward Snowden called the iPhone a “spyPhone”.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

Dorsey: Square May Develop Bitcoin Mining Hardware

Square chief executive Jack Dorsey says fintech company looking into custom-built Bitcoin mining hardware that…

8 hours ago

Twitch Warns Over Source Code Breach

Gameplay streaming service Twitch says massive breach earlier this month mainly affected confidential source code,…

8 hours ago

US, Kazakhstan Take Bitcoin Mining Lead From China

China's share of Bitcoin mining drops to zero amidst crackdown, as US, Kazakhstan and Russia…

9 hours ago

Facebook Developing ‘Egocentric’ Artificial Intelligence

Facebook wants to make wearable tech more useful with artificial intelligence trained on massive set…

9 hours ago

Sunderland University Seeks To Recover From ‘Extensive’ Cyber-Attack

IT and email systems at Sunderland University offline since last week following 'major cyber-attack' that…

10 hours ago

Apple Removes Quran App In China

Apple takes down popular Quran app in China after government request, as critics call company…

11 hours ago