Apple delays plans to roll out image-scanning technology on iPhones to search for child-abuse images before material is uploaded to iCloud
Apple has delayed plans to roll out image-scanning technology on iPhones that proved controversial after it was announced last month.
The company said it wanted to take additional time to “collect input and make improvements” to the technology.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in a statement.
The image-scanning tool was designed to scan images on an iPhone to search for matches with known child-abuse images, known as CSAM, before the images were uploaded to iCloud.
It’s standard practice for large companies that operate cloud services to scan material for such images, but Apple’s approach would have placed the scanning technology on the device, rather than in the cloud.
The company said this would ensure a greater degree of privacy as third parties would be able to verify exactly what was being scanned for.
Companies search against a CSAM database maintained by the National Centre for Missing and Exploited Children.
Apple’s system was to flag any matches to be reviewed by a human, who could, if necessary, disable a user’s account and make a report to law enforcement.
It was due to launch sometime this year.
Privacy groups were concerned the on-device tracking system could be repurposed by authoritarian governments to spy on users.
The Electronic Frontier Foundation said Apple’s system amounted to an attempt to “build a backdoor” into its data storage and messaging systems.
It gathered 25,000 signatures from consumers opposed to the plan.
More than 90 privacy groups urged Apple in an open letter to cancel the scheme.
Apple has previously presented itself as an advocate of privacy and end-to-end encryption.