Apple To Scan Children Messages In UK For Nudity

Nudity blurring Messages feature to protect children will roll out to additional countries, including the UK, Canada, Australia and NZ

Apple’s Messaging feature designed to blur images containing nudity sent to children, is being rolled out to additional countries.

The feature, called Apple’s “Communication Safety in Messages”, launched in the United States last year and is designed to automatically blur nudity in images sent via the company’s messaging service.

The feature is now coming to the Messages apps on iOS, iPadOS, and macOS for users in the UK, Canada, New Zealand, and Australia, the Guardian reported. Exact timing is unclear, but the newspaper said the feature is coming to the UK “soon.”

hybrid cloud

No nudity

Apple describes it’s “Communication Safety in Messages” option as a safety feature that uses AI technology to scan messages sent to and from children.

It seems that parents activate the feature, and turn on warnings for their children’s iPhones.

When enabled, all photos sent or received by the child using the Messages app will be scanned for nudity, the Guardian reported.

If nudity is found in photos received by a child with the setting turned on, the photo will be blurred, and the child will be warned that it may contain sensitive content and nudged towards resources from child safety groups, the Guardian reported.

If nudity is found in photos sent by a child, similar protections kick in, and the child will be encouraged not to send the images, and given an option to “Message a Grown-Up”.

All the scanning is carried out “on-device”, meaning that the images are analysed by the iPhone itself, and Apple never sees either the photos being analysed or the results of the analysis, Apple said.

“Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages,” Apple is quoted as saying in a statement. “The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”

The Guardian reported that Apple has also dropped a number of controversial options from the update, including an automatic alert to parents if young children, under 13, send or receive such images.

Apple is also reportedly introducing a set of features intended to intervene when content related to child exploitation is searched for in Spotlight, Siri or Safari.

CSAM Scanning

Apple has already encountered pushback from campaigners , when in August last year, more than 90 privacy groups urged the tech giant to abandon plans to scan children’s messages, and the iPhone’s of adults for child sexual abuse material (CSAM) images.

Apple had surprised many earlier in August 2021, when it suddenly announced that it would scan an iPhone’s photo libraries being uploaded to the iCloud for known images of child sexual abuse.

The move was immediately slammed by campaigners, who also accused the iPhone maker of creating a backdoor for encryption, by scanning encrypted messages of children using AI, for sexual red flags.

Apple called its development ‘Expanded Protections for Children’, and it meant that Apple will scan photo libraries stored on all iPhones of adults in the US for known images of child sexual abuse (Child Sexual Abuse Material or CSAM), before they are uploaded to iCloud.

Prior to the images being uploaded to iCloud, the images will be compared against a database of known child abuse imagery produced by child protection agencies.

If a strong enough match is flagged, then Apple staff will be able to manually review the reported images, and, if child abuse is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children (NCMEC) notified.

Apple said that governments cannot force it to add non-CSAM images to a hash list, meaning the feature cannot be used for political purposes.