Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

Apple Child Safety update will scan photos for abusive material, warn parents

Apple has announced a raft of new measures, aimed at keeping children safe on its platform and limiting the spread of child sexual abuse images.

As well as new safety tools in iMessage, Siri and Search, Apple is planning to scan users iCloud uploads for Child Sexual Abuse Material (CSAM). That’s sure to be controversial among privacy advocates, even if the ends can justify the means.

The company is planning on-device scanning of images that will take place before the photo is uploaded to the cloud. It’ll be checked against known ‘image hashes’ that can defect offending content. Apple says this will ensure the privacy of every day users will be protected.

Should the tech discover CSAM images, the iCloud account in question will be frozen and the images will be reported to the National Center for Missing and Exploited Children (NCMEC), which can then be referred to law enforcement agencies.

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple writes in an explainer.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

Elsewhere, the new iMessage tools are designed to keep children safe from online exploitation. If a child receives what go-between image-detecting tech deems to be inappropriate it will be blurred and the child will be warned and “presented with helpful resources, and reassured it is okay if they do not want to view this photo.”

Depending on the parental settings, parents will be informed if the kid goes ahead and views the image. “Similar protections are available if a child attempts to send sexually explicit photos,” Apple says. “The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.” Again, on-device image detection tech is used.

Finally, new guidance in Siri and Search will provide iPhone and iPad owners with staying safe online and filing reports with the relevant authorities.

The company adds: “Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

These updates are coming in iOS/iPadOS 15.

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Trusted Reviews Logo

Sign up to our newsletter

Get the best of Trusted Reviews delivered right to your inbox.

This is a test error message with some extra words