Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

Controversial iMessage child safety tools are coming to the UK

Apple is introducing a new child safety feature for its iMessage app for UK users. The “communications safety in iMessages” feature, which is already available to users in the United States, is now coming to the UK and Canada.

The idea behind the feature is to prevent younger iPhone users being exposed to age-inappropriate content shared via the platform. It’s an opt-in feature for parents, which will scan all photos sent and received via iMessage for nudity, with offending content blurred from young eyes.

The sender would be cautioned not to share the image and receive a prompt to message an adult. Those receiving the images will get a sensitive content warning, and be provided with means of contacting related child safety groups. Apple insists that everything is done on the device without being uploaded to the cloud for scanning. The company is also adding expanded guidance in Spotlight search, Safari search and Siri.

When announcing the feature last year, Apple said it provided “additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report child exploitation will be pointed to resources for where and how to file a report.”

However, one of the more controversial elements of the overall proposition, initially announced by Apple last year, will not be part of this stage of the rollout. Initially, Apple had planned to scan all photos on the device for known images of child sexual abuse, prior to their upload to iCloud, but received considerable pushback. Apple tells Trusted Reviews there is no update on if and when this element of the proposition will roll out and in what form.

The earlier announcement from the company also said parents would be alerted to the offending content, but that is no longer the case.

“Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages,” Apple explains on its child safety website.

“The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”

The features will roll out in a future software update in the next few weeks, Apple says.

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Trusted Reviews Logo

Sign up to our newsletter

Get the best of Trusted Reviews delivered right to your inbox.

This is a test error message with some extra words