Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

Apple admits ‘jumbled messages’ over controversial photo-scanning tech

Apple executive Craig Federighi said the company failed to effectively explain the new child safety protections that scan iPhone users photos for evidence of abuse.

In an interview with the Wall Street Journal, the personable Federighi said the “messages got jumbled” over the photo scanning policy, which has been met with fierce criticism from privacy advocates. Some have said the system, which aims to prevent the material being uploaded to iCloud, is tantamount to surveillance.

Apple published a Q&A on the matter last week, but it didn’t quell the criticism of the policy from rivals like WhatsApp.

Federighi says Apple wishes it could have been clearer over the policy rollout, which will come with iOS 15 in the United States, and also include tools to restrict the sharing of child sexual abuse material via iMessage.

“It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” Apple’s senior vice president of software engineering told the WSJ. “We wish that this would’ve come out a little more clearly for everyone because we feel very positive and strongly about what we’re doing.”

The announcement of the CSAM policy has somewhat damaged Apple’s reputation as a privacy-first company with many worried about the ramifications if the company’s security is penetrated by a government, for instance.

Apple has said that images users are attempting to upload to iCloud are scanned against a list of known CSAM images from the National Center for Missing and Exploited Children (NCMEC) in the United States. All searches take place on the device rather than in the cloud.

Federighi also looked to assure innocent users they won’t get flagged for false positives and find themselves in trouble with the law through no fault of their own. He said users will only be detected if the scans detect around 30 of images that are known to the authorities.

There are currently no plans to roll out the system in the UK or other countries, Federighi says in the interview, but it will be considered on a case-by-case bases. The “hashes” used to detect the images will ship with all versions of iOS 15, but they won’t be used for scanning anywhere but the US.

Federighi assured that the system will have “multiple levels of audibility” depending on the country the policy rolls out in. That will mean “you don’t have to trust any one entity, or even any one country, as far as what images are part of this process.”

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Trusted Reviews Logo

Sign up to our newsletter

Get the best of Trusted Reviews delivered right to your inbox.

This is a test error message with some extra words