Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

WhatsApp says Apple’s Child Safety tools are a dangerous ‘surveillance system’

Facebook is continuing its war of words with Apple, with the head of the company’s WhatsApp chat app taking aim at Apple’s newly-announced Child Safety features.

In a lengthy thread on Twitter, WhatsApp’s Will Cathcart said he was “concerned” about the approach, which will include scanning iPhone users’ photos to check for child sexual abuse material (CSAM) before they are uploaded to iCloud.

Cathcart said the new feature amounted to a “surveillance system” and hit out at a software that can “scan all the private photos on your phone.” He claimed the system could eventually be a back door for governments to spy on citizens, something Apple has vehemently opposed in the past.

The WhatsApp executive said: “Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.”

He went on to say: “This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.”

In an explainer on Friday, Apple said it had built tech that can scan photos earmarked for iCloud uploads on the device, in a manner that protects user privacy.

The firm said: “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

“This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

The features also include new image recognition tools in iMessage and guidance within Siri and Search pertaining to CSAM material.

While the features may help in identifying the offending and illegal material and bringing perpetrators and abusers to justice, it’s clear there is widespread concern over the approach and the potential for collateral damage. Apple has long held the high ground over companies like Facebook when it comes to user privacy, but it may be at risk of ceding some with the new Child Safety tools.

Cathcart added: “There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.”

The entire thread is certainly worth a read. Cathcart defended WhatsApp’s approach saying it was able to report a worrying 400,000 cases to the authorities without breaking encryption.

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Trusted Reviews Logo

Sign up to our newsletter

Get the best of Trusted Reviews delivered right to your inbox.

This is a test error message with some extra words