Trusted Reviews is supported by its audience. If you purchase through links on our site, we may earn a commission. Learn more.

EFF has started a petition to stop Apple from scanning iPhones

Earlier in August, Apple unveiled a controversial plan to scan user photos for child abuse images. Now, the Electronic Frontier Foundation is fighting back with a petition addressed to Apple.

The update will involve scanning user images for Child Sexual Abuse Material (CSAM) on-device by matching the photos up with known CSAM image hashes. 

If a match is found, Apple will create a cryptographic safety voucher and upload that to the user’s iCloud account alongside the image. This will result in the user’s account being frozen and the images reported to the National Center for Missing and Exploited Children (NCMEC), who can then alert US law enforcement agencies. 

Apple is also rolling out safety tools in iMessage which will detect if an inappropriate image has been sent to a child. iMessage will then blur the image and warn the child before asking if they still want to view it. 

If a parent opts into certain parental settings, they’ll also be alerted if the child chooses to view the image. The same process applies if a child attempts to send an explicit image. 

The update has been met with criticism by privacy advocates and rivals alike, with WhatsApp CEO Will Cathcart calling it an “Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.” 

Now, the Electronic Frontier Foundations (EFF) – a non-profit organisation dedicated to defending civil liberties in the digital world – has started a petition urging Apple not to scan phones. 

“Apple has abandoned its once-famous commitment to security and privacy,” writes EFF in the description of the petition. “The next version of iOS will contain software that scans users’ photos and messages. Under pressure from U.S. law enforcement, Apple has put a backdoor into their encryption system.”

EFF also warns that Apple could be pressured into expanding the system to search for additional types of content. 

“The system will endanger children, not protect them—especially LGBTQ kids and children in abusive homes. Countries around the world would love to scan for and report matches with their own database of censored material, which could lead to disastrous results, especially for regimes that already track activists and censor online content.”

Trusted Reviews has reached out to both EFF and Apple for comment.

Why trust our journalism?

Founded in 2003, Trusted Reviews exists to give our readers thorough, unbiased and independent advice on what to buy.

Today, we have millions of users a month from around the world, and assess more than 1,000 products a year.

author icon

Editorial independence

Editorial independence means being able to give an unbiased verdict about a product or company, with the avoidance of conflicts of interest. To ensure this is possible, every member of the editorial staff follows a clear code of conduct.

author icon

Professional conduct

We also expect our journalists to follow clear ethical standards in their work. Our staff members must strive for honesty and accuracy in everything they do. We follow the IPSO Editors’ code of practice to underpin these standards.

Trusted Reviews Logo

Sign up to our newsletter

Get the best of Trusted Reviews delivered right to your inbox.

This is a test error message with some extra words