Skip to main content

Apple confirms that it has stopped plans to roll out CSAM detection system

Back in 2021, Apple announced a number of new child safety features, including Child Sexual Abuse Material (CSAM) detection for iCloud Photos. However, the move was widely criticized due to privacy concerns. After putting it on hold indefinitely, Apple has now confirmed that it has stopped its plans to roll out the CSAM detection system.

Apple will no longer scan for CSAM in iCloud Photos

On the same day that the company announced Advanced Data Protection with end-to-end encryption for all iCloud data, it also put an end to the never-released CSAM scan. The news was confirmed by Apple’s vice president of software engineering Craig Federighi in an interview with WSJ’s Joanna Stern.

When the CSAM scan was announced, Apple said that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of CSAM image hashes. That way, the company would be able to detect such photos using on-device processing without ever having to see users’ photos.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. Private set intersection (PSI) allows Apple to learn if an image hash matches the known CSAM image hashes, without learning anything about image hashes that do not match. PSI also prevents the user from learning whether there was a match.

Even so, the CSAM scan resulted in a lot of criticism from users. In a statement to 9to5Mac last year, the company said it chose to “take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

It turns out that, Apple has now decided that it would be better to put an end to CSAM scanning on iCloud Photos. It’s worth noting, however, that other child safety features such as restrictions on iMessage are still available in iOS.

What are your thoughts on this decision? Let us know in the comments below.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Filipe Espósito Filipe Espósito

Filipe Espósito is a Brazilian tech Journalist who started covering Apple news on iHelp BR with some exclusive scoops — including the reveal of the new Apple Watch Series 5 models in titanium and ceramic. He joined 9to5Mac to share even more tech news around the world.