Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Apple Cancels System to Detect Child Sexual Abuse Material on iPhones

The controversial system would have flagged child sexual abuse material uploaded to iCloud from iPhones. Critics said it posed a major privacy risk that could ensnare innocent users.

By Michael Kan
December 7, 2022
(Photo by Alexander Pohl/NurPhoto via Getty Images)

Apple is dropping a controversial plan to use consumers' iPhones to flag child sexual abuse material being uploaded to iCloud. 

Apple confirmed the news to The Wall Street Journal and Wired as it announced plans to expand the end-to-end encryption found in iMessage to iCloud Backups, Photos, and Notes. 

Cupertino is ceasing work on the CSAM-detection system after consumers, privacy groups, and even WhatsApp argued that it posed a major privacy risk that could ensnare innocent users.  

Apple’s detection system was first introduced in August 2021 as a way to help combat child sexual abuse material from proliferating online. It was designed to work by using iPhones to detect and flag when at least 30 suspected child sexual abuse images were uploaded to an iCloud account. Apple could then investigate and notify the National Center for Missing and Exploited Children if illegal activity was found.

Apple iPhones on display.
(Photographer: Jason Alden/Bloomberg via Getty Images)

Cupertino built various safeguards to prevent the system from making errors. Nevertheless, privacy groups feared the same detection system could be abused by foreign governments to scan for other content. NSA leaker Edward Snowden even chimed in and described Apple’s approach as spyware.

The backlash was enough for Apple to hit pause on the system to gain more feedback and implement improvements to the technology. However, in December 2021, Apple decided to quietly scrub mention of the detection system from its website, although a company spokesperson said “nothing” had changed with Cupertino’s plans. 

Wired now reports the feedback from privacy experts prompted Apple to shelve it. Instead, the company has focused its effort on other tools to stop the sharing of child sexual abuse material, which include warning children when receiving or sending photos that contain nudity.  

“After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,” Apple tells Wired. “Children can be protected without companies combing through personal data.”

The child sexual abuse material detection system also risked undermining Apple’s promise of providing unmatched privacy on the iPhone. On Tuesday, the company introduced new end-to-end encryption features, which can prevent hackers, law enforcement, and even Apple itself from accessing data in your iCloud Backups. 

“At Apple, we are unwavering in our commitment to provide our users with the best data security in the world,” Apple SVP Craig Federighi said in unveiling the new features.

Like What You're Reading?

Sign up for Fully Mobilized newsletter to get our top mobile tech stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

Prime Big Day Deals

TRENDING

About Michael Kan

Senior Reporter

I've been with PCMag since October 2017, covering a wide range of topics, including consumer electronics, cybersecurity, social media, networking, and gaming. Prior to working at PCMag, I was a foreign correspondent in Beijing for over five years, covering the tech scene in Asia.

Read Michael's full bio

Read the latest from Michael Kan