Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos

In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED.

iCloud General Feature
Apple's full statement:

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.

In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.

Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company ultimately postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." Now, after a year of silence, Apple has abandoned the CSAM detection plans altogether.

Apple promised its CSAM detection system was "designed with user privacy in mind." The system would have performed "on-device matching using a database of known CSAM image hashes" from child safety organizations, which Apple would transform into an "unreadable set of hashes that is securely stored on users' devices."

Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple said there would be a "threshold" that would ensure "less than a one in one trillion chance per year" of an account being incorrectly flagged by the system, plus a manual review of flagged accounts by a human.

Apple's plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that the feature would have created a "backdoor" into devices, which governments or law enforcement agencies could use to surveil users. Another concern was false positives, including the possibility of someone intentionally adding CSAM imagery to another person's iCloud account to get their account flagged.

Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Popular Stories

iPhone 15 Pro Cameras

iPhone 17 Pro Max Will Be First Model to Feature Three 48MP Cameras

Thursday July 11, 2024 12:20 am PDT by
Next year's iPhone 17 Pro Max will feature an upgraded 48-megapixel Tetraprism camera for enhanced photo quality and zoom functionality, according to Apple analyst Ming-Chi Kuo. In his n-iphone-tetraprism-upgrade-ca62dd37e364">latest investor note published to Medium, Kuo said the key specification change would be a 1/2.6" 48MP CIS sensor, up from the 1/3.1" 12MP sensor expected to be used...
Beyond iPhone 13 Better Blue Face ID Single Camera Hole

10 Reasons to Wait for Next Year's iPhone 17

Monday July 8, 2024 5:00 am PDT by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we sometimes get rumored feature leaks so far ahead of launch. The iPhone 17 series is no different – already we have some idea of what to expect from Apple's 2025 smartphone lineup. If you plan to skip...
maxresdefault

Apple's AirPods Pro 2 vs. Samsung's Galaxy Buds3 Pro

Saturday July 13, 2024 8:00 am PDT by
Samsung this week introduced its latest earbuds, the Galaxy Buds3 Pro, which look quite a bit like Apple's AirPods Pro 2. Given the similarities, we thought we'd compare Samsung's new earbuds to the AirPods Pro. Subscribe to the MacRumors YouTube channel for more videos. Design wise, you could potentially mistake Samsung's Galaxy Buds3 Pro for the AirPods Pro. The Buds3 Pro have the same...
primeday2020 feature3

The Best Early Prime Day Deals on Apple Products

Saturday July 13, 2024 6:23 am PDT by
Amazon is soon to be back with its annual summertime Prime Day event, lasting for just two days from July 16-17. As it does every year, Prime Day offers shoppers a huge selection of deals across Amazon's storefront, and there are already many deals you can get on sale ahead of the event. Note: MacRumors is an affiliate partner with Amazon. When you click a link and make a purchase, we may...

Top Rated Comments

Populus Avatar
21 months ago
This is the right decision for Apple to make, in my opinion. I’m glad they recognized that there are better ways to prevent the spread of this type of content.

I’m sincerely surprised Apple backtracked on something as big as this (and with such a big pressure from the governments).
Score: 64 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
21 months ago

Yeah, but now we can't catch the pedophiles
That's the Law Enforcement and Government job. Not Apple's.
Score: 62 Votes (Like | Disagree)
TheYayAreaLiving ?️ Avatar
21 months ago
Thank you, Apple. CSAM was a joke. If privacy matters in your life, it should matter to the phone your life is on”. Long Live!

Score: 44 Votes (Like | Disagree)
Realityck Avatar
21 months ago

Apple today announced that it has abandoned its plans to detect known CSAM stored in iCloud Photos, according to a statement shared with WIRED ('https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/').
Everybody should be happy CSAM is DOA
Score: 30 Votes (Like | Disagree)
aPple nErd Avatar
21 months ago
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
Score: 27 Votes (Like | Disagree)
aPple nErd Avatar
21 months ago

[S]After extensive consultation with experts[/S] After extensive public pressure...

Never should have succumbed to the public feedback! Perhaps a botched introduction but no one else would have done it 'right' like Apple because of the scrutiny they are under.
Truly an unhinged take
Score: 26 Votes (Like | Disagree)