Apple admitted that the company is the greatest platform for distributing child porn

Apple has confirmed that it has been scanning iCloud Mail for Banned Sexually Abused Child Photos (CSAM) since 2019. However, the company has not yet scanned iCloud Photos or iCloud backups.


This information comes after 9to5mac asked for comment on a bizarre statement by CIO Eric Friedman, in which he noted that Apple is "the greatest platform for distributing child porn." If Apple hasn't scanned user photos, where does this come from?

Apple is committed to protecting children throughout our ecosystem, wherever our products are used, and we continue to support innovation in this area. We have developed robust protections at all levels of our software platform and throughout the supply chain. As part of this commitment, Apple uses image-matching technology to help identify and report child exploitation. Similar to email spam filters, our systems use electronic signatures to detect suspected child exploitation. We check each match with an individual review. Accounts with child exploitative content violate our terms of service and any accounts we find with this material will be disabled. - Eric Friedman

So, Apple has confirmed that it will scan outbound and inbound iCloud Mail for CSAM attachments as of 2019. Email is not encrypted, so scanning attachments as mail passes through Apple's servers was a trivial task.


Apple also indicated that it performs limited scans of other data, but did not specify which ones. However, it was noted that “other data” does not include iCloud and iCloud Photos backups.

2 views0 comments

Recent Posts

See All

Steam 2021 autumn sale begins

The Steam 2021 autumn sale has officially opened, which lasts from 2 a.m. Beijing time on the 25th to 1 a.m. Beijing time on December 2nd. T