Apple already scans iCloud Mail for Child Sexual Abuse Materials (CSAM)

Apple has confirmed it is already scanning iCloud Mail but not iCloud Photos for Child Sexual Abuse Materials

23 August 2021 | Ben Lovejoy | 9 to 5 Mac

Apple has confirmed to me that it already scans iCloud Mail for CSAM, and has been doing so since 2019. It has not, however, been scanning iCloud Photos or iCloud backups.

The clarification followed me querying a rather odd statement by the company’s anti-fraud chief: that Apple was “the greatest platform for distributing child porn.” That immediately raised the question: If the company wasn’t scanning iCloud photos, how could it know this?

There are also a couple of other clues that Apple had to have been doing some kind of CSAM scanning. An archived version of Apple’s child safety page said this (emphasis ours):

Apple is dedicated to protecting children throughout our ecosystem wherever our products are used, and we continue to support innovation in this space. We have developed robust protections at all levels of our software platform and throughout our supply chain. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.

Additionally, the company’s chief privacy officer said the same thing back in January 2020:

Jane Horvath, Apple’s chief privacy officer, said at a tech conference that the company uses screening technology to look for the illegal images. The company says it disables accounts if Apple finds evidence of child exploitation material, although it does not specify how it discovers it.

Apple wouldn’t comment on Friedman’s quote, but they did tell me that the company has never scanned iCloud Photos.

Apple scans iCloud Mail

However, Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task.

Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale. It did tell me that the “other data” does not include iCloud backups.

Although Friedman’s statement sounds definitive – like it’s based on hard data – it’s now looking likely that it wasn’t. It’s our understanding that the total number of reports Apple makes to CSAM each year is measured in the hundreds, meaning that email scanning would not provide any kind of evidence of a large-scale problem on Apple servers.

The explanation probably lays in the fact that other cloud services were scanning photos for CSAM, and Apple wasn’t. If other services were disabling accounts for uploading CSAM, and iCloud Photos wasn’t (because the company wasn’t scanning there), then the logical inference would be that more CSAM exists on Apple’s platform than anywhere else. Friedman was probably doing nothing more than reaching that conclusion.

The controversy over Apple’s CSAM plans continues, with two Princeton academics stating that they prototyped a scanning system based on exactly the same approach as Apple, but abandoned the work due to the risk of governmental misuse.

HT to Jesse Hollington for the email discussion that led me to contact Apple

Loading

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.