As Apple and privacy groups debate the pros and cons of the company’s new policy to scan its devices and cloud storage for known images of child sexual abuse material (CSAM), a report from 9-to-5 Mac has now confirmed that the Cupertino based company has been scanning email for over two years now.
A few weeks ago, we reported how an upcoming update will enable Apple to scan iPhones for CSAM images. In its bid to clarify how exactly the technology would work, Apple said that it would not scan actual images but only its hash – a digital fingerprint and compare it to the hash of known CSAM. It also clarified how a manual review would be used if a match was found and there was no automated reporting involved at any step that would flag a user erroneously and suspend their account.
However, the clarification did not placate privacy groups, who continue to stress that the technology could easily be modified to track dissenters and serve the interests of those in power, leaving very little scope for the privacy of an individual. An international coalition of privacy advocates has written an open letter to CEO, Tim Cook, asking him not to roll out these updates that can be potentially misused and impact the rights of children too. The revelation in this new report will only further infuriate them.
The scanning came to light thanks to the lawsuit filed against Apple by Epic Games regarding the massive 30 percent cut the App Store takes when an in-app purchase is made. Verge published 107 highlights from the internal data that Apple submitted during the trial. In an internal iMessage, Apple’s head of fraud, Eric Friedman had stated that the company was “the greatest platform for distributing child porn.” This caught the attention of 9 to 5 Mac writer Ben Lovejoy, who wondered how would the company know this unless it was scanning photos in its cloud storage.
Apple responded stating that it had never scanned Photos in its iCloud but has been scanning all email attachments for CSAM since 2019. Unlike modern-day messaging platforms, the iCloud mail does not promise any encryption, so scanning it is not really difficult. Interestingly, Forbes had made a similar revelation in February 2020 but since Apple wasn’t planning to scan iCloud then, it did not make much noise.
In the private messages, Friedman also says that Facebook does a better job at reported CSAM and suspending accounts. This pretty much means that the social media company scans all images on its servers, while Apple has been only scanning its emails so far. Lovejoy also remarks that Apple only reports a few hundred cases of CSAM every year, which means that most of the CSAM on its servers is hidden in photos. Unsurprisingly, Apple now plans to scan photos now.
We need to see if the company can roll out some additional features that can balance the concerns of privacy advocates while also reporting CSAM. We have also reached out to Apple to know if this affects users using the Mail App on their Apple devices and will update the story if or when we receive a response.