Apple to check iPhones for child abuse pics; a ‘backdoor’, claim digital privacy bodies
Apple is rolling out a two-pronged mechanism that scans photographs on its devices to check for content that could be classified as Child Sexual Abuse Material (CSAM). While the move is being welcomed by child protection agencies, advocates of digital privacy, and industry peers, are raising red-flags suggesting the technology could have broad-based ramifications on user privacy.
As part of the mechanism, Apple’s tool neuralMatch will check for photos before they are uploaded to iCloud — its cloud storage service — and examine the content of messages sent on its end-to-end encrypted iMessage app. “The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple,” the company said.