After WhatsApp, Fortnite game-maker Epic’s CEO slams Apple’s iPhone child sexual abuse photos move, calls it ‘spyware’

Apple has revealed that it may look into users iPhones and check for child abuse photos when they are being uploaded onto iCloud in an attempt to wipe out this

horrific online crime. However, while the attempt to address this issue has been on the agenda of all tech majors, this move by Apple has been blasted by others as it is likely to cause more problems for them. The first to react was WhatsApp Head Will Cathcart and then Fortnite-gamemaker Epic Games CEO Tin Sweeney.

What has Apple done? Apple has launched various tools to reduce the spread of child sexual abuse material (CSAM) on Thursday. This will introduce changes in iMessage, Siri and Search, that would enable scanning iCloud Photos for known CSAM imagery and thereby help protect children online.

Read more

You may also like

Comments are closed.