Explained: How Apple will scan for child exploitation images on devices, and why it’s raising eyebrows

Apple has announced that software updates later this year will bring new features that will “help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)”.

Expected to go live in the United States initially, the features include use of new technology to limit the spread of CSAM online, especially via Apple platform.

Then there will be on-device protection for children from sending or receiving sensitive content, with mechanisms to alert parents in case the user is below the age of 13. Apple will also intervene when Siri or Search is used to look up CSAM-related topics.

Read more

You may also like

More in Newspapers

Comments are closed.