Read what Apple has to say about not scanning iCloud Photos
iPhone maker Apple announced its plans to roll out three new child safety features in August 2021. These features included a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app and child exploitation resources for Siri.
In December 2021, Apple launched the Communication Safety feature in the US along with iOS 15.2. The feature was later expanded to other regions including — the UK, Canada, Australia, and New Zealand. The company also made Siri resources available, but the CSAM detection feature was never rolled out.