Apple says feature to find child images doesn’t create backdoor

Apple Inc defended concerns about its upcoming child safety features, saying it doesn’t believe its tool for locating child pornographic images on a user’s device creates a backdoor that reduces privacy. The Cupertino, California-based technology giant made the comments in a briefing Friday, a day after revealing new features for iCloud, Messages and Siri to combat the spread of sexually explicit images of children.

The company reiterated that it doesn’t scan a device owner’s entire photo library to look for abusive images, but instead uses cryptography to compare images with a known database provided by the National Center for Missing and Exploited Children.

Read more

You may also like

More in Newspapers

Comments are closed.