Apple to scan all iPhones for "child abuse" images

1
Apple to scan all iPhones for "child abuse" images


Apple to scan all iPhones and devices for images of child sexual abuse.

In a recent report by major news outlets, Apple will be scanning all images on iPhones and other Apple devices using their new neuralMatch tool.

The nueralMatch tool will then detect images of "child sexual abuse" before they’re uploaded to the iCloud. This will include all:

  • iPhones
  • iPads
  • Other Apple devices

If the neuralMatch program finds a suspicious image, the picture will be reviewed by a staff member of Apple.

If content is deemed child abuse, police will be called as well as the National Center for Missing and Exploited Children.

Now, I'm all for cracking down on pedophiles, but for a company who claims to be privacy-friendly, what's going on here? :think:

Will you be happy with Apple snooping through your personal and private pictures?

More information on the neuralMatch tool can be found in the attachment below.
Important: The worst forex brokers of all time 👎


Who is online

Users browsing this forum: No registered users and 31 guests