But researchers say the matching tool - which doesn’t “see” such images, just mathematical “fingerprints” that represent them - could be put to more nefarious purposes. Parents snapping innocent photos of a child in the bath presumably need not worry. The detection system will only flag images that are already in the center’s database of known child pornography. Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified. If it finds a match, the image will be reviewed by a human. The tool designed to detected known images of child sexual abuse, called “NeuralHash,” will scan images before they are uploaded to iCloud.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
December 2022
Categories |