Apple will scan U.S. iPhones for images of child abuse
C/P
Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.
The tool designed to detected known images of child sexual abuse, called "neuralMatch," will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.
Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.
The detection system will only flag images that are already in the center's database of known child pornography. Parents snapping innocent photos of a child in the bath for example, presumably need not worry. But researchers say the matching tool — which doesn't "see" such images, just mathematical "fingerprints" that represent them — could be put to more nefarious purposes.
end c/p
August 6, 2021
The Associated Press
full article
C/P
Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.
The tool designed to detected known images of child sexual abuse, called "neuralMatch," will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.
Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.
The detection system will only flag images that are already in the center's database of known child pornography. Parents snapping innocent photos of a child in the bath for example, presumably need not worry. But researchers say the matching tool — which doesn't "see" such images, just mathematical "fingerprints" that represent them — could be put to more nefarious purposes.
end c/p
August 6, 2021
The Associated Press
full article
Code:
https://www.npr.org/2021/08/06/1025402725/apple-iphone-for-child-sexual-abuse-privacy