Last week, Apple announced that they would begin analyzing images on its devices before they’re uploaded to the cloud in order to identify child pornography and report it to the authorities, sending privacy advocates through the roof. No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.
They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk — Edward Snowden (@Snowden) August 6, 2021 Apple defended the decision – claiming there’s a ‘ 1 in 1 trillion chance of false positives.’
“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of […]
All ORIGINAL content on this site is © 2021 NOQ Report. All REPUBLISHED content has received direct or implied permission for reproduction.
With that said, our content may be reproduced and distributed as long as it has a link to the original source and the author is credited prominently. We don’t mind you using our content as long as you help out by giving us credit with a prominent link. If you feel like giving us a tip for the content, we will not object!
JD Rucker – EIC