Apple Delays Plan To Snoop on Users’ Pictures
A controversial plan by Apple to scan users’ photos for the possible presence of child pornography has been shelved (for now) after tech privacy advocates warned of a future of oppressive government surveillance.
In early August, Apple announced it was rolling out a tool that would scan everybody’s iPhone and iPad photos when they were uploaded into iCloud storage. Apple promised that no human would actually be looking at the photos unless a complex tech program found that they matched known pictures of child pornography. If the program found child porn, Apple would disable the account and send a report to the National Center for Missing and Exploited Children, a quasi-private nonprofit created and primarily funded by Congress.
The announcement was greeted with shock and dismay by tech privacy and security organizations like the Electronic Frontier Foundation (EFF). One massive problem is that there’s no reason to believe that, once implemented, this system would remain confined to child pornography. A technology that could be programmed to detect images of child pornography could be programmed to detect other types of images if Apple is pressured or ordered to do so by a government. This technology could be used by authoritarian leaders to figure out who is transmitting images they don’t approve of or those associated with protests and activism. Will Chinese authorities demand to know who has images of Winnie the Pooh on their phones, given President Xi Jinping’s apparent sensitivity to the comparison?
Apple had initially planned to implement the scanning system with an August
Article from Latest – Reason.com