Apple To Search Users’ iPhones For Illegal Photos
Apple To Search Users’ iPhones For Illegal Photos
Apple famously touts itself as the pro-privacy tech company, but plans to roll out new surveillance software that some experts find deeply problematic.
Specifically, Apple will install a program on users’ iPhones and iPads that will scan users’ libraries for photos that violate child pornography laws. The company confirmed its plans Thursday afternoon, after news of them leaked on Twitter.
At launch the software will only examine photos in local storage that users have also uploaded to iCloud. The company will compare those photos to hashed versions of known illegal images and if it finds matches, will disable users’ accounts and notify the National Center for Missing and Exploited Children.
Apple — which famously prohibits advertisers from tracking mobile users without their consent — isn’t asking consumers for permission to install this software.
The company argues that the move provides “significant privacy benefits,” because it “only learns about users’ photos if they have a collection of known [child sexual abuse material] in their iCloud Photos account.”
Others disagree.
Matthew Green, the security researcher and Johns Hopkins professor who first reported news of Apple’s plans, argued on Twitter that the move has troubling implications for civil rights.
“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal,” Green said in a Twitter post. “In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content. That’s the message they’re sending to governments, competing services, China, you.”
Green adds that the types of “prohibited content” Apple could scan for isn’t limited to images of children.
“The way this will work is that your phone will download a database of ‘fingerprints’ for all of the bad images (child porn, terrorist recruitment videos etc.) It will check each image on your phone for matches. The fingerprints are ‘imprecise’ so they can catch close matches,” he said in a Tweet.
“Whoever controls this list can search for whatever content they want on your phone, and you don’t really have any way to know what’s on that list because it’s invisible to you,” he adds.
Security expert Alec Muffett raised similar concerns.
“Absolutely the most important story that broke overnight is that Apple apparently intend to launch a feature enabling image detection & government surveillance on everyone’s iPhones in the name of child protection,” he tweeted Thursday morning.
The digital rights group Electronic Frontier Foundation also weighed in against Apple’s move.
“Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy,” the group wrote Thursday afternoon.
“Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement,” the organization adds.
(36)