Apple Faces Growing Criticism Over Smartphone Scanning Plan
Apple Faces Growing Criticism Over Smartphone Scanning Plan
Apple’s plan to scan customers’ smartphones for contraband continues to draw criticism from a broad array groups, ranging from the international digital rights organization Access Now to the editorial board of the Los Angeles Times to the New York Public Library.
“Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases,” an online petition signed by the New York Public Library and other groups, as well as thousands of security exerts, states.
The signatories are urging Apple to reverse course.
Last week, Apple unveiled several new measures aimed at combatting child sex abuse. One of those measures involves scanning childrens’ iMessage accounts for nude photos, but parents can turn that feature off.
Another piece of Apple’s plan involves comparing images that users attempt to upload to iCloud with a database of known photos depicting the sexual abuse of children.
To accomplish this, Apple will download hashed fingerprints of photos in the database to users’ devices, then scan for matches among photos users want to place in iCloud. Critically, the scan will occur on users’ devices — not on Apple’s servers.
If the software finds a match, Apple will manually review the photos, and if it determines the photos are illegal, will notify the National Center for Missing and Exploited Children, which will alert the authorities.
While some tech companies have long scanned material in the cloud for child sex-abuse imagery, those scans take place on the companies’ servers — not users’ property.
With this technology, Apple will be the first major tech company to search for contraband on users’ devices. What’s more, there is no technological reason why Apple couldn’t use this technology to search for material other than child sex-abuse images.
A growing chorus of observers are raising alarms over that prospect.
“While Apple may mean well, it’s not hard to imagine how this safety feature could lead to even greater privacy incursions in the future,” legal scholar Tiffany Li writes in MSNBC.com. “Today, the justification is child safety. Tomorrow, the justification might be counterterrorism or public health or national security. When we begin giving up our digital rights, it is hard to turn the clock back and bring back past protections.”
“While Apple has vowed to use this technology to search only for child sexual abuse material, and only if your photos are uploaded to iCloud Photos, nothing in principle prevents this sort of technology from being used for other purposes and without your consent,” Alex Stamos, former chief security officer at Facebook, and Matthew Green, the Johns Hopkins computer scientist who first revealed Apple’s plan, argue in The New York Times.
Access Now echoes those concerns, warning that Apple will “put everyone’s privacy and security at risk by … reducing individuals’ control over their own device.”
For its part, Apple insists it will refuse any demands by governments to detect images other than ones that are in a database of child sex abuse photos.
“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands,” the company says. “We will continue to refuse them in the future.”
But there are questions about whether Apple will be able to keep that promise.
“Apple is relying on several policy choices designed to limit the reach of the new features, but policy choices can fall to government mandates to expand and repurpose the new technical capabilities,” the Center for Democracy & Technology’s Emma Llansó, director of the Free Expression Project, writes.
The Los Angeles Times issues a similar warning.
“Now that there is a door, won’t it be even easier for the government to open it even wider and to demand access to images that hint at other activities, criminal or otherwise?” the newspaper asks this week in an editorial.
“Private communication that cannot be accessed by the prying eyes and ears of governments, companies or crooks is an essential element of freedom and Apple has in the past been right to promote it. The change in direction is a very serious setback,” the newspaper adds.
(39)