Should Apple be allowed to rummage through the photos on your phone?
What’s to stop governments around the world from demanding that Apple look for and hand over images related to political dissent and pro-democracy movements?
Whenever new useful personal tech comes along, someone seems to find ways to turn it against its users.
Email in-boxes groan under the weight of spam. Robocalls bedevil owners of phones. Hackers make off with personal data stored on computers.
Now Apple is invading another hitherto sacrosanct area — photo archives on iPhones and other devices that back up to the cloud.
Earlier this month, Apple announced it was going to start scanning its way through the photos everyone keeps on their iPhones, images that users quite naturally assumed were personal and private unless they decided to share them. Apple also said it will flag sexually explicit photos sent or received by children under 13 using its Messages app, with an option to notify parents of what it finds.
Apple’s explanation for the high-tech snooping: It says it is looking for illicit child sexual abuse material, which often hides on encrypted platforms.
It’s hard to argue against putting the clamps on images of child abuse. And Apple says other companies scan photos on the cloud with fewer protections. Apple also may believe that its new system, which required an immense amount of engineering, is the only way to keep government from demanding the encryption keys to every device.
But now that Apple has this new tool, how long will it be before government or other interests insist on using it to look for other material? In 2014, the U.S. Supreme Court recognized the importance of smart phone privacy by ruling that police cannot search through phones without a warrant. Now Apple is showing us tech companies can do what they choose in this area without a warrant any time the company decides to change its policies.
With its new cryptographic tools, Apple will compare all photos on iCloud-enabled devices (think iPads, iPhones, Macs) with databases of known child sexual abuse images flagged by the National Center for Missing and Exploited Children. It will do so by comparing “hashes,” or numbers unique to each image. If Apple finds matches, it may notify law enforcement.
Big tech companies, such as Google, Microsoft and Facebook, already scan email and messages sent through their systems. What’s different is that Apple is now prying into the phones of its billion-plus users around the world to examine material that never has been sent anywhere. It would be a small step for Apple, under pressure, to start checking phone photo libraries for any other types of messages or images a particular government doesn’t like.
If Apple can look for evidence of child sexual abuse material, governments around the world might demand that it look for content related to political dissent, pro-democracy movements or national security. Who knows what snoops might enter through this new backdoor into your iPhone content?
Device owners don’t have to back up to the cloud, but then they risk losing all their photos and other personal information if a device is lost or goes belly up.
We’ve already learned hackers can turn smart phones against us by secretly turning on cameras and microphones and monitoring us. Phones track our locations and other personal information. Now, even a phone that appears quiescent in a pocket or purse might be rummaged through by Apple if it is backed up to the cloud. Privacy advocates worry this could launch a new trend of privacy violations.
Trying to clamp down on misuse of computers and phones is laudable. And apprehending child abusers is an obviously worthy goal. But Washington and Big Tech should start putting more value on guarding personal data. Individuals’ privacy should not be put at risk.
Send letters to firstname.lastname@example.org.