Apple’s Privacy Mythology Doesn’t Match Reality

In 2021, Apple has cast itself as the world’s superhero of privacy. Its leadership insists, “Privacy has been central to our work … from the very beginning” and that it’s a “fundamental human right.” Its new advertising even boasts that privacy and the iPhone are the same things

This past spring, rolling out a software update (iOS 14.5) that empowers users to say no to apps surveilling their activity across the internet did demonstrate something important: People choose privacy when they don’t have to struggle for control over their information. Now, only 25 percent of users consent, but before, nearly 75 percent consented by omission to have their information fuel targeted advertising. As Apple plans to add more privacy protections into iOS 15, which will be released next month, it continues to brand itself as a force potentially capable of slowing down growth at Facebook, a paragon of surveillance capitalism. Unfortunately, Apple’s privacy promises don’t show the full picture.

The company’s most alarming privacy failing may also be one of its most profitable: iCloud. For years, the cloud-based storage service has further entrenched hundreds of millions of Apple customers in its ecosystem, an internet-enabled extension of your hard drive designed for effortlessly offloading photos, movies, and other files to your unseen backup drive. Unfortunately, iCloud makes it nearly as easy for the police to access all of those files.

In the past, Apple has been adamant that it won’t weaken the security of its own devices to build in a back door. But with older devices, the door is already built. According to Apple’s law enforcement manual, anyone running iOS 7 or earlier is out of luck if they fall into the police or ICE’s crosshairs. With a simple warrant, Apple will unlock a phone. This may seem par for the course in Silicon Valley, but most tech giants’ CEOs haven’t previously proclaimed that warrants for their devices endanger “the data security of hundreds of millions of law-abiding people … setting a dangerous precedent that threatens everyone’s civil liberties.” This service is available due to security vulnerabilities eventually addressed in later operating systems.

Since 2015, Apple has drawn the ire of the FBI and the Justice Department with each new round of security enhancements, ultimately building a device that’s too safe for even Apple to crack. But the dirty little secret with nearly all of Apple’s privacy promises is that there’s been a backdoor all along. Whether it’s iPhone data from Apple’s latest devices or the iMessage data that the company constantly championed as being “end-to-end encrypted,” all of this data is vulnerable when using iCloud.

Apple’s simple design choice to hold onto iCloud encryption keys created complex consequences. They don’t do this with your iPhone (despite government pleas). They don’t do this with iMessage. Some benefits of making an exception for iCloud are clear. If Apple didn’t hold the keys, account users who forgot their password would be out of luck. With truly secure cloud storage, the company would be no more able to reset your password than would a random attacker. But retaining that power allows it to hand over your entire iCloud backup when ordered.

iCloud data goes beyond photos and files; it also includes location data, such as from “Find my phone” or AirTags, Apple’s controversial new tracking devices. With a single court order, all of your Apple devices could be turned against you and made into a weaponized surveillance system. Apple could fix it, of course. Plenty of companies have secure file-sharing platforms. The Swiss firm Tresorit offers true “end-to-end encryption” for its cloud service. Tresorit users also see their files uploaded in real time to the cloud, synced across multiple devices. The difference is that users, not Tresorit, hold the encryption keys. This does mean that if users forget their password, they also lose their files. But as long as providers have the power to recover or change passwords, they have the power to hand that information to the police.

The threat is only growing. Under a new suite of content moderation tools, Apple will scan iCloud uploads and iMessage communications for suspected child sexual abuse materials (CSAM). While the company once exclusively searched photos uploaded to iCloud for suspected CSAM, the new tools can now turn any photo and text you’ve sent or received against you. Thwarting CSAM is a noble goal, but the consequences could be disastrous for those wrongly accused when the AI fails. But even when the software works as intended, it could be deadly. As Harvard Law School instructor Kendra Albert noted on Twitter, these “features are going to get queer kids kicked out of their homes, beaten, or worse.” Software launched in the name of “child safety” could be a deadly threat to LGBTQ+ children outed to homophobic and transphobic parents. Just as chilling, the tools used to track CSAM today easily can be trained to flag political and religious content tomorrow.