This story is over 5 years old.

What Default Phone Encryption Really Means For Law Enforcement

Contrary to what some law enforcement officials are suggesting, default encryption now on mobile operating systems won't prevent them from solving crimes.
October 8, 2014, 6:50pm
Photo by Karlis Dambrans

After Apple announced it was expanding the scope of what types of data would be encrypted on devices running iOS 8, the law enforcement community was set ablaze with indignation. When Google followed suit and announced that Android L would also come with encryption on by default, it added fuel to the fire.

Law enforcement officials have angrily decried Apple and Google's decisions, using all sorts of arguments against the idea of default encryption (including the classic "Think of the children!" line of reasoning). One former NSA and Department of Homeland Security official even suggested that because China might forbid Apple from selling a device with default encryption, the US should forbid Apple from doing so here.

A former high-ranking American security official claiming the US should match China in restricting the use of privacy-enhancing technology is disconcerting, to put it mildly.

Apple has ensured it can't hand over iPhone data to law enforcement. Read more here.

Apple's decision, first and foremost, is about protecting the security of its customers. Before this change, if your iPhone was stolen or lost, a criminal could break into it with relative ease, accessing your private information by using the same backdoor as law enforcement used. Now that Apple has sealed that backdoor, you no longer have to worry. In an era when our mobile devices contain extremely private information, manufacturers have listened to their customers and made mobile security as strong as they know how by bringing it in line with laptop and desktop security.

The common misconception among the hysteria is that this decision will put vital evidence outside the reach of law enforcement. But nothing in this encryption change will stop law enforcement from seeking a warrant for the contents of a phone, just as they seek warrants for the contents of a laptop or desktop computer. Whether or not a person can be required to unlock the device is a complicated question — intertwined with the right of due process and the right to avoid self-incrimination — that ought to be carefully considered by a court in the context of each individual situation.

The next time a law enforcement official demands that Apple and Google put backdoors back into their products, remember what they're really demanding: that everyone's security be sacrificed in order to make their job marginally easier.

It's also important to note that the amount of information available to law enforcement about someone's electronic communications, movements, transactions, and relationships is staggering, even if they never analyze a suspect's mobile device. Law enforcement can still seek a phone company's call records, any text messages stored by the phone company, and a suspect's email accounts or any other information stored in the cloud — which for many people these days is a lot, as several Hollywood actresses recently learned. Some of those investigative tools have insufficient protections and go too far, and turning on encryption on mobile devices by default doesn't change that.

Unfortunately, that hasn't stopped law enforcement from twisting the nature of the Apple and Google announcements in order to convince the public that default encryption on mobile devices will bring about a doomsday scenario of criminals using "technological fortresses" to hide from the law. And sadly, some people seem to be buying this propaganda. Last week, the Washington Post published an editorial calling for Apple and Google to use "their wizardry" to "invent a kind of secure golden key they would retain and use only when a court has approved a search warrant."

The NSA has revealed details about its exhaustive search of Edward Snowden's emails. Read more here.

While the Post's Editorial Board may think technologists can bend cryptography to their every whim, it isn't so. Cryptography is about math, and math is made up of fundamental laws that nobody, not even the geniuses at Apple and Google, can break. One of those laws is that any key, even a golden one, can be stolen by ne'er-do-wells. Simply put, there is no such thing as a key that only law enforcement can use — any key creates a new backdoor that becomes a target for criminals, industrial spies, or foreign adversaries.

So the next time a law enforcement official demands that Apple and Google put backdoors back into their products, remember what they're really demanding: that everyone's security be sacrificed in order to make their jobs marginally easier. Given that decreased security is only one of several problems raised by the prospect of cryptography regulation, you should ask yourself: Is that trade worth making?

Cindy Cohn is the legal director and general counsel of the Electronic Frontier Foundation (EFF), Jeremy Gillula is a staff technologist at EFF, and Seth Schoen is a senior staff technologist at EFF.

_Photo via [Flickr_](